Load table data

Operation name

Load table data

Function overview

Gets data from the file on Google Cloud Storage, and writes the data to the table on Google BigQuery.

Properties

= Remarks =

For details on use of variables, refer to Variables.

Basic settings

Item name

Required/Optional

Use of variables

Description

Remarks

Name

Required

Not available

Enter a name that is used on the script canvas.

 

Required settings

Item name

Required/Optional

Use of variables

Description

Remarks

Destination

Required

Not available

Select a connection resource.

Add

A new connection resource can be added.

Edit list

Connection resource settings can be edited in > HULFT INTEGRATE > Connections.

 

Bucket name

Required

Available

Select or enter the bucket name.

 

Folder path

Required

Available

Select or enter a folder path.

  • Specify a slash "/" at the beginning and the end.

  • Consecutive slashes "//" can't be included.

Note

Specify an absolute path of Google Cloud Storage for the folder path.

File name

Required

Available

Select or enter file name.

  • "/" can't be included.

Destination project ID

Required

Available

Select or enter the destination project ID.

 

Destination dataset name

Required

Available

Select or enter the destination dataset name.

 

Destination table name

Required

Available

Select or enter the destination table name.

  • If the specified table doesn't exist, a new table with the specified name is created, and data is written to that table.

Property actions

Item name

Description

Remarks

Update the list of bucket names

Get buckets in the specified destination and set them in Bucket name.

  • If Destination is specified/edited or a bucket has been added/edited on Google Cloud Storage, the changes can be reflected using this property action.

Update the list of folder paths

Get folder paths in the specified bucket and set them in Folder path.

  • If Bucket name is specified/edited or a folder has been added/edited on Google Cloud Storage, the changes can be reflected using this property action.

  • If a variable is set to Bucket name, this action will be invalid.

Update the list of file names

Get files in the specified folder path and set them to File name.

  • If Folder path is specified/edited or a file has been added/edited on Google Cloud Storage, the changes can be reflected using this property action.

  • If a variable is set to Bucket name or Folder path, this action will be invalid.

Update the list of destination project IDs

Get projects in the specified destination and set them to Destination project ID.

  • If Destination is specified/edited or projects have been added/edited on Google BigQuery, the changes can be reflected using this property action.

Update the list of destination dataset names

Get datasets in the specified project and set them in Destination dataset name.

  • If Destination project ID is specified/edited or a dataset has been added/edited on Google BigQuery, the changes can be reflected using this property action.

  • If a variable is set to Destination project ID, this action will be invalid.

Update the list of destination table names

Get tables in the specified dataset and set them in Destination table name.

  • If Destination dataset name is specified/edited or a table has been added/edited on Google BigQuery, the changes can be reflected using this property action.

  • If a variable is set to Destination project ID or Destination dataset name, this action will be invalid.

Read settings

Item name

Required/Optional

Use of variables

Description

Remarks

Input file format

Optional

Not available

Select an input file format.

CSV

(Default)

Read data from Google Cloud Storage in CSV format.

TSV

Read data from Google Cloud Storage in TSV format.

 

Do not read the first line as a value

Optional

Not available

Select whether or not to handle the first row of the specified file on Google Cloud Storage as data.

Selected

Don't handle as data.

Not selected

(Default)

Handle as data.

 

Write settings

Item name

Required/Optional

Use of variables

Description

Remarks

Write settings of destination tables

Optional

Not available

Select an option to use when writing.

Add to table

(Default)

Add data to the end of an existing table.

Overwrite table

Delete all the data in the write destination table, and write data.

 

Comment

Item name

Required/Optional

Use of variables

Description

Remarks

Comment

Optional

Not available

You can write a short description of this connector.

 

Schemas

Input schema

None.

Output schema

None.

Transaction

Transaction isn't supported.

Parallel Stream Processing

PSP isn't supported.

Available component variables

Component variable name

Description

Remarks

message_category

When an error occurs, the category of the message code corresponding to the error is stored.

  • The default value is null.

message_code

When an error occurs, the code of the message code corresponding to the error is stored.

  • The default value is null.

message_level

When an error occurs, the severity of the message code corresponding to the error is stored.

  • The default value is null.

error_type

When an error occurs, the error type is stored.

  • The default value is null.

  • The format of the error type is as follows.

    Example: java.io.FileNotFoundException

error_message

When an error occurs, the error message is stored.

  • The default value is null.

error_trace

When an error occurs, the trace information for the error is stored.

  • The default value is null.

Message codes, exception messages, and limitations

Connector

Message code

Exception message

Limitations

Messages and limitations of the Google BigQuery connector

check

check

check