Create Batch

Operation Name

Create Batch

Function Overview

This operation is to create a CSV data based on the input data, and then issues a request for batch creation (Adding a Batch to a Job) to the specified job.
The created batch information can be obtained from the output schema.

This component is a function that realizes the smallest unit Bulk API operation.
Generally, if utilizing the functions, such as [Bulk Write Data(Insert)], in order to perform job creation, batch creation, the obtainment of batch results, and job closing as bulk process, this component does not need to be used.

For the API specifications used in this operation, please refer to the version of the API document, selected in the global resource, in the below mentioned link.

Data Model

The data model of this component is the Table Model type.

Properties

For information about using variables, refer to "variables".
Basic Settings
Item name Required/Optional Use of Variables Description Supplement
Name Required Not Available Enter the name that appears on the script canvas.  
Input Data Required Not Available Select the component on the script canvas.  
Required Settings
Item name Required/Optional Use of Variables Description Supplement
Destination Required Not Available Select Global Resources.
  • [Add...]:
    Add new global resource.
  • [Edit...]:
    Global resource settings can be edited by [Edit Resource list].
 
Job ID Required Available Enter a Job ID which creates a batch.  
Table Name Required Not Available Select the Salesforce table name which corresponds to the object of the specified job.  
Schema definition Required - Specify an item for the CSV data's header row, to be registered with the batch.
  • Depending on the job operation type, the required items vary.
    • In the case of "UPDATE", "ID" needs to be included.
    • In the case of "UPSERT", "External ID field" specified in the job needs to be included.
    • In the case of "DELETE", only "ID" needs to be specified.
  • Limiting the settings to only items needed to be written to Salesforce (deleting unnecessary items from the Schema Definition) helps improve processing performance.
  • Items whose data type are defined as "base 64" cannot be handled. If these items are selected, an error occurs.
Schema definition/Label Required Not Available Display the lable name of the column of the table specified in [TableName].  
Schema definition/API Required Not Available Display the API name of the column of the table specified in [TableName].  
Schema definition/Type Required Not Available Display the data type of the row of the table specified in [TableName].  
Relationship definition Optional - In the case that relationship items exist in the Schema Definition, set items subject to updating by external key.
  • By selecting the external key item of the relation object, data passing with relations can be performed.
    For details, please refer to Relationship Definition.
Relationship definition/Base field Required Not Available Display the API Name of the row of the relation item of the table specified in [Table Name].  
Relationship definition/Relationship name Required Not Available Display the relationship name of the row of the relation item of the table specified in [Table Name].  
Relationship definition/Related Object Optional Not Available Select the API Name of the relation counterpart object in the relation item column of the table specified in [Table Name].  
Relationship definition/Foreign key field Optional Not Available Select the external key item of the relation counterpart object in the relation item column of the table specified in [Table Name].
  • If omitted, the ID of the relevant record of the relation counterpart object will be passed.
Property Action
Item name Description Supplement
Create Schema... Allows the addition and deletion of the items for the CSV data's header row to be registered with the batch.
Please refer to Create Schema for setting methods.
 
Option Settings
Item name Required/Optional Use of Variables Description Supplement
Replace null/empty string with #N/A Required Not Available In the case that the input data is null or null string, select whether or not to replace it with the string "#N/A."
  • [Checked]:(default)
    Replace the string.
  • [Not Checked]:
    Do not replace the string.
  • Due to the API specification, "#N/A" needs to be selected if data is updated with null.
Batch size Required Available Input the upper limit of the number of records that can be registered to one batch.

  • The default value is "2000."
  • You can set the value between 1 and 10000. An error message will be shown if a value outside this range is set.
  • As the specification of this adapter, in a case in which 100,000 cases of data is written, and the batch size is set as 2,000, 50 batches will be created.
    However, in cases where 1 batch exceeds the API limit of 10 MB, the batch will be created with a size smaller than the specified batch size.
Column name type Required Not Available Choose the display type of the column name, for displaying the schema in mapping.
  • [Label]:
    The label name (item name) defined in Salesforce is displayed.
  • [API] :(default)
    The API reference name defined in Salesforce is displayed.
 
Property Action
Item name Description Supplement
Display Table Information... You can check the table structure of the object you are operating.
For more information on how to view the table structure, please refer to Display Table Information
  • Click [Load All Table Information....] if you would like to check the table structure of other objects.
Load All Table Information Retrieve all possible table information
After running, you can verify the retrieved information from [Table Information].
 
Read schema definition from file... Select a file from the file chooser and read the name of field API on the first line of the file specified in the comma-separated values. Set this as the schema definition.
  • Please specify "UTF-8" encoding for the selected file.
Large Data Processing Settings
Item name Required/Optional Use of Variables Description Supplement
Large Data Processing Required Not Available Select a Large Data Processing Settings.
  • [Use the setting in the script]:(default)
    Apply Large Data Processing Settings settings of script property to adapter.
  • [Disable Large Data Processing]:
    Large Data Processing is not performed.
  • [Enable Large Data Processing]:
    Large Data Processing is performed.
 
Comment
Item name Required/Optional Use of Variables Description Supplement
Comment Optional Not Available You can write a short description of this adapter.
The description will be reflected in the specifications.
 

Schema

Input Schema

The number of columns varies depending on the [Schema Definition] settings.
Please refer to "Table Model type schema" for Schema Structure information.

Output Schema

The created batch information is set.
(Hereafter shown as "Lable/API")

<?xml version="1.0" encoding="UTF-8" ?>
<table>
  <row>
    <column>Batch ID/batchId</column>
    <column>State/state</column>
    <column>State message/stateMessage</column>
    <column>Start time/createdDate</column>
    <column>Input record count/inputRecordCount</column>
  </row>
  <row>
    :
  </row>
</table>
Element Name Column Name(Label/API) Description Supplement
row
-
Repeats as many times as the number of batches created.  
column Batch ID/batchId The batch ID is output.  
State/state The batch state is output.
  • [Queued]:The process has not begun.
  • [InProgress]:Currently processing.
  • [Completed]:The process has completed.
  • [Failed]:The process was not successful. Please check the [stateMessage].
  • [Not Processed]:The job was canceled before processing.
 
State message/stateMessage The batch state message is output.  
Start time/createdDate The start time of the batch is output.  
Input record count/inputRecordCount The number of records input to the batch is output.  

Reading Schema by Mapper

The schema will be read automatically.

Large Data Processing

Large Data Processing is supported.

Transaction

Transaction is not supported.

Usage on PSP Script

Cannot be used on PSP Script.

Available Component variables

Component Variable Name Description Supplement
job_id The ID of the specified job is stored.
  • The default value is null.
read_count The number of input data is stored.
  • The default value is null.
created_batch_count The number of batches created is stored.
  • The default value is null.
server_url The end point URL after Login is stored.
  • The default value is null.
session_id The session Id is stored.
  • The default value is null.
message_category In the case that an error occurs, the category of the message code corresponding to the error is stored.
  • The default value is null.
message_code In the case that an error occurs, the code of the message code corresponding to the error is stored.
  • The default value is null.
message_level In the case that an error occurs, the importance of the message code corresponding to the error is stored.
  • The default value is null.
operation_api_exception_code The ExceptionCode of the occured error, in a case of API error, is stored.
  • The default value is null.
  • For any error other than an API Error, the value is not stored.
  • The content to be stored may change according to the version of DataSpider Servista.
operation_error_message If an error occurs, the error message of the occured error is stored.
  • The default value is null.
  • The content to be stored may change according to the version of DataSpider Servista.
operation_error_trace When an error occurs, the trace information of the occured error is stored.
  • The default value is null.
  • The content to be stored may change according to the version of DataSpider Servista.

Relationship definition

Regarding to the updating process of the Reference Item, it needed to pass the ID of the referred object and update by Mapper in the previous versions.
By taking in this setting, the automatical obtainment and updating of the ID of the record which responds to the external key, becomes possible by passing the value of the external key item of the reference object.

Create Schema

The data items to be read from and/or to be written into Salesforce can be specified in the creation of a schema.
By specifying only the items indispensable for processing, the data traffic to Salesforce can be reduced, generally improving the processing performance.
When writing into Salesforce (adding or updating), it is necessary to make sure that the items "Not Creatable" and "Not Updatable" are not be chosen.



Number in the Image Name Description Supplement
(1) Selected Fields Select items for reading from and/or writing to Salesforce.
  • By specifying only the items indispensable for processing, the data traffic to/from Salesforce is reduced, improving the processing performance.
(2) UnSelected Fields Select items not for reading from and/or writing to Salesforce.
(1) Selected Fields Select items for writing to Salesforce.  
(2) UnSelected Fields Select items not for writing to Salesforce.
  • You can filter fields by inputting or selecting values from the input above the column name.
    Text fields will be filtered as Starting With the input value.

Display Table Information

This display shows the available operations on Salesforce tables as well as field properties of the tables.
Also, the field information can be printed out.



Number in the Image Name Description Supplement
(1) Table Name Select the table whose structure to be shown.  
(2) Table Information Display the available operations on the selected table.  
(3) Length Display the Number of Digit of item  
(4) External ID Display whether or not the object item is set as an external ID.  
(5) Createble Display whether or not it can be set a value when adding data.  
(6) Updatable Display whether or not it can be set a value when updating data.  
(7) Nullable Display whether or not it can be set NULL when adding or updating data.  
(8) Default value Display whether or not Salesforce automatically set a default value when adding data.  
(9) Reference To Display the referring object name if the item is in reference relationship or master-servant relationship.  

Specification Limits

Main exceptions

Exception Name Reason Resolution
ResourceNotFoundException
Resource Definition is Not Found. Name:[]
[Destination] is not specified. Specify [Destination].
ResourceNotFoundException
Resource Definition is Not Found. Name:[<Global Resource Name>]
The resource definition selected in [Destination] cannot be found. Verify the global resource specified in [Destination]
java.net.UnknownHostException This exception occurs when the PROXY server specified in the global resource cannot be found. Verify the condition of the PROXY server. Or verify [Proxy Host] of the global resource specified in the [Destination].
java.net.SocketTimeoutException
connect timed out
A time-out has occurred while connecting to Salesforce. Verify the network condition and Salesforce server condition. Or check [Connection timeout(sec)] of the global resource specified in the [Destination].
java.net.SocketTimeoutException
Read timed out
A time-out has occurred while waiting for a responce from the server after connecting to Salesforce. Verify the network condition and Salesforce server condition. Or check [Timeout(sec)] of the global resource specified in the [Destination].
jp.co.headsol.salesforce.adapter.exception.SalesforceAdapterIllegalArgumentException Invalid value is set for the property of SalesforceBulk adapter. Check the error message, and verify the settings.
com.sforce.soap.partner.fault.LoginFault Login to Salesforce has failed. Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.
com.sforce.async.AsyncApiException An error has occurred in the batch or job executed in the SalesforceBulk adapter. Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.