Bulk Write Data(Delete) (deprecated)
Operation Name
Bulk Write Data(Delete)
Function Overview
This operation performs a Delete operation, based on input data, using Bulk API.
Job Creation, batch creation, batch result obtainment, and job closing will be done subsequently as one operation.
In the Delete operation, the Salesforce ID is set as the key item when deleting data.
For details about the API specification used in this operation, refer to the API documentation of the selected version in the connection resource from the following link destination page.
-
"Salesforce Developer Documentation"-(https://developer.salesforce.com/docs)
-
Bulk API Developer's Guide
-
Data Model
The data model of this component is table model type.
Properties
For details on use of variables, refer to Variables.
Basic Settings
Item name |
Required/Optional |
Use of Variables |
Description |
Remarks |
---|---|---|---|---|
Name |
Required |
Not Available |
Enter a name that is used on the script canvas. |
|
Input Data |
Required |
Not Available |
Select a component on the script canvas. |
|
Required Settings
Item name |
Required/Optional |
Use of Variables |
Description |
Remarks |
---|---|---|---|---|
Destination |
Required |
Not Available |
Select a connection resource.
|
|
Table Name |
Required |
Not Available |
Select the name of a table in Salesforce. |
|
Schema definition |
Required |
- |
"ID" has been selected as the CSV data's header row item, to be registered with the batch. Editing cannot be done. |
|
Schema definition/Label |
Required |
Not Available |
Display the input schema item's label name. |
|
Schema definition/API |
Required |
Not Available |
Display the input schema item's API name. |
|
Schema definition/Type |
Required |
Not Available |
Display the input schema item's data type. |
|
Option Settings
Item name |
Required/Optional |
Use of Variables |
Description |
Remarks |
---|---|---|---|---|
Hard Delete(Physical Delete) |
Required |
Not Available |
Select whether or not to perform a hard deletion (HardDelete).
|
Note
If this is selected, deleted records are not saved in the recycle bin. They will be immediately deleted instead. The authority of this "Bulk API Hard Delete" is by default invalid and must be validated by the system administrator. |
Column name type |
Required |
Not Available |
Select the display type of the column name, for displaying the schema in mapping.
|
|
Batch size |
Required |
Available |
Input the upper limit of the number of records that can be registered to one batch. |
Note
As the specification of this connector, in a case in which 100,000 cases of data is written, and the batch size is set as 2,000, 50 batches will be created. However, in cases where 1 batch exceeds the API limit of 10 MB, the batch will be created with a size smaller than the specified batch size. |
Concurrency mode |
Required |
Not Available |
Select the job's concurrency mode.
|
Note
If parallel processing is performed, database conflicts can occur. If there is a severe database conflict, reads may fail. If the sequential processing mode is used, the batches will be reliably processed one after another. However, with this option, the reading processing time may increase greatly. |
Obtain the batch results after waiting until the task is completed |
Required |
Not Available |
Select whether or not to monitor the situation until the created batches are finished, and to obtain batch results.
|
|
Waiting Time before the monitoring (sec) starts |
Optional |
Available |
Input the waiting time, until the beginning of the batch state monitoring, in seconds. |
|
Batch monitoring intervals (sec) |
Optional |
Available |
Input the intervals, which determine whether or not the batch results have been obtained and the process has completed, in seconds. |
|
Max. monitoring time(sec) |
Required |
Not Available |
Input the maximum monitoring time for the batch state in seconds. |
|
Comment
Item name |
Required/Optional |
Use of Variables |
Description |
Remarks |
---|---|---|---|---|
Comment |
Optional |
Not Available |
You can write a short description of this connector. |
|
(Hereafter shown as "Label/API")
<?xml version="1.0" encoding="UTF-8" ?> <table> <row> <column>ID/id</column> </row> <row> : </row> </table>
Element Name |
Column Name |
Description |
Remarks |
---|---|---|---|
row |
|
Repeats as many time as the number of input data. |
|
column |
ID/id |
Gives the Salesforce ID. |
|
The batch results information will be granted to the input schema column.
(Hereafter shown as "Label/API")
<?xml version="1.0" encoding="UTF-8" ?> <table> <row> <column>ID/Id</column> <column>ResultInformation_InputRowNumber/RESULT_inputRowNumber</column> <column>ResultInformation_Batch_ID/RESULT_BATCH_id</column> <column>ResultInformation_Batch_State/RESULT_BATCH_state</column> <column>ResultInformation_Batch_StateMessage/RESULT_BATCH_stateMessage</column> <column>ResultInformation_Record_ID/RESULT_ROW_id</column> <column>ResultInformation_Record_SuccessFlag/RESULT_ROW_success</column> <column>ResultInformation_Record_CreationFlag/RESULT_ROW_created</column> <column>ResultInformation_Record_ErrorMessage/RESULT_ROW_error</column> </row> <row> : </row> </table>
Element Name |
Column Name(Label/API) |
Description |
Remarks |
---|---|---|---|
row |
|
Repeats as many times as the number of data given to the input schema. |
|
column |
|||
ResultInformation_InputRowNumber/RESULT_inputRowNumber |
A number will be output, based on the position of the node in the data that was given to the input schema. |
|
|
ResultInformation_Batch_ID/RESULT_BATCH_id |
The IDs of the batches that processed the records will be output. |
|
|
ResultInformation_Batch_State/RESULT_BATCH_state |
The state of the batches that processed the records will be output.
|
|
|
ResultInformation_Batch_StateMessage/RESULT_BATCH_stateMessage |
The state message of the batches that processed the records will be output. |
|
|
ResultInformation_Record_ID/RESULT_ROW_id |
The record ID is output. |
|
|
ResultInformation_Record_SuccessFlag/RESULT_ROW_success |
The record's success flag is output.
|
|
|
ResultInformation_Record_CreationFlag/RESULT_ROW_created |
The record's new creation flag is output.
|
|
|
ResultInformation_Record_ErrorMessage/RESULT_ROW_error |
The record's error message is output. In the case where the batch results cannot be obtained, the state message of the batches that processed the records will be output. |
|
Loading schema in Mapper
The schema is loaded automatically.
For details, refer to Edit Schema.
Mass data processing
Mass data processing isn't supported.
Transaction
Transaction isn't supported.
Parallel Stream Processing
PSP isn't supported.
Available Component variables
Component Variable Name |
Description |
Remarks |
---|---|---|
job_id |
The IDs of the jobs created are stored. |
|
read_count |
The number of input data is stored. |
|
created_batch_count |
The number of batches created is stored. |
|
get_result_success_count |
The number of data that was successful in the processing is stored. |
|
get_result_error_count |
The number of data that was unsuccessful in the processing is stored. |
|
server_url |
The end point URL after Login is stored. |
|
session_id |
The session Id is stored. |
|
message_category |
When an error occurs, the category of the message code corresponding to the error is stored. |
|
message_code |
When an error occurs, the code of the message code corresponding to the error is stored. |
|
message_level |
When an error occurs, the severity of the message code corresponding to the error is stored. |
|
operation_api_exception_code |
The ExceptionCode of the occured error, in a case of API error, is stored. |
|
operation_error_message |
When an error occurs, the error message is stored. |
|
operation_error_trace |
When an error occurs, the trace information for the error is stored. |
|
Message codes, exception messages, and limitations
Connector |
Message code |
Error message |
Limitations |
---|---|---|---|
Messages and limitations of the Salesforce(deprecated) connector |