Messages and limitations of the Salesforce(deprecated) connector

Message code list of Salesforce(deprecated) category

These are message codes which respond to errors which occur in the Salesforce(deprecated) connector.

 

SALESFORCE0001E

This message code is output when input values for the Salesforce(deprecated) connector are invalid.

Cause

Solution

This occurs when input values for the Salesforce(deprecated) connector are invalid.

Check the settings for the Salesforce(deprecated) connector according to the error message.

 

SALESFORCE0002E

This message code is output when the execution of a script for the Salesforce(deprecated) connector ends in an invalid state.

Cause

Solution

This occurs when the execution of a script for the Salesforce(deprecated) connector ends in an invalid state.

Check the connection environment, the settings for Salesforce, and the input data according to the error message.

 

SCRIPT0003E

This message code is output when the settings for the script are incorrect.

Cause

Solution

This occurs when the settings for the script are incorrect.

Modify the script according to the error message.

 

SALESFORCE0000E

This message code is output in the case of errors other than the above.

Cause

Solution

This occurs in the case of errors other than the above.

For example, this code is output when an error occurs in the subquery checks for the following operations:

  • Read Data (execute SOQL)

  • Bulk Get Data(Query)

Check the error message or the help of the operation in which the error has occurred.

 

Error messages of Salesforce(deprecated) category

Error message

Cause

Solution

Remarks

Resource is not defined. Name:[RESOURCE_NAME]

The "Execute SOQL to set schema" or "Extraction test" button was clicked without a destination specified.

Select a destination.

 

Subqueries are not allowed.

Subqueries are included in the SOQL queries.

Write SOQL queries without subqueries.

If "from" is used twice or more in a SOQL query, the query is determined to include a subquery.

The value of [SOQL] is not yet set.

The field for SOQL is blank.

Enter a value in the field for SOQL, which is a required field.

 

The [SCHEMA] in the [LINE_NUMBER] line is not set.

As shown in the message

Enter the schema name.

 

A duplicate schema has been set. Schema Name[SCHEMA_NAME]

As shown in the message

Do not set schemas with the same name.

 

A table name is not selected.

The field for the table name is not set.

Set the table name.

 

The [TABLE_NAME] set in the table name is not defined in Salesforce.

The specified table name does not exist on Salesforce.

Check whether the specified table name exists on Salesforce.

Tables may change or be deleted after they are set. Only an existing table can be set from the screen

Table [TABLE_NAME] is not supported to execute [TARGET_PROCESSING].

You do not have permission for the relevant table on Salesforce.

Check whether you have permission for the relevant operation for the table on Salesforce, which is specified in [Table name].

 

The schema definition has not been set.

The schema definition has not been set in the properties window.

Set the schema definition and execute the script.

 

The API [FIELD_NAME] set in the schema definition is not defined in Salesforce.

The API field name that is set in the schema definition in the properties window does not exist on Salesforce.

Check the table on Salesforce and set an existing API field name in the schema definition.

 

The Salesforce API returned an error. Error code:[ERROR_CODE] Error message:[ERROR_MESSAGE]

An error occurred in Soap API or Bulk API.

Check the error code and the error message.

"SOAP API Developer Guide > ExceptionCode"

(https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_calls_concepts_core_data_objects.htm#statuscode)

"Bulk API 2.0 and Bulk API Developer Guide > Errors"

(https://developer.salesforce.com/docs/atlas.en-us.234.0.api_asynch.meta/api_asynch/asynch_api_reference_errors.htm)

An error has occurred. Error message:[ERROR_MESSAGE]

An error occurred due to a cause other than the above.

Check the error message and then check the script and each of the settings.

 

The number of records of process object data based on the input data and the number of update results do not match. An API probe is necessary.

The number of input data items and the number of updated data items (including added data and deleted data) do not match.

The behavior of Salesforce API may be incorrect. Check the data before and after updating and the log in order to check for problems with updating.

 

The maximum monitoring time for the batch has been exceeded. Maximum time (seconds) [MAXIMUM_MONITORING_TIME]

The batch does not finish in the maximum monitoring period.

Increase the maximum monitoring period in the option settings or change the settings for SOQL, input data, or batch size so that the processing finishes in the maximum monitoring period.

 

The number of batches created does not match the number of batches returned from Salesforce. Number of batches created [NUMBER_OF_BATCHES_CREATED] Number of batches returned [NUMBER_OF_RETURN_BATCHES]

Some batches that are executed are not returned as completed batches from Salesforce.

Check whether there are batches that are taking time to be processed or whether an error occurred on the Salesforce side.

 

For the column API name for relation, select the item of the object to which you want to relate.

A field name that does not exist is specified in the field API for relation in the schema definition (external ID).

Check the table on Salesforce and set the existing API field name for the foreign key field.

Only valid API fields can be set. The error may have occurred due to changes in the table after setting the script.

"Column relation API name" in the error message has the same meaning as "Item API for relations" on the screen.

The update key has not been set in the schema definition. Please check the setting of the update key (true, false) in the schema.

An update key is not set for the Upsert operation.

Set an update key.

 

Multiple update keys have been specified in the schema definition. Please check the setting of the update key (true, false) in the schema.

Two or more update keys are set for the Upsert operation.

Specify only one update key.

 

The item set as the update key in the schema definition [DISPLAY_LABEL_NAME,ITEM_NAME] is not set as an external ID in Salesforce.

The field that is set as the update key for the Upsert operation is not set as an external ID.

Set the API field name "Id" or the field "External ID: ✓ (checked)" as the update key.

 

There is an invalid date string.

The input value format for date data used in script execution is not able to be converted to a date.

Change the input value format to a format available for dates.

 

While the number of columns in the configured table is [NUMBER_OF_COLUMNS], data with the number of columns [NUMBER_OF_COLUMNS] was detected in the [NUMBER_OF_LINES] row of the input data.

The number of fields that are set in the schema definition and the number of fields in the input data do not match.

Use the same number of columns for the input data and for the table.

 

The password has expired.

The password for the Salesforce account has expired.

Enable the password for the Salesforce account by changing it or by other means.

 

Exception messages of Salesforce(deprecated) category

Exception name

Cause

Solution

ResourceNotFoundException

Resource definition could not be found. Name: []

Destination isn't specified.

Specify Destination.

ResourceNotFoundException

Resource definition could not be found. Name: [<connection resource name>]

The resource definition selected in Destination isn't found.

Check the connection resource specified in Destination.

java.net.UnknownHostException

This exception is thrown when the PROXY server specified in the connection resource cannot be found.

Verify the condition of the PROXY server. Or verify Proxy host for the connection resource specified in the Destination.

  • API 23.0 or earlier

    org.apache.commons.httpclient.HttpConnection$ConnectionTimeoutException

  • API 26.0 or later

    java.net.SocketTimeoutException

    connect timed out

A time-out has occurred while connecting to Salesforce.

Verify the network condition and Salesforce server condition. Or check Connection timeout (sec) for the connection resource specified in the Destination.

  • API 23.0 or earlier

    org.apache.commons.httpclient.HttpRecoverableException

    java.net.SocketTimeoutException: Read timed out

  • API 26.0 or later

    java.net.SocketTimeoutException

    Read timed out

A time-out has occurred while waiting for a responce from the server after connecting to Salesforce.

Verify the network condition and Salesforce server condition. Or check Timeout (sec) for the connection resource specified in the Destination.

java.net.SocketTimeoutException

connect timed out

A time-out has occurred while connecting to Salesforce.

Verify the network condition and Salesforce server condition. Or check Connection timeout (sec) for the connection resource specified in the Destination.

java.net.SocketTimeoutException

Read timed out

A time-out has occurred while waiting for a responce from the server after connecting to Salesforce.

Verify the network condition and Salesforce server condition. Or check Timeout (sec) for the connection resource specified in the Destination.

jp.co.headsol.salesforce.adapter.exception.SalesforceAdapterIllegalArgumentException

Invalid value is set for the property of Salesforce connector.

Check the error message, and verify the settings.

jp.co.headsol.salesforce.adapter.exception.SalesforceAdapterIllegalArgumentException

Invalid value is set for the property of Salesforce Bulk connector.

Check the error message, and verify the settings.

com.sforce.soap.partner.fault.LoginFault

Login to Salesforce has failed.

Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.

com.sforce.soap.partner.fault.UnexpectedErrorFault

An unexpected error has occured while processing to Salesforce.

Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.

com.sforce.soap.partner.fault.InvalidFieldFault

The item included in the executed SOQL is invalid.

Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.

com.sforce.soap.partner.fault.MalformedQueryFault

The executed SOQL is invalid.

Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.

com.sforce.async.AsyncApiException

An error has occurred in the batch or job executed in the Salesforce Bulk connector.

Check the ExceptionCode or error message, and refer to the information about this type of error in Salesforce-related documents etc.

Limitations of Salesforce(deprecated) category

Read Data(execute SOQL)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • You cannot use the COUNT() aggregate function. Please use instead COUNT(fieldName).

  • COUNT(Id) will give the correct number of records.

  • Cannot retrieve value from an address type field.

  • The field value of junctionIdList will output the first record ID only.

  • Location type field value cannot be retrieved after API34.0.

Notes

  • When using aggregate functions, values in the log files may appear in exponential notation.

Write Data(Insert)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated and the processing log in WARN level is output.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, you need to select Get in Whether or not to get result per rows and then use the results data for handling error detection etc. in the subsequent processing.

  • If the Foreign key field in the relationship definition is specified as numeric type, and a value is written beyond the precision of the Java Double data type, the result value will be rounded.
    Specify the field as text type to avoid the value from rounding off.

  • As of the next API version, If a value is written beyond the precision of the Java Double data type, the value will be rounded.

    For example, if the value (1,234,567,890,123,456.78) is written to a numerical (16,2) field, it will be saved as (1,234,567,890,123,456.80).

    • API 7.0

    • API 14.0

    • API 18.0

    • API 23.0

    To write the value without rounding, use an API version newer than the above.

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Write Data(Update)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated and the processing log in WARN level is output.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, you need to select Get in Whether or not to get result per rows and then use the results data for handling error detection etc. in the subsequent processing.

  • If the Foreign key field in the relationship definition is specified as numeric type, and a value is written beyond the precision of the Java Double data type, the result value will be rounded.
    Specify the field as text type to avoid the value from rounding off.

  • As of the next API version, If a value is written beyond the precision of the Java Double data type, the value will be rounded.

    For example, if the value (1,234,567,890,123,456.78) is written to a numerical (16,2) field, it will be saved as (1,234,567,890,123,456.80).

    • API 7.0

    • API 14.0

    • API 18.0

    • API 23.0

    To write the value without rounding, use an API version newer than the above.

  • As of the next API version, when Writing method is set to Update only and schema definition/key is true for a non-ID field, or when Writing method is set to Update and Insert, a MALFORMED_QUERY error will be thrown when a value is input the field specified to be the key beyond the precision of a Java Integer.

    • API 7.0

    • API 14.0

    • API 18.0

    • API 23.0

    Choose a field that differs from the key or use an API version newer than the above.

  • From the next API version, if Writing method is set to Update and Insert and the relationship field definition is used for the Reference Type, an INVALID_FIELD error will be returned when the record's reference type value is set to null or an empty string.

    • API 14.0

    • API 18.0

    • API 23.0


    To write the value as null or an empty string, use an API version newer than the above.

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Write Data(Delete)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated and the processing log in WARN level is output.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, you need to select Get in Whether or not to get result per rows and then use the results data for handling error detection etc. in the subsequent processing.

  • As of the next API version, when Writing method is set to Update only and schema definition/key is true for a non-ID field, or when Writing method is set to Update and Insert, a MALFORMED_QUERY error will be thrown when a value is input the field specified to be the key beyond the precision of a Java Integer.

    • API 7.0

    • API 14.0

    • API 18.0

    • API 23.0

    Choose a field that differs from the key or use an API version newer than the above.

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Write Data(Upsert)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated and the processing log in WARN level is output.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, you need to select Get in Whether or not to get result per rows and then use the results data for handling error detection etc. in the subsequent processing.

  • If the Foreign key field in the relationship definition is specified as numeric type, and a value is written beyond the precision of the Java Double data type, the result value will be rounded.

    Specify the field as text type to avoid the value from rounding off.

  • If the field of the external key is specified as numeric type, and a value is written beyond the precision of the Java Double data type, the value will be rounded.

    Specify the field as text type to avoid the value from rounding off.

  • As of the next API version, If a value is written beyond the precision of the Java Double data type, the value will be rounded.

    For example, if the value (1,234,567,890,123,456.78) is written to a numerical (16,2) field, it will be saved as (1,234,567,890,123,456.80).

    • API 7.0

    • API 14.0

    • API 18.0

    • API 23.0

    To write the value without rounding, use an API version newer than the above.

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Read Data(Child-Parent Relation)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • Cannot retrieve value from an address type field.

  • Location type field value cannot be retrieved after API34.0.

Notes

  • This field value will be obtained as null from the next API version when using a formula with a child-parent relationship field or when using a child-parent relationship field as an aggregate condition in a GROUP BY clause.

    • API 18.0

    • API 23.0

    When retrieving a field value, please use an API version newer than the above.

  • The field value will be obtained as null when "toLabel" or "convertCurrency" functions are used with the following API version.

    • API 18.0

    • API 23.0

    • API 26.0

    When retrieving a field value, please use an API version newer than the above.

Bulk Write Data(Insert)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • An item whose data type is "base 64" cannot be handled in this component.

  • In this connector, the job contet type "CSV" is used.

    It is not possible to create a job with another contet type, or to create batches for jobs with other content types.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, Obtain the batch results after waiting until the task is completed must be checked and then use the results data for handling error detection etc. in the subsequent processing.

  • If a value is written beyond the precision of the Java Double data type, the value will be rounded by the Bulk API.

    For example, if the value (1,234,567,890,123,456.78) is written to a numerical (16,2) field, it will be saved as (1,234,567,890,123,456.80).

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Bulk Write Data(Update)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • An item whose data type is "base 64" cannot be handled in this component.

  • In this connector, the job contet type "CSV" is used.

    It is not possible to create a job with another contet type, or to create batches for jobs with other content types.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, Obtain the batch results after waiting until the task is completed must be checked and then use the results data for handling error detection etc. in the subsequent processing.

  • If a value is written beyond the precision of the Java Double data type, the value will be rounded by the Bulk API.

    For example, if the value (1,234,567,890,123,456.78) is written to a numerical (16,2) field, it will be saved as (1,234,567,890,123,456.80).

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Bulk Write Data(Delete)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • In this connector, the job contet type "CSV" is used.

    It is not possible to create a job with another contet type, or to create batches for jobs with other content types.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, Obtain the batch results after waiting until the task is completed must be checked and then use the results data for handling error detection etc. in the subsequent processing.

Bulk Write Data(Upsert)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • An item whose data type is "base 64" cannot be handled in this component.

  • In this connector, the job contet type "CSV" is used.

    It is not possible to create a job with another contet type, or to create batches for jobs with other content types.

Notes

  • This operation cannot perform a rollback when an error occurs, unlike the database connectors. Succeeded data are updated while error data are left not to be updated.

    Therefore, if errors exist in a piece of data (constraint violation or invalid value, etc.), those errors cannot be detected by monitoring exceptions in a script.

    If you want to use the error information for each piece of data as data, Obtain the batch results after waiting until the task is completed must be checked and then use the results data for handling error detection etc. in the subsequent processing.

  • If a value is written beyond the precision of the Java Double data type, the value will be rounded by the Bulk API.

    For example, if the value (1,234,567,890,123,456.78) is written to a numerical (16,2) field, it will be saved as (1,234,567,890,123,456.80).

  • The data type is defined for the fields of the object. The API sends and receives data according to the defined data type format rules, but this connector extends the rules for some data types.

    = Remarks =

    For details, refer to Data type format.

Bulk Get Data(Query)

Specification limits

  • In order to perform data reading and writing with this connector, API's of Salesforce organization must be published.

  • All permissions for reading/writing data and other operations, as well as data access control, depend upon the profile, shared rule setting and API specifications in Salesforce.

  • The objects and fields that can be used may differ by API version. For details, please refer to the Salesforce documentation.

  • In this connector, the job contet type "CSV" is used.

    It is not possible to create a job with another contet type, or to create batches for jobs with other content types.

Notes

  • If the number of query result was zero (0), a message file "Records not found for this query" is stored on Salesforce server instead of CSV file.

    In this case, this operation does not output CSV file.

  • If batch process is otherwise "Completed", this operation does not output CSV file.

  • If an unexpected error occurs while writing to a CSV file, the content written up to the point of failure will not be Rolled Back. To allow for Rolling Back, please use the [Transaction] functionality.

  • The format of the data written to the CSV file depends on the API specification.