Testing Framework

Testing Framework

Testing framework provides the way to create and execute tests for scripts developed in DataSpider Servista. It consists of functions to support testing and the guideline to utilize the functions.
For more details on advantages of using the framework, refer to “Testing Script”.

Testing framework provides the following functions.

Function Purpose
Test project Separate production projects from tests
Call script of other project Test a production script before registering as service
Assertion adapter Verify data
Batch run scripts Execute tests
Output test result report Automated reporting

Guideline for Testing Framework” is provided to describe how to utilize these functions.

Target of the testing framework

This framework is provided for “script developers”, assuming that it is used in “unit testing on script development”.

Testing policies like the following are not defined in this framework. Define them by yourself as necessary.

Description of terms

The following terms are used in this page.

Term Description
Production script Script to be used in service operation.
Production project Project to be used in service operation.
Test script Script to test production scripts.
Test project Project to test production projects.
It has functions specialized for testing.

Guideline for Testing Framework

This guideline assumes the following procedures of service development. Among the phases, the test phase is described about here.
Refer to “Service Development” for the development phase and refer to “Service Operation” for registering services.



Contents of the balloon in the above picture shows the structure of test phase.
Descriptions of the items and the corresponding functions are the following.

No. Item Description Corresponding function
1. Create test project Create a test project to test scripts of a production project.
2. Create test script Create test scripts which call the developed production scripts and describe tests.
Also, correct some production scripts as necessary.
3. Batch run test scripts Batch run the test scripts created in No.2 of the project.
4. Check test result report Check the contents of test result report generated by the batch run in No.3.
Correct some test scripts and production scripts according to the result.

The details of the items are described below.

Create test project

Create a project to test production scripts. It is called a test project.
  1. Open “New project” window.

  2. Check [Create test project] and create a test project.
    A test project is displayed with a green icon to distinguish from normal projects.

About test project

In a test project, you can use the following dedicated functions to support testing a project.
  • Call script of other project (Call Script operation)
  • Batch run scripts
  • Output test result report
You can test production scripts using the functions which are not available in normal projects.

Relationship between production project and test project

There is no functional restriction that production projects and test projects must be one-to-one relationship.
However, we recommend one-to-one relationship because it provides some administrative advantages. When a production project is “PRJ-001” and the test project for that is “PRJ-001_TEST”, you can understand the test target immediately.

Create test script

Create a script to test a production script.
  1. Create a script in the test project.

  2. Drag “Call Script” from “Basic” - “Processing” category of the tool palette to the script canvas.

  3. The properties window of Call Script operation opens, and select [Script of other project].

  4. Select the production project and script to be tested in [Project] and [Script] respectively.

  5. Describe how to verify (assert) the production script behavior in the script.
    Various assertion methods can be considered depending on the structure of production script. A part of that is introduced in “Test Patterns” section.

  6. Run the script and check whether the results are as expected. Correct the test script or the production script as necessary.
Perform the above 1 through 6 for the production scripts to be tested.

Calling script of other project

As a script of another project can be called, you can test a production script before service registration (under development).
In so doing, you can detect problems in earlier phases.
For more details on the function, refer to “Call Script”.

Batch run scripts

Batch run the scripts in test project.
  1. Select [Batch run scripts] in the right-click menu of test project.

  2. The “Batch run scripts” dialog opens.
    Description of items
    Item name Description Remarks
    Script list List the test scripts in project.
    • The display order depends on the order in project explorer.
    Script list/Target Select whether or not to test each script.
    • [Checked]: (default)
      Include in target.
    • [Not checked]:
      Do not include in target.
     
    Script list/Script name The test script name is displayed.  
    Script list/Execution result The execution result of the test script is displayed.
    The execution results are the following.
    • Success (<exit status>):
      Shows normal end of the script.
    • Failure:
      Shows failure of script execution due to an assertion exception.
    • Error (<exit status>):
      Shows abnormal end of the script.
    • Build error:
      Shows build error of the script.
    • For more details on exit status, refer to “Exit Status”.
    • The execution result when an assertion exception occurs in the callee script of Call Script operation will be “Error”.
    Specify type Select whether or not to specify “Type” for test script batch run.
    • [Checked]:
      Specify type.
    • [Not checked]: (default)
      Do not specify type.
     
    Specify type for execution Select or enter a type for test script batch run.
    • Enabled when [Specify type] is [Checked].
    • Default value is “For test”.
    • User specified types will not appear in the drop-down list. Enter one of them directly in the input field.
      Refer to User Specified Types for more details.
    Use default type when specified type is not found Select whether or not to use the default type when the specified type global resource is not found.
    • [Checked]:
      Use default type.
    • [Not checked]: (default)
      “PoolNotDefinedException” error will occur.
    • Enabled when [Specify type] is [Checked].
    Run The scripts of which [Target] is [Checked] are batch executed.
    • Test scripts of the list are executed from top to bottom sequentially.
    Stop Stop batch running scripts.
    • It might take some time to stop the script.
    • When the batch run is stopped in the middle, then the result up to the stopped script will be output in the test result report.

  3. Click [Execute] to batch execute the scripts

  4. When the execution is completed, the information of test result report will be displayed in the dialog.


  5. Check the details of the report and correct the test script or the production script as necessary.

Batch running scripts

The advantage of batch running scripts is to batch execute the test scripts in a test project and output the test result without registering the project as a service.
You can batch run only from Designer.
The test result report will be output only when scripts are batch executed.

Check test result report

Check the test result report generated in batch run.
A test result report is output under the “/testreport” folder of DataSpider File System. The file name will be “<yyyyMMdd_HHmmssSSS>_<project name>.xlsx”.
Test result reports are not deleted automatically. As that will occupy the disk space, it is recommended to delete them periodically.

The information output in a test result report are as follows.
Description of items
Item name Description Remarks
Project The project name batch executed is output.  
Script The script name batch executed is output.  
Result The execution result of script is output.
  • Success:
    Shows normal end of the script.
  • Failure:
    Shows failure of script execution due to an assertion exception.
  • Error:
    Shows abnormal end of the script.
  • Build Error:
    Shows build error of the script.
  • The execution result when an assertion exception occurs in the callee script of Call Script operation will be “Error”.
Message code The message code is output when the result is “Failure” or “Error”.  
Error message The error message is output when the result is “Failure” or “Error”.  
Execution ID The execution ID of script is output.  
Exit status The exit status of script is output.  
Execution time (seconds) The execution time of script is output in seconds.  

Outputting test result report

A test result report is assumed to be used as is for the verification result (evidence) of the test phase.

Test Patterns

The role of test script is to check whether a production script works as expected. The testing framework provides “Assertion adapter” to support it.
In this section, how to create test scripts using Assertion adapter is described in several patterns.

File test pattern

This is a pattern to compare two files and check whether the contents are the same.
This pattern assumes the following. The settings of Compare File operation is so simple that you only need to specify the downloaded file (subject file) and the prepared file with expected contents (object file).
The properties window of Compare File operation


The test script is also simple. It is to download a file in the production script and check whether the downloaded file matches the expected file.
The picture of test script

Table model type data test pattern

This is a pattern to compare table model type data and check whether the contents are the same.
This pattern assumes the following. As you can specify the result data of table model type component in the input data of Compare Table Model operation, writing to a temporary file is not needed for assertion.

The Excel file is assumed to have three columns.

IDDateNumber
ID0012017-01-01500
ID053A2017-02-021200

The picture of test script

When simple matching is available

If the excel file does not include a variable value, compare with the CSV file.
The contents of expected CSV file
ID001,2017-01-01,500
ID053A,2017-02-02,1200
The properties window of Compare Table Model operation

When simple matching is unavailable

If the ID column contains any string starting with “ID” and the following value is auto-generated and unpredictable, configure forward matching with the string “ID”. Like this, you can configure advanced assertion settings, such as defining detailed conditions for each column.
The contents of expected CSV file
ID,2017-01-01,500
ID,2017-02-02,1200
The properties window of Compare Table Model operation


For more details on conditions, refer to “Compare Table Model”.

Variable test pattern

This is a pattern to check whether the number of created files is equal to the expected value.
This pattern assumes the following. As number of files can be retrieved from a component variable of Get List of File Names operation, use Compare Variables operation, with which you can compare values of component variables or script variables.
The picture of test script


Delete the files under the output directory beforehand to prevent from influence of other tests.
The properties window of Compare Variables operation


In Compare Variables operation, expect 1 for the file count and 0 for the directory count of the component variables of Get List of File Names operation.
If the production script does not create a file due to some cause, the file count will be 0 and an assertion exception will occur.

Abnormal case test pattern

This is a pattern to check the actual flow of branched flows using multiple assertions.
This pattern assumes the following. With Assertion Exception operation, you can assert that the undesirable flow is not executed.
The picture of test script


When the production script ended normally, an assertion exception occurs at the Assertion Exception operation.
You can easily understand the situation when the error occurred by setting an appropriate message in [Message] of the Assertion Exception operation.
The properties window of Assertion Exception operation

Specification Limits