Testing Connectors
Test Command
The test command is used to test the connector artifacts. The artifacts have some common and some specific options. The test command takes the type of the artifact as a sub-command. The test command has the following commands:
Commands | Description |
---|
flow test action <name> | Tests the action named <name>. |
flow test connection <name> | Tests the connection named <name>. |
flow test lookup <name> <method> | Tests the lookup function named <method> in lookup <name>. |
flow test oauth <name> | Tests the oauth named <name>. |
flow test trigger <name> <method> | Tests the trigger named <name>. |
The test command has the following options:
Options | Description | Data Type |
---|
--version | Displays the version number. | [boolean] |
--help | Displays the help. | [boolean] |
--timeout | The timeout for the operation. The --timeout option specifies a time-out in milliseconds (defaults to 30000 ms). This option ensures that the code finishes within the specified time and it captures errors, such as missed invocation of callbacks. | [default: 30000] |
--logLevel,-1 | Sets the log level. | [default: “info”] |
ThingWorx Flow CLI allows you to test individual artifacts before
deploying the connector on the server. This helps you to check if your artifacts are functioning correctly before deploying of the connector.
The test cases are written using the mocha-chai framework. Mocha is used to define the test suite and chai is used as an assertion library. All test cases are defined under the testData folder under a particular an artifact test folder.
The test command allows testing of artifacts in ThingWorx Flow CLI.
To test an artifact, execute the following commands from your command prompt:
1. cd <user project root directory>
2. flow test <artifactType> <name>
Configuration for Testing OAuths and Triggers
The flow test oauth and flow test trigger commands internally start a web-server. The web-server only accepts https requests. To allow it to serve requests configuring, certificates are required. If certificates are missing, the following error occurs:
[2018-09-19T10:13:11.876] [ERROR] trigger - Validation of trigger failed : ENOENT: no such file or directory, open 'C:\[2018-09-19T10:13:11.876] [ERROR] trigger - Validation of trigger failed : ENOENT: no such file or directory, open 'C:\<user-home-dir\.flow\key.pem>
To create a self signed certificate, do the following:
1. Download and install openssl.
2. Assuming openssl is in the path, run openssl req -nodes -new -x509 -keyout key.pem -out cert.pem
3. Copy these files to the .flow directory in the user’s home directory.
To configure the host name and port, create a file flow.json, in the .flow directory in the users home directory. A sample file follows:
{
"passphrase": "xyz",
"hostname" : "flow.local.rnd.ptc.com",
"port" : 443
}
All properties are optional.
• Passphrase—Used with the private key that requires a passphrase.
• Hostname—Used to construct redirect_uri for OAuth and to create trigger URLs. Many services may not allow localhost as a valid hostname in redirect_uri. Some applications specifically require non-localhost URLs as webhook registration URLs. You can also try to match the URL for your development system with your production system for test purposes. For example, if the redirect_uri that is registered with google is https://flow.local.rnd.ptc.com/Thingworx/Oauths/oauth/return then the hostname in the flow.json should be flow.local.rnd.ptc.com. Additionally, the host should be added in <%WINDIR%>\system32\drivers\etc\hosts.
| To edit the file run the editor as an administrator. |
Testing an OAuth Configuration
OAuth configuration testing is done through a browser since each OAuth provider has a different form on which to accept user credentials. The CLI internally runs an express web server to process the responses that are sent by the IdP on successful or unsuccessful authentication.
To test an oauth configuration, do the following:
1. From your project directory, execute the following command:
flow test oauth <oauth name>
After successful login, the OAuth validator browser window opens. For more information, refer to the example in the tutorial.
2. Select the configuration to be tested and then click Validate OAuth configuration or click Exit.
Testing a Connection
The test connection command calls the validate method and then calls the connect method, tests input against the input schema, and tests output against the output schema.
To test a connection, do the following:
1. From your project directory, execute the following command:
npm install
2. Create a test data file in the package and populate its data with your service provider information.
The file contains sample data for the test connectionTestData in<projectDir>\test\testData as shown in the code that follows.
module.exports = {
sampleInput : {
email : 'your-email-id-here',
subscription_id:'your-subscription-id',
account_url: 'your-account-url',
token: 'your-token'
},
sampleOutput : {
handle: {
email : 'your-email-id-here',
subscription_id:'your-subscription-id',
account_url: 'your-account-url',
token: 'your-token'
} }
}
3. Execute the test command using the following options:
Options | Description | Data Type |
---|
--version | Displays the version number. | [boolean] |
--help | Displays the help. | [boolean] |
--timeout | The timeout for the operation. The --timeout option specifies a time-out in milliseconds (defaults to 30000 ms). This option ensures that the code finishes within the specified time and it captures errors, such as missed invocation of callbacks. | [default: 30000] |
--logLevel,-1 | Sets the log level. | [default: “info”] |
--artifactVersion,-v | Version of the artifact to test. | NA |
--projectDir,-d | The parent directory of the project. | NA |
--input,-i | Name of the input variable. | NA |
--output,-o | Name of the expected output variable. | NA |
--testDataFile,-f | Path of the testData file. | NA |
--save,-s | Saves to the credential store for a user. | NA |
--noSchemaValidation,-n | Disables the schema validation. | [default: false] |
Running the test produces output if both the connect and validate method succeed. The command also checks that the input matches the input schema and the output matches the expected output.
The --save option stores the output of the connect command into the creds.json file in the .flow folder in the users home directory. The output from running this command is the key against which the connection is stored in the file. Use this key later to pass the stored authentication information to another test command.
Testing Lookups
Lookups have a different structure than other artifacts and are not versioned in the same way as other artifacts. The name itself serves as the version. If newer functionality is required, a new Lookup with a new name is created in the Lookup JavasSript file. Execute the test command for Lookups using the following command:
The following options are available:
Options | Description | Data Type |
---|
--version | Displays the version number. | [boolean] |
--help | Displays the help. | [boolean] |
--timeout | The timeout for the operation. The --timeout option specifies a time-out in milliseconds (defaults to 30000 ms). This option ensures that the code finishes within the specified time and it captures errors, such as missed invocation of callbacks. | [default: 30000] |
--logLevel,-1 | Sets the log level. | [default: “info”] |
--projectDir,-d | The parent directory of the project. | NA |
--connection,-c | The uid of the connection to inject. | NA |
--access_token,-a | Name of the expected output variable. | NA |
--input,-i | Path of the testData file. | NA |
--output,-o | Name of the expected output variable. | NA |
--testDataFile,-f | Path of the testData file. | NA |
--searchById | The id to use for search. | NA |
--searchByValue | The value to use for search. | NA |
--filter | The filter to use to search for items. | NA |
flow test lookup <name> <method>
Lookups have two search options, searchById, searchByValue and the following three options with respect to pagination.
• Lookup returns all data at once—No action required for this option.
• Pagination API supported by the application—The lookup code calls the getNextPage api and passes it arguments that can help the service return data from the next page. This information is passed to the lookup when the user clicks Load More in the list for the lookup field values.
• Pagination not supported—The application queries the data and the lookup service truncates the data to show only records for the current page and all previous pages.
Testing Actions
Testing actions is similar to testing Lookups. Lookups are not tested as a part of the test action. The input should already contain the results that are otherwise provided by the lookups.
To test an Action, do the following:
1. Write a sample input and sample output JSON in a test data file. For input and output samples, refer to
Tutorial B 2. From your project directory, execute the test command using the following options:
Options | Description | Data Type |
---|
--version | Displays the version number. | [boolean] |
--help | Displays the help. | [boolean] |
--timeout | The timeout for the operation. The --timeout option specifies a time-out in milliseconds (defaults to 30000 ms). This option ensures that the code finishes within the specified time and it captures errors, such as missed invocation of callbacks. | [default: 30000] |
--logLevel,-1 | Sets the log level. | [default: “info”] |
--artifactVersion,-v | Version of the artifact to test. | [default: “v1”] |
--connection,-c | The uid of the connection to inject. | NA |
--access_token,-a | Name of the expected output variable. | NA |
--projectDir,-d | The parent directory of the project. | [default: “.”] |
--input,-i | Path of the testData file. | NA |
--output,-o | Name of the expected output variable. | NA |
--testDataFile,-f | Path of the testData file. | NA |
--noSchemaValidation,-n | Disables the schema validation. | [default: false] |
Testing Triggers
Testing a trigger is slightly different from Actions, Connections, and Lookups. You can execute the test command using the following options:
The following table describes the various options available to test Triggers.
Options | Description | Data Type |
---|
--version | Displays the version number. | [boolean] |
--help | Displays the help. | [boolean] |
--timeout | The timeout for the operation. The --timeout option specifies a time-out in milliseconds (defaults to 30000 ms). This option ensures that the code finishes within the specified time and it captures errors, such as missed invocation of callbacks. | [default: 30000] |
--logLevel,-1 | Sets the log level. | [default: “info”] |
--artifactVersion,-v | Version of the artifact to test. | [default: “v1”] |
--connection,-c | The uid of the connection to inject. | NA |
--access_token,-a | Name of the expected output variable. | NA |
--projectDir,-d | The parent directory of the project. | [default: “.”] |
--input,-i | Name of the input variable. | NA |
--output,-o | Name of the expected output variable. | NA |
--testDataFile,-f | Path of the testData file. | NA |
--noSchemaValidation,-n | Disables the schema validation. | [default: false] |
--polling,-p | Indicates that the trigger is a polling trigger. | NA |
--event,-e | The event to test. | NA |
--mockData,-m | The name of the variable holding the mockdata for an event. | NA |
--interval,-t | The interval in seconds after which to trigger. | [default: 15] |
--stopAfter,-s | Maximum number of calls to the trigger. | [default: 1] |
You must specify what method you want to test. For polling triggers, the methods are execute, and activate.
For examples on testing various artifacts, refer to the
Tutorial B.