Persistence Providers
In ThingWorx, value streams, streams, data tables, blogs, and wikis are data providers. Data providers are databases that store run time data. Run time data is data that is persisted once Things are composed and used by connected devices to store their data (such as temperature, humidity, or position). Model providers are used to store the metadata for the Things. Persistence providers are initially set to have one database for model providers and data providers, but administrators can separate them based on requirements.
* 
Reference Model and Data Best Practices for additional information on selecting a persistence provider.
The amount of data that your business model requires determines how you need to handle the data. For big data requirements, the need for a scalable data store may be required. ThingWorx provides the option to choose one of the following persistence providers for your value stream, stream, and data table data:
H2
For more information, see Using H2 as the Persistence Provider.
PostgreSQL
Microsoft SQL Server
Azure SQL Server
InfluxDB
For more information, see Using InfluxDB as the Persistence Provider
ThingworxPersistenceProvider
The default persistence provider is ThingworxPersistenceProvider, which is based on the persistence provider installed with ThingWorx.
Configuration settings made to the ThingworxPersistenceProvider through Composer are not persisted and will reset to values specified in platform-setting.json on Platform restart. To make changes permanent, stop Platform, update platform-settings.json, and restart the Platform. platform-settings.json is the source of truth for configuration of ThingworxPersistenceProvider. On Platform start, ThingworxPersistenceProvider will use the settings specified in platform-settings.json. This functionality was implemented to allow Administrators to tune their setting while allowing them to rollback changes that adversely impact the platform.
When writing data to a stream or value stream in ThingWorx, it is a best practice to not use a 1:1 ratio. For example, if you have 10,000 Things in your model, use 50 streams or value streams (instead of 10,000 streams) for better read performance of your data.
Migrating Entities and Data Between Environments
Stream, value stream, and data table entities and their associated data can be migrated between different systems (for example, from development to production). Follow the steps below to migrate existing entities and data from your existing persistence provider environment to a new persistence provider environment.
If migrating the entity definition and not data, perform the following steps:
1. Export entities from the source system.
2. Import entities into the destination system.
3. In the destination system, manually change the persistence provider for the necessary entities.
If migrating the entity definition and data, perform the following steps:
1. Export entities and data from the source system.
2. Import the entities to the destination system.
3. In the destination system, manually change the persistence provider for the necessary entities.
4. Import data into the destination system.
Your data will be directed to the proper data store according to the persistence provider selected in the previous step.
Migrating Data in Existing Environments
To migrate data in your current environment from an existing persistence provider to a new persistence provider, perform the following steps after your initial persistence provider configuration is complete.
1. Identify which entities to migrate to the new persistence provider.
2. Export data for the identified entities.
3. In Composer, change the persistence provider for the identified entities.
4. Import data.
Was this helpful?