Performance Considerations and Data Management
SPM engine generates metrics records at scheduled intervals. These records being custom object records in Salesforce, there is no automatic archiving. Depending on the volume of data handled by your business, this might have an impact of the consumption of storage space in your Salesforce org.
So, the following best practices are recommended:
Determine which metrics are the most relevant for your business model.
Identify the source records which are of significance to your operations.
After SPM is set up and metrics are generated, monitor for a while and tweak the configuration to ensure that only the required data are processed and generated.
Set up a custom archiving/purging mechanism to handle old, unwanted data at regular intervals.
Metric data storage uses dedicated objects and is not clubbed with transaction object data. Hence, the volume of metric data should not have an adverse impact on the overall performance of the ServiceMax application in general.
However, SPM Engine processes the records using Apex-batch jobs, which are governed by Salesforce limits. So, schedule the metric generation in such a way that data processing is done more frequently. This will result in the handling of smaller data volumes at any given time.
Was this helpful?