Use Case: Scaling Digital Twin Experiences
Overview
When you begin using Vuforia Studio to build AR experiences, you gather specific 3D models, images, and other assets and then link these together using widgets and perhaps some JavaScript logic. Once the experience has been created and published, you’ll find that it is specific to the task and the data you used to create it. In many cases, this is appropriate; the experience might be very specific to the data or a specific version of a product, so there is no problem having a unique experience mapped to it.
However, there may be situations where you want to add a few more products, or variants of a product. Or, on a larger scale, you may need to add hundreds or thousands of new products. In this case, each product might be built-to-order, and the experiences will need to be different for each product.
In this series we’ll investigate methods used to build scalable experiences that can adapt to both context (where they are being used) and content (what they are being used upon).
You’ll start by building a simple experience that begins with the data itself, and learn how to prepare the content that will drive the AR experience. You’ll learn how using tools like Creo Illustrate, Windchill PLM, and Creo Parametric can help you include useful metadata that can later drive the experience. Using this approach, you can create a Vuforia Studio experience that will adapt automatically to the data that is loaded.
This use case will also show you how to externalize the data instead of it being encapsulated in the experience. This way, when the experience starts, the necessary data will be downloaded. The downloaded data will also include instance-specific values. So, along with being able to handle different definitions (variants) of a product, you’ll see how individual instances each have their own appearance, history, operational data etc., which in essence, is the digital twin.
You’ll walk through the following sections in this use case: