The future of computer-aided design systems in an Internet of Things world is the Digital Twin. Digital Twins leverage data from CAD systems, product lifecycles, manufacturing systems, and sensors to create a realistic virtual model of your product, enabling you to predict performance, maintenance, and failures.
To understand what is needed to implement a Digital Twin, we spoke to Jonathan Scott, chief architect at Razorleaf. Razorleaf is a consulting firm founded in 2000 that provides technical expertise into CAD systems, PLM systems, and how to build successful future digital services for manufacturing companies.
Tell us about yourself and your role at Razorleaf.
My background is in mechanical engineering. I worked in the transportation and nuclear industries and that led me down the road to working with product lifecycle management (PLM). For almost 20 years I’ve worked on the mechanical and electrical side managing data and building material for PLM systems. I’ve worked with several different software platforms from Dassault Systems, PTC, and Autodesk.
Currently, I've moved away from the hardcore technical programming and system configuration. Nowadays I spend more time working with clients talking about their business needs, and how they can improve business processes to get the real benefit and value out of their data. I still spend a fair amount of time answering fundamental questions on how to take your data to the next step and deal with the cultural aspects of transitions.
In your opinion, what is the current state of the Digital Twin in the automation and manufacturing market? How does it relate to PLM?
We’re in the early stages of the Digital Twin concept, as organizations develop targeted Digital Twin models of their products to meet specific objectives. Some businesses want a Digital Twin to help them control their physical assets, so the Digital Twin is used to validate behavioral response to software controls. Other businesses want a Digital Twin to make more accurate predictions of as-built product performance, so they focus on different areas to increase the fidelity of their DTs. However, most organizations are still trying to define the value of a Digital Twin and to build a representative model of their virtual product—they are years away from creating twins of each physical instance of their product.
Product lifecycle management software and Digital Twin modeling have sort of an independent maturity at the moment. PLM has become fairly mature in terms of its core capabilities and what the vendors are offering. The adoption rate is where I think there's still a huge spectrum and room for growth. And of course, then that relates to the level of maturity of digital tools in the market. Digital Twin is much less mature because it stands on the shoulders of PLM.
In all cases, you need some sort of virtual model on which you can overlay the sensor and performance data that comes back from the field. There is a lot of work being done there not just to figure out how you connect those two data [but how] you marry that with this low-volume data set that is the product record coming from PLM.
Jonathan Scott is chief architect at Razorleaf and has almost 20 years of experience in the CAD and PLM industry.
What digital services are enabling the Digital Twin?
Management of model data from PDM through the full PLM set of capabilities (this is a huge collection of digital services) [is] the foundation of the Digital Twin. Layered on top of that are IIoT and IoT data services. The as-built data collection services of MES systems are the primary provider of IIoT data, and the IoT data stream seems more ad hoc.
Lots of people are experimenting with IoT data collection—so tools like ThingWorx and Mindsphere are providing these services—but others are building their own using edge data collection services, cloud databases, and other microservice elements. ERP is also adding value, particularly those ERP systems with serialization modules and service management capabilities.
One system that people that tend to overlook is manufacturing execution systems (MES). For manufacturers that have been doing large capital equipment—like airplanes, for example—they really are tracking every physical part through their plant or their manufacturing facility. They are collecting all of this data related to quality and SPC data, which is a rich data source. 10 to 15 years ago these systems were hitting their prime and used to ensure products going out the door were validated.
A good example is a product made of composite material that requires baking. The MES would require the start time, finish time, and ensure the product was within nominal parameters. Now that same information can be used to feed into the Digital Twin model. For instance, correlating composite curing parameters with in-service strain data of the part, may show the importance of variables not previously managed or understood. Through MES data and the Digital Twin, you create a better product.
What are the major benefits today of the Digital Twin? Do they line up with what was expected?
Asking about the benefits of the Digital Twin is a bit like asking about the benefits of a PLM system—DTs will be defined differently by different people and have very different uses. I think people are already seeing benefits from the predictive nature of DTs with capital equipment and other high-value assets. As time goes on, we will see those values trickle down to other types of products, and additional value propositions crop up as well.
Users trying to create a digital twin of large capital equipment are seeing the payoff. When you see that the equipment is out of temperature or vibration parameters, it allows users to pause and analyze the equipment. It isn’t necessarily a failure, but you know something's abnormal and it allows you to go back to your virtual model to help predict an early failure indicator. It helps users determine if there is a certain repair or service mode that is needed to diagnose something specific. That type of focused Digital Twinning is paying off.
Potentially, what future benefits can the Digital Twin bring?
In the long term, Digital Twins should be like a crystal ball to an organization. By feeding the right data (virtual definition, physical characteristics, and performance history) into the right analytical or behavioral model, organizations should be able to see how serial number 123 of their product will react in response to specific real-world stimuli. In the right contexts and with greatly expanded computing capacity, future-predicting models like this should be able to do amazing things, such as saving lives from catastrophic product failures.
Shown is a road map of how product data is used to create a Digital Twin.
For those starting to explore the Digital Twin, what technology or service should they invest in?
It is difficult to leverage real-world (IIoT and IoT) performance data about a product for predictive value without some sort of model in which to apply the data. So, without a relevant virtual model of a product, it is hard to get the benefit out of capturing information from the field (there are exceptions, of course).
So, I think organizations need to invest in modeling their products (to whatever level of detail is appropriate for their needs) and managing those models in a flexible framework that will allow for incorporation of performance data and for adjustment of the model fidelity as time goes on. PLM systems are currently the best toolsets available for that, so I think one concrete action that organizations can take is deploying a PLM to enable their version of a Model-Based Enterprise.
Frankly, for a lot of organizations, this is a long road. A lot of the early adopters of Digital Twin are going to be folks who were already early adopters of PLM, modeling technology, analysis, and model-based enterprise type techniques and technologies. Those users have a great foundation to go after IoT data and start to build their twins. However, a lot of our customers are still sorting out the transition to 3D CAD.
Our advice to them is to be practical about the process. You need good PLM systems in place to prepare for Digital Twin systems. They also need to target a use case that has real benefits to their organization. Are you trying to be predictive because you’re trying to sell your product more as a service? Or are you trying to use a Digital Twin for maintenance purposes to avoid downtime? I think it’s better when people have a goal in mind or have a capability in mind for their Digital Twin and build out the functionality and underlying systems.
For those looking to enter Digital Twins, we recommend the following steps:
Assess where you are. Know where you are along the process. What tools and data do you already have available? We have tools that analyze your systems to help you define your status.
Define where you want to be. Determine how you want to use these tools and what goals make the most sense for you.
Prepare your workforce culture for change. You don’t always have to get your culture right, but you have to recognize that change will be needed. The process of putting new tools in place will impact large parts of an organization—people are going to need to be incented differently, and their roles may change. If management recognizes these impacts ahead of time, the process of change can be more comfortable for everyone.