Since the COVID-19 pandemic struck, organizations have been compelled to take reactive steps to mitigate crippling losses. As they prepare for the “new normal,” they may be rethinking existing infrastructure as a way to produce different products or offer new types of services.
But where to start?
The pandemic acts as a catalyst for extending technologies and digital twin simulation can be a critical tool in a proactive, strategic response, according to Karen Panetta, dean of graduate engineering at Tufts University in Medford, Mass.
“The name of the game in the post-COVID era is going to be contactless delivery of products and services,” said Panetta, who believes that employing digital twins can expedite an enterprise’s efforts by giving it the ability to anticipate stress points, enable more efficient model adaptations and more quickly rework its processes.
Not only will manufacturers be looking inside the plant for ways to mitigate the risk of contact with anything that might propagate disease, but they will also be analyzing their supply chains for insight into what happened once delivery or transportation shut down, observed Panetta, whose current research areas include the development of image and signal processing, algorithms for applications in homeland security and biomedical applications.
“Typically, when modeling, everybody looks for risk points,” said Panetta. “Whenever we develop processes, we say, ‘Oh, here’s my critical risk point. And here’s my contingency plan.’ For example, if we can’t send products via railway, then can we send it by air?
“But nobody really looks at large disruptions like we have now,” she continued. “Nobody assumes that all transportation might be closed down, or that the whole world doesn’t go in to work. Nobody’s ever modeled for unprecedented large-scale disasters like this.”
With a CV running more than 41 pages, Panetta’s experience allows her to see the future and act on it. In addition to her academic duties, she is an IEEE fellow. In this capacity she lends her technical and policy expertise to foster programs at the technical professional organization dedicated to advancing technology in areas ranging from aerospace systems, computers and telecommunications to biomedical engineering, electric power, and consumer electronics.
In the following edited Q&A, Panetta offers insights into the potential for the expanded use of digital twins once the effects of the pandemic subside.
Machine Design: Can you provide a high-level definition of a digital twin and a bit of context for your digital twin experience?
Karen Panetta: A digital twin is a virtual replica. Think of it as a simulation of physical assets, processes, people, places, systems and devices.
I was probably one of the first people ever to develop a digital twin. We didn’t call it a digital twin back then. I created a digital twin of a million transistor CPU design. That was so we could run all the software on this virtual machine before we even got the product built, and so that when the product shipped, we could have all the software that was ready to go with it…I probably had one the first patents on digital twins in 1992.
And then I went to work at NASA. And that’s where I saw that they were using these digital twins to actually control mechanical systems. So, their simulator had connections to real engines and things that it could control. I got to set up jet engines with my simulations and it was really cool. That was the beginning of digital twins.
MD: What would you regard as the best use of digital twins, and how has its relevance evolved?
KP: Originally, they were developed so one could create the product before spending money to build it. So, for saving on manufacturing costs. Originally, it was built to ensure that the products we were going to build was going to work. Now, it also allows us to co-develop the supply chain or the peripherals around the product. In this way, not only look at what could go wrong physically with the product, but also what could go wrong for the process in the supply chain, the people providing the services and the components used to build the product.
MD: Which industries, would you say, use digital twins effectively?
KP: The industry that essentially invented it uses it effectively—the computer industry, and any sort of semiconductor manufacturing. Now, it could also be any entity that’s manufacturing a product or providing a service, such as the communications industry or for developing smart cities.
MD: Can you elaborate on how digital twins support and enable more efficient modeling?
KP: If we have a model that simulates the entire process, we can do a “plug-and-play” by changing and adapting the model to the new inputs. For instance, how much capacity do we have? Or, if we can no longer have a Chinese manufacturer in our supply chain, but a supplier in South America or Canada can supply the product, we can quickly adapt our model to the new inputs. Or, I can substitute different parameters to see how they are going to change the price point.
We can also substitute one model for another. For example, if I use a lighter battery, how does that affect my entire design, my entire process or the reliability of my product? And based on my customer use cases, will it change their perception? Will they still buy this product?
MD: Are digital twins suitable for every business?
KP: If companies are not getting into digital twins, they’re going to be left behind. The companies that do embrace it are going to be more dynamic—really able to adapt to different environmental conditions, different market conditions, people’s different interests in the market and what people will buy. It will help with understanding supply and demand, and how to get there.
MD: What do you perceive to be the biggest challenges associated with developing digital twins?
KP: All of these models are grounded in data. When I’m building a product, I have very detailed low-level data about every single component. I know my process. I know what goes on in my manufacturing plant. I know every job, and every task in the plant. As I move outside my plant, I now have to model a supply chain—all of the different vendors that I buy my components from. That becomes the challenge because now you’re dependent on getting real data from those suppliers in order to model it. Without it, you’re guessing based on market trends. And that’s where the risk comes in.
You want to get to get what we call the “ground truth,” which is a large sample based on the right assumptions of how things really work… As we move outside the realm of the physical thing we build, processes are much more variable, people’s mindsets are much more variable and economies are much more variable. So being able to dynamically model with accurate data is the foundation.
MD: What’s next for digital twin technology?
KP: The next thing is incorporation of artificial intelligence, to learn from the experiences and to gather data to get people to share their data using AI. One of the things that’s prevented companies from moving into artificial intelligence is the ability to share data. A company that thinks its data is proprietary says, “Well, I understand my customer base better than you, so I’m able to market better to them. Why am I going to share that data with you?” And that’s part of the problem—trying to get more data.
In the future, we’ll see these models expand to include different entities outside of our own control; meaning, you’re not part of my company, but you might be part of my supply chain, or you might be one of my customers. You’re going to see more research done into getting access to that data. And that includes government. People might even say this is important for our economy. So maybe you’ll see some initiatives do open source sharing of data. Right now, data sharing is really only happening via the academic institutions.
MD: When would be the right time to start building digital twins?
KP: The right time is always yesterday. But the real answer to your question is, you’ve got to start with what you know...
The place to start is inside—modeling what you know and confirming your processes. Sometimes, when I’ve worked on projects, I’ve said, “I don’t know anything about how you manufacture these things. Just walk me through it.” Then I’ll go through it and there’s somebody that will say, “Well, no, that’s not really how it works.” And you find out that they really don’t know what’s going on inside. To find out that we really don’t know how we operate or how processes work internally is a valuable exercise. And you see all these inefficiencies and, right there, you can start building your strategy.
So, start with what you know at home because that’s where you have the most detail and the most accuracy. It’s where you’ll get the biggest bang for your buck because you can optimize and swap things in and out easily. Modeling is a valuable exercise for finding out what you don’t know and you’ll uncover inefficiencies. Then you can start building your strategy and move your way up.