The digital-twin maturity curve is like many others—the term dates back decades. However, digital twins really gained widespread adoption in the 2010s. So, what exactly is the current state of digital twin in manufacturing and construction?
First, a bit of a primer. Digital twin date back to the 1960s when we witnessed NASA create a living model of the Apollo mission. Why was a digital twin created? Apollo 13’s oxygen tank exploded and there was damage to the main engine. To address this, NASA employed multiple simulators to evaluate the failure and extended the physical model of the vehicle to include digital components. Thus, the birth of digital twin was born. At this point NASA was able to use this living model for forensic analysis and exploration of next steps. With this new model, NASA was seeking to provide greater decision support, predicting issues via a virtual counterpart.
Pretty cool, right? It wasn’t until 2002 when a paper on product lifecycle management actually detailed the concept of a digital twin in more depth. The idea was further expounded on in 2011 and became more widespread in the following years.
What Is Digital Twin?
While there are many different definitions of digital twins, especially among the marketing community, at its core, the digital twin is a collection of information from disparate sources and is a digital replica of a living or non-living physical entity. This replica could be an actual physical asset that exists in the real-world today—or it could be a potential asset that could exist in the future.
One of the simplest definitions of a digital twin is “a digital representation of a physical asset, process, or system.” However, a digital twin is a live, evolving set of data that must be continuously synchronized, and it should exploit data-driven workflows to optimize performance.
St. Peter’s Basilica
How can digital twins truly help in manufacturing and construction? Let’s talk about what is happening in the Vatican. Talk at the Vatican not only focused on the selection of a new Pope after the death of Pope Francis on Easter Monday, but also the work being completed using digital twins. I have written about the case study of the almost 400-year-old St. Peter’s Basilica before in my article Cardinal Gets His Geek On. If you haven’t read it, it’s worth taking a look, but let me give you a recap here.
To help preserve one of the world’s most important architectural icons in religious history, Italferr S.P.A. created a digital twin for structural monitoring of St. Peter’s Basilica in Vatican City using a whole suite of software from Bentley Systems including iTwin. This combines 26,000 files holding more than three terabytes of disparate data—enough to store 700 feature movies in high-video quality.
Using the technology saves hundreds of hours, reducing the on-site inspection needs. Also, the digital twin can help in the maintenance and preservation of the iconic Basilica, which can hold 60,000 people. With a digital twin, monitoring can be done remotely, which mean the Vatican doesn’t have to close.
Of course this is only one example. In the manufacturing industry, a digital twin can be created by a factory, warehouse, or facility to help drive efficiency on the plant floor. Companies like BMW, Rolls-Royce, Toyota, Microsoft, Siemens, GE, and Lowe’s are using digital twins and cities like Singapore, Zurich, and Orlando are leveraging digital twins. And, of course, this is just a sampling.
Addressing Digital Twin Challenges
Of course, creating and managing digital twins is easier said than done. In an April 2025 blog from NVIDIA James McKenna points to some of these unique challenges, such as fragmented data pipelines, siloed tools and the need for realtime, high-fidelity simulations.
He suggests the Mega NVIDIA Omniverse Blueprint helps address these challenges by providing a scalable reference workflow for simulating multi-robot fleets in industrial facility digital twins.
Societal challenges also persist including the rising material costs, a fragmented supply chain, the labor skills shortage, and the rising demand for more complex production facilities, just to name a few. We need to work faster than ever before, which also means we need to work smarter than ever before. As if that is not a challenge already.
Enter Digital Twin Consortium’s first testbeds as part of its Digital Twin Testbed Program. The organization suggests this collaborative member-driven program will accelerate development, validation, and implementation, demonstrating digital twin evolution with agentic AI and enabling technologies.
With this, members will be able to model, simulate, integrate, rigorously verify, deploy, and optimize digital twin solutions while collectively advancing the core technologies. The Digital Twin Testbed Program implements the Digital Twin Consortium Composability Framework—using the Business Maturity Model, Platform Stack Architecture, and Capabilities Periodic Table—alongside a capabilities-focused maturity assessment framework that incorporates the evaluation of generative AI, multi-agent systems, and other advanced technologies.
As technology continues to advance, we will need to keep our pulse on technology as well as identify the best way to train the people and identify how to best create processes for success. The future is certainly bright for digital twins, but it will need to be an ongoing conversation for how to best leverage technology. Digital twins are changing our engineering and manufacturing environments, helping us predict issues and identify challenges and opportunities before they arise. Remember, digital transformation is a journey, not a destination.
Want to tweet about this article? Use hashtags #IoT #sustainability #AI #5G #cloud #edge #futureofwork #digitaltransformation #green #ecosystem #environmental #circularworld