
Paris Akhshi, PhD, Product Marketing Manager, Keysight Technologies
During product development, companies often aim for faster, cheaper, smaller, more accurate and lower-power Despite recent improvements, most companies still take several months to correlate test data with design, significantly delaying product launches and straining engineering resources. Many companies have standardised and streamlined their design, test and build processes, including implementing gate processes to ensure consistency and repeatability. However, the tools and methods remain complex. Research shows that organisations use various software tools for designing, testing and verifying their products. Most organisations use three to ten software tools for each process. This complexity can delay product introduction and cause performance and quality issues during the production and customer support processes. However, change is on the horizon. A new approach to designing and building workflows involves using digital twins. Digital twins mimic every aspect of the product as if it were real. This process eliminates the gap between theory and reality.
The role of digital twins
A digital twin is a virtual model that accurately reflects a physical object. A digital twin can continuously learn and update itself to reflect the near-real-time status and the operating conditions of the physical system it is twinning. Updates may come from various sources, including a human expert familiar with the system’s operation. Embedded sensors within the large physical system can provide information about the system or its operating environment. It is also possible to get updates from a connected artificial intelligence (AI) and machine learning (ML) system. The digital twin may incorporate historical data from past operations.
During the product life cycle, digital twins play a crucial role in reducing cost and time as customers incorporate them into design and test workflows. Industry pioneers are constantly exploring new means to import and export digital twin data as well as to aggregate and share specific models, schemes and data across the product spectrum. They are also developing the notion of bionic digital twins or integrated digital twins to represent complex network systems that incorporate live hardware operational devices or hardware emulators as part of the overall digital twin.
Design verification (simulation) and hardware validation take nearly two-thirds of product development time. This is an area where industries hope to improve. There are many similarities between these two activities, and both strive to characterise a design based on a set of requirements. One activity occurs within the virtual digital twin domain, and the other takes place within the physical test domain. It is possible to connect these tasks through digital threads, allowing the digital twin to deliver tangible benefits.
How AI augments digital twins
Advancements in AI and ML are playing an increasingly important role in making digital twins more powerful and capable. With AI and automated learning, a digital twin can dynamically improve the quality and validity of predictions under various operational scenarios. When ML-based models integrate into the digital twin, designers can scale these models to large networks with tens or hundreds of nodes and represent diverse real-world situations. A digital twin offers the advantage of generating synthetic data with the required parametrisations while replicating previous tests. It is important that ML training generates similar data sets and has the informational complexity to fully access underlining algorithms. With physical systems alone, this is nearly impossible. High-fidelity simulation models such as digital twins are the only ones that allow simultaneous operation.
The concept of synthetic data generation involves generating data within a virtual world. One can use this context to train and assess how well an ML system performs under diverse situations in the real world. A physical system cannot do all these tasks because of lack of time, money or tractability. If one intends to widely deploy AI and ML, these types of network digital twins will be crucial.
Integrated tools for complete analysis
Remember that simply automating design will not suffice. Modern radio frequency (RF) design tools share data and measurements from simulations and real-world observations in an authoritative digital source of truth. The Keysight PathWave System Design tool integrates RF modelling, simulation, enterprise system engineering tools and Keysight measurement science for better results. The process starts with high-fidelity RF models, surrounded by an easy-to-use system architecture simulation workflow. With robust modelling and powerful simulation capabilities, system architects can understand how RF processing blocks work together, regardless of their experience with RF characteristics. The solution allows for some of the key steps towards designing high performance systems, including creating virtual prototypes, optimising subsystems and algorithms, and testing and evaluating complex mission scenarios digitally, including rapid dynamic changes.