Paris Akhshi, PhD: Product Marketing Manager, Keysight Technologies

During product development, companies often aim for faster, cheaper, more accurate, lower-power, and smaller designs. To accomplish these goals, designers have used step function innovations such as switching from through-hole components to surface-mount, moving from manual analysis on spreadsheets to sophisticated simulations, and conducting manual testing. Despite recent improvements, most companies still take several months to correlate test data with design, significantly delaying product launches and straining engineering resources.

Many companies have standardized and streamlined their design, test, and build processes, including implementing gate processes to ensure consistency and repeatability. However, the tools and methods remain complex. Research shows that organisations use various software tools for designing, testing, and verifying their products, as shown in Figure 1. Most organisations use three to 10 software tools for each process. This complexity can delay product introduction and cause performance and quality issues during the production and customer support processes.

But change is on the horizon. A new approach to designing and building workflows involves using digital twins. Digital twins mimic every aspect of the product as if it were real. This process eliminates the gap between theory and reality. While the digital twin concept appears simple, it can be deceptively complex when unraveled and understood. 

The role of digital twins

A digital twin is a virtual model that accurately reflects a physical object. A digital twin can continuously learn and update itself to reflect the near-real-time status and operating conditions of the physical system it is twinning. Updates may come from various sources, including a human expert familiar with the system’s operation. Embedded sensors within the large physical system can provide information about the system or its operating environment. It is also possible to get updates from a connected artificial intelligence (AI) and machine learning system. The digital twin may incorporate historical data from past operations. 

During the product life cycle, digital twins play a crucial role in reducing cost and time as customers incorporate them into design and test workflows. Industry pioneers are constantly exploring new means to import and export digital twin data as well as to aggregate and share specific models, schemas, and data across the product spectrum. They are also developing the notion of bionic digital twins or integrated digital twins to represent complex network systems that incorporate live hardware operational devices or hardware emulators as part of the overall digital twin. 

Design verification (simulation) and hardware validation take nearly two-thirds of product development time. This is an area where industries hope to improve. There are many similarities between these two activities, and both strive to characterize a design based on a set of requirements. One activity occurs within the virtual digital twin domain, and the other takes place within the physical test domain. It is possible to connect these tasks through digital threads, allowing the digital twin to deliver tangible benefits. 

How artificial intelligence augments digital twins

Advancements in artificial intelligence and machine learning are playing an increasingly important role in making digital twins more powerful and capable. With AI and automated learning, a digital twin can dynamically improve the quality and validity of predictions under various operational scenarios. When machine learning-based models integrate into the digital twin, designers can scale these models to large networks with tens or hundreds of nodes and represent these diverse real-world situations. 

Figure 1. Number of software tools used for testing and verification

A digital twin offers the advantage of generating synthetic data with the required parametrisations while replicating previous tests. It is important that machine learning training generates similar data sets and has the informational complexity to fully excise underlining algorithms. With physical systems alone, this is nearly impossible. High-fidelity simulation models like digital twins are the only ones that allow simultaneous operation.

The concept of synthetic data generation involves generating data within a virtual world. One can use this context to train and assess how well a machine learning system performs under diverse situations in the real world. A physical system cannot do all these tasks because of the lack of time, money, or tractability. If one intends to widely deploy AI and machine learning, these types of network digital twins will be crucial.

Integrated tools for complete analysis

Remember that simply automating design will not suffice anymore. Modern radio frequency (RF) design tools share data and measurements from simulations and real-world observations in an authoritative digital source of truth. The Keysight PathWave System Design tool integrates RF modeling, simulation, enterprise system engineering tools, and Keysight measurement science for better results. 

The process starts with high-fidelity RF models, surrounded by an easy-to-use system architecture simulation workflow. With robust modeling and powerful simulation capabilities, system architects can understand how RF processing blocks work together, regardless of their experience with RF characteristics. The solution allows for some of the key steps toward designing high-performance systems, including creating virtual prototypes, optimising subsystems and algorithms, and testing and evaluating complex mission scenarios digitally, including rapid dynamic changes.