Paris Akhshi, PhD, Product Marketing Manager, Keysight Technologies

During product development, companies often aim for faster, cheaper, sm­a­ller, more accurate and lower-power Despite recent improvements, most companies still take several months to correlate test data with design, significantly de­laying product launches and straining engineering resources. Many companies have standardised and streamlined their design, test and build pro­cesses, including implementing gate pro­ce­sses to ensure consistency and repeatability. However, the tools and methods remain complex. Research shows that organisations use various software tools for designing, te­s­ting and verifying their products. Most organisations use three to ten software tools for each process. This complexity can delay product introduction and cause performance and quality issues du­ring the production and customer support processes. However, change is on the horizon. A new approach to designing and building workflows involves using digital twins. Digital twins mimic every aspect of the product as if it were real. This process eliminates the gap between theory and reality.

The role of digital twins

A digital twin is a virtual model that accurately reflects a physical object. A digital twin can continuously learn and update itself to reflect the near-real-time status and the operating conditions of the physical system it is twinning. Updates may come from various sources, including a human expert familiar with the system’s operation. Em­b­e­d­ded sensors within the large physical syst­em can provide information about the syst­em or its operating en­v­i­­ron­ment. It is also po­­ssible to get updates from a connected ar­­ti­ficial intelligence (AI) and machine learning (ML) system. The digital twin may incorporate historical data from past operations.

During the product life cycle, digital twins play a crucial role in reducing cost and time as customers incorporate them in­to design and test workflows. Industry pio­neers are constantly exploring new means to import and export digital twin da­ta as well as to aggregate and share specific models, schemes and data across the product sp­ectrum. They are also developing the notion of bionic digital twins or integrated digital twins to represent complex network systems that incorporate live hardware operational devices or hardware emulators as part of the overall digital twin.

Design verification (simulation) and hard­ware validation take nearly two-thirds of product development time. This is an area where industries hope to improve. Th­e­re are many similarities between these two activities, and both strive to characterise a design based on a set of requirements. One activity occurs within the virtual digital twin domain, and the other takes place wi­th­in the physical test domain. It is possible to connect these tasks through digital th­reads, allowing the digital twin to deliver tangible benefits.

How AI augments digital twi­ns

Advancements in AI and ML are playing an increasingly important role in making digital twins more powerful and capable. With AI and automated learning, a digital twin can dynamically improve the quality and validity of predictions under various operational scenarios. When ML-based models integrate into the digital twin, designers can scale these models to large networks with tens or hundreds of nodes and represent diverse real-world situations. A digital twin offers the advantage of generating synthetic data with the required parametrisations while replicating previous tests. It is important that ML training gene­r­ates similar data sets and has the informational complexity to fully access underlining algorithms. With physical systems alo­ne, this is nearly impossible. High-fidelity simulation models such as digital twins are the only ones that allow simultaneous operation.

The concept of synthetic data generation involves generating data within a virtual world. One can use this context to train and assess how well an ML system performs under diverse situations in the real world. A physical system cannot do all these tasks because of lack of time, money or tractability. If one intends to widely deploy AI and ML, these types of network digital twins will be crucial.

Integrated tools for complete analysis

Remember that simply automating design will not suffice. Modern radio frequency (RF) design tools share data and measurements from simulations and real-world ob­ser­vations in an authoritative digital source of truth. The Keysight PathWave System Design tool integrates RF modelling, simulation, enterprise system engineering tools and Keysight measurement science for better results. The process starts with high-fidelity RF models, surrounded by an easy-to-use system ar­chitecture simulation work­flow. With robust modelling and powerful simulation capabilities, system architects can understand how RF processing blocks work together, regardless of their experience with RF characteristics. The solution allows for some of the key steps to­wards designing high performance syste­ms, including creating virtual prototypes, optimising subsystems and algorithms, and testing and evaluating complex mission scenarios digitally, including rapid dynamic changes.