Opinion: Future Aerospace Enterprises Will Demand More Advanced Modeling And Simulation
Computational modeling has already transformed aircraft design by reducing dependence on costly testing and evaluation while providing access to high-quality optimized configurations. However, as the value chain in the aerospace enterprise shifts and new businesses emerge, conventional modeling approaches will not be sufficient for future challenges. New approaches will be required, driven both by enterprise needs and the evolution of computing systems.
To understand the limitations of computational modeling (CM), consider the following: A near-complete simulation of a modern aircraft engine will consume as much power as the engine would generate in flight. The success of conventional CM has been driven by the spectacular growth in so-called high-performance computing (HPC) systems. As HPC machines became more powerful, more realistic physics could be represented, which in turn increased the predictive accuracy. However, more processing capability implies more energy consumption.
Another consideration is turnaround time. Even if an on-demand supercomputing capability is available, the time to complete such detailed simulations can take days if not weeks. While many different approaches are being pursued to address these issues (predictive modeling, green computing, etc.), it is necessary to ask whether such CM alone is sufficient for emerging aerospace needs.
The new computing needs arise from new business opportunities as well as pervasive digitalization across the aerospace enterprise. Digital threads are transforming design, maintenance and operation of fleets through the use of virtual copies of physical assets. At a fundamental level, such copies are collections of models that together can predict lifetime performance of aircraft.
As autonomous and personal air vehicles proliferate, rapid planning to minimize local weather effects will become necessary. Here, satellite data, sensor and flight data from other vehicles in that air space will have to be integrated. Even the design workflow is changing. Next-generation design will use augmented and virtual reality (AR/VR), where designers will be able to construct computer-aided-design models in virtual 3D space and obtain immediate results on structural characteristics and flow profiles as they alter aspects of the model (in a web browser, search “tilt brush” as an example for VR-based design).
The key difference between conventional design and these new paths is the need for speed. Computational results will have to be delivered in a matter of minutes, or in some cases even in real time. In many situations, such simulations will have to be conducted without access to a bank of HPC resources.
How can these new challenges be addressed? First, the key differentiator is data. In the operational space, the physical asset will produce streaming data from sensors and other sources. This will allow detailed physics models to be replaced with coarser but calibrated models that are faster to execute. In the design cycle, multifidelity data from component tests, experiments and varied sets of simulations can be used. This process itself relies heavily on statistics, uncertainty quantification and machine learning.
Second, computing systems are undergoing a new revolution. With ubiquitous digitalization, the idea of computing is no longer restricted to dedicated computers but can happen at multiple levels, including on the device—in the AR/VR headset, on the cloud and on traditional supercomputers. This new era of distributed computing will require a suite of models that are connected, operate on devices with disparate capabilities and utilize processing power to deliver results in real time. Leveraging advances in connectivity (5G and beyond) and security will become important.
Finally, the simulation data should be presented in a manner that accelerates time to decision. The sheer amount of information can be overwhelming to the user, especially when parsing in real time. The AR/VR tools can provide immersive environments, where simulation data extends the senses of the human user.
Such changes to computing have major implications for aerospace education. Foremost, understanding data provenance, usage and limitations have to be at the center of the aero curriculum. Data-based modeling that is supported by insight into physics as well as understanding of computing hardware are necessary. Since many of these applications involve operation under uncertainty, a move away from deterministic to statistical thinking should be encouraged.
For both research and education, maintaining the status quo in CM is not an option. If we do not adapt to these changes, we may cede our influence to other engineering and science disciplines.
Venkat Raman is a professor in the Department of Aerospace Engineering at the University of Michigan.
The views expressed are not necessarily those of Aviation Week.