AI/ML for R&D Can Drastically Accelerate Discoveries Like the Supercritical Airfoil Design
Research and Development in the Age of Artificial Intelligence and Machine Learning
Since the dawn of the scientific revolution, research and development of new technologies have been almost entirely conducted through ongoing experimentation and observation. From the telephone to pharmaceuticals, the technologies that impact our daily lives started as crude versions of their finely tuned versions of today. Engineers, scientists, and researchers have been bound by in long, iterative R&D cycles of forming hypotheses, observing test conditions, and measuring results in a repetitive fashion. Until now. AI-enabled R&D has already been credited for profound discoveries like specialized antibiotics and it will continue to reinvent the way we invent.
The rules of R&D are being rewritten by innovators with capabilities like artificial intelligence, machine learning/deep learning, simulation, and modeling. Long before the advent of AI/ML, simulation and digital modeling was adopted by R&D teams for a variety of reasons including increased safety (e.g. vehicle crash testing) and decreased cost (e.g. building stability testing). Trends like computer-aided design (e.g. blueprints & 3D models) and later computer-aided engineering (e.g. physics and chemistry simulations) paved the way for a new generation of computational science and engineering that takes advantage of new technologies for AI/ML for new innovations.
Why does AI and ML for R&D matter? Some argue that the overall pace of innovation is slowing (Worldbank), while others argue that solving big problems has gotten harder or less interesting (MIT Review). In either case, researchers, engineers and scientists need to develop new approaches of discovery that accelerate innovation and broaden possibilities.
Shifting From Traditional Aerospace Design to AI/ML-Enabled Design
One of the most iconic stories of continuous engineering optimization is modern flight as seen in wing shape design and the discovery of the supercritical airfoil. For over a century Aerospace engineers have worked to improve the cross-section shape of wings, known as the airfoil. Ideal airfoil shapes minimize air resistance and turbulence while maximizing lift. After 40 years of research and development, planes had become significantly faster thanks to the jet engine but high speeds led to instability especially as pioneers approached the speed of sound. Aerodynamicists continuously iterated on airfoil designs until eventually arriving at the supercritical design which was incorporated into a wide range of aircraft from commercial to military, leading to drastically improved safety and efficiency.
The discovery of the supercritical airfoil took decades to discover, relying on physical tools like wind tunnels (since the 1930s) and eventually digital tools computational fluid dynamics (CFD) (since the 1970s) – both approaches being highly iterative and yielding only incremental gains. This begs the question: With today’s understanding of physics combined with technologies like digital prototyping and artificial intelligence, can we achieve transformational discoveries much faster than ever before? The answer is most likely, yes!
AI and related technologies like machine and deep learning, physics-informed neural networks, and surrogate models show immense potential for solving complex design problems like the supercritical airfoil. Engineers can use computational methods to generate virtually limitless design candidates, test them against a wide range of real world conditions, and down select to the optimal design. In recent decades, aerospace engineers have relied on CFD software and parallel computing which continues to get faster but is still time intensive which is turning attention to this new category of AI-enabled approaches.
Discovering the Optimal Airfoil Design Using ML and a Surrogate Model on Rescale
To illustrate how aerodynamicists can get started designing and optimizing airfoil shapes, we have assembled a tutorial that produces the supercritical airfoil design much faster than traditional approaches. To begin deploying AI/ML, R&D teams need a variety of toolsets, so Rescale has built a platform with a complete stack and a simple workflow to get started. Similar to standard computer aided engineering (CAE) methods like CFD or finite element analysis (FEA), engineers need access to use-case specific software and specialized hardware (e.g. CPU, GPU, storage, networking, memory).
In addition to specific hardware and software, AI approaches rely on a range of frameworks and large data sets for training models, all of which Rescale brings together in one platform. In this particular use case, the user can leverage an existing data set of possible airfoil shapes and an open-source framework for multidisciplinary analysis, MDAO, to perform aerodynamic shape optimization.
In this project, we used applied ML techniques to accelerate CFD simulations and optimization processes on the Rescale platform. First, we applied principal component analysis (PCA) to reduce the dimensionality of a large airfoil database. Then we built a PCA-based reduced order model to produce low-fidelity flow field predictions to accelerate the convergence of high-fidelity simulations needed for training data generation. Finally, based on this fully generated training data, we built a Gaussian-process based surrogate model that is used for a rapid optimization loop to minimize the drag of a baseline airfoil under given constraints.
Running each high-fidelity case would take around 50 seconds (8 cores / sample) when started from initial conditions, and 4 seconds (8 cores / sample) when started from a predicted flow field. When the training data is generated naively, i.e. with no intermediate acceleration using an intermediate surrogate model, generating the whole dataset would take around 17 hours. However, when an intermediate acceleration is used to initialize a large majority of the high-fidelity simulations with intermediate predictions, data generation takes only 2 hours. The time it takes to train the surrogate model is negligible compared to generating the training data. It takes a couple of hours to generate the data needed to train the final surrogate model; but once trained, the model produces sufficiently accurate predictions, each within a fraction of a second. As a result, the front-loaded cost of the training data generation becomes significantly amortized during the subsequent optimization studies which may involve multiple optimization with different constraints and objectives.
In this comparison of traditional CAE versus ML design methods we demonstrated that using a surrogate model to aid or replace high-fidelity simulations greatly accelerates the design and optimization studies and makes it possible to dynamically change constraints or objectives without having to repeat high fidelity simulation for each study. The speed up was approximately 8x faster with the ML method. If you would like to learn more about this or try it for yourself, check out our guided tutorial with summary video here. If you’re interested in AI/ML techniques you can deploy for other industries and use cases, you can get in touch with us here. We are pleased to support many pioneering customers in continually testing new R&D approaches and making new discoveries as a result.