# The Multicore Challenge: Petascale DNS of a Spatially-Developing Supersonic Turbulent Boundary Layer up to High Reynolds Numbers with Flexi

A research team from the Institute for Aerodynamics and Gas Dynamics in Stuttgart and the University of Maryland in College Park conducted a direct numerical simulation of flat plate boundary layer using the discontinuous Galerkin spectral element method (DGSEM) on the HLRS Cray XC40 cluster. This project aimed at demonstrating the high potential of the FLEXI framework for the investigation of wall-bounded turbulent flows.

Key Facts:

• 93,840 compute cores @ HLRS Cray XC40
• 30 TB data
• N=5
• 1.458 billion degrees of freedom per variable

In contrast to today’s space transportation technologies, which need to carry tons of oxygen supplies, air-breathing supersonic and hypersonic vehicles are expected to shape the future of space flight as they inhale the atmospheric air to gain the oxygen required for combustion. Thus, the aircraft becomes lighter, faster and eventually more economic since the payload can be drastically increased. Particularly scramjets, a special type of ramjet engines where the combustion occurs at supersonic speeds, are considered as a very promising candidate for classical rocket driven systems. The air intake of a scramjet, however, plays a key role, since the compression of the incoming air is not achieved via moving parts like compressors, but through a series of shock waves generated by the specific shape of the air intake and the high flight velocity. Furthermore, as the supply of compressed air is of fundamental importance for the subsequent efficient combustion of the fuel-air mixture to produce thrust, the air intake also determines the operability limits of the whole system. The intake flow itself is characterized by laminar and turbulent boundary layers and their interaction with shock waves, yielding a three-dimensional unsteady complex flow pattern. Due to the interaction with shock waves, the boundary layer may experience intense heat loads leading to serious aircraft damages. A detailed study of the turbulent boundary layer is thus crucial not only for new cooling concepts, but also to ensure the structural integrity of air-breathing propulsion systems.

The intake flow is initially dominated by laminar/turbulent boundary layers and transition, i.e. the changeover from laminar to turbulent, is one of the main flow features encountered in the air intake. During the operation state of a scramjet, however, the occurring shock waves traverse through the spatially-developing boundary layer and exercise a huge impact on the whole intake flow. Since experimental and flight data of hypersonic air-breathing vehicles are though difficult and utterly expensive to obtain, numerical methods are applied to enhance our understanding of the involved complex physical phenomena. The most accurate way yet to conduct a numerical investigation is offered by direct numerical simulations (DNS), which resolve all flow features without any modeling. Due to the high spatial resolution requirements, the DNS of turbulent flows leads to enormous computational expenses. On the other hand, with increasing computation resources available on today’s modern supercomputer systems, the DNS of wall-bounded turbulent flows have become more attractive, but also a challenging HPC problem. Thus, the applied numerical method has to enable an efficient parallelization. An auspicious candidate that exhibits excellent HPC capabilities combined with high-order spatial accuracy and geometrical flexibility is the discontinuous Galerkin (DG) method.

Within the scope of this project, however, a team of researchers from the University of Stuttgart and the University of Maryland conducted a DNS of a spatially-developing supersonic turbulent boundary layer up to $Re_\theta=3878$ in order to generate a reliable database for further complex studies and to shed light on the occuring flow phenomena (e.g. the impact of shock wave/boundary layer interactions at different impingement locations). For this purpose, they have applied the FLEXI framework, that is specifically tailored to HPC applications. With 1.458 billion degrees of freedom per variable, the present DNS is the currently biggest computation within the DG community, which requires the necessity to cope with the demanding computational costs in a responsible way.  Figure 1 shows the instantaneous $\lambda_2$-visualization of the turbulent structures. Here, we see the footprints of the disturbances introduced at the inflow, which later breakdown into turbulence further downstream. The usage of the DGSEM approach allowed the researchers to efficiently exploit the whole computational power available on the HLRS Cray XC40 supercomputer and to run the simulation with 93,840 processors without any performance losses. The obtained results prove the strong potential of the DGSEM at conducting sustainable DNS of high Reynolds number wall-bounded turbulent flows.

Contact:

Muhammed Atak, atak@iag.uni-stuttgart.de