Skip to main content
Log in

Challenges in Fluid Flow Simulations Using Exascale Computing

  • Original Research
  • Published:
SN Computer Science Aims and scope Submit manuscript

Abstract

In this paper, we briefly discuss the challenges in porting hydrodynamic codes to futuristic exascale HPC systems. In particular, we sketch the computational complexities of finite difference (FD) method, pseudo-spectral method, and fast Fourier transform (FFT). The global data communication among the compute cores brings down the efficiency of pseudo-spectral codes and FFT. A FD solver involves relatively lower data communication. However, an incompressible FD flow solver has a pressure Poisson equation, whose computation in multigrid scheme is quite expensive. Hence, a comparative study between the two sets of solvers on exascale system would be valuable. In this paper, we report a comparative performance analysis between a FD code and a spectral code on a relatively smaller grid using 1024 compute cores of Shaheen II; here, the FD code yields comparable accuracy to the spectral code, but it is relatively slower. The above features need to be retested on much larger grids with many more processors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. This simple arrangement is called collocated grid, in contrast to more complex one called staggered grid in which the velocity fields are represented at the face centres, and pressure at the centre of the cube. In this section, we assume collocated grid for simplicity.

  2. In some CFD literature, compute cores are referred to as processors. In this paper, we reserve the word processor for a CPU that contains many compute cores.

  3. The usual definition of efficiency, \(T_\mathrm {serial}/(p T_\mathrm {parallel})\), is not suitable for large grids. This is because such large data cannot be accommodated within a single processor, hence, a sequential run for a large grid is impossible.

References

  1. Albin N, Bruno OP. A spectral FC solver for the compressible Navier–Stokes equations in general domains I: explicit time-stepping. J Comput Phys. 2011;230(16):6248–70.

    MathSciNet  MATH  Google Scholar 

  2. Anderson JD. Computational Fluid Dynamics: The Basics with Applications. New York: McGraw-Hill; 1995.

    Google Scholar 

  3. Arora R. Conquering Big Data with High Performance Computing. Berlin: Springer; 2016.

    Google Scholar 

  4. Aseeri S et al. Solving the Klein–Gordon equation using Fourier spectral methods: a benchmark test for computer performance. In: Symposium on High Performance Computing. Society for Computer Simulation International 2015; p. 182–91.

  5. Balaji V. Scientific computing in the age of complexity. XRDS. 2013;19(3):12–7.

    Google Scholar 

  6. Balsara DS, Balsara D, Jongsoo K, Mac Low MM, Mathews GJ. Amplification of interstellar magnetic fields by supernova-driven turbulence. ApJ. 2004;617:339–49.

    Google Scholar 

  7. Balsara DS. Higher-order accurate space-time schemes for computational astrophysics—Part I: finite volume methods. Liv Rev Comput Astrophys. 2017;3(1):1–138.

    Google Scholar 

  8. Boyd JP. Chebyshev and Fourier spectral methods. second revised ed. New York: Dover Publications; 2003.

    Google Scholar 

  9. Bryan GL, Norman ML, O’Shea BW, Abel T, Wise JH, Turk MJ, Reynolds DR, Collins DC, Wang P, Skillman SW, Smith B, Harkness RP, Bordner J, Kim J, Kuhlen M, Xu H, Goldbaum N, Hummels C, Kritsuk AG, Tasker E, Skory S, Simpson CM, Hahn O, Oishi JS, So GC, Zhao F, Cen R, Li Y, The Enzo Collaboration. ENZO: an adaptive mesh refinement code for astrophysics. ApJS. 2014;211(2):19–52.

    Google Scholar 

  10. Canuto C, Hussaini MY, Quarteroni A, Zang TA. Spectral methods in fluid dynamics. Berlin: Springer; 1988.

    MATH  Google Scholar 

  11. Chatterjee AG, Verma MK, Kumar A, Samtaney R, Hadri B, Khurram R. Scaling of a fast Fourier transform and a pseudo-spectral fluid solver up to 196608 cores. J Parallel Distrib Comput. 2018;113:77–91.

    Google Scholar 

  12. Czechowski K, Battaglino C, McClanahan C, Iyer K, Yeung PK, Vuduc R. On the communication complexity of 3D FFTs and its implications for Exascale. In: Proceedings of the 26th ACM international conference on Supercomputing. ACM, New York, 2012; p. 205–14.

  13. Deville M, Fischer PF, Mund EH. High-order methods for incompressible fluid flow. Cambridge: Cambridge University Press; 2004.

    MATH  Google Scholar 

  14. Ferziger JH, Peric M. Computational methods for fluid dynamics. 3rd ed. Berlin: Springer; 2001.

    MATH  Google Scholar 

  15. Fornberg B. The pseudospectral method: comparisons with finite difference for the elastic wave equation. Geophysics. 1987;52:483–501.

    Google Scholar 

  16. Frigo M, Johnson SG. The design and implementation of FFTW3. Proc IEEE. 2005;93(2):216–31.

    Google Scholar 

  17. Gholami A, Malhotra D, Sundar H, Biros G. FFT, FMM, or multigrid? A comparative study of state-of-the-art poisson solvers for uniform and nonuniform grids in the unit cube. SIAM J Sci Comput. 2016;38(3):C280–306.

    MathSciNet  MATH  Google Scholar 

  18. Habata S, Umezawa K, Yokokawa M, Kitawaki S. Hardware system of the Earth Simulator. Parallel Comput. 2004;30(12):1287–313.

    Google Scholar 

  19. Habata S, Yokokawa M, Kitawaki S. The earth simulator system. NEC Res Dev. 2003;44:21–6.

    Google Scholar 

  20. Hadri B, Kortas S, Feki S, Khurram R, Newby G Overview of the KAUST’s Cray X40 System–Shaheen II. In: CUG2015 Proceedings 2015.

  21. Harlow FH, Welch JE. Numerical calculation of time-dependent viscous incompressible flow of fluid with free surface. Phys Fluids. 1965;8(12):2182–9.

    MathSciNet  MATH  Google Scholar 

  22. Ishihara T, Morishita K, Yokokawa M, Uno A, Kaneda Y. Energy spectrum in high-resolution direct numerical simulations of turbulence. Phys Rev Fluids. 2016;1(8):9–299.

    Google Scholar 

  23. Kale LV, Krishnan S Charm++: a portable concurrent object oriented system based on c++. In: OOPSLA. vol 93. Citeseer, 1993; p. 91–108

  24. Kidder LE, Field SE, Foucart F, Schnetter E, Teukolsky SA, Bohn A, Deppe N, Diener P, Hébert F, Lippuner J, Miller J, Ott CD, Scheel MA, Vincent T. SpECTRE: a task-based discontinuous Galerkin code for relativistic astrophysics. J Comput Phys. 2017;335:84–114.

    MathSciNet  MATH  Google Scholar 

  25. Kim J, Dally WJ, Scott S, Abts D. Technology-driven, highly-scalable dragonfly topology. In: 2008 international symposium on computer architecture. IEEE 2008, p. 77–88.

  26. McClanahan C, Czechowski K, Battaglino C, Iyer K, Yeung PK, Vuduc R. Prospects for scalable 3d ffts on heterogeneous exascale systems. In: ACMIEEE conference on supercomputing, SC 2011.

  27. Mininni PD, Rosenberg DL, Reddy R, Pouquet AG. A hybrid MPI-OpenMP scheme for scalable parallel pseudospectral computations for fluid turbulence. Parallel Comput. 2011;37(6–7):316–26.

    Google Scholar 

  28. Patankar SV. Numerical heat transfer and fluid flow. London: Taylor and Francis; 1980.

    MATH  Google Scholar 

  29. Pekurovsky D. P3DFFT: a framework for parallel computations of fourier transforms in three dimensions. SIAM J Sci Comput. 2012;34(4):C192–209.

    MathSciNet  MATH  Google Scholar 

  30. Pippig M, Potts D. Scaling parallel fast Fourier transform on bluegene/p. In: Jülich BlueGeneP Scaling Workshop. Jülich BlueGene/P Scaling Workshop; 2010.

  31. Ravikumar K, Appelhans D, Yeung PK. GPU acceleration of extreme scale pseudo-spectral simulations of turbulence using asynchronism. In: SC ’19. New York: ACM; 2019, p. 1–22.

  32. Rosenberg DL, Pouquet AG, Marino R, Mininni PD. Evidence for Bolgiano–Obukhov scaling in rotating stratified turbulence using high-resolution direct numerical simulations. Phys Fluids. 2015;27(5):055105.

    Google Scholar 

  33. Samtaney R, Pullin DI, Kosović B. Direct numerical simulation of decaying compressible turbulence and shocklet statistics. Phys Fluids. 2001;13:1415–30.

    MATH  Google Scholar 

  34. Schranner F, Hu X, Adams N. Long-time evolution of the incompressible three-dimensional Taylor–Green vortex at very high Reynolds number. In: Proceedings of the eighth international symposium on turbulence and shear flow phenomena (TSFP-8), Poitiers, France; Aug 2013.

  35. Stone JM, Norman ML. ZEUS-2D: a radiation magnetohydrodynamics code for astrophysical flows in two space dimensions. 2. The magnetohydrodynamic algorithms and tests. ApJS. 1992;80:791.

    Google Scholar 

  36. Trottenberg U, Oosterlee CW, Schüller A. Multigrid. San Diego: Academic Press; 2001.

    MATH  Google Scholar 

  37. Veldhuizen TL. Arrays in blitz++. In: Caromel D, Oldehoeft RR, Tholburn M, editors. Computing in object-oriented parallel environments. Berlin: Springer; 1998. p. 223–30.

    Google Scholar 

  38. Verma MK. Energy trasnfers in fluid flows: multiscale and spectral perspectives. Cambridge: Cambridge University Press; 2019.

    Google Scholar 

  39. Verma MK, Chatterjee AG, Yadav RK, Paul S, Chandra M, Samtaney R. Benchmarking and scaling studies of pseudospectral code Tarang for turbulence simulations. Pramana J Phys. 2013;81(4):617–29.

    Google Scholar 

  40. Verma MK, Kumar A, Pandey A. Phenomenology of buoyancy-driven turbulence: recent results. New J Phys. 2017;19:025012.

    Google Scholar 

  41. Yang C, Xue W, Fu H, You H, Wang X, Ao Y, Liu F, Gan L, Xu P, Wang L, Yang G, Zheng W. 10M-core scalable fully-implicit solver for nonhydrostatic atmospheric dynamics. In: International conference for high performance computing, networking, storage and analysis. IEEE Press, IEEE; Sep 2016.

  42. Yanofsky NS, Mannucci MA. Quantum computing for computer scientists. Cambridge: Cambridge University Press; 2008.

    MATH  Google Scholar 

  43. Yeung PK, Zhai XM, Sreenivasan KR. Extreme events in computational turbulence. PNAS. 2015;112(41):12633.

    Google Scholar 

  44. Yokokawa M, Itakura K, Uno A, Ishihara T. 16.4-Tflops direct numerical simulation of turbulence by a fourier spectral method on the earth simulator. In: ACM/IEEE 2002 conference. IEEE; 2002.

Download references

Acknowledgements

The authors thank all the co-developers of FFTK, TARANG, and SARAS. Some of the key contributors to the codes are Anando Chatterjee, Ravi Samtaney, Fahad Anwer, Gaurav Gautam, Abhishek Kumar, Mani Chandra, Akash Anand, and Awanish Tiwari. In addition, we thank Akash Anand, Samar Aseeri, Rooh Khurram, Bilel Hadri, V. Balaji, and Preeti Malakar for discussion and ideas; and to Ritu Arora, Venkatesh Shenoy, and Amitava Majumdar for organzing wonderful conference “Software Challenges to Exascale Computing (SCEC)”.

Funding

This study was funded by research Grants 6104-1 from Indo-French centre (CEFIPRA), and STC/PHY/2018037 from Indian Space Research Organization. Our numerical simulations were performed on Blue Gene/P (Shaheen I) and Cray XC40 (Shaheen II) of KAUST supercomputing laboratory, Saudi Arabia, through Projects k1052 and k1416.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mahendra K. Verma.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the topical collection “Software Challenges to Exascale Computing” guest edited by Amit Majumdar and Ritu Arora.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Verma, M.K., Samuel, R., Chatterjee, S. et al. Challenges in Fluid Flow Simulations Using Exascale Computing. SN COMPUT. SCI. 1, 178 (2020). https://doi.org/10.1007/s42979-020-00184-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s42979-020-00184-1

Keywords

Navigation