Early Experience with Scientific Applications on the Blue Gene/L Supercomputer

  • George Almasi
  • Gyan Bhanot
  • Dong Chen
  • Maria Eleftheriou
  • Blake Fitch
  • Alan Gara
  • Robert Germain
  • John Gunnels
  • Manish Gupta
  • Philip Heidelberg
  • Mike Pitman
  • Aleksandr Rayshubskiy
  • James Sexton
  • Frank Suits
  • Pavlos Vranas
  • Bob Walkup
  • Chris Ward
  • Yuriy Zhestkov
  • Alessandro Curioni
  • Wanda Andreoni
  • Charles Archer
  • José Moreira
  • Richard Loft
  • Henry Tufo
  • Theron Voran
  • Katherine Riley
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3648)

Abstract

Blue Gene/L uses a large number of low power processors, together with multiple integrated interconnection networks, to build a supercomputer with low cost, space and power consumption. It uses a novel system software architecture designed with application scalability in mind. However, whether real applications will scale to tens of thousands of processors has been an open question. In this paper, we describe early experience with several applications on a 16,384 node Blue Gene/L system. This study establishes that applications from a broad variety of scientific disciplines can effectively scale to thousands of processors. The results reported in this study represent the highest performance ever demonstrated for most of these applications, and in fact, show effective scaling for the first time ever on thousands of processors.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Habata, S., Yokokawa, M., Kitawaki, S.: The Earth Simulator. In: NEC Research and Development, vol. 44(1) (January 2003)Google Scholar
  2. 2.
    ASC Purple: Fifth Generation ASC Platform, http://www.llnl.gov/asci/platforms/purple/
  3. 3.
  4. 4.
    Adiga, N.R., et al.: An overview of the BlueGene/L supercomputer. In: SC 2002–High Performance Networking and Computing, Baltimore, MD (November 2002)Google Scholar
  5. 5.
    Almasi, G., Bellofatto, R., Brunheroto, J., Cascaval, C., Castaños, J., Ceze, L., Crumley, P., Erway, C., Gagliano, J., Lieber, D., Martorell, X., Moreira, J., Sanomiya, A., Strauss, K.: An Overview of the BlueGene/L System Software Organization (Distinguished Paper). In: Kosch, H., Böszörményi, L., Hellwagner, H. (eds.) Euro-Par 2003. LNCS, vol. 2790, pp. 543–555. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  6. 6.
    Almasi, G., Chatterjee, S., Gara, A., Gunnels, J., Gupta, M., Henning, A., Moreira, J., Walkup, B., Curioni, A., Archer, C., Bachega, L., Chan, B., Curtis, B., Brunett, S., Chukkapalli, G., Harkness, R., Pfeiffer, W.: Unlocking the Performance of the BlueGene/L Supercomputer. In: SC 2004: High Performance Computing, Networking and Storage Conference, Pittsburgh, PA (November 2004)Google Scholar
  7. 7.
    Petrini, F., Kerbyson, D., Pakin, S.: The Case of the Missing Supercomputer Performance: Achieving Optimal Performance on the 8,192 Processors of ASCI Q. In: IEEE/ACM SC 2003, Phoenix, AZ (November 2003)Google Scholar
  8. 8.
    Almasi, G., et al.: Scaling physics and material science applications on a massively parallel Blue Gene/L system. In: International Conference on Supercomputing, Cambridge, MA (June 2005)Google Scholar
  9. 9.
    TOP500 Supercomputer Sites, http://www.top500.org
  10. 10.
    Bachega, L., Chatterjee, S., Dockser, K., Gunnels, J., Gupta, M., Gustavson, F., Lapkowski, C., Liu, G., Mendell, M., Wait, C., Ward, T.J.C.: A High-Performance SIMD Floating Point Unit Design for BlueGene/L: Architecture, Compilation, and Algorithm Design. In: Parallel Architecture and Compilation Techniques (PACT 2004), Antibes Juan-les-Pins, France (September-October 2004)Google Scholar
  11. 11.
    Davis, K., Hoisie, A., Johnson, G., Kerbyson, D., Lang, M., Pakin, S., Petrini, F.: A Performance and Scalability Analysis of the BlueGene/L Architecture. In: IEEE/ACM SC 2004, Pittsburgh, PA (November 2004)Google Scholar
  12. 12.
    Fitch, B.G., Germain, R.S., Mendell, M., Pitera, J., Pitman, M., Rayshubskiy, A., Sham, Y., Suits, F., Swope, W., Ward, T.J.C., Zhestkov, Y., Zhou, R.: Blue Matter, an application framework for molecular simulation on Blue Gene. Journal of Parallel and Distributed Computing, 759–773 (2003)Google Scholar
  13. 13.
    Germain, R.S., et al.: Early performance data on the Blue Matter molecular simulation framework. IBM Journal of Research and Development 49(2/3), 447–456 (2005)CrossRefGoogle Scholar
  14. 14.
    Phillips, J.C., Zheng, G., Kumar, S., Kale, L.V.: NAMD: biomolecular simulation on thousands of processors. In: Supercomputing 2002 Proceedings (2002)Google Scholar
  15. 15.
    CPMD home page, http://www.cpmd.org
  16. 16.
    Hutter, J., Curioni, A.: Dual-level parallelism for ab initio molecular dynamics: Reaching teraflop performance with the CPMD code. Parallel Computing (31), 1–17 (2005)Google Scholar
  17. 17.
    Hutter, J., Curioni, A.: Car-Parrinello Molecular Dynamics on Massively Parallel Computers. ChemPhysChem (2005) (in press)Google Scholar
  18. 18.
  19. 19.

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • George Almasi
    • 1
  • Gyan Bhanot
    • 1
  • Dong Chen
    • 1
  • Maria Eleftheriou
    • 1
  • Blake Fitch
    • 1
  • Alan Gara
    • 1
  • Robert Germain
    • 1
  • John Gunnels
    • 1
  • Manish Gupta
    • 1
  • Philip Heidelberg
    • 1
  • Mike Pitman
    • 1
  • Aleksandr Rayshubskiy
    • 1
  • James Sexton
    • 1
  • Frank Suits
    • 1
  • Pavlos Vranas
    • 1
  • Bob Walkup
    • 1
  • Chris Ward
    • 1
  • Yuriy Zhestkov
    • 1
  • Alessandro Curioni
    • 2
  • Wanda Andreoni
    • 2
  • Charles Archer
    • 3
  • José Moreira
    • 3
  • Richard Loft
    • 4
  • Henry Tufo
    • 4
    • 5
  • Theron Voran
    • 5
  • Katherine Riley
    • 6
  1. 1.IBM T.J. Watson Research CenterYorktown HeightsUSA
  2. 2.IBM Zurich Research LaboratoryRüschlikonSwitzerland
  3. 3.IBM Systems and Technology GroupRochesterUSA
  4. 4.National Center for Atmospheric ResearchBoulderUSA
  5. 5.University of Colorado at BoulderBoulderUSA
  6. 6.Argonne National LaboratoryArgonneUSA

Personalised recommendations