MPI I/O Analysis and Error Detection with MARMOT

  • Bettina Krammer
  • Matthias S. Müller
  • Michael M. Resch
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3241)


The most frequently used part of MPI-2 is MPI I/O. Due to the complexity of parallel programming in general, and of handling parallel I/O in particular, there is a need for tools that support the application development process. There are many situations where incorrect usage of MPI by the application programmer can be automatically detected. In this paper we describe the MARMOT tool that uncovers some of these errors and we also analyze to what extent it is possible to do so for MPI I/O.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
  2. 2.
    Burns, G., Daoud, R., Vaigl, J.: LAM: An Open Cluster Environment for MPI. In: Proceedings of Supercomputing Symposium, pp. 379–386 (1994)Google Scholar
  3. 3.
    Gropp, W., Lusk, E., Doss, N., Skjellum, A.: A high-performance, portable implementation of the MPI message passing interface standard. Parallel Computing 22(6), 789–828 (1996)MATHCrossRefGoogle Scholar
  4. 4.
    Gropp, W., Lusk, E., Thakur, R.: Using MPI-2: Advanced Features of the Message-Passing Interface. MIT Press, Cambridge (1999)Google Scholar
  5. 5.
    Gropp, W.D.: Runtime Checking of Datatype Signatures in MPI. In: Dongarra, J., Kacsuk, P., Podhorszki, N. (eds.) PVM/MPI 2000. LNCS, vol. 1908, pp. 160–167. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  6. 6.
    William, D.: Gropp and Ewing Lusk. User’s Guide for mpich, a Portable Implementation of MPI. Mathematics and Computer Science Division, Argonne National Laboratory, ANL-96/6 (1996)Google Scholar
  7. 7.
    Hood, R.: Debugging Computational Grid Programs with the Portable Parallel/ Distributed Debugger (p2d2). In: The NASA HPCC Annual Report for 1999. NASA (1999),
  8. 8.
    Krammer, B., Bidmon, K., Müller, M.S., Resch, M.M.: MARMOT: An MPI Analysis and Checking Tool. In: Proceedings of PARCO 2003, Dresden, Germany (September 2003)Google Scholar
  9. 9.
    Krammer, B., Müller, M.S., Resch, M.M.: MPI Application Development Using the Analysis Tool MARMOT. In: Bubak, M., van Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2004. LNCS, vol. 3038, pp. 464–471. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  10. 10.
    Kranzlmueller, D., Schaubschlaeger, C., Volkert, J.: A Brief Overview of the MAD Debugging Activities. In: Fourth International Workshop on Automated Debugging (AADEBUG 2000), Munich (2000)Google Scholar
  11. 11.
    Luecke, G., Zou, Y., Coyle, J., Hoekstra, J., Kraeva, M.: Deadlock Detection in MPI Programs. Concurrency and Computation: Practice and Experience 14, 911–932 (2002)MATHCrossRefGoogle Scholar
  12. 12.
    Message Passing Interface Forum. MPI: A Message Passing Interface Standard (June 1995),
  13. 13.
    Message Passing Interface Forum. MPI-2: Extensions to the Message Passing Interface (July 1997),
  14. 14.
    Reynolds, S.: System software makes it easy. Insights Magazine, NASA (2000),
  15. 15.
    Thakur, R., Ross, R., Lusk, E., Gropp, W.: Users Guide for ROMIO: A High-Performance, Portable MPI-IO Implementation. Argonne National Laboratory, Technical Memorandum ANL/MCS-TM-234 (January 2002)Google Scholar
  16. 16.
    Vetter, J.S., de Supinski, B.R.: Dynamic Software Testing of MPI Applications with Umpire. In: Proceedings of the 2000 ACM/IEEE Supercomputing Conference (SC 2000), Dallas, Texas, ACM/IEEE (2000) CD-ROMGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Bettina Krammer
    • 1
  • Matthias S. Müller
    • 1
  • Michael M. Resch
    • 1
  1. 1.High Performance Computing Center StuttgartStuttgartGermany

Personalised recommendations