Advertisement

Cluster Computing

, Volume 4, Issue 3, pp 179–188 | Cite as

Cactus Tools for Grid Applications

  • Gabrielle Allen
  • Werner Benger
  • Thomas Dramlitsch
  • Tom Goodale
  • Hans-Christian Hege
  • Gerd Lanfermann
  • André Merzky
  • Thomas Radke
  • Edward Seidel
  • John Shalf
Article

Abstract

Cactus is an open source problem solving environment designed for scientists and engineers. Its modular structure facilitates parallel computation across different architectures and collaborative code development between different groups. The Cactus Code originated in the academic research community, where it has been developed and used over many years by a large international collaboration of physicists and computational scientists. We discuss here how the intensive computing requirements of physics applications now using the Cactus Code encourage the use of distributed and metacomputing, and detail how its design makes it an ideal application test-bed for Grid computing. We describe the development of tools, and the experiments which have already been performed in a Grid environment with Cactus, including distributed simulations, remote monitoring and steering, and data handling and visualization. Finally, we discuss how Grid portals, such as those already developed for Cactus, will open the door to global computing resources for scientific users.

Cactus Grid computing Grid portals 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Cactus Code, http://www.cactuscode.orgGoogle Scholar
  2. [2]
    G. Allen, T. Goodale, G. Lanfermann, E. Seidel, W. Benger, H.-C. Hege, A. Merzky, J. Massó, T. Radke and J. Shalf, Solving Einstein's Equation on supercomputers, IEEE Computer (December 1999) 52–59, http://www.computer.org/computer/ articles/einstein_1299_l.htmGoogle Scholar
  3. [3]
    E. Seidel and W.M. Suen, Numerical relativity as a tool for computational astrophysics, J. Comp. Appl. Math. 109 (1999) 493–525.Google Scholar
  4. [4]
    DFN Gigabit Project, Tele-Immersion: Collision of Black Holes, http://www.zib.de/Visual/projects/TIKSL/Google Scholar
  5. [5]
    Astrophysics Simulation Collaboratory, http://www. ascportal.org/ASCGoogle Scholar
  6. [6]
    Globus Metacomputing Toolkit, http://www.globus.orgGoogle Scholar
  7. [7]
    W. Benger, I. Foster, J. Novotny, E. Seidel, J. Shalf, W. Smith and P. Walker, Numerical relativity in a distributed environment, in: Proc. of the 9th SIAM Conf. on Parallel Processing for Scientific Computing, March, 1999.Google Scholar
  8. [8]
    W. Benger, H.-C. Hege, A Merzky, T. Radke and E. Seidel, Schwarze Löcher sehen, DFN-Mitteilungen, Bd. 52 2000.Google Scholar
  9. [9]
    W. Benger, H.-C. Hege, A. Merzky, T. Radke and E. Seidel, Efficient distributed file I/O for visualization in Grid environments, in: Simulation and Visualization on the Grid, Lecture Notes in Computational Science and Engineering, Vol. 13, eds. B. Engquist, L. Johnsson, M. Hammill and F. Short (Springer, 2000) pp. 1–16.Google Scholar
  10. [10]
    Grid Adaptive Development Software (GrADS), http://www. isi.edu/grads/Google Scholar
  11. [11]
    The European Grid-Forum, http://www.egrid.orgGoogle Scholar
  12. [12]
    Grid-enabled MPICH Implementation, http://www.globus. org/mpiGoogle Scholar
  13. [13]
    FlexIO, http://zeus.ncsa.uiuc.edu/~jshalf/ FlexIO/Google Scholar
  14. [14]
    Hierachical Data Format Version 5, http://hdf.ncsa.uiuc. edu/HDF5Google Scholar
  15. [15]
    A. Chervenak, I. Foster, C. Kesselman, C. Salisbury and S. Tuecke, The data Grid: towards an architecture for the distributed management and analysis of large scientific datasets (1999), submitted to NetStore '99.Google Scholar
  16. [16]
    Distributed Parallel Storage System, http://www-didc.lbl. gov/DPSSGoogle Scholar
  17. [17]
    Amira - Users Guide and Reference Manual, AmiraDev - Programmers Guide, Konrad-Zuse-Zentrum für Informationstechnik Berlin (ZIB) and Indeed-Visual Concepts, Berlin, http://amira.zib. deGoogle Scholar
  18. [18]
    The Globus Project: GridFTP: Universal Data Transfer for the Grid, White Paper, http://www.globus.org/datagrid/ deliverables/C2WPdrafts.pdfGoogle Scholar
  19. [19]
    G. Allen, T. Dramlitsch, T. Goodale, G. Lanfermann, T. Radke, E. Seidel, T. Kielmann, K. Verstoep, Z. Balaton, P. Kacsuk, F. Szalai, J. Gehring, A. Keller, A. Streit, L. Matyska, M. Ruda, A. Krenek, H. Frese, H. Knipp, A. Merzky, A. Reinefeld, F. Schintke, B. Ludwiczak, J. Nabrzyski, J. Pukacki, H.-P. Kersken and M. Russell, Early experiences with the Egrid testbed, in: IEEE Int. Symp. on Cluster Computing and the Grid, 2001.Google Scholar
  20. [20]
    Geodesies in Kerr Space-Time, Presentation at the IGrid 2000 conference in Yokohama, Japan, http://www.zib.de/geodesicsGoogle Scholar
  21. [21]
    IBM Data Explorer, http://www.research.ibm.com/dxGoogle Scholar
  22. [22]
    LCA Vision, http://zeus.ncsa.uiuc.edu/~miksa/ LCAVision.htmlGoogle Scholar

Copyright information

© Kluwer Academic Publishers 2001

Authors and Affiliations

  • Gabrielle Allen
    • 1
  • Werner Benger
    • 1
    • 2
  • Thomas Dramlitsch
    • 1
  • Tom Goodale
    • 1
  • Hans-Christian Hege
    • 2
  • Gerd Lanfermann
    • 1
  • André Merzky
    • 2
  • Thomas Radke
    • 1
  • Edward Seidel
    • 1
    • 3
  • John Shalf
    • 3
    • 4
  1. 1.Max-Planck-Institut für GravitationsphysikAlbert-Einstein-Institut (AEI)GolmGermany
  2. 2.Konrad-Zuse-Zentrum für Informationstechnik (ZIB)BerlinGermany
  3. 3.National Center for Supercomputing Applications (NCSA)ChampaignUSA
  4. 4.Lawrence Berkeley National Laboratory (LBNL)BerkeleyUSA

Personalised recommendations