Experiences and Requirements for Interoperability Between HTC and HPC-driven e-Science Infrastructure

  • Morris Riedel
  • Achim Streit
  • Daniel Mallmann
  • Felix Wolf
  • Thomas Lippert
Conference paper

Abstract

Recently, more and more e-science projects require resources in more than one production e-science infrastructure, especially when using HTC and HPC concepts together in one scientific workflow. But the interoperability of these infrastructures is still not seamlessly provided today and we argue that this is due to the absence of a realistically implementable reference model in Grids. Therefore, the fundamental goal of this paper is to identify requirements that allows for the definition of the core building blocks of an interoperability reference model that represents a trimmed down version of OGSA in terms of functionality, is less complex, more fine-granular and thus easier to implement. The identified requirements are underpinned with gained experiences from world-wide interoperability efforts.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Distributed European Infrastructure for Supercomputing Applications (DEISA) http://www. deisa.eu, Cited 10 October 2008
  2. 2.
    Riedel, M. et al.: Interoperation of World-Wide Production e-Science Infrastructures, accepted for Concurrency and Computation: Practice and Experience Journal, (2008)Google Scholar
  3. 3.
    The Web Services Resource Framework Technical Committee http://www.oasis-open.org/ committees/wsrf/, Cited 10 October 2008
  4. 4.
    Riedel, M. et al.: Improving e-Science with Interoperability of the e-Infrastructures EGEE and DEISA. In: Proceedings of the 31st International Convention MIPRO, Conference on Grid and Visualization Systems (GVS), Opatija, Croatia, ISBN 978-953-233-036-6, pages 225–231 (2008)Google Scholar
  5. 5.
    The Open Middleware Infrastructure Institute for Europe http://www.omii-europe.org, Cited 10 October 2008
  6. 6.
    Streit, A. et al.: UNICORE - From Project Results to Production Grids, Advances in Parallel omputing 14, Elsevier, 357–376 (2005)Google Scholar
  7. 7.
    EU Fusion for ITER Applications http://www.euforia-project.eu, Cited 10 October 2008
  8. 8.
    Foster, I. et al.: The Physiology of the Grid. In: Grid Computing - Making the Global Infrastructure a Reality, John Wiley & Sons Ltd, pages 217–249 (2002)Google Scholar
  9. 9.
    Wide In Silico Docking on Malaria Project http://wisdom.eu-egee.fr, Cited 10 October 2008
  10. 10.
    Riedel, M. et al.: Classification of Different Approaches for e-Science Applications in Next Generation Computing Infrastructures. Accepted for publication in: Proceedings of the e-science Conference, Indianapolis, Indiana, USA (2008)Google Scholar
  11. 11.
    Wide In Silico Docking on Malaria Project http://www.euindiagrid.org/, Cited 10 October 2008
  12. 12.
    Gudgin, M. et al.: SOAP Version 1.2 Part 1: Messaging Framework, W3C Rec. (2003)Google Scholar
  13. 13.
    WS-Interoperability (WS-I) http://www.ws-i.org, Cited 10 October 2008
  14. 14.
    Box, D. et al.:WS-Addressing (WS-A), W3C Member Submission. (2004)Google Scholar
  15. 15.
    Amazons Elastic Computing Cloud (EC2) http://aws.amazon.com/ec2, Cited 10 October 2008
  16. 16.
    Venturi, V. et al.: Using SAML-based VOMS for Authorization within Web Services-based UNICORE Grids, In:Proceedings of 3rd UNICORE Summit 2007 in Springer LNCS 4854, Euro-Par 2007 Workshops: Parallel Processing, pages 112–120 (2007)Google Scholar
  17. 17.
    Enabling Grids for e-Science Project http://public.eu-egee.org, Cited 10 October 2008
  18. 18.
    Alfieri, R. et al.: From gridmapfile to voms: managing authorization in a grid environment, In: Future Generation Comp. Syst., 21(4):, pages 549–558 (2005)Google Scholar
  19. 19.
    Open Science Grid (OSG) http://www.opensciencegrid.org, Cited 10 October 2008
  20. 20.
    Laure, E. et al.: Programming The Grid with gLite, Computational Methods in Science and Technology, Scientific Publishers OWN, 33–46 (2006)Google Scholar
  21. 21.
    TeraGrid http://www.teragrid.org, Cited 10 October 2008
  22. 22.
    Foster, I. et al.: Globus Toolkit version 4: Software for Service-Oriented Science, In: Proceedings of IFIP International Conference on Network and Parallel Computing, LNCS 3779, pages 213–223 (2005)Google Scholar
  23. 23.
    NorduGrid http://www.nordugrid.org/, Cited 10 October 2008
  24. 24.
    Anjomshoaa, A. et al.: Job Submission Description Language (JSDL) Specification, Version 1.0, OGF GFD 136Google Scholar
  25. 25.
    DRIVER Project http://www.driver-repository.eu, Cited 10 October 2008
  26. 26.
    Foster, I. et al.: OGSA - Basic Execution Services, OGF GFD 108Google Scholar
  27. 27.
    EGA Reference Model http://www.ogf.org/documents/06322r00EGA RefMod-Reference-Model.pdf, Cited 10 October 2008
  28. 28.
    Sim, A. et al.: Storage Resource Manager Interface Specification Version 2.2, OGF GFD 129Google Scholar
  29. 29.
    Antonioletti, M. et al.: Web Services Data Access and Integration - The Core (WS-DAI) Specification, Version 1.0, OGF GFD 74Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Morris Riedel
    • 1
  • Achim Streit
    • 1
  • Daniel Mallmann
    • 1
  • Felix Wolf
    • 1
  • Thomas Lippert
    • 1
  1. 1.Morris Riedel, Co-Chair of Grid Interoperation Now (GIN) Group of the Open Grid Forum (OGF) Jü lich Supercomputing CentreForschungszentrum Jü lich GmbHJülichGermany

Personalised recommendations