Skip to main content
Log in

HPC on the Grid: The Theophys Experience

  • Published:
Journal of Grid Computing Aims and scope Submit manuscript

    We’re sorry, something doesn't seem to be working properly.

    Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Abstract

The Grid Virtual Organization (VO) “Theophys”, associated to the INFN (Istituto Nazionale di Fisica Nucleare), is a theoretical physics community with various computational demands, spreading from serial, SMP, MPI and hybrid jobs. That has led, in the past 20 years, towards the use of the Grid infrastructure for serial jobs, while the execution of multi-threaded, MPI and hybrid jobs has been performed in several small-medium size clusters installed in different sites, with access through standard local submission methods. This work analyzes the support for parallel jobs in the scientific Grid middlewares, then describes how the community unified the management of most of its computational need (serial and parallel ones) using the Grid through the development of a specific project which integrates serial e parallel resources in a common Grid based framework. A centralized national cluster is deployed inside this framework, providing “Wholenodes” reservations, CPU affinity, and other new features supporting our High Performance Computing (HPC) applications in the Grid environment. Examples of the cluster performance for relevant parallel applications in theoretical physics are reported, focusing on the different kinds of parallel jobs that can be served by the new features introduced in the Grid.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Foster, I., Kesselman, C., Tuecke, S.: The anatomy of the Grid: enabling scalable virtual organizations. Int. J. Supercomput. Appl. 15(3), 200–222 (2001)

    Article  Google Scholar 

  2. Ferrari, T., Luciano Gaido, L.: Resources and services of the EGEE production infrastructure. J. Grid Computing 9(2), 119–133 (2011)

    Article  Google Scholar 

  3. Dooley, R., Milfeld, K., Guiang, C., Pamidighantam S., Allen, G.: From proposal to production: lessons learned developing the computational chemistry Grid cyberinfrastructure. J. Grid Computing 4(2), 195–208 (2006)

    Article  MATH  Google Scholar 

  4. Becciani, U., Antonuccio-Delogu, V., Costa, A., Petta, C.: Cosmological simulations and data exploration: a testcase on the usage of Grid infrastructure. J. Grid Computing 10(2), 265–277 (2012)

    Article  Google Scholar 

  5. Vilotte, J.P., Moguilny, G.: Earth science: requirements and experiences with use of MPI in EGEE. In: EGEE09 Conference, Barcelona, 21–25 September 2009

  6. Engelberts, J.: Towards a robust and userfriendly MPI functionality on the EGEE Grid. In: EGEE User Forum 2010, Uppsala, 12–15 April 2010. See also: http://www.Grid.ie/mpi/wiki/WorkingGroup

  7. Anglano, C., Canonico, M., Guazzone, M.: The ShareGrid peer-to-peer desktop Grid: infrastructure, applications and performance evaluation. J. Grid Computing 8(4), 543–570 (2010)

    Article  Google Scholar 

  8. IGI accounting portal: http://accounting.egi.eu (2010)

  9. Bodin, F., Boucaud, P., Cabibbo, N., Cascino, G., Calvayrac, F., Della Morte, M., Del Re, A., De Pietri, et al.: APE computers—past, present and future. Comput. Phys. Commun. 147(1–2), 402–409 (2002)

    Article  Google Scholar 

  10. Foster, I.: Globus toolkit version 4: software for service-oriented systems. In: IFIP International Conference on Network and Parallel Computing, LNCS 3779, pp. 2–13. Springer, Berlin (2005)

    Google Scholar 

  11. Laure, E., Fisher, S.M., Frohner, A., Grandi, C., Kunszt, P., Krenek, A., Mulmo, O., Pacini, F., Prelz, F., White, J., Barroso, M., Buncic, P., Hemmer, F., Di Meglio, A., Edlund, A.: Programming the Grid with gLite. Comput. Methods Sci. Technol. 12(1), 33–45 (2006)

    Google Scholar 

  12. Ellert, M., et al.: Advanced resource connector middleware for lightweight computational Grids. Future Gener. Comput. Syst. 23, 219–240 (2007)

    Article  Google Scholar 

  13. Ernst, M., Fuhrmann, P., Mkrtchyan, T., Bakken, J., Fisk, I., Perelmutov, T., Petravick, D.: Managed data storage and data access services for data Grids. In: Computing in High Energy Physics and Nuclear Physics 2004 (CHEP04), Interlaken, Switzerland, p. 665, 27 Sept–1 Oct 2004

  14. Streit, A., Bala, P., Beck-Ratzka, A., Benedyczak, K., Bergmann, S., Breu, R., Daivandy, J. M. , Demuth, B., Eifer, A. Giesler, A., Hagemeier, B., Holl, S., Huber, V., Lamla, N., Mallmann, D., Memon, A.S., Memon, M.S., Rambadt, M., Riedel, M., Romberg, M., Schuller, B., Schlauch, T., Schreiber, A., Soddemann, T., Ziegler, W.: UNICORE 6—recent and future advancements. Ann. Télécommun. 65(11–12), 757–762 (2010)

    Article  Google Scholar 

  15. Fuhrmann, P.: EMI, the introduction. In: Proceeding of CHEP 2010, Taipei, 18 October 2010

  16. Fernández, E.: A unified user experience for MPI jobs in EMI. In: EGI User Forum 2011, Vilnius, 11 April 2011

  17. Anjomshoaa, A., Drescher, M., Fellows, D., Ly, A., McGough, S., Pulsipher, D., Savva, A.: Job Submission Description Language (JSDL) specification, version 1.0. White Paper, 7 November 2005

  18. Schuller, B.: MPI in UNICORE. In: EGI Technical Forum 2010, Amsterdam, 15 September 2010

  19. Dichev, K., Stork, S., Keller R., Fernández, E.: MPI support on the Grid. Comput. Inform. 27(2), 213–222 (2008)

    MATH  Google Scholar 

  20. Schuller, B., Konya, B., Konstantinov, A., Sgaravatto, M., Zangrando, L.: EMI-ES, a common interface to ARC, gLite and UNICORE computing elements. In: EGI User Forum, Vilnius, 11 April 2011

  21. Carminati, F., Templon, J., et al.: Common use cases for a HEP common application layer. White Paper, LHC-SC2-20-2002

  22. European Grid Infrastructure: An integrated sustainable Pan-European infrastrucute for researchers in Europe (EGI-InSPIRE). White Paper, 18 April 2011. Document Link: https://documents.egi.eu/document/201

  23. XSEDE Production Baseline: Service provider software and services. White paper, released on 22 February 2012

  24. Altunay, M., Avery, P., Blackburn, K., Bockelman, B., Ernst, M., et al.: A science driven production cyberinfrastructure—the open science Grid. J. Grid Computing 9(2), 201–218 (2011)

    Article  Google Scholar 

  25. Berg, A.: PRACE distributed infrastructure services and evolution. In: EGI Community Forum 2012, Garching, 28 March 2012

  26. Gentzsch, W., Denis Girou, D., Kennedy, A., Lederer, H., Reetz, J., et al.: DEISA–distributed European infrastructure for supercomputing applications. J. Grid Computing 9(2), 259–277 (2011)

    Article  Google Scholar 

  27. Laganá, A., Costantini, A., Gervasi, O., Faginas Lago, N., Manuali, C., et al.: Compchem: progress towards GEMS a Grid empowered molecular simulator and beyond. J. Grid Computing 8(4), 571–586 (2010)

    Article  Google Scholar 

  28. Borgdorff, J., Falcone, J., Lorenz, E., Chopard, B., Hoekstra, A.: A principled approach to distributed multiscale computing, from formalization to execution. In: Proceedings of The Seventh IEEE International Conference on e-Science Workshops, Stockholm, Sweden, 5–8 December 2011, pp. 97–104. IEEE Computer Society, Washington, DC (2011)

  29. Frohner, A., Baud, J.P., Garcia Rioja, R.M., Grosdidier, G., Mollon, R., Smith D., Tedesco, P.: Data management in EGEE. In: Journal of Physics, Conference Series 219 (2010)

  30. Zappi, R., Magnoni, L., Donno, F., Ghiselli, A.: StoRM: Grid middleware for disk resource management. In: Proceedings of Computing in High-Energy Physics, 27 Sept–1 Oct 2004 Interlaken, Switzerland, (CHEP04), pp. 1238–1241 (2005)

  31. Turner, D., Oline, A., Chen, X., Benjegerdes, T.: Integrating new capabilities into NetPIPE. Lect. Notes Comput. Sci. 2840, 37–44 (2003)

    Article  Google Scholar 

  32. Aiftimiei, C., Andreetto, P., Bertocco, S., Dalla Fina, S., Alvise Dorigo, A., Frizziero, E., Gianelle, A., Marzolla, M., Mazzucato, M., Sgaravatto, M., Traldi S., Zangrando, L.: Design and implementation of the gLite CREAM job management service. Future Gener. Comput. Syst. 26(4), 654–667 (2010)

    Article  Google Scholar 

  33. Alfieri, R., Cecchini, R., Ciaschini, V., dell’Agnello, L., Frohner, A., Lorentey, K., Spataro F.: From Gridmap-file to VOMS: managing authorization in a Grid environment. Future Gener. Comput. Syst. 21(4), 549–558 (2005)

    Article  Google Scholar 

  34. Edwards, R.G., (LHPC Collaboration), Joó, B., (UKQCD Collaboration): The chroma software system for lattice QCD. arXiv:hep-lat/0409003. In: Proceedings of the 22nd International Symposium for Lattice Field Theory (Lattice2004), Nucl. Phys. B140 (Proc. Suppl), 832 (2005). See also: http://usqcd.jlab.org/usqcd-docs/chroma/

  35. Goodale, T., et al.: The cactus framework and toolkit: design and applications. In: Vector and Parallel Processing—VECPAR’2002, 5th International Conference, Lecture Notes in Computer Science. Springer, Berlin (2003).

    Google Scholar 

  36. Schnetter, E., Hawley, S.H., Hawke, I.: Evolutions in 3-D numerical relativity using fixed mesh refinement. Class. Quantum Grav. 21, 1465–1488 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  37. Baiotti, L., Hawke, I., Montero, P.J., Löffler, F., Rezzolla, L., Stergioulas, N., Font, J.A., Seidel, E.: Three-dimensional relativistic simulations of rotating neutron star collapse to a Kerr black hole. Phys. Rev. D 71, 024035 (2005). See also: http://einsteintoolkit.org/

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roberto Alfieri.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Alfieri, R., Arezzini, S., Ciampa, A. et al. HPC on the Grid: The Theophys Experience. J Grid Computing 11, 265–280 (2013). https://doi.org/10.1007/s10723-012-9223-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10723-012-9223-6

Keywords

Navigation