Advertisement

Initial Experiences from TUPAC Supercomputer

  • David Vinazza
  • Alejandro Otero
  • Alejandro Soba
  • Esteban Mocskos
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 796)

Abstract

High Performance Computing centers boost the development of a wide range of disciplines in Science and Engineering. The installation of public shared facilities in a country is an effort that should be carefully used and must be as open as possible to strengthen the impact of these resources.

We describe the current status and characteristics of TUPAC supercomputer, which is hosted by a CONICET research institute. Unlike other experiences in Argentina, TUPAC is focused on supporting external scientific and technological communities. In spite of having a reduced operations staff, this machine provides computational resources and supports more than 200 external research projects.

In this work we describe the supercomputer setup, tools and policies implemented to reach the level of efficiency needed to support this amount of projects. We also characterize jobs, users and projects with special emphasis on industrial applications and large computational research initiatives hosted in TUPAC.

Keywords

High performance computing facilities Operations Monitoring 

References

  1. 1.
    Centro de simulación computacional p/aplic tecnológicas. http://www.csc-conicet.gob.ar/. Accessed 2 Dec 2017
  2. 2.
    INVAP S.E.: Company devoted to the design and construction of complex technological systems. http://www.invap.com.ar/. Accessed 2 Dec 2017
  3. 3.
    Redmine: A flexible project management web application written using ruby on rails framework. http://www.redmine.org. Accessed 2 Dec 2017
  4. 4.
    ANSYS: Fluent software (2015). http://www.ansys.com/Products/Fluids/ANSYS-Fluent. Accessed 2 Dec 2017
  5. 5.
    Barth, W.: Nagios: System and Network Monitoring. No Starch Press, San Francisco (2006)Google Scholar
  6. 6.
    Hafner, J., Kresse, G., Vogtenhuber, D., Marsman, M.: Vienna Ab initio simulation package (2017). http://www.vasp.at/. Accessed 2 Dec 2017
  7. 7.
    Laizet, S., Lamballais, E.: High-order compact schemes for incompressible flows: a simple and efficient method with quasi-spectral accuracy. J. Comput. Phys. 228(16), 5989–6015 (2009). http://www.sciencedirect.com/science/article/pii/S0021999109002587 MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Laizet, S., Li, N.: Incompact3d: a powerful tool to tackle turbulence problems with up to o(105) computational cores. Int. J. Numer. Meth. Fluids 67(11), 1735–1757 (2011)CrossRefzbMATHGoogle Scholar
  9. 9.
    Massie, M., Li, B., Nicholes, B., Vuksan, V., Alexander, R., Buchbinder, J., Costa, F., Dean, A., Josephsen, D., Phaal, P., Pocock, D.: Monitoring with Ganglia, 1st edn. O’Reilly Media Inc., Sebastopol (2012)Google Scholar
  10. 10.
    Pascual, J.A., Navaridas, J., Miguel-Alonso, J.: Effects of topology-aware allocation policies on scheduling performance. In: Frachtenberg, E., Schwiegelshohn, U. (eds.) JSSPP 2009. LNCS, vol. 5798, pp. 138–156. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-04633-9_8 CrossRefGoogle Scholar
  11. 11.
    Sanderse, B., van der Pijl, S., Koren, B.: Review of computational fluid dynamics for wind turbine wake aerodynamics. Wind Energy 14(7), 799–819 (2011)CrossRefGoogle Scholar
  12. 12.
    Springel, V.: The cosmological simulation code GADGET-2. Mon. Not. R. Astron. Soc. 364(4), 1105–1134 (2005)CrossRefGoogle Scholar
  13. 13.
    Springel, V., Yoshida, N., White, S.D.: GADGET: a code for collisionless and gasdynamical cosmological simulations. New Astron. 6(2), 79–117 (2001). http://www.sciencedirect.com/science/article/pii/S1384107601000422 CrossRefGoogle Scholar
  14. 14.
    Thain, D., Tannenbaum, T., Livny, M.: Distributed computing in practice: the condor experience: research articles. Concurr. - Pract. Exp. 17(2–4), 323–356 (2005)CrossRefGoogle Scholar
  15. 15.
    The OpenFOAM Foundation: Open source software for computational fluid dynamics (CFD). http://www.openfoam.org. Accessed 2 Dec 2017
  16. 16.
    Yoo, A.B., Jette, M.A., Grondona, M.: SLURM: simple Linux utility for resource management. In: Feitelson, D., Rudolph, L., Schwiegelshohn, U. (eds.) JSSPP 2003. LNCS, vol. 2862, pp. 44–60. Springer, Heidelberg (2003).  https://doi.org/10.1007/10968987_3 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • David Vinazza
    • 1
  • Alejandro Otero
    • 1
    • 3
  • Alejandro Soba
    • 1
    • 4
  • Esteban Mocskos
    • 1
    • 2
  1. 1.Centro de Simulación Computacional p/Aplic. TecnológicasCSC-CONICETBuenos AiresArgentina
  2. 2.Departamento de Computación, Facultad de Ciencias Exactas y NaturalesUniversidad de Buenos AiresBuenos AiresArgentina
  3. 3.Facultad de IngenieríaUniversidad de Buenos AiresBuenos AiresArgentina
  4. 4.Centro Atómico ConstituyentesComisión Nacional de Energía AtómicaSan MartínArgentina

Personalised recommendations