Journal of Grid Computing

, Volume 14, Issue 4, pp 559–573 | Cite as

Development of Science Gateways Using QCG — Lessons Learned from the Deployment on Large Scale Distributed and HPC Infrastructures

  • T. Piontek
  • B. Bosak
  • M. Ciżnicki
  • P. Grabowski
  • P. Kopta
  • M. Kulczewski
  • D. Szejnfeld
  • K. Kurowski
Article

Abstract

Today, various Science Gateways created in close collaboration with scientific communities provide access to remote and distributed HPC, Grid and Cloud computing resources and large-scale storage facilities. However, as we have observed there are still many entry barriers for new users and various limitations for active scientists. In this paper we present our latest achievements and software solutions that significantly simplify the use of large scale and distributed computing. We describe several Science Gateways that have been successfully created with the help of our application tools and the QCG (Quality in Cloud and Grid) middleware, in particular Vine Toolkit, QCG-Portal and QCG-Now, and make the use of HPC, Grid and Cloud more straightforward and transparent. Additionally, we share the best practices and lessons learned after creating jointly with user communities many domain-specific Science Gateways, e.g. dedicated for physicists, medical scientists, chemists, engineers and external communities performing multi-scale simulations. As our deployed software solutions have reached recently a critical mass of active users in the PLGrid e-infrastructure in Poland, we also discuss in this paper how changing technologies, visual design and user experience could impact the way we should re-design Science Getaways or even develop new attractive tools, e.g. desktop or mobile-based applications in the future. Finally, we present information and statistics regarding the behaviour of users to help readers understand how new capabilities and functionalities may influence the growth of user interest in Science Gateways and HPC technologies.

Keywords

Science gateway GUIs High performance computing Grid Cloud 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Polish ngi: Plgrid. http://www.plgrid.pl/en. Accessed: 2015-11-25
  2. 2.
    Bosak, B., Komasa, J., Kopta, P., Kurowski, K., Mamoński, M., Piontek, T.: In: Building a National Distributed e-InfrastructurePL-Grid (Springer, 2012), p 4055Google Scholar
  3. 3.
    Qcg. http://www.qoscosgrid.org. Accessed: 2015-12-01
  4. 4.
  5. 5.
    Xsede science gateways listing. https://www.xsede.org/gateways-listing. Accessed: 2015-11-25
  6. 6.
    Korkhov, V., Krefting, D., Kukla, T., Terstyanszky, G.Z., Caan, M.W., Olabarriaga, S.D.: J. Grid Comput. 11(3), 505 (2013)CrossRefGoogle Scholar
  7. 7.
    guse. http://guse.hu/about/home. Accessed: 2015-11-25
  8. 8.
    Ws-pgrade. http://guse.hu/about/architecture/ws-pgrade. Accessed: 2015-11-25
  9. 9.
    Rycerz, K., Bubak, M., Ciepiela, E., HareŻlak, D., Guba la, T., Meizner, J., Pawlik, M., Wilk, B.: Futur. Gener. Comput. Syst. 53, 77 (2015)CrossRefGoogle Scholar
  10. 10.
    Belgacem, M.B., Chopard, B., Borgdorff, J., Mamoński, M., Rycerz, K., Harezlak, D.: Procedia Comput. Sci. 18, 1106 (2013)CrossRefGoogle Scholar
  11. 11.
    Pierce, M.E., Marru, S., Gunathilake, L., Wijeratne, D.K., Singh, R., Wimalasena, C., Ratnayaka, S., Pamidighantam, S.: Concurrency and Computation. Pract. Experience 27(16), 4282 (2015). http://dblp.uni-trier.de/db/journals/concurrency/concurrency27.html#PierceMGWSWRP15 CrossRefGoogle Scholar
  12. 12.
    Klimeck, G., McLennan, M., Brophy, S.P., Adams, III G.B., Lundstrom, M.S.:` Comput. Sci. Eng. 10(5), 17 (2008)CrossRefGoogle Scholar
  13. 13.
    Migrating desktop. http://desktop.psnc.pl. Accessed: 2015-11-25
  14. 14.
    Unicore rich client. http://sourceforge.net/projects/unicore/. Accessed: 2015-11-25
  15. 15.
    Gridchem. https://www.gridchem.org/. Accessed: 2015-11-25
  16. 16.
    Gridlab project. http://gridlab.org. Accessed: 2015-11-25
  17. 17.
    Bosak, B., Kopta, P., Kurowski, K., Piontek, T., Mamoński, M.. In: eScience on Distributed Computing Infrastructure (Springer, 2014), pp 34–53Google Scholar
  18. 18.
    Kurowski, K., Dziubecki, P., Grabowski, P., Krysiński, M., Piontek, T., Szejnfeld, D.. In: xeScience on Distributed Computing Infrastructure (Springer, 2014), pp 147–163Google Scholar
  19. 19.
    Vine toolkit. http://vinetoolkit.psnc.pl. Accessed: 2015-11-27
  20. 20.
    Dziubecki, P., Grabowski, P., Krysiński, M., Kuczyński, T., Kurowski, K., Piontek, T., Sze-jnfeld, D.: In: Building a National Distributed e-InfrastructurePL-Grid (Springer, 2012), pp 205–216Google Scholar
  21. 21.
    Bridgecloud. https://www.eurostars-eureka.eu/project/id/7987. Accessed: 2016-01-28
  22. 22.
    Radecki, M., Szymocha, T., Piontek, T., Bosak, B., Mamoński, M., Wolniewicz, P., Benedyczak, K., Kluszczyński, R.: In: eScience on Distributed Computing Infrastructure (Springer, 2014), pp. 80–93Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2016

Authors and Affiliations

  1. 1.Poznan Supercomputing and Networking CenterPoznanPoland
  2. 2.Poznan University of TechnologyPoznanPoland

Personalised recommendations