Advertisement

Journal of Grid Computing

, Volume 16, Issue 4, pp 683–696 | Cite as

umd-verification: Automation of Software Validation for the EGI Federated e-Infrastructure

  • Pablo Orviz FernándezEmail author
  • João Pina
  • Álvaro López García
  • Isabel Campos Plasencia
  • Mário David
  • Jorge Gomes
Article

Abstract

Supporting e-Science in the EGI e-Infrastructure requires extensive and reliable software, for advanced computing use, deployed across over approximately 300 European and worldwide data centers. The Unified Middleware Distribution (UMD) and Cloud Middleware Distribution (CMD) are the channels to deliver the software for the EGI e-Infrastructure consumption. The software is compiled, validated and distributed following the Software Provisioning Process (SWPP), where the Quality Criteria (QC) definition sets the minimum quality requirements for EGI acceptance. The growing number of software components currently existing within UMD and CMD distributions hinders the application of the traditional, manual-based validation mechanisms, thus driving the adoption of automated solutions. This paper presents umd-verification, an open-source tool that enforces the fulfillment of the QC requirements in an automated way for the continuous validation of the software products for scientific disposal. The umd-verification tool has been successfully integrated within the SWPP pipeline and is progressively supporting the full validation of the products in the UMD and CMD repositories. While the cost of supporting new products is dependant on the availability of Infrastructure as Code solutions to take over the deployment and high test coverage, the results obtained for the already integrated products are promising, as the time invested in the validation of products has been drastically reduced. Furthermore, automation adoption has brought along benefits for the reliability of the process, such as the removal of human-associated errors or the risk of regression of previously tested functionalities.

Keywords

Automation Software verification and validation Software quality assurance Software quality control Software testing Continuous integration 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgements

This work has been partially funded by the EGI-Engage project (Engaging the Research Community towards an Open Science Commons) under grant agreement No. 654182. The authors are especially grateful to EGI.eu’s colleagues Enol Fernández, for his contributions to the umd-verification codebase, and Vincenzo Spinoso, for his support in the tool integration within the EGI Software Provisioning Process.

References

  1. 1.
    EGI.eu: EGI Federation. https://www.egi.eu (2018). Online; Accessed 1 Apr 2018
  2. 2.
    Andronico, G., et al.: E-infrastructures for e-science: A global view. J. Grid Comput. 9(2), 155–184 (2011)CrossRefGoogle Scholar
  3. 3.
    Shamsi, J., et al.: Data-intensive cloud computing: Requirements, expectations, challenges, and solutions. J. Grid Comput. 11(2), 281–310 (2013)CrossRefGoogle Scholar
  4. 4.
    Pérez, M.S., Montes, J., Sánchez A.: Riding out the storm: How to deal with the complexity of grid and cloud management. J. Grid Comput. 10(3), 349–366 (2012)CrossRefGoogle Scholar
  5. 5.
    David, M., et al.: Validation of grid middleware for the european grid infrastructure. J. Grid Comput. 12(3), 543–558 (2014)CrossRefGoogle Scholar
  6. 6.
    EGI Quality Assurance team: EGI Quality Criteria 7th release. http://egi-qc.github.io/ (2018). Online; Accessed 1 Apr 2018
  7. 7.
    Peter, S.: AppImage. https://appimage.org/ (2018). Online; Accessed 1 Apr 2018
  8. 8.
    Ubuntu: Ubuntu Snap. https://www.ubuntu.com/desktop/snappy (2018). Online; Accessed 1 Apr 2018
  9. 9.
    FlatPak: FlatPak. https://flatpak.org/ (2018). Online; Accessed 1 Apr 2018
  10. 10.
    Debian: Debian Policy Manual. https://www.debian.org/doc/debian-policy/ (2018). Online; Accessed 1 Apr 2018
  11. 11.
    Debian: Debian Quality Assurance. https://piuparts.debian.org/ (2018). Online; Accessed 1 Apr 2018
  12. 12.
    IEEE Computer Society: Ieee standard for system and software verification and validation. IEEE Std 1012-2012 (Revision of IEEE Std 1012-2004), pp. 1–223 (2012)Google Scholar
  13. 13.
    Ryan, M.J., Wheatcraft, L.S.: On the use of the terms verification and validation. In: INCOSE International Symposium, vol. 27, 1, pp. 1277–1290. Wiley Online Library (2017)Google Scholar
  14. 14.
    CMMI Product Team: Cmmi for development, version 1.3. Technical Report CMU/SEI-2010-TR-033, Software Engineering Institute, Carnegie Mellon University, Pittsburgh PA (2010)Google Scholar
  15. 15.
    CMMI Product Team: Cmmi for services, version 1.3. Technical Report CMU/SEI-2010-TR-034, Software Engineering Institute, Carnegie Mellon University, Pittsburgh PA (2010)Google Scholar
  16. 16.
    CMMI Product Team: Cmmi for acquisition, version 1.3. Technical Report CMU/SEI-2010-TR-032, Software Engineering Institute, Carnegie Mellon University, Pittsburgh PA (2010)Google Scholar
  17. 17.
    Monteiro, P., Machado, R. J., Kazman, R.: Inception of software validation and verification practices within cmmi level 2. In: 2009 Fourth International Conference on Software Engineering Advances, pp. 536–541 (2009)Google Scholar
  18. 18.
    German, A.: Software static code analysis lessons learned. Crosstalk 16(11), 19–22 (2003)Google Scholar
  19. 19.
    Myers, T.B.G.J., Sandler, C.: The Art of Software Testing. Wiley, Hoboken (2012)CrossRefGoogle Scholar
  20. 20.
    Perry, W.E.: Effective Methods for Software Testing: Includes Complete Guidelines, Checklists, and Templates. Wiley (2007)Google Scholar
  21. 21.
    Kit, E.: Software Testing in the Real World: Improving the Process. Addison-wesley (1995)Google Scholar
  22. 22.
    Huang, C.-Y., Lyu, M.R.: Optimal release time for software systems considering cost, testing-effort, and test efficiency. IEEE Trans. Reliab. 54(4), 583–591 (2005)CrossRefGoogle Scholar
  23. 23.
    Bullock, J.: Calculating the value of testing from an executive’s perspective, software testing is not a capital investment in the physical plant, an acquisition, or another readily accepted business expense. A quality assurance manager describes how to present testing as a business-process investment. Softw. Test. Quality Eng. 2, 56–63 (2000)Google Scholar
  24. 24.
    Saglietti, F., Pinte, F.: Automated unit and integration testing for component-based software systems. In: Proceedings of the International Workshop on Security and Dependability for Resource Constrained Embedded Systems, p. 5. ACM (2010)Google Scholar
  25. 25.
    Dustin, E., Rashka, J., Paul, J.: Automated Software Testing: Introduction, Management, and Performance. Addison-Wesley Professional (1999)Google Scholar
  26. 26.
    Rafi, D.M., Moses, K.R.K., Petersen, K., Mäntylä, M.V.: Benefits and limitations of automated software testing: Systematic literature review and practitioner survey. In: Proceedings of the 7th International Workshop on Automation of Software Test, pp. 36–42. IEEE Press (2012)Google Scholar
  27. 27.
    Wiklund, K., Eldh, S., Sundmark, D., Lundqvist, K.: Impediments for software test automation: A systematic literature review. Softw. Test. Verif. Reliab. 27(8) (2017)CrossRefGoogle Scholar
  28. 28.
    Taipale, O., Kasurinen, J., Karhu, K., Smolander, K.: Trade-off between automated and manual software testing. Int. J. Syst. Assur. Eng. Manag. 2(2), 114–125 (2011)CrossRefGoogle Scholar
  29. 29.
    Ansible: Ansible. https://www.ansible.com/ (2018). Online; Accessed 1 Apr 2018
  30. 30.
    Puppet: Puppet. https://puppet.com/ (2018). Online; Accessed 1 Apr 2018
  31. 31.
    EGI Software Provisioning team: umd-verification tool. https://github.com/egi-qc/umd-verification (2018). Online; Accessed 1 April 2018
  32. 32.
    The Python Community: The Python language. https://www.python.org/ (2018). Online; Accessed 1 Apr 2018
  33. 33.
    Forcier, J.: Fabric - Pythonic remote execution. http://www.fabfile.org/ (2018). Online; Accessed 1 Apr 2018
  34. 34.
    Cooper, D., Santesson, S., Farrell, S., Boeyen, S., Housley, R., Polk, W.: Internet x.509 public key infrastructure certificate and certificate revocation list (crl) profile. RFC 5280, RFC Editor (2008). http://www.rfc-editor.org/rfc/rfc5280.txt
  35. 35.
    National Institute of Standards and Technology (NIST): Secure hash standard. Federal Inf. Process. Stds. (NIST FIPS), pp. 180–184 (2015)Google Scholar
  36. 36.
    Open Grid Forum: GLUE Specification v. 2. https://www.ogf.org/documents/GFD.147.pdf (2018). Online; Accessed 1 Apr 2018
  37. 37.
    CERN: GLUE validator guide. http://gridinfo.web.cern.ch/glue/glue-validator-guide (2018). Online; Accessed 1 Apr 2018
  38. 38.
    Jenkins: EGI Jenkins CI. https://jenkins.egi.ifca.es/ (2018). Online; Accessed 1 Apr 2018
  39. 39.
    EGI.eu: EGI Document Server. https://documents.egi.eu/ (2018). Online; Accessed 1 Apr 2018
  40. 40.
    EGI Software Provisioning team: EGI Quality Criteria in GitHub. https://github.com/egi-qc (2018). Online; Accessed 1 Apr 2018
  41. 41.
    EGI Software Provisioning team: EGI Quality Criteria in Ansible Galaxy. https://galaxy.ansible.com/egi-qc/ (2018). Online; Accessed 1 Apr 2018
  42. 42.
    EGI Software Provisioning team: EGI Quality Criteria in PuppetForge. https://forge.puppet.com/egiqc/ (2018). Online; Accessed 1 Apr 2018
  43. 43.
    Simón, Á.: EGI Release candidate tester. https://github.com/alvarosimon/RC_tester (2018). Online; Accessed 1 Apr 2018
  44. 44.
    EGI Software Provisioning team: EGI Release candidate Ansible role. https://github.com/egi-qc/ansible-release-candidate (2018). Online; Accessed 1 Apr 2018

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.Instituto de Física de Cantabria, Centro Mixto CSIC - UCSantanderSpain
  2. 2.Laboratório de Instrumentação e Física Experimental de Partículas (LIP)LisboaPortugal

Personalised recommendations