Testing Idempotence for Infrastructure as Code

  • Waldemar Hummer
  • Florian Rosenberg
  • Fábio Oliveira
  • Tamar Eilam
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8275)

Abstract

Due to the competitiveness of the computing industry, software developers are pressured to quickly deliver new code releases. At the same time, operators are expected to update and keep production systems stable at all times. To overcome the development–operations barrier, organizations have started to adopt Infrastructure as Code (IaC) tools to efficiently deploy middleware and applications using automation scripts. These automations comprise a series of steps that should be idempotent to guarantee repeatability and convergence. Rigorous testing is required to ensure that the system idempotently converges to a desired state, starting from arbitrary states. We propose and evaluate a model-based testing framework for IaC. An abstracted system model is utilized to derive state transition graphs, based on which we systematically generate test cases for the automation. The test cases are executed in light-weight virtual machine environments. Our prototype targets one popular IaC tool (Chef), but the approach is general. We apply our framework to a large base of public IaC scripts written by operators, showing that it correctly detects non-idempotent automations.

Keywords

Middleware Deployment Software Automation Idempotence Convergence Infrastructure as Code Software Testing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hüttermann, M.: DevOps for Developers. Apress (2012)Google Scholar
  2. 2.
    Loukides, M.: What is DevOps? O’Reilly Media (2012)Google Scholar
  3. 3.
    Schaefer, A., Reichenbach, M., Fey, D.: Continuous Integration and Automation for Devops. IAENG Trans. on Engineering Technologies 170, 345–358 (2013)CrossRefGoogle Scholar
  4. 4.
    Nelson-Smith, S.: Test-Driven Infrastructure with Chef. O’Reilly (2011)Google Scholar
  5. 5.
  6. 6.
  7. 7.
    Couch, A.L., Sun, Y.: On the algebraic structure of convergence. In: Brunner, M., Keller, A. (eds.) DSOM 2003. LNCS, vol. 2867, pp. 28–40. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  8. 8.
    Burgess, M.: Testable system administration. Commun. ACM 54(3), 44–49 (2011)CrossRefGoogle Scholar
  9. 9.
    Opscode Community: http://community.opscode.com/
  10. 10.
    Utting, M., Pretschner, A., Legeard, B.: A taxonomy of model-based testing approaches. Software Testing, Verification and Reliability 22(5), 297–312 (2012)CrossRefGoogle Scholar
  11. 11.
    Offutt, J., Liu, S., Abdurazik, A., Ammann, P.: Generating test data from state-based specifications. Software Testing, Verification and Reliability 13, 25–53 (2003)CrossRefGoogle Scholar
  12. 12.
    Nie, C., Leung, H.: A survey of combinatorial testing. ACM Comp. Surv. (2011)Google Scholar
  13. 13.
    Helland, P.: Idempotence is not a medical condition. ACM Queue 10(4) (2012)Google Scholar
  14. 14.
    Helland, P., Campbell, D.: Building on quicksand. In: Conference on Innovative Data Systems Research, CIDR (2009)Google Scholar
  15. 15.
    Traugott, S.: Why order matters: Turing equivalence in automated systems administration. In: 16th Conference on Systems Administration (LISA), pp. 99–120 (2002)Google Scholar
  16. 16.
    Zamboni, D.: Learning CFEngine 3: Automated system administration for sites of any size. O’Reilly Media, Inc. (2012)Google Scholar
  17. 17.
    Humble, J., Farley, D.: Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation. Addison-Wesley Professional (2010)Google Scholar
  18. 18.
    Giurgiu, I., Castillo, C., Tantawi, A., Steinder, M.: Enabling efficient placement of virtual infrastructures in the cloud. In: Narasimhan, P., Triantafillou, P. (eds.) Middleware 2012. LNCS, vol. 7662, pp. 332–353. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  19. 19.
  20. 20.
  21. 21.
  22. 22.
    Pretschner, A.: Model-based testing. In: Proceedings of the 27th International Conference on Software Engineering, ICSE 2005, pp. 722–723 (2005)Google Scholar
  23. 23.
    Cadar, C., Godefroid, P., et al.: Symbolic execution for software testing in practice: preliminary assessment. In: 33rd Int. Conf. on Software Engineering, ICSE (2011)Google Scholar
  24. 24.
    Benavides Navarro, L.D., Douence, R., Südholt, M.: Debugging and testing middleware with aspect-based control-flow and causal patterns. In: Issarny, V., Schantz, R. (eds.) Middleware 2008. LNCS, vol. 5346, pp. 183–202. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  25. 25.
    Hummer, W., Raz, O., Shehory, O., Leitner, P., Dustdar, S.: Testing of data-centric and event-based dynamic service compositions. In: Softw. Test., Verif. & Reliab. (2013)Google Scholar
  26. 26.
    Bucur, S., Ureche, V., Zamfir, C., Candea, G.: Parallel symbolic execution for automated real-world software testing. In: ACM EuroSys. Conf., pp. 183–198 (2011)Google Scholar
  27. 27.
    Candea, G., Bucur, S., Zamfir, C.: Automated software testing as a service. In: 1st ACM Symposium on Cloud Computing (SoCC), pp. 155–160 (2010)Google Scholar
  28. 28.
    van der Burg, S., Dolstra, E.: Automating system tests using declarative virtual machines. In: 21st Int. Symposium on Software Reliability Engineering (2010)Google Scholar
  29. 29.
    Casale, G., Kalbasi, A., Krishnamurthy, D., Rolia, J.: Automatic stress testing of multi-tier systems by dynamic bottleneck switch generation. In: Bacon, J.M., Cooper, B.F. (eds.) Middleware 2009. LNCS, vol. 5896, pp. 393–413. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  30. 30.
    Gambi, A., Hummer, W., Truong, H.L., Dustdar, S.: Testing Elastic Computing Systems. IEEE Internet Computing (2013)Google Scholar
  31. 31.
    Whitaker, A., Cox, R., Gribble, S.: Configuration debugging as search: finding the needle in the haystack. In: Symp. on Op. Sys. Design & Impl (OSDI), p. 6 (2004)Google Scholar
  32. 32.
    Su, Y.Y., Attariyan, M., Flinn, J.: AutoBash: improving configuration management with operating system causality analysis. In: SOSP (2007)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2013

Authors and Affiliations

  • Waldemar Hummer
    • 1
  • Florian Rosenberg
    • 2
  • Fábio Oliveira
    • 2
  • Tamar Eilam
    • 2
  1. 1.Distributed Systems GroupVienna University of TechnologyAustria
  2. 2.IBM T.J. Watson Research CenterYorktown HeightsUSA

Personalised recommendations