Testing Idempotence for Infrastructure as Code
- 29 Citations
- 3 Mentions
- 2.2k Downloads
Abstract
Due to the competitiveness of the computing industry, software developers are pressured to quickly deliver new code releases. At the same time, operators are expected to update and keep production systems stable at all times. To overcome the development–operations barrier, organizations have started to adopt Infrastructure as Code (IaC) tools to efficiently deploy middleware and applications using automation scripts. These automations comprise a series of steps that should be idempotent to guarantee repeatability and convergence. Rigorous testing is required to ensure that the system idempotently converges to a desired state, starting from arbitrary states. We propose and evaluate a model-based testing framework for IaC. An abstracted system model is utilized to derive state transition graphs, based on which we systematically generate test cases for the automation. The test cases are executed in light-weight virtual machine environments. Our prototype targets one popular IaC tool (Chef), but the approach is general. We apply our framework to a large base of public IaC scripts written by operators, showing that it correctly detects non-idempotent automations.
Keywords
Middleware Deployment Software Automation Idempotence Convergence Infrastructure as Code Software TestingReferences
- 1.Hüttermann, M.: DevOps for Developers. Apress (2012)Google Scholar
- 2.Loukides, M.: What is DevOps? O’Reilly Media (2012)Google Scholar
- 3.Schaefer, A., Reichenbach, M., Fey, D.: Continuous Integration and Automation for Devops. IAENG Trans. on Engineering Technologies 170, 345–358 (2013)CrossRefGoogle Scholar
- 4.Nelson-Smith, S.: Test-Driven Infrastructure with Chef. O’Reilly (2011)Google Scholar
- 5.Opscode: http://www.opscode.com/chef/
- 6.Puppet Labs: http://puppetlabs.com/
- 7.Couch, A.L., Sun, Y.: On the algebraic structure of convergence. In: Brunner, M., Keller, A. (eds.) DSOM 2003. LNCS, vol. 2867, pp. 28–40. Springer, Heidelberg (2003)CrossRefGoogle Scholar
- 8.Burgess, M.: Testable system administration. Commun. ACM 54(3), 44–49 (2011)CrossRefGoogle Scholar
- 9.Opscode Community: http://community.opscode.com/
- 10.Utting, M., Pretschner, A., Legeard, B.: A taxonomy of model-based testing approaches. Software Testing, Verification and Reliability 22(5), 297–312 (2012)CrossRefGoogle Scholar
- 11.Offutt, J., Liu, S., Abdurazik, A., Ammann, P.: Generating test data from state-based specifications. Software Testing, Verification and Reliability 13, 25–53 (2003)CrossRefGoogle Scholar
- 12.Nie, C., Leung, H.: A survey of combinatorial testing. ACM Comp. Surv. (2011)Google Scholar
- 13.Helland, P.: Idempotence is not a medical condition. ACM Queue 10(4) (2012)Google Scholar
- 14.Helland, P., Campbell, D.: Building on quicksand. In: Conference on Innovative Data Systems Research, CIDR (2009)Google Scholar
- 15.Traugott, S.: Why order matters: Turing equivalence in automated systems administration. In: 16th Conference on Systems Administration (LISA), pp. 99–120 (2002)Google Scholar
- 16.Zamboni, D.: Learning CFEngine 3: Automated system administration for sites of any size. O’Reilly Media, Inc. (2012)Google Scholar
- 17.Humble, J., Farley, D.: Continuous Delivery: Reliable Software Releases through Build, Test, and Deployment Automation. Addison-Wesley Professional (2010)Google Scholar
- 18.Giurgiu, I., Castillo, C., Tantawi, A., Steinder, M.: Enabling efficient placement of virtual infrastructures in the cloud. In: Narasimhan, P., Triantafillou, P. (eds.) Middleware 2012. LNCS, vol. 7662, pp. 332–353. Springer, Heidelberg (2012)CrossRefGoogle Scholar
- 19.ChefSpec: https://github.com/acrmp/chefspec
- 20.Cucumber-puppet: http://projects.puppetlabs.com/projects/cucumber-puppet
- 21.Test Kitchen: https://github.com/opscode/test-kitchen
- 22.Pretschner, A.: Model-based testing. In: Proceedings of the 27th International Conference on Software Engineering, ICSE 2005, pp. 722–723 (2005)Google Scholar
- 23.Cadar, C., Godefroid, P., et al.: Symbolic execution for software testing in practice: preliminary assessment. In: 33rd Int. Conf. on Software Engineering, ICSE (2011)Google Scholar
- 24.Benavides Navarro, L.D., Douence, R., Südholt, M.: Debugging and testing middleware with aspect-based control-flow and causal patterns. In: Issarny, V., Schantz, R. (eds.) Middleware 2008. LNCS, vol. 5346, pp. 183–202. Springer, Heidelberg (2008)CrossRefGoogle Scholar
- 25.Hummer, W., Raz, O., Shehory, O., Leitner, P., Dustdar, S.: Testing of data-centric and event-based dynamic service compositions. In: Softw. Test., Verif. & Reliab. (2013)Google Scholar
- 26.Bucur, S., Ureche, V., Zamfir, C., Candea, G.: Parallel symbolic execution for automated real-world software testing. In: ACM EuroSys. Conf., pp. 183–198 (2011)Google Scholar
- 27.Candea, G., Bucur, S., Zamfir, C.: Automated software testing as a service. In: 1st ACM Symposium on Cloud Computing (SoCC), pp. 155–160 (2010)Google Scholar
- 28.van der Burg, S., Dolstra, E.: Automating system tests using declarative virtual machines. In: 21st Int. Symposium on Software Reliability Engineering (2010)Google Scholar
- 29.Casale, G., Kalbasi, A., Krishnamurthy, D., Rolia, J.: Automatic stress testing of multi-tier systems by dynamic bottleneck switch generation. In: Bacon, J.M., Cooper, B.F. (eds.) Middleware 2009. LNCS, vol. 5896, pp. 393–413. Springer, Heidelberg (2009)CrossRefGoogle Scholar
- 30.Gambi, A., Hummer, W., Truong, H.L., Dustdar, S.: Testing Elastic Computing Systems. IEEE Internet Computing (2013)Google Scholar
- 31.Whitaker, A., Cox, R., Gribble, S.: Configuration debugging as search: finding the needle in the haystack. In: Symp. on Op. Sys. Design & Impl (OSDI), p. 6 (2004)Google Scholar
- 32.Su, Y.Y., Attariyan, M., Flinn, J.: AutoBash: improving configuration management with operating system causality analysis. In: SOSP (2007)Google Scholar