How Cost Reduction in Recovery Improves Performance in Program Design Tasks

  • Bastian Steinert
  • Robert Hirschfeld
Part of the Understanding Innovation book series (UNDINNO)


Changing source code often leads to undesired implications, raising the need for recovery actions. Programmers need to manually keep recovery costs low by working in a structured and disciplined manner and regularly performing practices such as testing and versioning. While additional tool support can alleviate this constant need, the question is whether it affects programming performance? In a controlled lab study, 22 participants improved the design of two different applications. Using a repeated measurement setup, we compared the effect of two sets of tools on programming performance: a traditional setting and a setting with our recovery tool called CoExist. CoExist makes it possible to easily revert to previous development states even, if they are not committed explicitly. It also allows forgoing test runs, while still being able to understand the impact of each change later. The results suggest that additional recovery support such as provided with CoExist positively affects programming performance in explorative programming tasks.


Source Code Recovery Cost Code Base Independent Increment Social Threat 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. Apache Software Foundation (2009) Subversion best practices. Available
  2. Beck K, Andres C (2004) Extreme programming explained: embrace change. Addison-Wesley Longman, AmsterdamGoogle Scholar
  3. Bilda Z, Gero JS (2007) The impact of working memory limitations on the design process during conceptualization. Des Stud 28(4):343–367CrossRefGoogle Scholar
  4. Denker M, Gîrba T, Lienhard A, Nierstrasz O, Renggli L, Zumkehr P (2007) Encapsulating and exploiting change with changeboxes. In: Proceedings of the 2007 international conference on dynamic languages: in conjunction with the 15th international Smalltalk Joint conference 2007, ACM, pp 25–49Google Scholar
  5. Farrington J (2011) Seven plus or minus two. Perform Improv Q 23(4):113–116CrossRefGoogle Scholar
  6. Fowler M (1999) Refactoring: improving the design of existing code. Addison-Wesley Professional, Boston, MAGoogle Scholar
  7. Hartmann B, Yu L, Allison A, Yang Y, Klemmer SR (2008) Design as exploration: creating interface alternatives through parallel authoring and runtime tuning. In: Proceedings of the 21st annual ACM symposium on user interface software and technology, ACM, pp 91–100Google Scholar
  8. Hattori L, D’Ambros M, Lanza M, Lungu M (2011) Software evolution comprehension: replay to the rescue. In: Proceedings of ICPC 2011 I.E. 19th international conference on program comprehension, IEEE, pp 161–170Google Scholar
  9. Kahneman D (2011) Thinking, fast and slow. Farrar, Straus and Giroux, NYGoogle Scholar
  10. Kirsh D (2010) Thinking with external representations. AI Soc 25(4):441–454CrossRefGoogle Scholar
  11. Robbes R, Lanza M (2007) A change-based approach to software evolution. Electron Notes Theor Comput Sci 166:93–109CrossRefGoogle Scholar
  12. Saff D, Ernst MD (2003) Reducing wasted development time via continuous testing. In: ISSRE ’03: International symposium on software reliability engineeringGoogle Scholar
  13. Saff D, Ernst MD (2004) An experimental evaluation of continuous testing during development. ACM SIGSOFT Softw Eng Notes 29(4):76–85CrossRefGoogle Scholar
  14. Schon DA, Wiggins G (1992) Kinds of seeing and their functions in designing. Des Stud 13(2):135–156CrossRefGoogle Scholar
  15. Shadish WR, Cook TD, Campbell DT (2002) Experimental and quasi-experimental designs for generalized causal inference. Houghton Mifflin, Boston, MAGoogle Scholar
  16. Steinert B, Cassou D, Hirschfeld R (2012) Coexist: overcoming aversion to change. In: Proceedings of the 8th symposium on dynamic languages, DLS ’12, ACM, New York, pp 107–118Google Scholar
  17. Suwa M, Tversky B (2002) External representations contribute to the dynamic construction of ideas. In: Diagrammatic representation and inference, vol 2317. Springer, BerlinGoogle Scholar
  18. Suwa M, Purcell T, Gero J (1998) Macroscopic analysis of design processes based on a scheme for coding designers’ cognitive actions. Des Stud 19(4):455–483CrossRefGoogle Scholar
  19. Thomas D, Johnson K (1988) Orwell—a configuration management system for team programming. In: ACM SIGPLAN notices, vol 23. No. 11, ACM, pp 135–141Google Scholar
  20. Zeller A (1999) Yesterday, my program worked. today, it does not. why? In: Nierstrasz O, Lemoine M (eds) Software engineering—ESEC/FSE ’99. Lecture notes in computer science, vol 1687. Springer, Berlin, pp 253–267Google Scholar
  21. Zeller A (2002) Isolating cause-effect chains from computer programs. In: Proceedings of the 10th ACM SIGSOFT symposium on Foundations of software engineering, ACM, pp 1–10Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Software Architecture Group, Hasso Plattner InstituteUniversity of PotsdamPotsdamGermany

Personalised recommendations