A study on the changes of dynamic feature code when fixing bugs: towards the benefits and costs of Python dynamic features
- 78 Downloads
- 1 Citations
Abstract
Dynamic features in programming languages support the modification of the execution status at runtime, which is often considered helpful in rapid development and prototyping. However, it was also reported that some dynamic feature code tends to be change-prone or error-prone. We present the first study that analyzes the changes of dynamic feature code and the roles of dynamic features in bug-fix activities for the Python language. We used an AST-based differencing tool to capture fine-grained source code changes from 17926 bug-fix commits in 17 Python projects. Using this data, we conducted an empirical study on the changes of dynamic feature code when fixing bugs in Python. First, we investigated the characteristics of dynamic feature code changes, by comparing the changes between dynamic feature code and non-dynamic feature code when fixing bugs, and comparing dynamic feature changes between bug-fix and non-bugfix activities. Second, we explored 226 bug-fix commits to investigate the motivation and behaviors of dynamic feature changes when fixing bugs. The study results reveal that (1) the changes of dynamic feature code are significantly related to bug-fix activities rather than non-bugfix activities; (2) compared with non-dynamic feature code, dynamic feature code is inserted or updated more frequently when fixing bugs; (3) developers often insert dynamic feature code as type checks or attribute checks to fix type errors and attribute errors; (4) the misuse of dynamic features introduces bugs in dynamic feature code, and the bugs are often fixed by adding a check or adding an exception handling. As a benefit of this paper, we gain insights into the manner in which developers and researchers handle the changes of dynamic feature code when fixing bugs.
Keywords
Python fine-grained code changes change behaviors dynamic features bug fixingNotes
Acknowledgements
This work was supported by National Natural Science Foundation of China (Grant Nos. 61472175, 61472178, 61403187), Natural Science Foundation of Jiangsu Province of China (Grant No. BK20140611), and National Key Basic Research and Development Program of China (Grant No. 2014CB340702).
References
- 1.Akerblom B, Stendahl J, Tumlin M, et al. Tracing dynamic features in Python programs. In: Proceedings of the 11th Working Conference on Mining Software Repositories, Hyderabad, 2014. 292–295Google Scholar
- 2.Holkner A, Harland J. Evaluating the dynamic behaviour of Python applications. In: Proceedings of the 32nd Australasian Conference on Computer Science, Wellington, 2009. 19–28Google Scholar
- 3.Bodden E, Sewe A, Sinschek J, et al. Taming reflection: aiding static analysis in the presence of reflection and custom class loaders categories and subject descriptors. In: Proceedings of the 33rd International Conference on Software Engineering, Waikiki, 2011. 241–250Google Scholar
- 4.Richards G, Hammer C, Burg B. The eval that men do: a large-scale study of the use of eval in JavaScript applications. In: Proceedings of the 25th European Conference on Object-oriented Programming, Lancaster, 2011. 52–78Google Scholar
- 5.Richards G, Lebresne S, Burg B, et al. An analysis of the dynamic behavior of JavaScript programs. ACM SIGPLAN Notices, 2010, 45: 1–12CrossRefGoogle Scholar
- 6.Calla´u O, Robbes R, Tanter E, et al. How developers use the dynamic features of programming languages: the case of Smalltalk. In: Proceedings of the 8th Working Conference on Mining Software Repositories, Waikiki, 2011. 23–32CrossRefGoogle Scholar
- 7.Dufour B, Goard C, Hendren L, et al. Measuring the dynamic behaviour of AspectJ programs. In: Proceedings of the 19th Annual ACM SIGPLAN Conference on Object-oriented Programming, Systems, Languages, and Applications, Vancouver, 2004. 150–169Google Scholar
- 8.Wang B B, Chen L, Ma W W Y, et al. An empirical study on the impact of Python dynamic features on changeproneness. In: Proceedings of the 27th International Conference on Software Engineering and Knowledge Engineering, Pittsburgh, 2015. 134–139Google Scholar
- 9.Park J, Lim I, Ryu S. Battles with false positives in static analysis of JavaScript web applications in the wild. In: Proceedings of the 38th International Conference on Software Engineering Companion, Austin, 2016. 61–70Google Scholar
- 10.Sanner M F. Python: a programming language for software integration and development. J Mol Graph Model, 1999, 17: 57–61Google Scholar
- 11.Chen Z F, Ma W W Y, Lin W, et al. Tracking down dynamic feature code changes against Python software evolution. In: Proceedings of the 3rd International Conference on Trustworthy Systems and Their Applications, Wuhan, 2016. 54–63Google Scholar
- 12.Qian J, Chen L, Xu B W. Finding shrink critical section refactoring opportunities for the evolution of concurrent code in trustworthy software. Sci China Inf Sci, 2013, 56: 012106CrossRefGoogle Scholar
- 13.Chen L, Qian J, Zhou Y M, et al. Identifying extract class refactoring opportunities for internetware. Sci China Inf Sci, 2014, 57: 072103Google Scholar
- 14.Feng Y, Liu Q, Dou M Y, et al. Mubug: a mobile service for rapid bug tracking. Sci China Inf Sci, 2016, 59: 013101Google Scholar
- 15.Zhang J, Wang X, Hao D, et al. A survey on bug-report analysis. Sci China Inf Sci, 2015, 58: 021101Google Scholar
- 16.Chen L, Ma W W Y, Zhou Y M, et al. Empirical analysis of network measures for predicting high severity software faults. Sci China Inf Sci, 2016, 59: 122901CrossRefGoogle Scholar
- 17.Kim S, Zimmermann T, Whitehead E J. Predicting faults from cached history. In: Proceedings of the 29th International Conference on Software Engineering, Minneapolis, 2007. 489–498Google Scholar
- 18.Fischer M, Pinzger M, Gall H. Populating a release history database from version control and bug tracking systems. In: Proceedings of the International Conference on Software Maintenance, Amsterdam, 2003. 23–32Google Scholar
- 19.Hall T. Some code smells have a significant but small effect on faults. ACM Trans Softw Eng Methodol, 2014, 23: 33CrossRefGoogle Scholar
- 20.Khomh F, Penta M D, Gúehéneuc Y G, et al. An exploratory study of the impact of antipatterns on class changeand fault-proneness. Empir Softw Eng, 2012, 17: 243–275CrossRefGoogle Scholar
- 21.Zhong H, Su Z D. An empirical study on real bug fixes. In: Proceedings of the 37th International Conference on Software Engineering, Florence, 2015. 913–923Google Scholar
- 22.Monographs B. Statistical methods for research workers. In: Breakthroughs in Statistics. Berlin: Springer-Verlag, 1992. 66–70Google Scholar
- 23.Xu Z G, Liu P, Zhang X Y, et al. Python predictive analysis for bug detection. In: Proceedings of the 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering, Seattle, 2016. 121–132Google Scholar
- 24.Chen Z F, Chen L, Zhou Y M, et al. Dynamic slicing of Python programs. In: Proceedings of the 38th Computer Software and Applications Conference, Vasteras, 2014. 219–228Google Scholar
- 25.Xu Z G, Qian J, Chen L, et al. Static slicing for Python first-class objects. In: Proceedings of the 13th International Conference on Quality Software, Najing, 2013. 117–124Google Scholar
- 26.Chen Z F, Chen L, Xu B W. Hybrid information flow analysis for Python bytecode. In: Proceedings of the 11th Web Information System and Application Conference, Tianjin, 2014. 95–100Google Scholar
- 27.Chen L, Xu B W, Zhou T L, et al. A constraint based bug checking approach for Python. In: Proceedings of the 33rd Computer Software and Applications Conference, Seattle, 2009. 306–311Google Scholar
- 28.Vitousek M M, Kent A M, Siek J G, et al. Design and evaluation of gradual typing for Python. In: Proceedings of the 10th ACM Symposium on Dynamic Languages, Portland, 2014. 45–56Google Scholar
- 29.Xu Z G, Zhang X Y, Chen L, et al. Python probabilistic type inference with natural language support. In: Proceedings of the 24th ACM SIGSOFT International Symposium on Foundations of Software Engineering, Seattle, 2016. 607–618Google Scholar
- 30.Akerblom B, Wrigstad T. Measuring polymorphism in Python programs. In: Proceedings of the 11th Symposium on Dynamic Languages, Pittsburgh, 2015. 114–128Google Scholar
- 31.Gorbovitski M, Stoller S D. Alias analysis for optimization of dynamic languages. In: Proceedings of the 6th Symposium on Dynamic Languages, Reno/Tahoe, 2010. 27–42Google Scholar
- 32.Lin W, Chen Z F, Ma W W Y, et al. An empirical study on the characteristics of Python fine-grained source code change types. In: Proceedings of the 32nd International Conference on Software Maintenance and Evolution, Raleigh, 2016. 188–199Google Scholar
- 33.Member S. Change distilling: tree differencing for fine-grained source code change extraction. IEEE Trans Softw Eng, 2007, 33: 725–743CrossRefGoogle Scholar
- 34.Neamtiu I, Foster J S, Hicks M. Understanding source code evolution using abstract syntax tree matching. In: Proceedings of the 2005 International Workshop on Mining Software Repositories, St. Louis, 2005. 1–5Google Scholar
- 35.Sager T, Bernstein A, Pinzger M, et al. Detecting similar Java classes using tree algorithms. In: Proceedings of the 2006 International Workshop on Mining Software Repositories, Shanghai, 2006. 65–71CrossRefGoogle Scholar
- 36.Apiwattanapong T, Orso A, Harrold M J. A differencing algorithm for object-oriented programs. In: Proceedings of the 19th IEEE International Conference on Automated Software Engineering, Linz, 2004. 2–13Google Scholar
- 37.Howitz S. Identifying the semantic and textual differences between two versions of a program. In: Proceedings of the ACM SIGPLAN 1990 Conference on Programming Language Design and Implementation, White Plains, 1990. 234–245CrossRefGoogle Scholar
- 38.Raghavan S, Rohana R, Leon D, et al. Dex: a semantic-graph differencing tool for studying changes in large code bases. In: Proceedings of the 20th IEEE International Conference on Software Maintenance, Chicago, 2004. 188–197Google Scholar
- 39.Kim M, Notkin D. Program element matching for multi-version program analyses. In: Proceedings of the 2006 International Workshop on Mining Software Repositories, Shanghai, 2006. 58–64CrossRefGoogle Scholar
- 40.Purushothaman R, Perry D E. Toward understanding the rhetoric of small source code changes. IEEE Trans Softw Eng, 2005, 31: 511–526CrossRefGoogle Scholar
- 41.Voinea L, Telea A. CVSscan: visualization of code evolution. In: Proceedings of the 2005 ACM Symposium on Software Visualization, St. Louis, 2005. 47–56CrossRefGoogle Scholar
- 42.Omori T, Maruyama K. A change-aware development environment by recording editing operations of source code. In: Proceedings of the 2008 International Working Conference on Mining Software Repositories, Leipzig, 2008. 31–34Google Scholar
- 43.Rastkar S, Murphy G C. Why did this code change? In: Proceedings of the 2013 International Conference on Software Engineering, San Francisco, 2013. 1193–1196Google Scholar
- 44.Canfora G, Cerulo L, Cimitile M, et al. How changes affect software entropy: an empirical study. Empir Softw Eng, 2014, 19: 1–38CrossRefGoogle Scholar
- 45.Kamei Y, Shihab E, Adams B, et al. A large-scale empirical study of just-in-time quality assurance. IEEE Trans Softw Eng, 2013, 39: 757–773CrossRefGoogle Scholar
- 46.Wen M, Wu R X, Cheung S C. Locus: locating bugs from software changes. In: Proceedings of the 31st IEEE/ACM International Conference on Automated Software Engineering, Singapore, 2016. 262–273Google Scholar
- 47.Zhao Y Y, Leung H, Yang Y B, et al. Towards an understanding of change types in bug fixing code. Inform Softw Tech, 2017, 86: 37–53CrossRefGoogle Scholar
- 48.Pan K, Kim S, Whitehead E J Jr. Toward an understanding of bug fix patterns. Empir Softw Eng, 2009, 14: 286–315CrossRefGoogle Scholar