AC 2017: Augmented Cognition. Enhancing Cognition and Behavior in Complex Human Environments pp 56-64 | Cite as
Neurophysiological Impact of Software Design Processes on Software Developers
Abstract
Software development often leads to failed implementations resulting from several factors related to individual reactions to software design. Some design metrics give software developers guidelines and heuristics for use in software design. Furthermore, many metrics have been created to measure outcomes in terms of “code quality.” However, these guidelines and metrics have been shown only to have a weak relationship and are poorly implemented. This study takes a new approach using tools from cognitive neuroscience to examine the cognitive load and arousal level placed on software engineers while working with different software designs. Specifically, we use electroencephalography (EEG) and skin conductance (SCR) to examine cognitive and emotional reactions to software structure. We propose to examine whether modular design affects levels of cognitive load and arousal. Our findings open the door for future research that combines software engineering and cognitive neuroscience. The potential implications of this study extend beyond optimal ways to structure software to leading the software engineering field to study individual cognition and arousal as a central component in successful software development. This opens a wide array of potential studies in the software engineering field.
Keywords
Cognitive load and performance Emotion Electroencephalography Augmented cognition Arousal Software engineeringReferences
- 1.Chow, T., Cao, D.-B.: A survey study of critical success factors in agile software projects. J. Syst. Softw. 81(6), 961–971 (2008)CrossRefGoogle Scholar
- 2.DeMarco, T., Lister, T.: Waltzing with Bears: Managing Risk on Software Projects. Addison-Wesley, Boston (2013)Google Scholar
- 3.Lawrie, D., Morrell, C., Feild, H., Binkley, D.: What’s in a Name? a study of identifiers. In: 14th IEEE International Conference on Program Comprehension (ICPC 2006), pp. 3–12 (2006)Google Scholar
- 4.Dijkstra, E.W.: Cooperating sequential processes. In: Hansen, P.B. (ed.) The Origin of Concurrent Programming: From Semaphores to Remote Procedure Calls, pp. 65–138. Springer, New York (2002)Google Scholar
- 5.Parnas, D.L.: On the criteria to be used in decomposing systems into modules. Commun. ACM 15, 1053–1058 (1972)CrossRefGoogle Scholar
- 6.Tempero, E.: An empirical study of unused design decisions in open source java software. In: 2008 15th Asia-Pacific Software Engineering Conference, pp. 33–40 (2008)Google Scholar
- 7.Gorschek, T., Tempero, E., Angelis, L.: A large-scale empirical study of practitioners’ use of object-oriented concepts. In: 2010 ACM/IEEE 32nd International Conference on Software Engineering, pp. 115–124 (2010)Google Scholar
- 8.Kazman, R., Cai, Y., Mo, R., Feng, Q., Xiao, L., Haziyev, S., Fedak, V., Shapochka, A.: A case study in locating the architectural roots of technical debt. Presented at the Proceedings of the 37th International Conference on Software Engineering, Vol. 2, Florence (2015)Google Scholar
- 9.Dimoka, A.: What does the brain tell us about trust and distrust? evidence from a functional neuroimaging study. MIS Q. 34, 373–396 (2010)Google Scholar
- 10.Minas, R.K., Potter, R.F., Dennis, A.R., Bartelt, V., Bae, S.: Putting on the thinking cap: using neurois to understand information processing biases in virtual teams. J. Manag. Inf. Syst. 30, 49–82 (2014)Google Scholar
- 11.Siegmund, J.: Measuring Program Comprehension with Fmri Google Scholar
- 12.McCabe, T.J.: A complexity measure. IEEE Trans. Softw. Eng. 2, 308–320 (1976)MathSciNetCrossRefMATHGoogle Scholar
- 13.Halstead, M.H.: Elements of Software Science 7. Elsevier, New York (1977)MATHGoogle Scholar
- 14.Mo, R., Cai, Y., Kazman, R., Xiao, L.: Hotspot patterns: the formal definition and automatic detection of architecture smells. In: 2015 12th Working IEEE/IFIP Conference on Software Architecture, pp. 51–60 (2015)Google Scholar
- 15.Chidamber, S.R., Kemerer, C.F.: A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20, 476–493 (1994)CrossRefGoogle Scholar
- 16.Lorenz, M., Kidd, J.: Object-Oriented Software Metrics: A Practical Guide. Prentice-Hall Inc, Upper Saddle River (1994)Google Scholar
- 17.e Abreu, F.B.: The mood metrics set. In: Proceedings of ECOOP, p. 267 (1995)Google Scholar
- 18.Harrison, R., Counsell, S.J., Nithi, R.V.: An evaluation of the mood set of object-oriented software metrics. IEEE Trans. Softw. Eng. 24, 491–496 (1998)CrossRefGoogle Scholar
- 19.Lang, J.E., Bogovich, B.R., Barry, S.C., Durkin, B.G., Katchmar, M.R., Kelly, J.H., McCollum, J.M., Potts, M.: Object-oriented programming and design patterns. ACM SIGCSE Bull. 33, 68–70 (2001)CrossRefGoogle Scholar
- 20.Paas, F., Renkl, A., Sweller, J.: Cognitive load theory and instructional design: recent developments. Educ. Psychol. 38, 1–4 (2003)CrossRefGoogle Scholar
- 21.Baddeley, A.: Working memory. Science 255, 556–559 (1992)CrossRefGoogle Scholar
- 22.Conway, A.R.A., Engle, R.W.: Working memory and retrieval: a resource-dependent inhibition model. J. Exp. Psychol. Gen. 123, 354–373 (1994)CrossRefGoogle Scholar
- 23.Welsh, M.C., Satterlee-Cartmell, T., Stine, M.: Towers of hanoi and london: contribution of working memory and inhibition to performance. Brain Cogn. 41, 231–242 (1999)CrossRefGoogle Scholar
- 24.D’Esposito, M.: From cognitive to neural models of working memory. Philos. Trans. Royal Soc. B Biol. Sci. 362, 761–772 (2007)CrossRefGoogle Scholar
- 25.D’Esposito, M., Detre, J.A., Alsop, D.C., Shin, R.K., Atlas, S., Grossman, M.: The neural basis of the central executive system of working memory. Nature 378, 279–281 (1995)CrossRefGoogle Scholar
- 26.Wager, T.D., Jonides, J., Reading, S.: Neuroimaging studies of shifting attention: a meta-analysis. NeuroImage 22, 1679–1693 (2004)CrossRefGoogle Scholar
- 27.Gevins, A., Smith, M.E., McEvoy, L., Yu, D.: High-resolution EEG mapping of cortical activation related to working memory: effects of task difficulty, type of processing, and practice. Cereb. Cortex 7, 374–385 (1997)CrossRefGoogle Scholar
- 28.Lavie, N.: Distracted and confused? selective attention under load. Trends Cogn. Sci. 9, 75–82 (2005)CrossRefGoogle Scholar
- 29.Shi, Y., Ruiz, N., Taib, R., Choi, E., Chen, F.: Galvanic skin response (Gsr) as an index of cognitive load. Presented at the CHI 2007 Extended Abstracts on Human Factors in Computing Systems. San Jose (2007)Google Scholar
- 30.Mehler, B., Reimer, B., Coughlin, J., Dusek, J.: Impact of incremental increases in cognitive workload on physiological arousal and performance in young adult drivers. Transp. Res. Rec. J. Transp. Res. Board 2138, 6–12 (2009)CrossRefGoogle Scholar
- 31.Moody, D.: The “Physics” of notations: toward a scientific basis for constructing visual notations in software engineering. IEEE Trans. Softw. Eng. 35, 756–779 (2009)CrossRefGoogle Scholar
- 32.Sutcliffe, A.G., Maiden, N.A.M.: Analysing the novice analyst: cognitive models in software engineering. Int. J. Man-Mach. Stud. 36, 719–740 (1992)CrossRefGoogle Scholar
- 33.Herwig, U., Satrapi, P., Schönfeldt-Lecuona, C.: Using the international 10–20 EEG system for positioning of transcranial magnetic stimulation. Brain Topogr. 16, 95–99 (2003)CrossRefGoogle Scholar
- 34.Delorme, A., Makeig, S.: EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 134, 9–21 (2004)CrossRefGoogle Scholar
- 35.Onton, J., Makeig, S.: Information-based modeling of event-related brain dynamics. In: Christa, N., Wolfgang, K. (Eds.) Progress in Brain Research, vol. 159, pp. 99–120. Elsevier, Amsterdam (2006)Google Scholar
- 36.Delorme, A., Makeig, S.: EEGLAB Wikitutorial, 12 May. http://sccn.ucsd.edu/wiki/PDF:EEGLAB_Wiki_Tutorial