Empirical Software Engineering

, Volume 10, Issue 3, pp 311–341 | Cite as

Studying Software Engineers: Data Collection Techniques for Software Field Studies

  • Timothy C. Lethbridge
  • Susan Elliott Sim
  • Janice Singer
Article

Abstract

Software engineering is an intensively people-oriented activity, yet too little is known about how designers, maintainers, requirements analysts and all other types of software engineers perform their work. In order to improve software engineering tools and practice, it is therefore essential to conduct field studies, i.e. to study real practitioners as they solve real problems. To do so effectively, however, requires an understanding of the techniques most suited to each type of field study task. In this paper, we provide a taxonomy of techniques, focusing on those for data collection. The taxonomy is organized according to the degree of human intervention each requires. For each technique, we provide examples from the literature, an analysis of some of its advantages and disadvantages, and a discussion of how to use it effectively. We also briefly talk about field study design in general, and data analysis.

Keywords

Field studies work practices empirical software engineering 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anquetil, N., and Lethbridge, T.C. 1999. Recovering software architecture from the names of source files. Journal of Software Maintenance: Research and Practice 11: 201–221.CrossRefGoogle Scholar
  2. Baddoo, N., and Hall, T. 2002. Motivators of software process improvement: An analysis of practitioners’ views. Journal of Systems and Software 62: 85–96.CrossRefGoogle Scholar
  3. Baddoo, N., and Hall, T. 2002. De-motivators of software process improvement: An analysis of practitioners’ views. Journal of Systems and Software 66: 23–33.Google Scholar
  4. Basili, V. R. Software modeling and measurement: The Goal/Question/Metric paradigm, Tech. Rep. CS-TR-2956, Department of Computer Science, University of Maryland, College Park, MD 20742, Sept. 1992.Google Scholar
  5. Beecham, S., Hall, T., and Rainer, A. 2003. Software process improvement problems in twelve software companies: An empirical analysis. Empirical Software Engineering 8: 7–42.CrossRefGoogle Scholar
  6. Bellotti, V., and Bly, S. 1996. Walking Away from the Desktop Computer: Distributed Collaboration and Mobility in a Product Design Team. Cambridge, MA: Conference on Computer Supported Cooperative Work, pp. 209–219.Google Scholar
  7. Berlin, L. M. 1993. Beyond Program Understanding: A Look at Programming Expertise in Industry. Empirical Studies of Programmers. Palo Alto: Fifth Workshop, 6–25.Google Scholar
  8. Bratthall, L., and Jørgensen, M. 2002. Can you trust a single data source exploratory software engineering case study? Empirical Software Engineering: An International Journal 7(1): 9–26.Google Scholar
  9. Briand, L., El Emam, K., and Morasca, S. 1996. On the application of measurement theory in software engineering. Empirical Software Engineering 1: 61–88.CrossRefGoogle Scholar
  10. Buckley, J., and Cahill, T. 1997. Measuring Comprehension Behaviour Through System Monitoring, Int. Workshop on Empirical Studies of Software Maintenance, Bari, Italy, 109–113.Google Scholar
  11. Budgen, D., and Thomson, M. 2003. CASE tool Evaluation: Experiences from an empirical study. Journal of Systems and Software 67: 55–75.CrossRefGoogle Scholar
  12. Chi, M. 1997. Quantifying qualitative analyzes of verbal data: A practical guide. The Journal of the Learning Sciences 6(3): 271–315.CrossRefGoogle Scholar
  13. Curtis, B., Krasner, H., and Iscoe, N. 1988. A field study of the software design process for large systems. Communications of the ACM 31(11): November, 1268–1287.CrossRefGoogle Scholar
  14. Damian, D., Zowghi, D., Vaidyanathasamy, L., and Pal, Y. 2004. An industrial case study of immediate benefits of requirements engineering process improvement at the australian center for unisys software. Empirical Software Engineering: An International Journal 9(1–2): 45–75.Google Scholar
  15. Delbecq, A. L., Van de Ven, A. H., Gustafson, D. H. 1975. Group Techniques for Program Planning. Scott. Glenview, IL: Foresman & Co.Google Scholar
  16. DeVaus, D. A. 1996. Surveys in Social Research. 4th edition. London: UCL Press.Google Scholar
  17. Draper, S. 2004. The Hawthorne Effect. http://www.psy.gla.ac.uk/~steve/hawth.html.
  18. Ericcson, K., and Simon, H. 1984. Protocol Analysis: Verbal Reports as Data. Cambridge, MA: The MIT Press.Google Scholar
  19. Foddy, W. 1994. Constructing Questions for Interviews and Questionnaires: Theory and Practice in Social Research. Cambridge, MA: Cambridge University Press.Google Scholar
  20. Hassan, A., Holt, R., and Mockus, A. 2004. MSR 20004: The international workshop on mining software repositories. Proc. ICSE 2004: International Conference on Software Engineering, Scotland, UK, May pp. 23–28.Google Scholar
  21. Herbsleb, J., and Mockus, A. 2003. An empirical study of speed and communication in globally distributed software development. IEEE Transactions Software Engineering 29(6): 481–494.CrossRefGoogle Scholar
  22. Hungerford, B., Hevner, A., and Collins, R. 2004. Reviewing software diagrams: A cognitive study. IEEE, Transactions Software Engineering 30(2): 82–96.CrossRefGoogle Scholar
  23. Iivari, J. 1996. Why are CASE tools not used? Communications of the ACM 39(10): October, 94–103.CrossRefGoogle Scholar
  24. Jick, T. 1979. Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly 24(4): December, 602–611.Google Scholar
  25. Jordan, B., and Henderson, A. 1995. Interaction analysis: Foundations and practice. The Journal of the Learning Sciences 4(1): 39–103.CrossRefGoogle Scholar
  26. Jørgensen, M. 1995. An empirical study of software maintenance tasks. Software Maintenance: Research and Practice 7: 27–48.Google Scholar
  27. Keller, R., Schauer, R., Robitaille, S., and Page, P. 1999. Pattern-based reverse engineering of design components. In: Proc, Int. Conf. Software Engineering, Los Angeles, CA, pp. 226–235.Google Scholar
  28. Kemerer, C. F., and Slaughter, S. A. 1997. Determinants of software maintenance profiles: An empirical investigation. Software Maintenance: Research and Practice 9: 235–251.CrossRefGoogle Scholar
  29. Kensing, F. 1998. Prompted Reflections: A Technique for Understanding Complex Work. interactions, January/February, 7–15.Google Scholar
  30. Lethbridge, T. C. 2000. Priorities for the education and training of software engineers. Journal of Systems and Software 53(1): 53–71.CrossRefGoogle Scholar
  31. Lethbridge, T. C., and Anquetil, N. 2000. Evaluation of approaches to clustering for program comprehension and remodularization. In H. Erdogmus and O. Tanir, (eds), Advances in Software Engineering: Topics in Evolution, Comprehension and Evaluation, New York: Springer-Verlag.Google Scholar
  32. Lethbridge, T. C., and Laganière, R. 2001. Object-Oriented Software Engineering: Practical Software Development Using UML and Java. London: McGraw-Hill.Google Scholar
  33. Miles, M. B. 1979. Qualitative data as an attractive nuisance: The problem of analysis. Administrative Science Quarterly 24(4): 590–601.Google Scholar
  34. Miles, M. B., and Huberman, A. M. 1994. Qualitative Data Analysis: An Expanded Sourcebook. 2nd edition. Thousand Oaks, CA: Sage Publications.Google Scholar
  35. Mockus, A., Fielding, R. T., and Herbsleb, J. D. 2002. Two case studies of open source software development: Apache and mozilla. ACM Trans. on Software Engineering and Methodology 11(3) 209–246.CrossRefGoogle Scholar
  36. NASA, SEL COTS Study Phase 1 Initial Characterization Study Report, SEL-98-001, August 1998, http://sel.gsfc.nasa.gov/website/documents/online-doc.htm.
  37. Nielsen, J. 1997. The Use and Misuse of Focus Groups. http://www.useit.com/papers/focusgroups.html.
  38. Perry, D. E., Staudenmayer, N., and Votta, L. 1994. People, organizations, and process improvement. IEEE Software July, 37–45.Google Scholar
  39. Pfleeger, S. L., and Hatton, L. 1997. Investigating the influence of formal methods. Computer February, 33–43.Google Scholar
  40. Pfleeger, S., and Kitchenham, B. 2001. Principles of survey research Part 1: Turning lemons into lemonade. Software Engineering Notes 26(6) 16–18.CrossRefGoogle Scholar
  41. Porter, A. A., Siy, H. P., Toman, C. A., and Votta, L. G. 1997. An experiment to assess the cost-benefits of code inspections in large scale software development. IEEE Transactions Software Engineering 23(6): 329–346.CrossRefGoogle Scholar
  42. Punter, T., Ciolkowski, M., Freimut, B., John, I. 2003. Conducting on-line surveys in software engineering. Proceedings Int. Symp. on Empirical Software Eng. ‘03, pp. 80–88.Google Scholar
  43. Rainer, A., and Hall, T. 2003. A quantitative and qualitative analysis of factors affecting software processes. Journal of Systems and Software 66: 7–21.Google Scholar
  44. Robbins, S. P. 1994. Essentials of Organizational Behavior. 4th edition. Englewood Cliffs, NJ: Prentice Hall.Google Scholar
  45. Robillard, P. N., d’Astous, P., Détienne, D., and Visser, W. 1998. Measuring cognitive activities in software engineering. Proc. 20th Int. Conf. Software Engineering, Japan, pp. 292–300.Google Scholar
  46. Sayyad-Shirabad, J., Lethbridge, T. C., and Lyon, S. 1997. A little knowledge can go a long way towards program understanding. Proc. 5th Int. Workshop on Program Comprehension. Dearborn, MI: IEEE, pp. 111–117.Google Scholar
  47. Sayyad Shirabad, J., Lethbridge, T. C., and Matwin, S. 2003. Applying data mining to software maintenance records. Proc CASCON 2003, Toronto, October, IBM, in ACM Digital Library, pp. 136–148.Google Scholar
  48. Scacchi, W. 2003. Issues and experiences in modeling open source software processes. Proc. 3rd. Workshop on Open Source Software Engineering, Portland, OR: 25th. Int. Conf. Software Engineering, May.Google Scholar
  49. Seaman, C. B., and Basili, V. R. 1998. Communication and organization: An empirical study of discussion in inspection meetings. IEEE Trans. on Software Engineering 24(7): July, 559–572.CrossRefGoogle Scholar
  50. Seaman, C., Mendonca, M., Basili, V., and Kim, Y. 2003. User interface evaluation and empirically-based evolution of a prototype experience management tool. IEEE Transactions on Software Engineering 29: 838–850.CrossRefGoogle Scholar
  51. Seigel, S., and Castellan, N. J. 1988. Nonparametric Statistics for the Behavioral Sciences. 2nd edition. Boston, MA: McGraw-Hill.Google Scholar
  52. Shull, F., Lanubile, F., and Basili, V. 2000. Investigating reading techniques for object-oriented framework learning. IEEE Transactions on Software Engineering 26: 1101–1118.CrossRefGoogle Scholar
  53. Sim S. E., and Holt, R. C. 1998. The ramp-up problem in software projects: A case study of how software immigrants naturalize. Proc. 20th Int. Conf. on Software Engineering, Kyoto, Japan, pp. 361–370, April.Google Scholar
  54. Sim, S. E., Clarke, C. L. A., and Holt, R. C. 1998. Archetypal source code searches: A survey of software developers and maintainers. Proc. Int. Workshop on Program Comprehension, Ischia, Italy. pp. 180–187.Google Scholar
  55. Singer, J., Lethbridge, T., Vinson, N., and Anquetil, N. 1997. An examination of software engineering work practices. Proc. CASCON. IBM Toronto, 209–223, October.Google Scholar
  56. Singer, J. 1998. Practices of software maintenance. Proc. Int. Conf. on Software Maintenance. Washington, DC, November, pp. 139–145.Google Scholar
  57. Singer, J., Lethbridge, T. C., and Vinson, N. 1998. Work practices as an alternative method to assist tool design in software engineering. Proc. International Workshop on Program Comprehension. Ischia, Italy, pp. 173–179.Google Scholar
  58. Singer, J., and Vinson, N. Ethical issues in empirical studies of software engineering. IEEE Transactions on Software Engineering, 28: 1171–1180.Google Scholar
  59. Snelling, L., and Bruce-Smith, D. 1997. The work mapping technique. Interactions 25–31, July/August.Google Scholar
  60. Somé, S. S., and Lethbridge T. C. 1998. Parsing minimizing when extracting information from code in the presence of conditional compilation. Proc. 6th IEEE International Workshop on Program Comprehension. Italy, June pp. 118–125.Google Scholar
  61. Teasley, S., Covi, L, Krishnan, M., and Olson, J. 2002. Rapid software development through team collocation. IEEE Transactions on Software Engineering 28: 671–683.CrossRefGoogle Scholar
  62. von Mayrhauser, A., and Vans, A. M. 1993. From program comprehension to tool requirements for an industrial environment. Proc. of the 2nd Workshop on Program Comprehension, Capri, Italy, July, pp. 78–86.Google Scholar
  63. von Mayrhauser, A., and Vans, A. M. 1995. Program understanding: Models and experiments. In: M. C. Yovita and M. V. Zelkowitz, (eds), Advances in Computers, Vol. 40, Academic Press, pp. 1–38.Google Scholar
  64. Walenstein, A. 2003. Observing and measuring cognitive support: Steps toward systematic tool evaluation and engineering. Proc. the 11th IEEE Workshop on Program Comprehension.Google Scholar
  65. Walz, D. B., Elam, J. J., and Curtis, B. 1993. Inside a software design team: Knowledge acquisition, sharing, and integration. Communications of the ACM 36(10): October, 62–77.CrossRefGoogle Scholar
  66. Williams, L., Kessler, R. R., Cunningham, W., and Jeffries, R. 2000. Strengthening the case for pair-programming, IEEE Software July/Aug, 19–25.Google Scholar
  67. Wolf, A., and Rosenblum, D. 1993. A study in software process data capture and analysis. Proc. 2nd International Conference on Software Process February, pp. 115–124.Google Scholar
  68. Wu, J., Graham, T., Smith, P. 2003. A study of collaboration in software design. Proc. Int. Symp. Empirical Software Eng. ‘03.Google Scholar

Copyright information

© Springer Science + Business Media, Inc. 2005

Authors and Affiliations

  • Timothy C. Lethbridge
    • 1
  • Susan Elliott Sim
    • 2
  • Janice Singer
    • 3
  1. 1.School of Information Technology and EngineeringUniversity of OttawaOttawaCanada
  2. 2.Department of InformaticsUniversity of California, IrvineIrvineUSA
  3. 3.National Research Council CanadaInstitute for Information TechnologyOttawaCanada

Personalised recommendations