Skip to main content

Evaluation

  • Chapter
  • First Online:
Action Research in Software Engineering
  • 1369 Accesses

Abstract

Once we plan and take actions, we need to understand the impact of the action on the organization. Since we are part of the action, and our actions cause effects, we need objective data to analyze the impact of these actions. In this chapter, we describe a selection of data analysis techniques, which are used often as part of action research studies in software engineering. We provide a selection of data visualization methods, statistics, and machine learning to show how to assess the impact of our actions. We also discuss qualitative data analysis methods that can be helpful in analyzing data collected in our research logs or through interviews and workshops.

If your experiment needs a statistician, you need a better experiment.

—Ernest Rutherford

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 84.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vard Antinyan, Miroslaw Staron, Wilhelm Meding, Per Österström, Erik Wikstrom, Johan Wranker, Anders Henriksson, and Jörgen Hansson. Identifying risky areas of software code in agile/lean software development: An industrial experience report. In Software Maintenance, Reengineering and Reverse Engineering (CSMR-WCRE), 2014 Software Evolution Week-IEEE Conference on, pages 154–163. IEEE, 2014.

    Google Scholar 

  2. Virginia Braun and Victoria Clarke. Using thematic analysis in psychology. Qualitative research in psychology, 3(2):77–101, 2006.

    Article  Google Scholar 

  3. Guido Buzzi-Ferraris and Flavio Manenti. Outlier detection in large data sets. Computers & chemical engineering, 35(2):388–390, 2011.

    Article  Google Scholar 

  4. Gustavo EAPA Batista, Ronaldo C Prati, and Maria Carolina Monard. A study of the behavior of several methods for balancing machine learning training data. ACM SIGKDD explorations newsletter, 6(1):20–29, 2004.

    Article  Google Scholar 

  5. Tanja Blascheck, Michael Raschke, and Thomas Ertl. Circular heat map transition diagram. In Proceedings of the 2013 Conference on Eye Tracking South Africa, pages 58–61. ACM, 2013.

    Google Scholar 

  6. Gül Çalikli, Miroslaw Staron, and Wilhelm Meding. Measure early and decide fast: transforming quality management and measurement to continuous deployment. In Proceedings of the 2018 International Conference on Software and System Process, pages 51–60. ACM, 2018.

    Google Scholar 

  7. David P Doane, Lori Welte Seward, et al. Applied statistics in business and economics. New York, NY: McGraw-Hill/Irwin,, 2011.

    Google Scholar 

  8. Carlos A Gomez-Uribe and Neil Hunt. The Netflix recommender system: Algorithms, business value, and innovation. ACM Transactions on Management Information Systems (TMIS), 6(4):13, 2016.

    Google Scholar 

  9. Peter Harrington. Machine learning in action, volume 5. Manning Greenwich, CT, 2012.

    Google Scholar 

  10. Brett Lantz. Machine learning with R. Packt Publishing Ltd, 2013.

    Google Scholar 

  11. J Ross Quinlan. C4. 5: programs for machine learning. Elsevier, 2014.

    Google Scholar 

  12. Colin Robson and Kieran McCartan. Real world research. John Wiley & Sons, 2016.

    Google Scholar 

  13. Miroslaw Staron, Jorgen Hansson, Robert Feldt, Anders Henriksson, Wilhelm Meding, Sven Nilsson, and Christoffer Hoglund. Measuring and visualizing code stability–a case study at three companies. In Software Measurement and the 2013 Eighth International Conference on Software Process and Product Measurement (IWSM-MENSURA), 2013 Joint Conference of the 23rd International Workshop on, pages 191–200. IEEE, 2013.

    Google Scholar 

  14. Miroslaw Staron, Wilhelm Meding, Kent Niesel, and Alain Abran. A key performance indicator quality model and its industrial evaluation. In Software Measurement and the International Conference on Software Process and Product Measurement (IWSM-MENSURA), 2016 Joint Conference of the International Workshop on, pages 170–179. IEEE, 2016.

    Google Scholar 

  15. Miroslaw Staron, Kent Niesel, and Niclas Bauman. Milestone-oriented usage of key performance indicators–an industrial case study. e-Informatica Software Engineering Journal, 12(1), 2018.

    Google Scholar 

  16. Alexandru C Telea. Data visualization: principles and practice. AK Peters/CRC Press, 2007.

    Google Scholar 

  17. Adam Tornhill. Your code as a crime scene. Pragmatic Bookshelf, 2015.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Staron, M. (2020). Evaluation. In: Action Research in Software Engineering. Springer, Cham. https://doi.org/10.1007/978-3-030-32610-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32610-4_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32609-8

  • Online ISBN: 978-3-030-32610-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics