Advertisement

Data-Driven Requirements Engineering: A Guided Tour

Conference paper
  • 261 Downloads
Part of the Communications in Computer and Information Science book series (CCIS, volume 1375)

Abstract

Data-driven approaches are becoming dominant in almost every single software engineering activity, and requirements engineering is not the exception. The analysis of data coming from several sources may indeed become an extremely useful input to requirements elicitation and management. However, benefits do not come for free. Techniques such as natural language processing and machine learning are difficult to master and require high-quality data and specific competences from different fields, whilst their generalization remains as a challenge. This paper introduces the main concepts behind data-driven requirements engineering, provides an overview of the state of the art in the field and identifies the main challenges to be addressed.

Keywords

Requirements engineering Data-driven requirements engineering Feedback Natural language processing Software analytics Monitoring Decision-making Release planning 

Notes

Acknowledgment

This work is partially supported by the GENESIS project, funded by the Spanish Ministerio de Ciencia e Innovación under contract TIN2016-79269-R. The author wants to deeply thank Fabiano Dalpiaz, Silverio Martínez-Fernández and Marc Oriol for their comments and suggestions over a first draft of the paper.

References

  1. 1.
    Pohl, K.: Requirements Engineering: Fundamentals, Principles and Techniques. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  2. 2.
    Ross, D.T. (ed): Special Collection on Requirement Analysis. IEEE Trans. Softw. Eng. SE-3(1), 2–84 (1977)Google Scholar
  3. 3.
    Boehm, B.: Software engineering. IEEE Trans. Comput. C-25(12), 1226–1241 (1976)zbMATHCrossRefGoogle Scholar
  4. 4.
    Kuffel, W.: Extra time saves money. Comput. Lang. (1990)Google Scholar
  5. 5.
    Spinellis, D.: Code Quality – The Open Source Perspective. Pearson (2006)Google Scholar
  6. 6.
    PMI: Pulse of the Profession® In-Depth Report: Requirements Management—A Core Competency for Project and Program Success (2014). https://www.pmi.org/-/media/pmi/documents/public/pdf/learning/thought-leadership/pulse/requirements-management.pdf
  7. 7.
    Maalej, W., Nayebi, M., Johann, T., Ruhe, G.: Toward data-driven requirements engineering. IEEE Softw. 33(1), 48–54 (2016)CrossRefGoogle Scholar
  8. 8.
    Lucas, H.C.: A user-oriented approach to systems design. In: Proceedings of the 26th Annual Conference of the Association for Computing Machinery (ACM), pp. 325–338. ACM Press (1971)Google Scholar
  9. 9.
    Trotter, P.: User feedback and how to get it. In: Proceedings of the 4th Annual Conference on User Services (SIGUCCS), pp. 130–132. ACM Press (1976)Google Scholar
  10. 10.
    Pagano, D., Maalej, W.: User feedback in the appstore: an empirical study. In: Proceedings of the 21st International Requirements Engineering Conference (RE), pp. 125–134. IEEE Press (2013)Google Scholar
  11. 11.
    Guzmán, L., Oriol, M., Rodríguez, P., Franch, X., Jedlitschka, A., Oivo, M.: How can quality awareness support rapid software development? – A research preview. In: Grünbacher, P., Perini, A. (eds.) REFSQ 2017. LNCS, vol. 10153, pp. 167–173. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-54045-0_12CrossRefGoogle Scholar
  12. 12.
    Franch, X., et al.: Data-driven requirements engineering in agile projects: the Q-rapids approach. In: Proceedings of the 25th International Requirements Engineering Conference Workshops (REW), pp. 411–414. IEEE Computer Society (2017)Google Scholar
  13. 13.
    Fitzgerald, B., Stol, K.J.: Continuous software engineering: a roadmap and agenda. J. Syst. Softw. 123, 176–189 (2017)CrossRefGoogle Scholar
  14. 14.
    Hosseini, M., Groen, E.C., Shahri, A., Ali, R.: CRAFT: a crowd-annotated feedback technique. In: Proceedings of the IEEE 25th International Requirements Engineering Conference Workshops (REW), pp. 170–175 (2017)Google Scholar
  15. 15.
    Chowdhury, G.: Natural language processing. Ann. Rev. Inf. Sci. Technol. 37, 51–89 (2003)CrossRefGoogle Scholar
  16. 16.
    Zhao, L., et al.: Natural language processing (NLP) for requirements engineering: a systematic mapping study. arXiv:2004.01099v2 [cs.SE] (2020)
  17. 17.
    Dalpiaz, F., Ferrari, A., Franch, X., Palomares, C.: Natural language processing for requirements engineering; the best is yet to come. IEEE Softw. 35(5), 115–119 (2018)CrossRefGoogle Scholar
  18. 18.
    El Shawi, R., Maher, M., Sakr, S.: Automated machine learning: state-of-the-art and open challenges. arXiv:1906.02287v2 [cs.LG] (2019)
  19. 19.
    Webster, J.J., Kit, C.: Tokenization as the initial phase in NLP. In: Proceedings of the 14th Conference on Computational Linguistics (COLING),vol. 4, pp. 1106–1110. ACM Press (1992)Google Scholar
  20. 20.
    Ladani, D.J., Desai, N.P.: Stopword identification and removal techniques on TC and IR applications: a survey. In: Proceedings of the 6th International Conference on Advanced Computing and Communication Systems (ICACCS), pp. 466–472. IEEE Press (2020)Google Scholar
  21. 21.
    Singh, J., Gupta, V.: A systematic review of text stemming techniques. Artif. Intell. Rev. 48, 157–217 (2017).  https://doi.org/10.1007/s10462-016-9498-2CrossRefGoogle Scholar
  22. 22.
    Balakrishnan, V., Lloyd-Yemoh, E.: Stemming and lemmatization: a comparison of retrieval performances. Lect. Notes Softw. Eng. 2(3), 262–267 (2014)CrossRefGoogle Scholar
  23. 23.
    Abney, S.: Part-of-speech tagging and partial parsing. In: Young, S., Bloothooft, G. (eds.) Corpus-Based Methods in Language and Speech Processing. Text, Speech and Language Technology, vol. 2, pp. 118–136. Springer, Heidelberg (1997).  https://doi.org/10.1007/978-94-017-1183-8_4CrossRefGoogle Scholar
  24. 24.
    Morales-Ramirez, I., Kifetew, F.M., Perini, A.: Speech-acts based analysis for requirements discovery from online discussions. Inf. Syst. 86, 94–112 (2019)CrossRefGoogle Scholar
  25. 25.
    Searle, J.R.: Speech Acts: An Essay in the Philosophy of Language. Cambridge University Press, Cambridge (1969)CrossRefGoogle Scholar
  26. 26.
    Guzman, E., Alkadhi, R., Seyff, N.: A needle in a haystack: what do twitter users say about software? In: Proceedings of the 24th International Requirements Engineering Conference (RE), pp. 96–105. IEEE Computer Society (2016)Google Scholar
  27. 27.
    Nasukawa, T., Yi, J.: Sentiment analysis: capturing favorability using natural language processing. In: Proceedings of the 2nd international Conference on Knowledge Capture (K-CAP), pp. 70–77. ACM Press (2003)Google Scholar
  28. 28.
    Taboada, M., Brooke, J., Tofiloski, M., Voll, K., Stede, M.: Lexicon-based methods for sentiment analysis. Comput. Linguist. 37(2), 267–307 (2011)CrossRefGoogle Scholar
  29. 29.
    Zhang, L., Wang, S., Liu, B.: Deep learning for sentiment analysis: a survey. Data Min. Knowl. Discov. 8(4), e1253 (2018)Google Scholar
  30. 30.
    Guzman, E., Maalej, W.: How do users like this feature? A fine grained sentiment analysis of app reviews. In: Proceedings of the 22nd International Requirements Engineering Conference (RE), pp. 153–162. IEEE Computer Society (2014)Google Scholar
  31. 31.
    Wallach, H.M.: Topic modeling: beyond bag-of-words. In: Proceedings of the 23rd International Conference on Machine Learning (ICML), pp. 977–984. ACM Press (2006)Google Scholar
  32. 32.
    Jacobi, C., van Atteveldt, W., Welbers, K.: Quantitative analysis of large amounts of journalistic texts using topic modelling. Digit. J. 4(1), 89–106 (2016)Google Scholar
  33. 33.
    Abad, Z.S.H., Karras, O., Ghazi, P., Glinz, M., Ruhe, G., Schneider, K.: What works better? A study of classifying requirements. arXiv:1707.02358 [cs.SE] (2017)
  34. 34.
    Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)zbMATHGoogle Scholar
  35. 35.
    Yan, X., Guo, J., Lan, Y., Cheng, X.: A biterm topic model for short texts. In Proceedings of the 22nd International Conference on World Wide Web (WWW), pp. 1445–1456. ACM press (2013)Google Scholar
  36. 36.
    Nenkova, A., McKeown, K.: A survey of text summarization techniques. In: Aggarwal, C., Zhai, C. (eds.) Mining Text Data, pp. 43–76. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-1-4614-3223-4_3CrossRefGoogle Scholar
  37. 37.
    Allahyari, M., et al.: Text summarization techniques: a brief survey. arXiv:1707.02268v3 [cs.CL] (2017)
  38. 38.
    Manning, C.D., Surdeanu, M., Bauer, J., Finkel, J., Bethard, S.J., McClosky, D.: The stanford CoreNLP natural language processing toolkit. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics: System Demonstrations (ACL), pp. 55–60 (2014)Google Scholar
  39. 39.
    Kelly, D., Teevan, J.: Implicit feedback for inferring user preference: a bibliography. ACM SIGIR Forum 37(2), 18–28 (2003)CrossRefGoogle Scholar
  40. 40.
    Agichtein, E., Brill, E., Dumais, S.: Improving web search ranking by incorporating user behavior information. In: Proceedings of the 29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR), pp. 19–26. ACM Press (2006)Google Scholar
  41. 41.
    Carvalho, V.M., et al.: Tracking the Covid-19 crisis with high-resolution transaction data. CEPR Discussion Paper No. DP14642 (2020)Google Scholar
  42. 42.
    Papazoglou, M.P., Georgakopoulos, D.: Introduction: service-oriented computing. Communun. ACM 46(1), 24–28 (2003)CrossRefGoogle Scholar
  43. 43.
    Abdelmaboud, A., Jawawi, D.N.A., Ghani, I., Elsafi, A., Kitchenham, B.: Quality of service approaches in cloud computing: a systematic mapping study. J. Syst. Softw. 101, 159–179 (2015)CrossRefGoogle Scholar
  44. 44.
    Janes, A.: Non-distracting, continuous collection of software development process data. In: Nalepa, G.J., Baumeister, J. (eds.) Synergies Between Knowledge Engineering and Software Engineering. AISC, vol. 626, pp. 275–294. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-64161-4_13CrossRefGoogle Scholar
  45. 45.
    Joachims, T.: Optimizing search engines using clickthrough data. In: Proceedings of the 8th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), pp. 133–142. ACM Press (2002)Google Scholar
  46. 46.
    Harrigan, J., Rosenthal, R., Scherer, K. (eds.): The New Handbook of Methods in Nonverbal Behavior Research. Oxford University Press, Oxford (2005)Google Scholar
  47. 47.
    Sharafi, Z., Soh, Z., Guéhéneuc, Y.-G.: A systematic literature review on the usage of eye-tracking in software engineering. Inf. Softw. Technol. 67, 79–107 (2015)CrossRefGoogle Scholar
  48. 48.
    Joachims, T., Granka, L., Pan, B., Hembrooke, H., Gay, G.: Accurately interpreting clickthrough data as implicit feedback. In: Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR), pp. 154–161. ACM Press (2005)Google Scholar
  49. 49.
    Kertesz, A., et al.: Enhancing federated cloud management with an integrated service monitoring approach. J. Grid Comput. 11(4), 699–720 (2013).  https://doi.org/10.1007/s10723-013-9269-0CrossRefGoogle Scholar
  50. 50.
    Cabrera, O., Franch, X., Marco, J.: Ontology-based context modeling in service-oriented computing: a systematic mapping. Data Knowl. Eng. 110, 24–53 (2017)CrossRefGoogle Scholar
  51. 51.
    Ali, R., Dalpiaz, F., Giorgini, P.: Reasoning with contextual requirements: detecting inconsistency and conflicts. Inf. Softw. Technol. 55, 35–57 (2013)CrossRefGoogle Scholar
  52. 52.
    Knauss, A., Damian, D.E., Franch, X., Rook, A., Müller, H.A., Thomo, A.: ACon: a learning-based approach to deal with uncertainty in contextual requirements at runtime. Inf. Softw. Technol. 70, 85–99 (2016)CrossRefGoogle Scholar
  53. 53.
    Sutcliffe, A., Sawyer, P.: Requirements elicitation: towards the unknown unknowns. In Proceedings of the 21st International Requirements Engineering Conference (RE), pp. 92–104. IEEE Press (2013)Google Scholar
  54. 54.
    Oriol, M., et al.: FAME: supporting continuous requirements elicitation by combining user feedback and monitoring. In: Proceedings of the 26th International Requirements Engineering Conference (RE), pp. 217–227. IEEE Computer Society (2018)Google Scholar
  55. 55.
    McDaniel, M., Storey, V.C.: Evaluating domain ontologies: clarification, classification, and challenges. ACM Comput. Surv. 52(4), Article 70 (2019)Google Scholar
  56. 56.
    Groen, E.C., et al.: The crowd in requirements engineering: the landscape and challenges. IEEE Softw. 34(2), 44–52 (2017)CrossRefGoogle Scholar
  57. 57.
    Wüest, D., Fotrousi, F., Fricker, S.: Combining monitoring and autonomous feedback requests to elicit actionable knowledge of system use. In: Knauss, E., Goedicke, M. (eds.) REFSQ 2019. LNCS, vol. 11412, pp. 209–225. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-15538-4_16CrossRefGoogle Scholar
  58. 58.
    Johanssen, J.O., Kleebaum, A., Bruegge, B., Paech, B.: How do practitioners capture and utilize user feedback during continuous software engineering? In: Proceedings of the 27th International Requirements Engineering Conference (RE), pp. 153–164. IEEE Press (2019)Google Scholar
  59. 59.
    Gall, H., Menzies, T., Williams, L., Zimmermann, T. (eds.): Software development analytics. Dagstuhl Rep. 4(6), 64–83 (2014)Google Scholar
  60. 60.
    Buse, R.P.L., Zimmermann, T.: Information needs for software development analytics. In: Proceedings of the 34th International Conference on Software Engineering (ICSE), pp. 987–996. IEEE Press (2012)Google Scholar
  61. 61.
    The ISO Organization: ISO/IEC 25010:2011 –Systems and Software Engineering—Systems and Software Quality Requirements and Evaluation (SQuaRE)—System and Software Quality Models (2011)Google Scholar
  62. 62.
    Martínez-Fernández, S., et al.: Continuously assessing and improving software quality with software analytics tools: a case study. IEEE Access 7, 68219–68239 (2019)CrossRefGoogle Scholar
  63. 63.
    Wagner, S., et al.: Operationalised product quality models and assessment: the quamoco approach. Inf. Softw. Technol. 62, 101–123 (2015)CrossRefGoogle Scholar
  64. 64.
    Choraś, M., Kozik, R., Pawlicki, M., Hołubowicz, W., Franch, X.: Software development metrics prediction using time series methods. In: Saeed, K., Chaki, R., Janev, V. (eds.) CISIM 2019. LNCS, vol. 11703, pp. 311–323. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-28957-7_26CrossRefGoogle Scholar
  65. 65.
    Oriol, M., et al.: Data-driven and tool-supported elicitation of quality requirements in agile companies. Softw. Qual. J. 28(3), 931–963 (2020).  https://doi.org/10.1007/s11219-020-09509-y. (in press)CrossRefGoogle Scholar
  66. 66.
    Renault, S., Mendez-Bonilla, O., Franch, X., Quer, C.: PABRE: pattern-based requirements elicitation. In: Proceedings of the 3rd International Conference on Research Challenges in Information Science (RCIS), pp. 81–92. IEEE Press (2009)Google Scholar
  67. 67.
    Dalpiaz, F., Parente, M.: RE-SWOT: from user feedback to requirements via competitor analysis. In: Knauss, E., Goedicke, M. (eds.) REFSQ 2019. LNCS, vol. 11412, pp. 55–70. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-15538-4_4CrossRefGoogle Scholar
  68. 68.
    Svahnberg, M., Gorschek, T., Feldt, R., Torkar, R., Saleem, S.B., Shafique, M.U.: A systematic review on strategic release planning models. Inf. Softw. Technol. 52(3), 237–248 (2010)CrossRefGoogle Scholar
  69. 69.
    Ameller, D., Farré, C., Franch, X., Rufian, G.: A survey on software release planning models. In: Abrahamsson, P., Jedlitschka, A., Nguyen Duc, A., Felderer, M., Amasaki, S., Mikkonen, T. (eds.) PROFES 2016. LNCS, vol. 10027, pp. 48–65. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-49094-6_4CrossRefGoogle Scholar
  70. 70.
    Greer, D., Ruhe, G.: Software release planning: an evolutionary and iterative approach. Inf. Softw. Technol. 46(4), 243–253 (2004)CrossRefGoogle Scholar
  71. 71.
    Nayebi, M., Adams, B., Ruhe, G.: Release practices for mobile apps – what do users and developers think? In: Proceedings of the 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER), pp. 552–562 (2016)Google Scholar
  72. 72.
    Villarroel, L., Bavota, G., Russo, B., Oliveto, R., di Penta, M.: Release planning of mobile apps based on user reviews. In: Proceedings of the 38th International Conference on Software Engineering (ICSE), pp. 14–24. IEEE Computer Society (2016)Google Scholar
  73. 73.
    Maalej, W., Nayebi, M., Ruhe, G.: Data-driven requirements engineering - an update. In: Proceedings of the IEEE/ACM 41st International Conference on Software Engineering: Software Engineering in Practice (ICSE-SEIP), pp. 289–290 (2019)Google Scholar
  74. 74.
    Kifetew, F.M., et al.: Gamifying collaborative prioritization: does pointsification work? In: Proceedings of the 25th International Requirements Engineering Conference (RE), pp. 322–331. IEEE Press (2017)Google Scholar
  75. 75.
    Johann, T., Maalej, W.: Democratic mass participation of users in requirements engineering? In: Proceedings of the 23rd International Requirements Engineering Conference (RE), pp. 256–261. IEEE Press (2015)Google Scholar
  76. 76.
    Shearer, C.: The CRISP-DM model: the new blueprint for data mining. J. Data Warehous. 4(5), 13–22 (2000)Google Scholar
  77. 77.
    Ebert, C., Heidrich, J., Martinez-Fernandez, S., Trendowicz, A.: Data science: technologies for better software. IEEE Softw. 36(6), 66–72 (2019)CrossRefGoogle Scholar
  78. 78.
    Svensson, R.B., Feldt, R., Torkar, R.: The unfulfilled potential of data-driven decision making in agile software development. In: Kruchten, P., Fraser, S., Coallier, F. (eds.) XP 2019. LNBIP, vol. 355, pp. 69–85. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-19034-7_5CrossRefGoogle Scholar
  79. 79.
    Franch, X., et al.: Towards integrating data-driven requirements engineering into the software development process: a vision paper. In: Madhavji, N., Pasquale, L., Ferrari, A., Gnesi, S. (eds.) REFSQ 2020. LNCS, vol. 12045, pp. 135–142. Springer, Cham (2020).  https://doi.org/10.1007/978-3-030-44429-7_10CrossRefGoogle Scholar
  80. 80.
    Dalpiaz, F., Snijders, R., Brinkkemper, S., Hosseini, M., Shahri, A., Ali, R.: Engaging the crowd of stakeholders in requirements engineering via gamification. In: Stieglitz, S., Lattemann, C., Robra-Bissantz, S., Zarnekow, R., Brockmann, T. (eds.) Gamification. PI, pp. 123–135. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-45557-0_9CrossRefGoogle Scholar
  81. 81.
    Martens, D., Maalej, W.: Towards detecting and understanding fake reviews in app stores. Empir. Eng. 24, 3316–3355 (2019).  https://doi.org/10.1007/s10664-019-09706-9CrossRefGoogle Scholar
  82. 82.
    Zavala, E., Franch, X., Marco, J.: Adaptive monitoring: a systematic mapping. Inf. Softw. Technol. 105, 161–189 (2019)CrossRefGoogle Scholar
  83. 83.
    Pruitt, J., Grudin, J.: Personas: practice and theory. In: Proceedings of the 2003 Conference on Designing for User Experiences (DUX), pp. 1–15. ACM Press (2003)Google Scholar
  84. 84.
    Almaliki, M., Ncube, C., Ali, R.: Adaptive software-based feedback acquisition: a persona-based design. In: Proceedings of the 9th International Conference on Research Challenges in Information Science (RCIS), pp. 100–111. IEEE Press (2015)Google Scholar
  85. 85.
    Choras, M., et al.: Measuring and improving agile processes in a small-size software development company. IEEE Access 8, 78452–78466 (2020)CrossRefGoogle Scholar
  86. 86.
    Kling, R.: The organizational context of user-centered software designs. MIS Q. 1(4), 41–52 (1977)CrossRefGoogle Scholar
  87. 87.
    Hansen, W.J.: User engineering principles for interactive systems. In: Proceedings of the Fall Joint Computer Conference (AFIPS), pp. 523–532. ACM Press (1971)Google Scholar
  88. 88.
    Cook, J.E., Wolf, A.L.: Automating process discovery through event-data analysis. In: Proceedings of the 17th International Conference on Software Engineering (ICSE), pp. 73–82. IEEE Press (1995)Google Scholar
  89. 89.
    Agrawal, R., Gunopulos, D., Leymann, F.: Mining process models from workflow logs. In: Schek, H.-J., Alonso, G., Saltor, F., Ramos, I. (eds.) EDBT 1998. LNCS, vol. 1377, pp. 467–483. Springer, Heidelberg (1998).  https://doi.org/10.1007/BFb0101003CrossRefGoogle Scholar
  90. 90.
    Wolf, A.L., Rosenblum, D.S.: A study in software process data capture and analysis. In: Proceedings of the 2nd International Conference on the Software Process-Continuous Software Process Improvement (SPCON), pp. 115–124. IEEE Press (1993)Google Scholar
  91. 91.
    van der Aalst, W.: Process Mining: Discovery, Conformance and Enhancement of Business Processes. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-19345-3CrossRefzbMATHGoogle Scholar
  92. 92.
    van der Aalst, W.: Service mining: using process mining to discover, check, and improve service behavior. IEEE Trans. Serv. Comput. 6(4), 525–535 (2013)CrossRefGoogle Scholar
  93. 93.
    Garcia, C.D.S., et al.: Process mining techniques and applications – a systematic mapping study. Expert Syst. Appl. 133, 260–295 (2019)CrossRefGoogle Scholar
  94. 94.
    Hassan, A.E.: Mining software repositories to assist developers and support managers. In: Proceedings of the 22nd IEEE International Conference on Software Maintenance (ICSM), pp. 339–342. IEEE Press (2006)Google Scholar
  95. 95.
    Kagdi, H., Collard, M.L., Maletic, J.I.: A survey and taxonomy of approaches for mining software repositories in the context of software evolution. J. Softw. Evol. Process 19(2), 77–131 (2007)CrossRefGoogle Scholar
  96. 96.
    Bird, C., Menzies, T., Zimmermann, T.: The Art and Science of Analyzing Software Data. Elsevier, Amsterdam (2016)Google Scholar
  97. 97.
    Papazoglou, M.P., Georgakopoulos, D.: Introduction: service-oriented computing. Commun. ACM 46(10), 24–28 (2003)CrossRefGoogle Scholar
  98. 98.
    Oriol, M., Franch, X., Marco, J.: Monitoring the service-based system lifecycle with SALMon. Expert Syst. Appl. 42(19), 6507–6521 (2015)CrossRefGoogle Scholar
  99. 99.
    Comuzzi, M., Kotsokalis, C., Spanoudakis, G., Yahyapour, R.: Establishing and monitoring SLAs in complex service based systems. In: Proceedings of the 2009 IEEE International Conference on Web Services (ICWS), pp. 783–790. IEEE Press (2009)Google Scholar
  100. 100.
    Müller, C., et al.: Comprehensive explanation of SLA violations at runtime. IEEE Trans. Serv. Comput. 7(2), 168–183 (2014)CrossRefGoogle Scholar
  101. 101.
    Fickas, S., Feather, M.S.: Requirements monitoring in dynamic environments. In: Proceedings of the 2nd IEEE International Symposium on Requirements Engineering (ISRE), pp. 140–147. IEEE Press (1995)Google Scholar
  102. 102.
    Vierhauser, M., Rabiser, R., Grünbacher, P.: Requirements monitoring frameworks: a systematic review. Inf. Softw. Technol. 80, 89–109 (2016)CrossRefGoogle Scholar
  103. 103.
    Oriol, M., Qureshi, N.A., Franch, X., Perini, A., Marco, J.: Requirements monitoring for adaptive service-based applications. In: Regnell, B., Damian, D. (eds.) REFSQ 2012. LNCS, vol. 7195, pp. 280–287. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-28714-5_25CrossRefGoogle Scholar
  104. 104.
    Cailliau, A., van Lamsweerde, A.: Runtime monitoring and resolution of probabilistic obstacles to system goals. ACM Trans. Auton. Adapt. Syst. 14(1), Article 3 (2019)Google Scholar
  105. 105.
    Robinson, W.N.: Seeking quality through user-goal monitoring. IEEE Softw. 26(5), 58–65 (2009)CrossRefGoogle Scholar
  106. 106.
    Kohavi, R., Deng, A., Frasca, B., Walker, T., Xu, Y., Pohlmann, N.: Online controlled experiments at large scale. In: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), pp. 1168–1176. ACM Press (2013)Google Scholar
  107. 107.
    Fabijan, A., Dmitriev, P., McFarland, C., Vermeer, L., Holmström Olsson, H., Bosch, J.: Experimentation growth: evolving trustworthy A/B testing capabilities in online software companies. J. Softw. Evol. Process. 30, e2113 (2018)CrossRefGoogle Scholar
  108. 108.
    Lindgren, E., Münch, J.: Raising the odds of success: the current state of experimentation in product development. Inf. Softw. Technol. 77, 80–91 (2016)CrossRefGoogle Scholar
  109. 109.
    Franch, X., Lopez, L., Martínez-Fernández, S., Oriol, M., Rodríguez, P., Trendowicz, A.: Quality-aware rapid software development project: the Q-rapids project. In: Mazzara, M., Bruel, J.-M., Meyer, B., Petrenko, A. (eds.) TOOLS 2019. LNCS, vol. 11771, pp. 378–392. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-29852-4_32CrossRefGoogle Scholar
  110. 110.
    Perini, A.: Data-driven requirements engineering. The SUPERSEDE way. In: Lossio-Ventura, J.A., Muñante, D., Alatrista-Salas, H. (eds.) SIMBig 2018. CCIS, vol. 898, pp. 13–18. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-11680-4_3CrossRefGoogle Scholar
  111. 111.
    Felfernig, A., Stetinger, M., Falkner, A., Atas, M., Franch, X., Palomares, C.: OpenReq: recommender systems in requirements engineering. In: Proceedings of the International Workshop on Recommender Systems and Social Network Analysis (RS-SNA), pp. 1–4. CEUR 2025 (2017)Google Scholar
  112. 112.
    Henderson-Sellers, B., Ralyté, J., Ågerfalk, P., Rossi, M.: Situational Method Engineering. Springer, Heidelberg (2014).  https://doi.org/10.1007/978-3-642-41467-1CrossRefGoogle Scholar
  113. 113.
    Franch, X., et al.: A situational approach for the definition and tailoring of a data-driven software evolution method. In: Krogstie, J., Reijers, H.A. (eds.) CAiSE 2018. LNCS, vol. 10816, pp. 603–618. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-91563-0_37CrossRefGoogle Scholar
  114. 114.
    Dam, H.K., Tran, T., Ghose, A.: Explainable software analytics. In: Proceedings of the 40th International Conference on Software Engineering: New Ideas and Emerging Results (ICSE-NIER), pp. 53–56. ACM Press (2018)Google Scholar
  115. 115.
    Franch, X., Palomares, C., Gorschek, T.: On the requirements engineer role. Commun. ACM (in press). http://dx.doi.org/10.1145/3418292

Copyright information

© Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.UPC-BarcelonaTechUniversitat Politècnica de CatalunyaBarcelonaSpain

Personalised recommendations