Advertisement

Improving manual reviews in function-centered engineering of embedded systems using a dedicated review model

  • Marian Daun
  • Thorsten WeyerEmail author
  • Klaus Pohl
Regular Paper

Abstract

In model-based engineering of embedded systems, manual validation activities such as reviews and inspections are needed to ensure that the system under development satisfies the stakeholder intentions. During the engineering process, changes in the stakeholder intentions typically trigger revisions of already developed and documented engineering artifacts including requirements and design specifications. In practice, changes in stakeholder intentions are often not immediately perceived and not properly documented. Moreover, they are quite often not consistently incorporated into all relevant engineering artifacts. In industry, typically manual reviews are executed to ensure that the relevant stakeholder intentions are adequately considered in the engineering artifacts. In this article, we introduce a dedicated review model to aid the reviewer in conducting manual reviews of behavioral requirements and functional design specification—two core artifacts in function-centered engineering of embedded software. To investigate whether the proposed solution is beneficial we conducted controlled experiments showing that the use of the dedicated review model can significantly increase the effectiveness and efficiency of manual reviews. Additionally, the use of the dedicated review model leads to significantly more confident decisions of the reviewers and is perceived by the reviewers as significantly more supportive compared with reviews without the dedicated review model.

Keywords

Behavioral requirements Embedded software Functional design Review model Requirements engineering Perspective-based review Model transformations 

Notes

Acknowledgements

Funding was partially provided by the German Federal Ministry of Education and Research (BMBF) under grant number 01IS12005C (SPES_XT) and 01IS16043V (CrESt).

References

  1. 1.
    Schätz, B., Pretschner, A., Huber, F., Philipps, J.: Model-based development of embedded systems. In: Advances in Object-Oriented Information Systems, OOIS 2002 Workshops, Montpellier, France, 2 Sept 2002, Proceedings, vol. 2426, pp. 298–312 (2002)Google Scholar
  2. 2.
    France, R.B., Rumpe, B.: Model-driven development of complex software: a research roadmap. In: International Conference on Software Engineering, ISCE 2007, Workshop on the Future of Software Engineering, FOSE 2007, 23–25 May 2007, Minneapolis, MN, USA, pp. 37–54 (2007)Google Scholar
  3. 3.
    Schmidt, D.C.: Guest editor’s introduction: model-driven engineering. IEEE Comput. 39(2), 25–31 (2006)MathSciNetCrossRefGoogle Scholar
  4. 4.
    ISO 26262-1: Road vehicles—functional safety—part 1: vocabulary (2011)Google Scholar
  5. 5.
    SAE International Standard 4761: Guidelines and methods for conducting the safety assessment process on civil airborne systems and equipment. SAE (1996)Google Scholar
  6. 6.
    ISO/IEC/IEEE 24765: ISO/IEC/IEEE international standard—systems and software engineering: vocabulary, Aug 2017Google Scholar
  7. 7.
    Pretschner, A., Broy, M., Krüger, I.H., Stauner, T.: Software engineering for automotive systems: a roadmap. In: International Conference on Software Engineering, ISCE 2007, Workshop on the Future of Software Engineering, FOSE 2007, 23–25 May 2007, Minneapolis, MN, USA, pp. 55–71 (2007)Google Scholar
  8. 8.
    Jantsch, A., Sander, I.: On the roles of functions and objects in system specification. In: Proceedings of the Eighth International Workshop on Hardware/Software Codesign, CODES 2000, San Diego, California, USA, 2000, pp. 8–12 (2000)Google Scholar
  9. 9.
    Daun, M., Höfflinger, J., Weyer, T.: Function-centered engineering of embedded systems evaluating industry needs and possible solutions. In: ENASE 2014 Proceedings of the 9th International Conference on Evaluation of Novel Approaches to Software Engineering, Lisbon, Portugal, 28–30 Apr 2014, pp. 226–234 (2014)Google Scholar
  10. 10.
    ISO/IEC/IEEE systems and software engineering—architecture description. ISO/IEC/IEEE 42010:2011(E) (revision of ISO/IEC 42010:2007 and IEEE Std 1471–2000), pp. 1–46, Dec 2011Google Scholar
  11. 11.
    ISO/IEC TS 24748-1: Systems and software engineering—life cycle management—part 1: guidelines for life cycle management (2016)Google Scholar
  12. 12.
    Nuseibeh, B.: Weaving together requirements and architectures. IEEE Comput. 34(3), 115–117 (2001)CrossRefGoogle Scholar
  13. 13.
    Whalen, M.W., Murugesan, A., Heimdahl, M.P.E.: Your what is my how: why requirements and architectural design should be iterative. In: First IEEE International Workshop on the Twin Peaks of Requirements and Architecture, TwinPeaks@RE 2012, Chicago, IL, USA, 25 Sept 2012, pp. 36–40 (2012)Google Scholar
  14. 14.
    ISO/IEC/IEEE International Standard—systems and software engineering—life cycle processes: requirements engineering. ISO/IEC/IEEE 29148:2011(E), pp. 1–94, Dec (2011)Google Scholar
  15. 15.
    DOT/FAA/AR-08/32: Requirements Engineering Management Handbook. U.S. Department of Transportation, Federal Aviation Administration, Springfield, Virginia, United States (2009)Google Scholar
  16. 16.
    Jeon, S.-U., Hong, J.-E., Bae, D.-H.: Interaction-based behavior modeling of embedded software using UML 2.0. In: Ninth IEEE International Symposium on Object and Component-Oriented Real-Time Distributed Computing (ISORC’06), p. 5 (2006)Google Scholar
  17. 17.
    Weber, M., Weisbrod, J.: Requirements engineering in automotive development experiences and challenges. In: 10th Anniversary IEEE Joint International Conference on Requirements Engineering (RE 2002), 9–13 Sept 2002, Essen, Germany, pp. 331–340 (2002)Google Scholar
  18. 18.
    ITU-T Z.120: Recommendation ITU-T Z.120: Message Sequence Chart (MSC). International Telecommunication Union (2011)Google Scholar
  19. 19.
    Whittle, J., Jayaraman, P.K.: Synthesizing hierarchical state machines from expressive scenario descriptions. ACM Trans. Softw. Eng. Methodol. 19(3), 1–45 (2010)CrossRefGoogle Scholar
  20. 20.
    de Alfaro, L., Henzinger, T.A.: Interface automata. In: Proceedings of the 8th European Software Engineering Conference Held Jointly with 9th ACM SIGSOFT International Symposium on Foundations of Software Engineering, New York, NY, USA, pp. 109–120 (2001)Google Scholar
  21. 21.
    Albers, K., et al.: System function networks. In: Pohl, K., Broy, M., Daembkes, H., Hönninger, H. (eds.) Advanced Model-Based Engineering of Embedded Systems, Extensions of the SPES 2020 Methodology, pp. 119–144. Springer, Berlin (2016)Google Scholar
  22. 22.
    Leveson, N.G.: Safeware—System Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology. Addison-Wesley, Reading (1995)Google Scholar
  23. 23.
    SAE International Standard J1239_200901: Potential failure mode and effects analysis in design (design FMEA), potential failure mode and effects analysis in manufacturing and assembly processes (process FMEA) (2009)Google Scholar
  24. 24.
    Basili, V.R., et al.: The empirical investigation of perspective-based reading. Empir. Softw. Eng. 1(2), 133–164 (1996)CrossRefGoogle Scholar
  25. 25.
    Shull, F., Rus, I., Basili, V.R.: How perspective-based reading can improve requirements inspections. IEEE Comput. 33(7), 73–79 (2000)CrossRefGoogle Scholar
  26. 26.
    Alur, R., Etessami, K., Yannakakis, M.: Inference of message sequence charts. IEEE Trans. Softw. Eng. 29(7), 623–633 (2003)CrossRefGoogle Scholar
  27. 27.
    Uchitel, S., Kramer, J., Magee, J.: Synthesis of behavioral models from scenarios. IEEE Trans. Softw. Eng. 29(2), 99–115 (2003)CrossRefGoogle Scholar
  28. 28.
    Uchitel, S., Brunet, G., Chechik, M.: Behaviour model synthesis from properties and scenarios. Presented at the 29th International Conference on Software Engineering, 2007. ICSE 2007, pp. 34–43 (2007)Google Scholar
  29. 29.
    Uchitel, S., Brunet, G., Chechik, M.: Synthesis of partial behavior models from properties and scenarios. IEEE Trans. Softw. Eng. 35(3), 384–406 (2009)CrossRefGoogle Scholar
  30. 30.
    Hopcroft, J.E., Motwani, R., Ullman, J.D.: Introduction to Automata Theory, Languages, and Computation, 3rd edn. Boston, Pearson (2007)zbMATHGoogle Scholar
  31. 31.
    Harel, D., Segall, I.: Synthesis from scenario-based specifications. J. Comput. Syst. Sci. 78(3), 970–980 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  32. 32.
    Hopcroft, J.E., Ullman, J.D.: Introduction to Automata Theory, Languages and Computation, 1st edn. Addison-Wesley Publishing Company, Reading (1979)zbMATHGoogle Scholar
  33. 33.
    Nerode, A.: Linear automaton transformations. Proc. Am. Math. Soc. 9(4), 541–544 (1958)MathSciNetCrossRefzbMATHGoogle Scholar
  34. 34.
    Brauer, W.: Automatentheorie: Eine Einführung in die Theorie endlicher Automaten. Softcover reprint of the original 1st edn. 1984. Vieweg + Teubner Verlag, Stuttgart (1984)Google Scholar
  35. 35.
    Sikora, E., Daun, M., Pohl, K.: Supporting the consistent specification of scenarios across multiple abstraction levels. In: Requirements Engineering: Foundation for Software Quality, 16th International Working Conference, REFSQ 2010, Essen, Germany, 30 June, 2 July 2010. Proceedings, vol. 6182, pp. 45–59 (2010)Google Scholar
  36. 36.
    Nolte, S.: QVT Operational Mappings: Modellierung mit der Query Views Transformation. Springer, Berlin (2010)CrossRefGoogle Scholar
  37. 37.
    OMG: Meta Object Facility (MOF) 2.0 Query/View/Transformation Specification. Object Management Group (2015)Google Scholar
  38. 38.
    Daun, M., Salmon, A., Weyer, T., Pohl, K.: The impact of students’ skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students. In: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering, EASE 2015, Nanjing, China, 27–29 Apr 2015, p. Paper 29 (2015)Google Scholar
  39. 39.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in Software Engineering, 2012th edn. Springer, New York (2012)CrossRefzbMATHGoogle Scholar
  40. 40.
    Salman, I., Misirli, A.T., Juzgado, N.J.: Are students representatives of professionals in software engineering experiments?. In: 37th IEEE/ACM International Conference on Software Engineering, ICSE 2015, Florence, Italy, 16–24 May 2015, vol. 1, pp. 666–676 (2015)Google Scholar
  41. 41.
    Siegmund, J., Siegmund, N., Apel, S.: Views on internal and external validity in empirical software engineering. In: 37th IEEE/ACM International Conference on Software Engineering, ICSE 2015, Florence, Italy, 16–24 May 2015, vol. 1, pp. 9–19 (2015)Google Scholar
  42. 42.
    Berander, P.: Using students as subjects in requirements prioritization. In: 2004 International Symposium on Empirical Software Engineering (ISESE 2004), 19–20 Aug 2004, Redondo Beach, CA, USA, pp. 167–176 (2004)Google Scholar
  43. 43.
    Höst, M., Regnell, B., Wohlin, C.: Using students as subjects: a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5(3), 201–214 (2000)CrossRefzbMATHGoogle Scholar
  44. 44.
    Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects an empirical evaluation. In: Proceedings of the Second International Symposium on Empirical Software Engineering and Measurement, ESEM 2008, 9–10 Oct 2008, Kaiserslautern, Germany, pp. 288–290 (2008)Google Scholar
  45. 45.
    Tichy, W.F.: Hints for reviewing empirical work in software engineering. Empir. Softw. Eng. 5(4), 309–312 (2000)CrossRefGoogle Scholar
  46. 46.
    Venkatesh, V., Bala, H.: Technology acceptance model 3 and a research agenda on interventions. Decis. Sci. 39(2), 273–315 (2008)CrossRefGoogle Scholar
  47. 47.
    Jedlitschka, A., Ciolkowski, M., Pfahl, D.: Reporting experiments in software engineering. In: Shull, F., Singer, J., Sjøberg, D.I.K. (eds.) Guide to Advanced Empirical Software Engineering, pp. 201–228. Springer, London (2008)CrossRefGoogle Scholar
  48. 48.
    Ramsey, P.H.: Exact type 1 error rates for robustness of student’s t test with unequal variances. J. Educ. Behav. Stat. 5(4), 337–349 (1980)MathSciNetCrossRefGoogle Scholar
  49. 49.
    Arcuri, A., Briand, L.C.: A practical guide for using statistical tests to assess randomized algorithms in software engineering. In: Proceedings of the 33rd International Conference on Software Engineering, ICSE 2011, Waikiki, Honolulu, HI, USA, 21–28 May 2011, pp. 1–10 (2011)Google Scholar
  50. 50.
    Kitchenham, B.: Robust statistical methods: why, what and how: keynote. In: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering, EASE 2015, Nanjing, China, 27–29 Apr 2015, p. Paper 1 (2015)Google Scholar
  51. 51.
    Faul, F., Erdfelder, E., Lang, A., Buchner, A.: G*Power 3: a flexible statistical power analysis program for social, behavioral, and biomedical sciences. Behav. Res. Methods 39(2), 175–191 (2007)CrossRefGoogle Scholar
  52. 52.
    Campbell, D.T., Stanley, J.C.: Experimental and Quasi-Experimental Designs for Research. Houghton Mifflin, Boston (1963)Google Scholar
  53. 53.
    Daun, M., Brings, J., Weyer, T.: On the impact of the model-based representation of inconsistencies to manual reviews. In: Conceptual Modeling, pp. 466–473 (2017)Google Scholar
  54. 54.
    Boehm, B.W., Basili, V.R.: Software defect reduction top 10 list. IEEE Comput. 34(1), 135–137 (2001)CrossRefGoogle Scholar
  55. 55.
    Gilb, T., Graham, D., Finzi, S.: Software Inspection. Addison-Wesley, Wokingham (1993)Google Scholar
  56. 56.
    Fagan, M.E.: Advances in software inspections. IEEE Trans. Softw. Eng. 12(7), 744–751 (1986)CrossRefGoogle Scholar
  57. 57.
    Porter, A.A., Votta, L.G., Basili, V.R.: Comparing detection methods for software requirements inspections: a replicated experiment. IEEE Trans. Softw. Eng. 21(6), 563–575 (1995)CrossRefGoogle Scholar
  58. 58.
    Abdelnabi, Z., Cantone, G., Ciolkowski, M., Rombach, H.D.: Comparing code reading techniques applied to object-oriented software frameworks with regard to effectiveness and defect detection rate. In: 2004 International Symposium on Empirical Software Engineering (ISESE 2004), 19–20 Aug 2004, Redondo Beach, CA, USA, pp. 239–248 (2004)Google Scholar
  59. 59.
    Shull, F., et al.: What we have learned about fighting defects. In: 8th IEEE International Software Metrics Symposium (METRICS 2002), 4–7 June 2002, Ottawa, Canada, pp. 249–258 (2002)Google Scholar
  60. 60.
    Denger, C., Ciolkowski, M.: High quality statecharts through tailored, perspective-based inspections. In: 2003 Proceedings 29th Euromicro Conference, pp. 316–323 (2003)Google Scholar
  61. 61.
    Binder, R.V.: Testing Object-Oriented Systems: Models, Patterns, and Tools. Addison-Wesley, Reading (1999)Google Scholar
  62. 62.
    Travassos, G., Shull, F., Fredericks, M., Basili, V.R.: Detecting defects in object-oriented designs: using reading techniques to increase software quality. In: Proceedings of the 1999 ACM SIGPLAN Conference on Object-Oriented Programming Systems, Languages & Applications (OOPSLA’99), Denver, Colorado, USA, 1–5 Nov 1999, pp. 47–56 (1999)Google Scholar
  63. 63.
    Liu, S., Chen, Y., Nagoya, F., McDermid, J.A.: Formal specification-based inspection for verification of programs. IEEE Trans. Softw. Eng. 38(5), 1100–1122 (2012)CrossRefGoogle Scholar
  64. 64.
    Gotel, O.C.Z., Finkelstein, A.: An analysis of the requirements traceability problem. In: Proceedings of the First IEEE International Conference on Requirements Engineering, ICRE’94, Colorado Springs, Colorado, USA, 18–21 Apr 1994, pp. 94–101 (1994)Google Scholar
  65. 65.
    Winkler, S., von Pilgrim, J.: A survey of traceability in requirements engineering and model-driven development. Softw. Syst. Model. 9(4), 529–565 (2010)CrossRefGoogle Scholar
  66. 66.
    Clarke, E.M., Emerson, E.A., Sifakis, J.: Model checking: algorithmic verification and debugging. Commun. ACM 52(11), 74–84 (2009)CrossRefGoogle Scholar
  67. 67.
    Larsen, K.G.: Efficient local correctness checking. In: Computer Aided Verification, Fourth International Workshop, CAV’92, Montreal, Canada, 29 June 1 July 1992, Proceedings, vol. 663, pp. 30–43 (1993)Google Scholar
  68. 68.
    Holzmann, G.J.: The model checker SPIN. IEEE Trans. Softw. Eng. 23(5), 279–295 (1997)CrossRefGoogle Scholar
  69. 69.
    Blanc, X., Mounier, I., Mougenot, A., Mens, T.: Detecting model inconsistency through operation-based model construction. In: 30th International Conference on Software Engineering (ICSE 2008), Leipzig, Germany, 10–18 May 2008, pp. 511–520 (2008)Google Scholar
  70. 70.
    Grundy, J.C., Hosking, J.G., Mugridge, W.B.: Inconsistency management for multiple-view software development environments. IEEE Trans. Softw. Eng. 24(11), 960–981 (1998)CrossRefGoogle Scholar
  71. 71.
    Fradet, P., Le Métayer, D., Périn, M.: Consistency checking for multiple view software architectures. In: Software Engineering ESEC/FSE’99, 7th European Software Engineering Conference, Held Jointly with the 7th ACM SIGSOFT Symposium on the Foundations of Software Engineering, Toulouse, France, Sept 1999, Proceedings, vol. 1687, pp. 410–428 (1999)Google Scholar
  72. 72.
    Borges, R.V., d’Avila Garcez, A.S., Lamb, L.C.: Integrating model verification and self-adaptation. In: ASE 2010, 25th IEEE/ACM International Conference on Automated Software Engineering, Antwerp, Belgium, 20–24 Sept 2010, pp. 317–320 (2010)Google Scholar
  73. 73.
    Paige, R.F., Brooke, P.J., Ostroff, J.S.: Metamodel-based model conformance and multiview consistency checking. ACM Trans. Softw. Eng. Methodol. 16(3), 11 (2007)CrossRefGoogle Scholar
  74. 74.
    Milicev, D.: Automatic model transformations using extended UML object diagrams in modeling environments. IEEE Trans. Softw. Eng. 28(4), 413–430 (2002)CrossRefGoogle Scholar
  75. 75.
    Giese, H., Wagner, R.: From model transformation to incremental bidirectional model synchronization. Softw. Syst. Model. 8(1), 21–43 (2009)CrossRefGoogle Scholar
  76. 76.
    Giese, H., Wagner, R.: Incremental model synchronization with triple graph grammars. In: Model Driven Engineering Languages and Systems, 9th International Conference, MoDELS 2006, Genova, Italy, 1–6 Oct 2006, Proceedings, vol. 4199, pp. 543–557 (2006)Google Scholar
  77. 77.
    Hermann, F., et al.: Model synchronization based on triple graph grammars: correctness, completeness and invertibility. Softw. Syst. Model. 14(1), 241–269 (2015)CrossRefGoogle Scholar
  78. 78.
    Giese, H., Hildebrandt, S., Neumann, S.: Model synchronization at work: keeping SysML and AUTOSAR models consistent. In: Graph Transformations and Model-Driven Engineering Essays Dedicated to Manfred Nagl on the Occasion of his 65th Birthday, vol. 5765, pp. 555–579 (2010)Google Scholar
  79. 79.
    Damas, C., Lambeau, B., Roucoux, F., van Lamsweerde, A.: Analyzing critical process models through behavior model synthesis. In: 31st International Conference on Software Engineering, ICSE 2009, 16–24 May 2009, Vancouver, Canada, Proceedings, pp. 441–451 (2009)Google Scholar
  80. 80.
    van Paesschen, E., Meuter, W.D., D’Hondt, M.: SelfSync: a dynamic round-trip engineering environment. In: Model Driven Engineering Languages and Systems, 8th International Conference, MoDELS 2005, Montego Bay, Jamaica, 2–7 Oct 2005, Proceedings, vol. 3713, pp. 633–647 (2005)Google Scholar
  81. 81.
    Malavolta, I., Muccini, H., Pelliccione, P., Tamburri, D.A.: Providing architectural languages and tools interoperability through model transformation technologies. IEEE Trans. Softw. Eng. 36(1), 119–140 (2010)CrossRefGoogle Scholar
  82. 82.
    Kimelman, D., Kimelman, M., Mandelin, D., Yellin, D.M.: Bayesian approaches to matching architectural diagrams. IEEE Trans. Softw. Eng. 36(2), 248–274 (2010)CrossRefGoogle Scholar
  83. 83.
    Agrawal, A.: Graph rewriting and transformation (GReAT): a solution for the model integrated computing (MIC) bottleneck. In: 18th IEEE International Conference on Automated Software Engineering (ASE 2003), 6–10 Oct 2003, Montreal, Canada, pp. 364–368 (2003)Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.paluno – The Ruhr Institute for Software TechnologyUniversity of Duisburg EssenEssenGermany

Personalised recommendations