Advertisement

Requirements Engineering

, Volume 8, Issue 4, pp 222–235 | Cite as

Arcade: early dynamic property evaluation of requirements using partitioned software architecture models

  • K. Suzanne Barber
  • Tom Graser
  • Jim Holt
  • Geoff Baker
Original Article

Abstract

A fundamental goal of software engineering research is to develop evaluation techniques that enable analysis early in the software development process, when correcting errors is less costly. The Systems Engineering Process Activities (SEPA) Arcade tool employs a number of techniques to evaluate dynamic properties of requirements including correctness, performance, and reliability. To mitigate a number of practical issues associated with dynamic property evaluation, Arcade leverages the SEPA 3D Architecture, a formal requirements representation that partitions requirements types amongst a set of interrelated architecture models. This paper presents a case study illustrating how Arcade uses the SEPA 3D Architecture to help manage complexity associated with dynamic property evaluation, to reduce the level of evaluation technique expertise required to perform dynamic property evaluations, and to support an iterative, incremental approach that allows early evaluation using partial requirements models.

Keywords

Dynamic properties Requirements models Software architecture 

Notes

Acknowledgements

This research was supported in part by the Schlumberger Foundation Grant and the Texas Higher Education Coordinating Board Advanced Technology Program (ATP #003658-0188-1999).

References

  1. 1.
    Bass L, Clements P, Kazman R (1998) Software architecture in practice. SEI series in software engineering. Addison-Wesley, Reading, MAGoogle Scholar
  2. 2.
    Perry DE, Wolf AL (1992) Foundations for the study of software architecture. Softw Eng Notes 17(4):40–52Google Scholar
  3. 3.
    Hsia P, Davis A, Kung D (1993) Status report: requirements engineering. IEEE Softw 10(6):75–79CrossRefGoogle Scholar
  4. 4.
    Wieringa R, Dubois E (1998) Integrating semi-formal and formal software specification techniques. Inform Syst 23(3/4):159–178Google Scholar
  5. 5.
    Barber KS et al (1999) Requirements evolution and reuse using the systems engineering process activities (SEPA). Aust J Inform Syst (Special Issue on Requirements Engineering) 7(1):75–97Google Scholar
  6. 6.
    Barber KS et al (2001) Reliability estimation techniques for domain reference architectures. In: 14th international conference on software and systems engineering and their applications (ICSSEA 2001), ParisGoogle Scholar
  7. 7.
    Barber KS, Graser TJ, Holt J. Evaluating dynamic correctness properties of domain reference architectures using a combination of simulation and model checking. In: 13th international conference in software engineering and knowledge engineering (SEKE 2001), Buenos AiresGoogle Scholar
  8. 8.
    Barber KS, Holt J, Baker G (2002) Performance evaluation of domain reference architectures. In: 14th international conference in software engineering and knowledge engineering (SEKE 2002), Ischia, ItalyGoogle Scholar
  9. 9.
    Sommerville I (1992) Software engineering (4th edn). Addison-Wesley, Wokingham, UKGoogle Scholar
  10. 10.
    Tsai J, Xu K (1999) An empirical evaluation of deadlock detection in software architecture specifications. Ann Softw Eng 7:95–126CrossRefGoogle Scholar
  11. 11.
    Hofmann HF, Lehner F (2001) Requirements engineering as a success factor in software projects. IEEE Softw 18(4):58–66Google Scholar
  12. 12.
    Barber KS, Graser TJ, Holt J (2001) Evolution of requirements and architectures: an empirical-based analysis. In: 1st international workshop on model-based requirements engineering (MBRE'01), San Diego, CAGoogle Scholar
  13. 13.
    Barber KS, Graser TJ, Holt J. A multi-level software architecture metamodel to support the capture and evaluation of stakeholder concerns. In: 5th world multi-conference on systematics, cybernatics and informatics (SCI 2001), Orlando, FLGoogle Scholar
  14. 14.
    Holzman GJ (1997) The model checker SPIN. IEEE Trans Softw Eng 23(5):279–295CrossRefGoogle Scholar
  15. 15.
    Alpern B, Schneider FB (1987) Recognizing safety and liveness. Distrib Comput 2(3):117–126Google Scholar
  16. 16.
    Kindler E (1994) Safety and liveness properties: a survey. Bull Eur Assoc Theor Comput Sci 53:268–272Google Scholar
  17. 17.
    Lamport L, Lynch N (1990) Distributed computing: models and methods. In: Leeuwen Jv (ed) Handbook of theoretical computer science. Elsevier, Amsterdam, pp 1157–1199Google Scholar
  18. 18.
    Barber KS, Graser TJ, Holt J (2002) Providing early feedback in the development cycle through automated application of model checking to software architectures. In: 16th international conference on automated software engineering, San Diego, CAGoogle Scholar
  19. 19.
    ITU-TS (1996) ITU-TS Recommendation Z.120: Message Sequence Charts (MSC). ITU, GenevaGoogle Scholar
  20. 20.
    Schneider F et al (1998) Validating requirements for fault tolerant systems using model checking. In: 3rd international conference on requirements engineering, Colorado Springs, COGoogle Scholar
  21. 21.
    Barber KS, Holt J (2001) Software architecture correctness. IEEE Softw 18(8):64–65CrossRefGoogle Scholar
  22. 22.
    Fishwick PA (1995) Simulation model design and execution: building digital worlds. Prentice-Hall, Englewood Cliffs, NJGoogle Scholar
  23. 23.
    Yacoub S, Cukic B, Ammar H (1999) Scenario-based reliability analysis of component-based software. In: 10th international symposium on software reliability engineering, Boca Raton, FLGoogle Scholar
  24. 24.
    Heitmeyer C, Kirby J, Labaw B (1998) Applying the SCR requirements method to a weapons control panel: an experience report. In: 2nd workshop on formal methods in software practice (FMSP'98), Clearwater Beach, FLGoogle Scholar
  25. 25.
    Cheung S, Giannakopoulou D, Kramer J (1997) Verification of liveness properties using compositional reachability analysis. In: ESEC/FSE '97, ZurichGoogle Scholar
  26. 26.
    Cheung SC, Kramer J Checking subsystem safety properties in compositional reachability analysis. In: 18th international conference on software engineering, BerlinGoogle Scholar
  27. 27.
    Bose P (1999) Scenario-driven analysis of component-based software architecture models. In: IFIP WICSA, San Antonio, TXGoogle Scholar
  28. 28.
    Aquilana F, Balsamo S, Inverardi P (2001) Performance analysis at the software architectural design level. Perform Evaluation 45(2–3):147–178Google Scholar
  29. 29.
    Petriu D, Shousha C, Jalnapurkar A (2000) Architecture-based performance analysis applied to a telecommunication system. IEEE Trans Softw Eng 26(11):1049–1065CrossRefGoogle Scholar
  30. 30.
    Spitznagel B, Garlan D (1998) Architecture-based performance analysis. In: 10th international conference on software engineering and knowledge engineering, San Francisco, CAGoogle Scholar
  31. 31.
    Li JJ (1998) Performance prediction based on semi-formal software architectural description. In: International conference on performance in computing and communications, Phoenix/Tempe, AZGoogle Scholar
  32. 32.
    Lung C-H, Jalnapurkar A, El-Rayess A (1998) Performance-oriented software architecture engineering: an experience report. In: Workshop on software performance (WOSP98), Santa Fe, NMGoogle Scholar
  33. 33.
    Williams LG, Smith CU (1998) Performance evaluation of software architectures. In: Workshop on software and performance, Santa Fe, NMGoogle Scholar
  34. 34.
    Bernardo M, Ciancarini P, Donatiello L (2000) AEMPA: a process algebraic description language for the performance analysis of software architectures. In: 2nd international workshop on software and performance (WOSP 2000), OttawaGoogle Scholar
  35. 35.
    Andolfi F et al (2000) Deriving performance models of software architecture from message sequence charts. In: 2nd international workshop on software performance, OttawaGoogle Scholar
  36. 36.
    Li JJ, Micallef J (1997) Automatic simulation to predict software architecture reliability. In: 8th international symposium on software reliability engineering, Albuquerque, NMGoogle Scholar
  37. 37.
    Gokhale SS et al (1998) An analytical approach to architecture-based software reliability prediction. In: IEEE international computer performance and dependability symposium, Durham, NCGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2003

Authors and Affiliations

  • K. Suzanne Barber
    • 1
  • Tom Graser
    • 1
  • Jim Holt
    • 2
  • Geoff Baker
    • 2
  1. 1.Laboratory for Intelligent Processes and SystemsUniversity of Texas at AustinAustinUSA
  2. 2.MotorolaAustinUSA

Personalised recommendations