Embedded Systems Architecture: Evaluation and Analysis

  • Bastian Florentz
  • Michaela Huhn
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4214)


Short innovation cycles in software and hardware make architecture design a key issue in future development processes for embedded systems. The basis for architectural design decisions is a transparent architecture evaluation.

Our model-based approach supports a uniform representation of hierarchies of quality attributes and an integration of different architecture evaluation techniques and methods. We present a metamodel for architecture evaluation as a basis for the precise description of the quality attribute structure and the evaluation methodology. By modelling architecture evaluation, the relationships between architectural elements and quality attributes and interdependencies between quality attributes can be represented and investigated. Thereby, the architecture exploration process with its evaluations, decisions, and optimizations is made explicit, traceable, and analyzable.


Quality Rate Quality Attribute Software Architecture Hardware Architecture Hardware Cost 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    ISO: ISO/IEC 9126 Information technology - Software product evaluation - Quality characteristics and guidelines for their use, 1st edn. (1991)Google Scholar
  2. 2.
    Bass, L., Clements, P., Kazman, R.: Software Architecture in Practice. Addison-Wesley Longman Publishing Co., Inc., Amsterdam (1998)Google Scholar
  3. 3.
    Clements, P., Bachmann, F., Bass, L., Garlan, D., Ivers, J., Little, R., Nord, R., Stafford, J.: Documenting Software Architectures: Views and Beyond. Addison-Wesley, Reading (2002)Google Scholar
  4. 4.
    Hofmeister, C., Nord, R., Soni, D.: Applied Software Architecture. Addison-Wesley Longman Publishing Co., Inc., Boston (2000)Google Scholar
  5. 5.
    Florentz, B.: Systemarchitekturevaluation: Integration unterschiedlicher Kriterien. In: 26. Tagung Elektronik im Kfz (2006)Google Scholar
  6. 6.
    Kazman, R., Abowd, G.D., Bass, L.J., Clements, P.: Scenario-Based Analysis of Software Architecture. IEEE Software 13(6), 47–55 (1996)CrossRefGoogle Scholar
  7. 7.
    Kazman, R., Klein, M.H., Barbacci, M.R., Longstaff, T.A., Lipson, H.F., Carriere, S.J.: The Architecture Tradeoff Analysis Method. In: ICECCS, pp. 68–78 (1998)Google Scholar
  8. 8.
    Bergner, K., Rausch, A., Sihling, M., Ternité, T.: DoSAM – Domain-Specific Software Architecture Comparison Model. In: Reussner, R., Mayer, J., Stafford, J.A., Overhage, S., Becker, S., Schroeder, P.J. (eds.) QoSA 2005 and SOQUA 2005. LNCS, vol. 3712, pp. 4–20. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  9. 9.
    Lung, C.H., Kalaichelvan, K.: An Approach to Quantitative Software Architecture Sensitivity Analysis. IJSEKE 10(1), 97–114 (2000)Google Scholar
  10. 10.
    Happe, J.: Predicting Mean Service Execution Times of Software Components Based on Markov Models. In: Reussner, R., Mayer, J., Stafford, J.A., Overhage, S., Becker, S., Schroeder, P.J. (eds.) QoSA 2005 and SOQUA 2005. LNCS, vol. 3712, pp. 53–70. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  11. 11.
    Koziolek, H., Firus, V.: Empirical Evaluation of Model-Based Performance Prediction Methods in Software Development. In: Reussner, R., Mayer, J., Stafford, J.A., Overhage, S., Becker, S., Schroeder, P.J. (eds.) QoSA 2005 and SOQUA 2005. LNCS, vol. 3712, pp. 188–202. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  12. 12.
    Reussner, R., Hasselbring, W.: Handbuch der Software-Architektur. Dpunkt Verlag (2006)Google Scholar
  13. 13.
    Wandeler, E., Thiele, L., Verhoef, M.H.G., Lieverse, P.: System Architecture Evaluation Using Modular Performance Analysis - A Case Study. In: Margaria, T., Steffen, B. (eds.) ISoLA 2004. LNCS, vol. 4313, Springer, Heidelberg (2006)Google Scholar
  14. 14.
    Liu, C.L., Layland, J.W.: Scheduling Algorithms for Multiprogramming in a Hard-Real-Time Environment. J. ACM 20(1), 46–61 (1973)zbMATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Babar, M.A., Gorton, I.: Comparison of Scenario-Based Software Architecture Evaluation Methods. In: APSEC, pp. 600–607 (2004)Google Scholar
  16. 16.
    Babar, M.A., Zhu, L., Jeffery, D.R.: A Framework for Classifying and Comparing Software Architecture Evaluation Methods. In: ASWEC, pp. 309–319 (2004)Google Scholar
  17. 17.
    Dobrica, L., Niemelä, E.: A Survey on Software Architecture Analysis Methods. IEEE Transactions on Software Engineering 28(7), 638–653 (2002)CrossRefGoogle Scholar
  18. 18.
    Ionita, M.T., Hammer, D.K., Obbink, H.: Scenario-Based Software Architecture Evaluation Methods: An Overview. In: ICSE/SARA (2002)Google Scholar
  19. 19.
    Bahsoon, R., Emmerich, W.: Evaluating Software Architectures: Development, Stability, and Evolution. In: AICCSA (2003)Google Scholar
  20. 20.
    Stoermer, C., Bachmann, F., Verhoef, C.: SACAM: The Software Architecture Comparison Analysis Method. Technical Report SEI/CMU-2003-TR-006, SEI (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Bastian Florentz
    • 1
  • Michaela Huhn
    • 1
  1. 1.Technical University of BraunschweigBraunschweigGermany

Personalised recommendations