Advertisement

An Automated Architectural Evaluation Approach Based on Metadata and Code Analysis

  • Felipe Pinto
  • Uirá Kulesza
  • Eduardo Guerra
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 190)

Abstract

Traditional methods of scenario-based software architecture evaluation rely on manual review and advanced skills from architects and developers. They are used when the system architecture has been specified, but before its implementation has begun. When the system implementation evolves, code analysis can enable the automation of this process and the reuse of architectural information. We propose an approach that introduces metadata about use case scenarios and quality attributes in the source code of the system in order to support automated architectural evaluation through of static and dynamic code analysis and produce reports about scenarios, quality attributes, code assets, and potential tradeoff points among quality attributes. Our work also describes the implementation of a code analysis tool that provides support to the approach. In addition, the paper also presents the preliminary results of its application for the architectural analysis of an enterprise web system and an e-commerce web system.

Keywords

Architectural evaluation Source code analysis 

Notes

Acknowledgements

This work was partially supported by the National Institute of Science and Technology for Software Engineering (INES) - CNPq under grants 573964/2008-4 and CNPq 560256/2010-8, and the Informatics Superintendence (SINFO) from Federal University of Rio Grande do Norte (UFRN), Brazil.

References

  1. 1.
    Clements, P., Kazman, R., Klein, M.: Evaluating Software Architectures: Methods and Case Studies. Addison-Wesley, MA (2002)Google Scholar
  2. 2.
    Bengtsson, P., Lassing, N., Bosch, J., Vliet, H.: Architecture-level modifiability analysis (ALMA). J. Syst. Softw. 69, 1–2 (2004)CrossRefGoogle Scholar
  3. 3.
    Kazman, R., Abowd, G., Bass, L., Clements, P.: Scenario-based analysis of software architecture. IEEE Softw. 13(6), 47–55 (1996)CrossRefGoogle Scholar
  4. 4.
    Kazman, R., Klein, M., Clements, P.: ATAM: Method for Architecture Evaluation. Technical report, CMU/SEI-2000-TR-004, ESC-TR-2000-004, Software Engineering Institute, August 2000Google Scholar
  5. 5.
    Silva, L., Balasubramaniam, D.: Controlling software architecture erosion: a survey. J. Syst. Softw. 85(1), 132–151 (2012)CrossRefGoogle Scholar
  6. 6.
    Abi-Antoun, M., Aldrich, J.: Static extraction and conformance analysis of hierarchical runtime architectural structure using annotations. SIGPLAN Not. 44, 321–340 (2009)CrossRefGoogle Scholar
  7. 7.
    Babar, M.A., Gorton, I.: Comparison of scenario-based software architecture evaluation methods. In: Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC ‘04), pp. 600–607. IEEE Computer Society, Washington, DC (2004)Google Scholar
  8. 8.
    Wala, T.J.: Watson Libraries for Analysis, September 2013. http://wala.sourceforge.net
  9. 9.
    SINFO/UFRN (Informatics Superintendence), September 2013. http://www.info.ufrn.br/wikisistemas
  10. 10.
    Torres, M.: Systematic Assessment of Product Derivation Approaches. MSc Dissertation, Federal University of Rio Grande do Norte (UFRN), Natal, Brazil (2011) (in Portuguese)Google Scholar
  11. 11.
    Aquino, H.M.: A Systematic Approach to Software Product Lines Testing. MSc Dissertation, Federal University of Rio Grande do Norte (UFRN), Natal, Brazil (2011) (in Portuguese)Google Scholar
  12. 12.
    Lau, S.Q.: Domain Analysis of E-Commerce Systems Using Feature-Based Model Templates, MSc Dissertation, University of Waterloo (2006)Google Scholar
  13. 13.
    Christensen, H.B., Hansen, K.M.: Towards architectural information in implementation (NIER track). In: Proceedings of the 33rd International Conference on Software Engineering, (ICSE ‘11), pp. 928–931. ACM, New York (2011)Google Scholar
  14. 14.
    Mirakhorli, M., Shin, Y., Cleland-Huang, J., Cinar, M.: A tactic-centric approach for automating traceability of quality concerns. In: Proceedings of the 2012 International Conference on Software Engineering (ICSE 2012), pp. 639–649. IEEE Press, Piscataway (2012)Google Scholar
  15. 15.
    Holmes, R., Notkin, D.: Identifying program, test, and environmental changes that affect behaviour. In: Proceedings of the 33rd International Conference on Software Engineering (ICSE ‘11), pp. 371–380. ACM, New York (2011)Google Scholar
  16. 16.
    Liu, S. Zhang, J.: Program analysis: from qualitative analysis to quantitative analysis (NIER track). In: Proceedings of the 33rd International Conference on Software Engineering (ICSE ‘11), pp. 956–959. ACM, New York (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Federal University of Rio Grande do Norte (UFRN)NatalBrazil
  2. 2.Federal Institute of Education, Science and Technology of Rio Grande do Norte (IFRN)NatalBrazil
  3. 3.National Institute for Space Research (INPE)São José dos CamposBrazil

Personalised recommendations