An Automated Architectural Evaluation Approach Based on Metadata and Code Analysis
Traditional methods of scenario-based software architecture evaluation rely on manual review and advanced skills from architects and developers. They are used when the system architecture has been specified, but before its implementation has begun. When the system implementation evolves, code analysis can enable the automation of this process and the reuse of architectural information. We propose an approach that introduces metadata about use case scenarios and quality attributes in the source code of the system in order to support automated architectural evaluation through of static and dynamic code analysis and produce reports about scenarios, quality attributes, code assets, and potential tradeoff points among quality attributes. Our work also describes the implementation of a code analysis tool that provides support to the approach. In addition, the paper also presents the preliminary results of its application for the architectural analysis of an enterprise web system and an e-commerce web system.
KeywordsArchitectural evaluation Source code analysis
This work was partially supported by the National Institute of Science and Technology for Software Engineering (INES) - CNPq under grants 573964/2008-4 and CNPq 560256/2010-8, and the Informatics Superintendence (SINFO) from Federal University of Rio Grande do Norte (UFRN), Brazil.
- 1.Clements, P., Kazman, R., Klein, M.: Evaluating Software Architectures: Methods and Case Studies. Addison-Wesley, MA (2002)Google Scholar
- 4.Kazman, R., Klein, M., Clements, P.: ATAM: Method for Architecture Evaluation. Technical report, CMU/SEI-2000-TR-004, ESC-TR-2000-004, Software Engineering Institute, August 2000Google Scholar
- 7.Babar, M.A., Gorton, I.: Comparison of scenario-based software architecture evaluation methods. In: Proceedings of the 11th Asia-Pacific Software Engineering Conference (APSEC ‘04), pp. 600–607. IEEE Computer Society, Washington, DC (2004)Google Scholar
- 8.Wala, T.J.: Watson Libraries for Analysis, September 2013. http://wala.sourceforge.net
- 9.SINFO/UFRN (Informatics Superintendence), September 2013. http://www.info.ufrn.br/wikisistemas
- 10.Torres, M.: Systematic Assessment of Product Derivation Approaches. MSc Dissertation, Federal University of Rio Grande do Norte (UFRN), Natal, Brazil (2011) (in Portuguese)Google Scholar
- 11.Aquino, H.M.: A Systematic Approach to Software Product Lines Testing. MSc Dissertation, Federal University of Rio Grande do Norte (UFRN), Natal, Brazil (2011) (in Portuguese)Google Scholar
- 12.Lau, S.Q.: Domain Analysis of E-Commerce Systems Using Feature-Based Model Templates, MSc Dissertation, University of Waterloo (2006)Google Scholar
- 13.Christensen, H.B., Hansen, K.M.: Towards architectural information in implementation (NIER track). In: Proceedings of the 33rd International Conference on Software Engineering, (ICSE ‘11), pp. 928–931. ACM, New York (2011)Google Scholar
- 14.Mirakhorli, M., Shin, Y., Cleland-Huang, J., Cinar, M.: A tactic-centric approach for automating traceability of quality concerns. In: Proceedings of the 2012 International Conference on Software Engineering (ICSE 2012), pp. 639–649. IEEE Press, Piscataway (2012)Google Scholar
- 15.Holmes, R., Notkin, D.: Identifying program, test, and environmental changes that affect behaviour. In: Proceedings of the 33rd International Conference on Software Engineering (ICSE ‘11), pp. 371–380. ACM, New York (2011)Google Scholar
- 16.Liu, S. Zhang, J.: Program analysis: from qualitative analysis to quantitative analysis (NIER track). In: Proceedings of the 33rd International Conference on Software Engineering (ICSE ‘11), pp. 956–959. ACM, New York (2011)Google Scholar