MEMS: A Method for Evaluating Middleware Architectures

  • Yan Liu
  • Ian Gorton
  • Len Bass
  • Cuong Hoang
  • Suhail Abanmi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4214)


Middleware architectures play a crucial role in determining the overall quality of many distributed applications. Systematic evaluation methods for middleware architectures are therefore important to thoroughly assess the impact of design decisions on quality goals. This paper presents MEMS, a scenario-based evaluation approach. MEMS provides a principled way of evaluating middleware architectures by leveraging generic qualitative and quantitative evaluation techniques such as prototyping, testing, rating, and analysis. It measures middleware architectures by rating multiple quality attributes, and the outputs aid the determination of the suitability of alternative middleware architectures to meet an application’s quality goals. MEMS also benefits middleware development by uncovering potential problems at early stage, making it cheaper and quicker to fix design problems. The paper describes a case study to evaluate the security architecture of grid middleware architectures for managing secure conversations and access control. The results demonstrate the practical utility of MEMS for evaluating middleware architectures for multiple quality attributes.


Quality Attribute Software Architecture Virtual Organization Simple Object Access Protocol Policy Decision Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Ali Babar, M., Gorton, I.: Comparison of Scenario-Based Software Architecture Evaluation Methods. In: 11th Asia-Pacific Software Engineering Conference, pp. 600–607 (2004) Google Scholar
  2. 2.
    Barbacci, M., Klein, M.H., Longstaff, T.A., Weinstock, C.B.: Quality Attributes, Technical report, CMU/SEI-95-TR-021 (1995),
  3. 3.
    Bass, L., Clements, P., Kazman, R.: Software Architecture in Practice, 2nd edn. SEI Series. Addison-Wesley, Reading (2003) Google Scholar
  4. 4.
    Bass, L., John, B.E., Kates, J.: Achieving usability through software architecture, Technical Report, CMU/SEI-2001-TR-005,
  5. 5.
    Boehm, B., In, H.: Identifying Quality-Requirement Conflicts. IEEE Software 13(2), 25–35 (1996)CrossRefGoogle Scholar
  6. 6.
    Cecchet, E., Marguerite, J., Zwaenepoel, W.: Performance and scalability of EJB applications. In: ACM Conference on Object-Oriented Programming, Systems, Languages and Applications (Oopsla 2002) (2002) Google Scholar
  7. 7.
    Cronbach, L.J.: Coefficient alpha and the internal structure of tests. Psychometrika 16, 297–333 (1951) Google Scholar
  8. 8.
    Dobrica, L., Niemela, E.: A Survey on Software Architecture Analysis Methods. IEEE Transactions on Software Engineering 28(7) (2002)Google Scholar
  9. 9.
    Foster, I.: Service-oriented science. Science 308(5723), 814–817 (2005)CrossRefGoogle Scholar
  10. 10.
    Foster, I., Kesselman, C., Tuecke, S.: The Anatomy of the Grid: Enabling Scalable Virtual Organizations. In: Sakellariou, R., Keane, J.A., Gurd, J.R., Freeman, L. (eds.) Euro-Par 2001. LNCS, vol. 2150, p. 1. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  11. 11.
    Gorton, I., Liu, A., Brebner, P.: Rigorous Evaluation of COTS Middleware Technology. IEEE Computer 36(3), 50–55 (2003) Google Scholar
  12. 12.
    Globus Toolkit,
  13. 13.
    Gwet, K.: Handbook of Inter-Rater Reliability, STATAXIS Publishing Company, ISBN, 0970806205Google Scholar
  14. 14.
    Harmer, T., Stell, A., McBride, D.: UK Engineering Task Force Globus Toolkit Version 4 Middleware Evaluation, UK Technical Report UKeS-2005-03 (2005)Google Scholar
  15. 15.
  16. 16.
    Lawlor, B., Vu, L.: A Survey of Techniques for Security Architecture Analysis, Science and Technical Report, DSTO-TR-1438 (2003) Google Scholar
  17. 17.
    Nord, R.L., Tomayko, J.E.: Software Architecture-Centric Methods and Agile Development. IEEE Software 23(2), 47–53 (2006) Google Scholar
  18. 18.
    Liu, A., Gorton, I.: Accelerating COTS Middleware Acquisition: The i-Mate Process. IEEE Software 20(2), 72–79 (2003) Google Scholar
  19. 19.
    Liu, Y., Gorton, I., Abanmi, S., Hoang, C.: A secure configuration manager for adaptive server framework, NICTA Summer Scholarship Project Proposal (November 2005) Google Scholar
  20. 20.
    Jogalekar, P., Woodside, M.: Evaluating the Scalability of Distributed Systems. IEEE Transactions on Parallel and Distributed Systems 11(6), 589–603 (2000) Google Scholar
  21. 21.
    Kanoun, K., Kaaniche, M., Laprie, J.C.: Qualitative and Quantitative Reliability Assessment. IEEE Software, 77–87 (1997) Google Scholar
  22. 22.
    Kontio, J.: A Case Study in Applying a Systematic Method for COTS Selection. In: Proc. 18th Int’l. Conf. Software Eng., IEEE CS Press, Los Alamitos (1996)Google Scholar
  23. 23.
    Nord, R.L., Tomayko, J.E.: Software Architecture-Centric Methods and Agile Development. IEEE Software 23(2), 47–53 (2006) Google Scholar
  24. 24.
    Smith, C.U., Willams, L.G.: Performance Engineering of CORBA-based Distributed Systems with SPE·ED. In: Computer Performance Evaluation Modelling Techniques and Tools. LNCS. Springer, Heidelberg (1998)Google Scholar
  25. 25.
    Sotomayor, B., Childers, L.: Globus Toolkit 4: Programming Java Services, written by Borja Sotomayor and Lisa Childers. Elsevier, Amsterdam Google Scholar
  26. 26.
    Tang, D., Kumar, D., Duvur, S., Torbjornsen, O.: Availability measurement and modeling for an application server. In: International Conference on Dependable Systems and Networks, pp. 669–678 (2004) Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yan Liu
    • 1
  • Ian Gorton
    • 1
  • Len Bass
    • 2
  • Cuong Hoang
    • 3
  • Suhail Abanmi
    • 1
  1. 1.National ICT Australia (NICTA), Australia & School of Computer Science and EngineeringUniversity of New South WalesAustralia
  2. 2.Software Engineering InstituteUSA
  3. 3.Engineering FacultyUniversity of TechnologySydney

Personalised recommendations