Advertisement

Performance engineering evaluation of object-oriented systems with SPE·EDTM

  • Connie U. Smith
  • Lloyd G. Williams
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1245)

Abstract

Although object-oriented methods have been shown to help construct software systems that are easy to understand and modify, have a high potential for reuse, and are relatively quick and easy to implement, concern over performance of object-oriented systems represents a significant barrier to its adoption. Our experience has shown that it is possible to design object-oriented systems that have adequate performance and exhibit the other qualities, such as reusability, maintainability, and modifiability, that have made OOD so successful. However, doing this requires careful attention to performance goals throughout the life cycle. This paper describes the use of SPE·ED, a performance modeling tool that supports the SPE process, for early life cycle performance evaluation of object-oriented systems. The use of SPE·ED for performance engineering of object-oriented software is illustrated with a simple example.

Keywords

Performance Engineering Case Tool Automate Teller Machine Software Resource Queue Network Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [Baldassari, et al., 1989]
    M. Baldassari, B. Bruno, V. Russi, and R. Zompi, “PROTOB: A Hierarchical Object-Oriented CASE Tool for Distributed Systems,” Proceedings European Software Engineering Conference-1989, Coventry, England, 1989.Google Scholar
  2. [Baldassari and Bruno, 1988]
    M. Baldassari and G. Bruno, “An Environment for Object-Oriented Conceptual Programming Based on PROT Nets,” in Advances in Petri Nets, Lectures in Computer Science No. 340, Berlin, Springer-Verlag, 1988, pp. 1–19.Google Scholar
  3. [Beilner, et al., 1988]
    H. Beilner, J. Mäter, and N. Weissenburg, “Towards a Performance Modeling Environment: News on HIT,” Proceedings 4th International Conference on Modeling Techniques and Tools for Computer Performance Evaluation, Plenum Publishing, 1988.Google Scholar
  4. [Beilner, et al., 1995]
    H. Beilner, J. Mäter, and C. Wysocki, “The Hierarchical Evaluation Tool HIT,” in Performance Tools & Model Interchange Formats, vol. 581/1995, F. Bause and H. Beilner, ed., D-44221 Dortmund, Germany, Universität Dortmund, Fachbereich Informatik, 1995, pp. 6–9.Google Scholar
  5. [Booch, 1994]
    G. Booch, Object-Oriented Analysis and Design with Applications, Redwood City, CA, Benjamin/Cummings, 1994.Google Scholar
  6. [Booch and Rumbaugh, 1995]
    G. Booch and J. Rumbaugh, “Unified Method for Object-Oriented Development,” Rational Software Corporation, Santa Clara, CA, 1995.Google Scholar
  7. [Buhr and Casselman, 1996]
    R. J. A. Buhr and R. S. Casselman, Use Case Maps for Object-Oriented Systems, Upper Saddle River, NJ, Prentice Hall, 1996.Google Scholar
  8. [Buhr and Casselman, 1994]
    R. J. A. Buhr and R. S. Casselman, “Timethread-Role Maps for Object-Oriented Design of Real-Time and Distributed Systems,” Proceedings of OOPSLA '94: Object-Oriented Programming Systems, Languages and Applications, Portland, OR, October, 1994, pp. 301–316.Google Scholar
  9. [Buhr and Casselman, 1992]
    R. J. A. Buhr and R. S. Casselman, “Architectures with Pictures,” Proceedings of OOPSLA '92: Object-Oriented Programming Systems, Languages and Applications, Vancouver, BC, October, 1992, pp. 466–483.Google Scholar
  10. [Coleman, et al., 1994]
    D. Coleman, P. Arnold, S. Bodoff, C. Dollin, H. Gilchrist, F. Hayes, and P. Jeremaes, Object-Oriented Development: The Fusion Method, Englewood Cliffs, NJ, Prentice Hall, 1994.Google Scholar
  11. [Goettge, 1990]
    R. T. Goettge, “An Expert System for Performance Engineering of Time-Critical Software,” Proceedings Computer Measurement Group Conference, Orlando FL, 1990, pp. 313–320.Google Scholar
  12. [Grummitt, 1991]
    A. Grummitt, “A Performance Engineer's View of Systems Development and Trials,” Proceedings Computer Measurement Group Conference, Nashville, TN, 1991, pp. 455–463.Google Scholar
  13. [Hrischuk, et al., 1995]
    C. Hrischuk, J. Rolia, and C. M. Woodside, “Automatic Generation of a Software Performance Model Using an Object-Oriented Prototype,” Proceedings of the Third International Workshop on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Durham, NC, January, 1995, pp. 399–409.Google Scholar
  14. [ITU, 1996]
    ITU, “Criteria for the Use and Applicability of Formal Description Techniques, Message Sequence Chart (MSC),” International Telecommunication Union, 1996.Google Scholar
  15. [Jacobson, et al., 1992]
    I. Jacobson, M. Christerson, P. Jonsson, and G. Overgaard, Object-Oriented Software Engineering, Reading, MA, Addison-Wesley, 1992.Google Scholar
  16. [Raatikainen, 1993]
    K. E. E. Raatikainen, “Accuracy of Estimates for Dynamic Properties of Queueing Systems in Interactive Simulation,” University of Helsinki, Dept. of Computer Science Teollisuuskatu 23, SF-00510 Helsinki, Finland, 1993.Google Scholar
  17. [Rolia, 1992]
    J. A. Rolia, “Predicting the Performance of Software Systems,” University of Toronto, 1992.Google Scholar
  18. [Rumbaugh, et al., 1991]
    J. Rumbaugh, M. Blaha, W. Premerlani, F. Eddy, and W. Lorensen, Object-Oriented Modeling and Design, Englewood Cliffs, NJ, Prentice Hall, 1991.Google Scholar
  19. [Schwetman, 1994]
    H. Schwetman, “CSIM17: A Simulation Model-Building Toolkit,” Proceedings Winter Simulation Conference, Orlando, 1994.Google Scholar
  20. [Smith, 1990]
    C. U. Smith, Performance Engineering of Software Systems, Reading, MA, Addison-Wesley, 1990.Google Scholar
  21. [Smith and Williams, 1993]
    C. U. Smith and L. G. Williams, “Software Performance Engineering: A Case Study Including Performance Comparison with Design Alternatives,” IEEE Transactions on Software Engineering, vol. 19, no. 7, pp. 720–741, 1993.CrossRefGoogle Scholar
  22. [Smith and Williams, 1995]
    C. U. Smith and L. G. Williams, “A Performance Model Interchange Format,” in Performance Tools and Model Interchange Formats, vol. 581/1995, F. Bause and H. Beilner, ed., D-44221 Dortmund, Germany, Universität Dortmund, Informatik IV, 1995, pp. 67–85.Google Scholar
  23. [Turner, et al., 1992]
    M. Turner, D. Neuse, and R. Goldgar, “Simulating Optimizes Move to Client/Server Applications,” Proceedings Computer Measurement Group Conference, Reno, NV, 1992, pp. 805–814.Google Scholar
  24. [Williams, 1994]
    L. G. Williams, “Definition of Information Requirements for Software Performance Engineering,” Technical Report No. SERM-021-94, Software Engineering Research, Boulder, CO, October, 1994.Google Scholar
  25. [Williams and Smith, 1995]
    L. G. Williams and C. U. Smith, “Information Requirements for Software Performance Engineering,” in Quantitative Evaluation of Computing and Communication Systems, Lecture Notes in Computer Science, vol. 977, H. Beilner and F. Bause, ed., Heidelberg, Germany, Springer-Verlag, 1995, pp. 86–101.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Connie U. Smith
    • 1
  • Lloyd G. Williams
    • 2
  1. 1.Performance Engineering ServicesSanta Fe
  2. 2.Software Engineering ResearchBoulder

Personalised recommendations