WOSP 2000, GWPESD 2000: Performance Engineering pp 96-118 | Cite as

Origins of Software Performance Engineering: Highlights and Outstanding Problems

  • Connie U. Smith
Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2047)

Abstract

This chapter first reviews the origins of Software Performance Engineering (SPE). It provides an overview and an extensive bibliography of the early research. It then covers the fundamental elements of SPE: the data required, the software performance models and the SPE process. It concludes with a review of the current status and outstanding problems in the areas of: tools, performance models, use of SPE, principles, patterns and antipatterns for building high performance software, and SPE methods.

Keywords

Software Engineer Resource Requirement Performance Engineer Execution Model Capacity Planning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Alexander, C.T.: Performance Engineering: Various Techniques and Tools. Proceedings Computer Measurement Group Conference, Las Vegas, NV (1986) 264–267Google Scholar
  2. 2.
    Alexander, W., Brice, R.: Performance Modeling in the Design Process. Proceedings National Computer Conference, Houston, TX (1982)Google Scholar
  3. 3.
    Alter, S.: A Model for Automating File and Program Design in Business Application Systems. Communications of the ACM, 22(1979)6, 345–353CrossRefGoogle Scholar
  4. 4.
    Anderson, G.E.: The Coordinated Use of Five Performance Evaluation Methodologies. Communications of the ACM, 27(1984)2, 119–125CrossRefGoogle Scholar
  5. 5.
    Bagrodia, R.L., Shen, C.: MIDAS: Integrated Design and,Simulation of Distributed Systems. IEEE Transactions on Software Engineering, 17(1991)10, 1042–58CrossRefGoogle Scholar
  6. 6.
    Baskett, F. et al.: Open, Closed, and Mixed Networks of Queues with Different Classes of Customers. Journal of the ACM, 22(1975)2 248–260CrossRefMATHMathSciNetGoogle Scholar
  7. 7.
    Beilner, H., Mäter, J., Wysocki, C.: The Hierarchical Evaluation Tool HIT. In: Bause, F., Beilner, H. (eds.): Performance Tools & Model Interchange Formats. vol. 581/1995, D-44221 Dortmund, Germany, Universität Dortmund, Fachbereich Informatik (1995) 6–9Google Scholar
  8. 8.
    Beizer, B.: Micro-Analysis of Computer System Performance, New York, NY, Van Nostrand Reinhold (1978)Google Scholar
  9. 9.
    Beizer, B.: Software Performance. In: Vicksa, C.R., Ramamoorthy, C.V. (eds.): Handbook of Software Engineering. New York, NY, Van Nostrand Reinhold (1984) 413–436Google Scholar
  10. 10.
    Bell, T.E. (Editor): Special Issue on Software Performance Engineering, Computer Measureent Group Transactions (1988)Google Scholar
  11. 11.
    Bell, T.E., Bixler, D.X., Dyer. M.E.: An Extendible Approach to Computer-aided Software Requirements Engineering. IEEE Transactions on Software Engineering, 3(1977)1, 49–59CrossRefGoogle Scholar
  12. 12.
    Bell, T.E., Falk, A.M.: Performance Engineering: Some Lessons From the Trenches. Proceedings Computer Measurement Group Conference, Orlando, FL (1987) 549–552Google Scholar
  13. 13.
    Bentley, J.L.: Writing Efficient Programs, Englewood Cliffs, NJ, Prentice-Hall (1982)MATHGoogle Scholar
  14. 14.
    BMC: BMC Software, 2101 City West Blvd., Houston, TX 77042, (713) 918-8800, http://www.bmc.com
  15. 15.
    Booth, T.L.: Performance Optimization of Software Systems Processing Information Sequences Modeled by Probabilistic Languages. IEEE Transactions on Software Engineering, 5(1979)1, 31–44CrossRefGoogle Scholar
  16. 16.
    Booth, T.L.: Use of Computation Structure Models to Measure Computation Performance. Proceedings Conference on Simulation, Measurement, and Modeling of Computer Systems, Boulder, CO (1979)Google Scholar
  17. 17.
    Booth, T.L., Hart, R.O., Qin, B.: High Performance Software Design. Proceedings Hawaii International Conference on System Sciences, Honolulu, HI (1986) 41–52Google Scholar
  18. 18.
    Booth, T.L., Wiecek, C.A.: Performance Abstract Data Types as a Tool in Software Perfromance Analysis and Design. IEEE Transactions on Software Engineering, 6(1980)2, 138–151CrossRefGoogle Scholar
  19. 19.
    Brataas, G.: Performance Engineering Method for Workflow Systems: An Integrated View of Human and Computerised Work Processes. Norwegian University of Science and Technology (1996)Google Scholar
  20. 20.
    Buhr, R.J. et al.: Software CAD: A Revolutionary Approach. IEEE Transactions on Software Engineering, 15(1989)3, 234–249CrossRefGoogle Scholar
  21. 21.
    Bulka, D., Mayhew, D.: Efficient C++: Performance Programming Techniques. Addison Wesley Longman (2000)Google Scholar
  22. 22.
    Buzen, J.P.: Queueing Network Models of Multiprogramming. Ph.D. Dissertation, Harvard University (1971)Google Scholar
  23. 23.
    Chang, C.K. et al.: Modeling a Real-Time Multitasking System in a Timed PQ Net. IEEE Transactions on Software Engineering, 6(1989)2, 46–51Google Scholar
  24. 24.
    Chung, L. et al.: Non-Functional Requirements in Software Engineering. Boston, MA, Kluwer Academic Publishers (2000)MATHGoogle Scholar
  25. 25.
    CMG: Computer Measurement Group, PO Box 1124, Turnersville, NJ 08012, (800) 436-7264, http://www.cmg.org
  26. 26.
    Crain, P., Hanson, C.: Web Application Tuning. CMG, Reno (1999)Google Scholar
  27. 27.
    Crovella, M.: Performance Evaluation with Heavy Tailed Distributions. In: Haverkort, B., Bohnenkamp, H., Smith, C.U. (eds.): Computer Performance Evaluation Modelling Techniques and Tools. Vol. 1786, Berlin, Springer (2000) 1–9CrossRefGoogle Scholar
  28. 28.
    El-Sayed, H., Cameron, D., Woodside, C.M.: Automated Performance Modeling from Scenarios and SDL Designs of Telecom Systems. Proc. Int. Symposium on Software Engineering for Parallel and Distributed Systems (PDSE98), Kyoto, Japan (1998)Google Scholar
  29. 29.
    Estrin, G. et al.: SARA (System ARchitects’ Apprentice): Modeling, Analysis, and Simulation Support for Design of Concurrent Systems. IEEE Transactions on Software Engineering, SE-12(1986)2, 293–311Google Scholar
  30. 30.
    Ferrari, D.: Computer Systems Performance Evaluation. Englewood Cliffs, NJ, Prentice-Hall (1978)Google Scholar
  31. 31.
    Ferrari, D., Serazzi, G., Zeigner, A.: Measurement and Tuning of Computer Systems. Englewood Cliffs, NJ, Prentice-Hall (1983)Google Scholar
  32. 32.
    Fox, G.: Take Practical Performance Engineering Steps Early. Proceedings Computer Measurement Group Conference, Orlando, FL (1987) 992–993Google Scholar
  33. 33.
    Fox, G.: Performance Engineering as a Part of the Development Lifecycle for Large-Scale Software Systems. Proceedings 11th International Conference on Software Engineering, Pittsburgh, PA (1989)Google Scholar
  34. 34.
    Frank, G.A., Smith, C.U., Cuadrado, J.L.: Software/Hardware Codesign with an Architecture Design and Assessment System. Proceedings Design Automation Conference, Las Vegas, NV (1985)Google Scholar
  35. 35.
    Graham, R.M., Clancy, G,.J., DeVaney, D.B.: A Software Design and Evalation System. Communications of the ACM, 16(1973)2, 110–116CrossRefGoogle Scholar
  36. 36.
    Gunther, N.: The Practical Performance Analyst: Performance-By-Design Techniques for Distributed Systems. McGraw-Hill (1998)Google Scholar
  37. 37.
    Gunther, N.: E-Ticket Capacity Planning: Riding the E-Commerce Growth Curve. Proc. Computer Measurement Group, Orlando (2000)Google Scholar
  38. 38.
    Hanson, C., Crain, P., Wigginton, S.: User and Computer Performance Optimization. CMG, Reno (1999)Google Scholar
  39. 39.
    Hills, G., Rolia, J.A., Serazzi, G.: Performance Engineering of Distributed Software Process Architectures. Modelling Techniques and Tools for Computer Performance Evaluation, Heidelberg, Germany, Vol. 977, Springer (1995) 357–371Google Scholar
  40. 40.
    Howes, N.R.: Toward a Real-Time Ada Design Methodology. Proceedings Tri-Ada 90, Baltimore, MD (1990)Google Scholar
  41. 41.
    Hrischuk, C., Rolia, J.A., Woodside, C.M.: Automatic Generation of a Software Performance Model Using an Object-Oriented Prototype. Proceedings of the Third International Workshop on Modeling, Analysis, and Simulation of Computer and Telecommunications Systems, Durham, NC (1995) 399–409Google Scholar
  42. 42.
    Hrischuk, C.E. et al.: Trace-Based Load Characterization for Generating Performance Software Models. TSE, 25(1999)1, 122–135Google Scholar
  43. 43.
    HyPerformix: Inc., 4301 West Bank Dr, Bldg A, Austin, TX 78746, (512)328-5544, http://www.hyperformix.com
  44. 44.
    Jain, R.: Art of Computer Systems Performance Analysis. New York, NY, John Wiley (1990)Google Scholar
  45. 45.
    Jalics, P.J.: Improving Performance The Easy Way. Datamation, 23(1977)4, 135–148Google Scholar
  46. 46.
    Kant, E.: Efficiency in Program Synthesis. Ann Arbor, MI, UMI Research Press (1981)Google Scholar
  47. 47.
    King, P., Pooley, R.: Derivation of Petri Net Performance Models from UML Specifications of Communications Software. In: Haverkort, B., Bohnenkamp, H., Smith, C. (eds).: Modelling Tools and Techniques. Schaumburg, IL, Vol. 1786, Springer (2000)Google Scholar
  48. 48.
    Knuth, D.E.: An Empirical Study of FORTRAN Programs. Software Practice & Experience, 1(1971)2, 105–133CrossRefMATHGoogle Scholar
  49. 49.
    Knuth, D.E: The Art of Computer Programming. Vol.1: Fundamental Algorithms, Third Edition, Reading, MA, Addison-Wesley (1997)Google Scholar
  50. 50.
    Knuth, D.E.: The Art of Computer Programming Vol.3: Sorting and Searching, Second Edition, Reading, MA, Addison-Wesley (1998)Google Scholar
  51. 51.
    Kopetz, H.: Design Principles of Fault Tolerant Real-Time Systems. Proceedings Hawaii International Conference on System Sciences, Honolulu, HI (1986) 53–62Google Scholar
  52. 52.
    Lampson, B.W.: Hints for Computer System Design. IEEE Software, 2(1984)1, 11–28CrossRefGoogle Scholar
  53. 53.
    Larman, C., Guthrie, R.: Java 2 Performance and Idiom Guide. Upper Saddle River, NJ, Prentice Hall PTR (2000)Google Scholar
  54. 54.
    Lazowska, E.D. et al.: Quantitative System Performance: Computer System Analysis Using Queuing Network Models. Englewod Cliffs, NJ, Prentice-Hall, Inc. (1984)Google Scholar
  55. 55.
    LeMer, E.: MEDOC: A Methodology for Designing and Evaluating Large-Scale Real-Time Systems. Proceedings National Computer Conference, 1982, Houston, TX (1982)Google Scholar
  56. 56.
    Levi, S., Agrawala, A.K.: Real-Time System Design, New York, NY, McGraw-Hill (1990)Google Scholar
  57. 57.
    Lor, K., Berry, D.B.: Automatic Synthesis of SARA Design Models from System Requirements. IEEE Transactions on Software Engineering, 17(1991)12, 1229–1240CrossRefGoogle Scholar
  58. 58.
    L&S Computer Technology Inc.: Performance Engineering Services Division. Austin, TX 78766, (505) 988-3811, http://www.perfeng.com
  59. 59.
    Martin, C.R.: An Integrated Software Performance Engineering Environment. Masters Thesis, Duke University (1988)Google Scholar
  60. 60.
    Menascé, D.A., Almeida, V.A.F.: Scaling for E-Business: Technologies, Models, Performance, and Capacity Planning. Prentice Hall (2000)Google Scholar
  61. 61.
    Menascé, D.A., Almeida, V.A.F., Dowdy, L.W.: Capacity Planning and Performance Modeling. Englewood Cliffs, NJ, PTR Prentice Hall (1994)Google Scholar
  62. 62.
    Menascé, D.A., Gomaa, H.: On a Language Based Method for Software Performance Engineering of Client/Server Systems. Workshop on Software and Performance, Santa Fe, NM, ACM (1998) 63–69Google Scholar
  63. 63.
    Nichols, K.M.: Performance Tools. IEEE Software, 7(1990)3, 21–30CrossRefMathSciNetGoogle Scholar
  64. 64.
    Nichols, K.M., Oman, P.: Special Issue in High Performance. IEEE Software, 8(1991)5Google Scholar
  65. 65.
    Opdahl, A.: A CASE Tool for Performance Engineering During Software Design. Proceedings Fifth Nordic Workshop on Programming Environmental Research, Tampere, Finland (1992)Google Scholar
  66. 66.
    Opdahl, A., Sølvberg, A.: Conceptual Integration of Information System and Performance Modeling. Proceedings Working Conference on Information System Concepts: Improving the Understanding (1992)Google Scholar
  67. 67.
    Paterok, M., Heller, R., deMeer, H.: Performance Evaluation of an SDL Run Time System-A Case Study. Proceedings 5th International Conference on Modeling Techniques and Tools for Computer Performance Evaluation, Torino, Italy (1991) 86–101Google Scholar
  68. 68.
    Pooley, R.: The Integrated Modeling Support Environment. Proceedings 5th International Conference on Modeling Techniques and Tools for Computer Performance Evaluation, Torino, Italy (1991) 86–101Google Scholar
  69. 69.
    Qin, B.: A Model to Predict the Average Response Time of User Programs. Performance Evaluation, Vol. 10 (1989) 93–101CrossRefGoogle Scholar
  70. 70.
    QSP: Quantitative System Performance. 7516 34th Ave N., Seattle, WA 98117-4723Google Scholar
  71. 71.
    Riddle, W.E. et al.: Behavior Modeling During Software Design. IEEE Transactions on Software Engineering, Vol. 4 (1978)Google Scholar
  72. 72.
    Rolia, J.A.: Predicting the Performance of Software Systems. University of Toronto (1992)Google Scholar
  73. 73.
    Rolia, J.A., Sevcik, K.C.: The Method of Layers. IEEE Trans. on Software Engineering, 21(1995)8 689–700CrossRefGoogle Scholar
  74. 74.
    Rolia, J.A., Vetland, V.: Correlating Resource Demand Information with ARM Data for Application Services. Int. Workshop on Software and Performance, Santa Fe, NM, ACM (1998)Google Scholar
  75. 75.
    Sahner, R.A., Trivedi, K.S.: Performance and Reliability Analysis Using Directed Acyclic Graphs. IEEE Transactions on Software Engineering, 13(1987)10, 1105–1114CrossRefGoogle Scholar
  76. 76.
    Sanguinetti, J.W.: A Formal Technique for Analyzing the Performance of Complex Systems. Proceedings Performance Evaluation Users Group 14, Boston, MA (1978)Google Scholar
  77. 77.
    Schmietendorf, A., Scholz, A., Rautenstrauch, C.: Evaluating the Performance Engineering Process. Proc. Workshop on Software and Performance (WOSP2000), Ottawa, Canada (2000)Google Scholar
  78. 78.
    Sevcik, K.C., Smith, C.U., Williams, L.G.: Performance Prediction Techniques Applied to Electronic Commerce Systems. In: Kou, W., Yesha, Y. (eds.): Electronic Commerce Technology Trends: Challenges and Opportunities. IBM Press (2000)Google Scholar
  79. 79.
    Sheikh, F., Woodside, C.M.: Layered Analytic Performance Modelling of a Distributed Database System. ICDCS (1997)Google Scholar
  80. 80.
    Sholl, H., Kim, S.: An Approach to Performance Modeling as an Aid in Structuring Realtime, Distributed System Software. Proceedings Hawaii International Conference on System Sciences, Honolulu, HI (1986) 5–16Google Scholar
  81. 81.
    Sholl, H.A., Booth, T.L.: Software Performance Modeling Using Computation Structures. IEEE Transactions on Software Engineering, 1(1975)4Google Scholar
  82. 82.
    Smith, C.U.: The Prediction and Evaluation of the Performance of Software from Extended Design Specifications. Ph.D. Dissertation, University of Texas (1980)Google Scholar
  83. 83.
    Smith, C.U.: Software Performance Engineering: Proceedings Computer Measurement Group Conference XII (1981) 5–14Google Scholar
  84. 84.
    Smith, C.U.: A Methodology for Predicting the Memory Management Overhead of New Software Systems. Proceedings Hawaii International Conference on System Sciences, Honolulu, HI (1982) 200–209Google Scholar
  85. 85.
    Smith, C.U.: Independent General Principles for Constructing Responsive Software Systems. ACM Transactions on Computer Systems, 4(1986)1, 1–31CrossRefGoogle Scholar
  86. 86.
    Smith, C.U.: Applying Synthesis Principles to Create Responsive Software Systems. IEEE Transactions on Software Engineering, 14(1988)10, 1394–1408CrossRefGoogle Scholar
  87. 87.
    Smith, C.U.: Who Uses SPE? Computer Measurement Group Transactions (1988) 69–75Google Scholar
  88. 88.
    Smith, C.U.: Performance Engineering of Software Systems. Reading, MA, Addison-Wesley (1990)Google Scholar
  89. 89.
    Smith, C.U., Browne, J.C.: Performance Specifications and Analysis of Software Designs. Proceedings ACM Sigmetrics Conference on Simulation Measurement and Modeling of Computer Systems, Boulder, CO (1979)Google Scholar
  90. 90.
    Smith, C.U., Browne, J.C.: Aspects of Software Design Analysis: Concurrency and Blocking. Proceedings ACM Sigmetrics Conference on Simulation Measurement and Modeling of Computer Systems (1980)Google Scholar
  91. 91.
    Smith, C.U., Browne, J.C.: Performance Engineering of Software Systems: A Case Study. Proceedings National Computer Conference, Houston, TX, Vol. 15, (1982) 217–224Google Scholar
  92. 92.
    Smith, C.U., Loendorf, D.D.: Performance Analysis of Software for an MIMD Computer. Proceedings ACM Sigmetrics Conference on Measurement and Modeling of Computer Systems, Seattle, WA (1982) 151–162Google Scholar
  93. 93.
    Smith, C.U., Williams, L.G.: Software Performance Engineering: A Case Study with Design Comparisons. IEEE Transactions on Software Engineering, 19(1993)7, 720–741CrossRefGoogle Scholar
  94. 94.
    Smith, C.U., Williams, L.G.: Performance Engineering of Object-Oriented Systems with SPEED. In: Mari, R. et al. (eds.): Lecture Notes in Computer Science 1245: Computer Performance Evaluation. Berlin, Germany, Springer (1997) 135–154CrossRefGoogle Scholar
  95. 95.
    Smith, C.U., Williams, L.G.: Performance Engineering Evaluation of CORBA-based Distributed Systems with SPEED. In: Puigjaner, R. (ed.): Lecture Notes in Computer Science. Berlin, Germany, Springer (1998)Google Scholar
  96. 96.
    Smith, C.U., Williams, L.G.: Performance Engineering Models of CORBA-based Distributed Object Systems. Proc. Computer Measurement Group, Anaheim (1998)Google Scholar
  97. 97.
    Smith, C.U., Williams, L.G.: Performance Models of Distributed System Architectures. Proc. Computer Measurement Group, Anaheim (1998)Google Scholar
  98. 98.
    Smith, C.U., Williams, L.G.: Software Performance Antipatterns. Workshop on Software and Performance, Ottawa, Canada, ACM (2000)Google Scholar
  99. 99.
    Smith, C.U., Williams, L.G.: SPE Models for Web Applications. Proc. Computer Measurement Group, Orlando (2000)Google Scholar
  100. 100.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley (2001)Google Scholar
  101. 101.
    Smith, C.U., Wong, B.: SPE Evaluation of a Client/Server Application. Proc. Computer Measurement Group, Orlando, FL (1994)Google Scholar
  102. 102.
    Steppler, M: Performance Analysis of Communication Systems Formally Specified in SDL. Proc. 1st Int. Workshop on Software and Performance (WOSP98), Santa Fe, NM (1998)Google Scholar
  103. 103.
    Trivedi, K.S.: Probability and Statistics With Reliability, Queueing, and Computer Science Applications. Englewood Cliffs, NJ, Prentice-Hall (1982)Google Scholar
  104. 104.
    Vetland, V., Hughes, P., Solvberg, A.: A Composite Modelling Approach to Software Performance Measurement. Proc. ACM Sigmetrics, Santa Clara, CA (1993) 275–276Google Scholar
  105. 105.
    Vetland, V.: Measurement-Based Composite Computational Work Modelling of Software. University of Trondheim (1993)Google Scholar
  106. 106.
    Woodside, C.M.: Throughput Calculation for Basic Stochastic Rendezvous Networks. Technical Report, Carleton University, Ottawa, Canada, April (1986)Google Scholar
  107. 107.
    Woodside, C.M.: Throughput Calculation for Basic Stochastic Rendezvous Networks. Performance Evaluation, Vol. 9 (1989)Google Scholar
  108. 108.
    Woodside, C.M. et al.: The CAEDE Performance Analysis Tool,” Ada Letters, XI(1991)3Google Scholar
  109. 109.
    Woodside, C.M. et al.: A Wideband Approach to Integrating Performance Prediction into a Software Design Environment. Proc. First Int. Workshop on Software and Performance (WOSP98), Santa Fe, NM (1998)Google Scholar
  110. 110.
    Woodside, C.M. et al.: The Stochastic Rendezvous Network Model for Performance of Synchronous Client-Server-like Distributed Software. IEEE Trans. Computers, 44(1995)1, 20–34CrossRefMATHMathSciNetGoogle Scholar
  111. 111.
    WOSP: Workshop on Software and Performance (WOSP98, WOSP2000). ACM SigmetricsGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Connie U. Smith
    • 1
  1. 1.Performance Engineering ServicesSanta Fe

Personalised recommendations