Advertisement

Analysis of the Dynamic Nature of Information Systems Performance Evaluation

  • Vesa Savolainen

Abstract

Graphic models are constructed and analyzed to illustrate the gap between information system performance and user preferences as well as the changing performance evaluation efforts during the information system life cycle (ISLC). At the starting stages of the ISLC, i.e., during the development project life cycle, performance evaluation is carried out on the basis of user requirements and design documents. Later, during the system introduction, utilization, and maintenance stages, the system is evaluated in its usage environment. This chapter, following Savolainen (1995), explains how certain user views and technical views affect the set of performance evaluation criteria in the ISLC and how any interest in evaluating the old system is lost after launching a project for developing a new system (Savolainen, 1993, Savolainen, 1994, Savolainen, 1996). Practical findings of a field study are included in the models of this chapter.

Keywords

User Preference User Requirement Performance Evaluation Criterion Mance Evaluation Information Resource Management 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson JG, Aydin CE, Jay SJ (eds.). (1994) Evaluating Health Care Information Systems: Methods and Applications. Thousand Oaks, CA: Sage.Google Scholar
  2. Ang JSK, Conrath DW, Savolainen V. (1991) Analyzing information systems using Petri nets: operations-oriented methodology. In Sol HG, Crosslin RL (eds.), Dynamic Modelling of Information Systems. Amsterdam: Elsevier Science, pp. 329–352.Google Scholar
  3. Bawden D. (1990) User-Oriented Evaluation of Information Systems & Services. Oxon, England: Gower.Google Scholar
  4. Coccia AM. (1985) Human factors in performance evaluation. ESPRIT Project No. 285 Report. R&D Area 4.1, Office Systems Science and Human Factors. Munich: IOT.Google Scholar
  5. Conrath DW, Dumas RI (eds.). (1989) Office support systems analysis and design. Final Report on Office Modelling, Language and OSSAD Methodology. ESPRIT Project No. 285 Report. R&D Area 4.1, Office Systems Science and Human Factors. Munich: IOT.Google Scholar
  6. Di Febbraro A, Minciardi R. (1993) Event graphs in performance evaluation of manufacturing processes. Proceedings of 32nd IEEE Conference on Decision and Control, San Antonoi, Texas.Google Scholar
  7. Hussain D, Hussain KM. (1984) Information Resource Management. Homewood, IL: Richard D. Irwin.Google Scholar
  8. Mumford E. (1983) Designing Human Systems. Manchester, England: Manchester Business School.Google Scholar
  9. Nielsen J, Levy J. (1994) Measuring usability. Preference vs. performance. Comm. ACM 37(4), pp. 66–75.CrossRefGoogle Scholar
  10. Savolainen V. (1991) Definition of favourable atmosphere for effective IT decisions. In Sol HG, Vecsenyi J (eds), Environments for Supporting Decision Processes. Amsterdam: North-Holland, pp. 129–140.Google Scholar
  11. Savolainen V. (1993) Performance evaluation of office information systems. In Wells WR (ed.), Proceedings of Ninth International Conference on Systems Engineering. University of Nevada Las Vegas, Las Vegas, pp. 340–344.Google Scholar
  12. Savolainen V. (1994) Dynamic performance evaluation of office information systems. In Verbraeck A, Sol H, Bots P (eds.), Proceedings of International Conference on Dynamic Modelling and Information Systems. Noordwijkerhout, pp. 85–100.Google Scholar
  13. Savolainen V. (1995) Analysis of the dynamic nature of information systems performance evaluation. Proceedings of the Information Systems Evaluation Workshop, European Conference of Information Systems, Athens, Greece, pp. 14–21.Google Scholar
  14. Savolainen V. (1996) Performance evaluation of information system in a big company. In Wells WR (ed.), Proceedings of the Eleventh International Conference on Systems Engineering. University of Nevada Las Vegas, Las Vegas, pp. 164–169.Google Scholar
  15. Sherwood-Smith M. (1989) Evaluation of Computer-based Office Systems. Dublin: University College.Google Scholar
  16. Slevin DP, Stieman PA, Boone LW. (1991) Critical success factor analysis for information systems performance measurement and enhancement. Information and Management 21(3), pp. 161–174.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 1999

Authors and Affiliations

  • Vesa Savolainen

There are no affiliations available

Personalised recommendations