Evaluations of Physiological Monitoring Displays: A Systematic Review

  • Matthias GörgesEmail author
  • Nancy Staggers



The purpose of this paper is to present the findings from a systematic review of evaluation studies for physiologic monitoring displays, centered on empirical assessments across all available settings and samples. The findings from this review give readers the opportunity to examine past work across studies and set the stage for the design and conduct of future evaluations.


A broad literature search of the literature from 1991 to June 2007 on PubMed and PsycINFO databases was completed to locate data-based articles for physiologic monitoring device display evaluations. The results of this search plus several unpublished works yielded 23 publications and 31 studies.


Participants were faster detecting an adverse event, making a diagnosis or a clinical decision in 18 of 31 studies. They showed improved accuracy in a clinical decision or diagnosis in 13 of 19 studies and they perceived a decreased mental workload in 3 of 8 studies. Eighteen studies used a within subjects design (mean sample size 16.5), and 9 studies used a between group design (mean group size 7.6). Study settings were usability laboratories for 15 studies and patient simulation laboratories for 6 studies. Study participants were anesthesiologists or anesthesiology residents for 19 studies and nurses for 5 studies.


The advent of integrated graphical displays ushered a new era into physiological monitoring display designs. All but one study reported significant differences between traditional, numerical displays and novel displays; yet we know little about which graphical displays are optimal and why particular designs work. Future authors should use a theoretical model or framework to guide the study design, focus on other clinical study participants besides anesthesiologists, employ additional research methods and use more realistic and complex tasks and settings to increase external validity.


graphical data displays human factors patient monitoring physiologic monitoring usability testing user interface evaluations 



The authors would like to thank Dr. Dwayne Westenskow for his thoughtful comments on a previous draft of this manuscript.

Matthias Görges is supported with an anesthesiology fellowship by Drägerwerk AG, Lübeck, Germany.


  1. 1.
    Sanderson P, Watson M, Russell W. Advanced patient monitoring displays: tools for continuous informing. Anesth Analg 2005; 101(1): 161–168.CrossRefPubMedGoogle Scholar
  2. 2.
    Drews F, Westenskow D. The right picture is worth a thousand numbers: data displays in anesthesia. Hum Factors 2006; 48(1): 59–71.CrossRefPubMedGoogle Scholar
  3. 3.
    Jenkins J. Computerized electrocardiography. Crit Rev Bioeng 1981; 6(4): 307–350.PubMedGoogle Scholar
  4. 4.
    Imhoff M, Kuhls S. Alarm algorithms in critical care monitoring. Anesth Analg 2006; 102(5): 1525–1537.CrossRefPubMedGoogle Scholar
  5. 5.
    Goodstein L. Discriminative display support for process operators. In: Rasmussen J, Rouse W, eds, Human detection and diagnosis of system failures. 1st edition. Springer, 1981: 433–449.Google Scholar
  6. 6.
    Syroid N, Agutter J, Drews F, Westenskow D, Albert R, Bermudez J, et al. Development and evaluation of a graphical anesthesia drug display. Anesthesiology 2002; 96(3): 565–575.CrossRefPubMedGoogle Scholar
  7. 7.
    Drews F, Syroid N, Agutter J, Strayer D, Westenskow D. Drug delivery as control task: improving performance in a common anesthetic task. Hum Factors 2006; 48(1): 85–94.CrossRefPubMedGoogle Scholar
  8. 8.
    Gurushanthaiah K, Weinger M, Englund C. Visual display format affects the ability of anesthesiologists to detect acute physiologic changes. A laboratory study employing a clinical display simulator. Anesthesiology 1995; 83(6): 1184–1193.CrossRefPubMedGoogle Scholar
  9. 9.
    Agutter J, Albert R, Syroid N, Doig A, Johnson K, Westenskow D. Arterial blood gas visualization for critical care clinicians. Proceedings of the Annual Meeting of the Society for Technology in Anesthesiology, San Diego, CA; 2006.Google Scholar
  10. 10.
    Agutter J, Drews F, Syroid N, Westneskow D, Albert R, Strayer D, et al. Evaluation of graphic cardiovascular display in a high-fidelity simulator. Anesth Analg 2003; 97(5): 1403–1413.CrossRefPubMedGoogle Scholar
  11. 11.
    Albert R, Agutter J, Syroid N, Johnson K, Loeb R, Westenskow D. A simulation-based evaluation of a graphic cardiovascular display. Anesth Analg 2007; 105(5): 1303–1311.CrossRefPubMedGoogle Scholar
  12. 12.
    Blike G, Surgenor S, Whalen K, Jensen J. Specific elements of a new hemodynamics display improves the performance of anesthesiologists. J Clin Monit Comput 2000; 16(7): 485–491.CrossRefPubMedGoogle Scholar
  13. 13.
    Blike G, Surgenor S, Whalen K. A graphical object display improves anesthesiologists’ performance on a simulated diagnostic task. J Clin Monit Comput 1999; 15(1): 37–44.CrossRefPubMedGoogle Scholar
  14. 14.
    Cole W, Stewart J. Human performance evaluation of a metaphor graphic display for respiratory data. Methods Inf Med 1994; 33(4): 390–396.PubMedGoogle Scholar
  15. 15.
    Doig A. Graphical cardiovascular display for hemodynamic monitoring. Salt Lake City: University of Utah; 2006.Google Scholar
  16. 16.
    Effken J, Kim N, Shaw R. Making the constraints visible: testing the ecological approach to interface design. Ergonomics 1997; 40(1): 1–27.CrossRefPubMedGoogle Scholar
  17. 17.
    Görges M, Förger K, Westenskow D. A trend based decision support system for anesthesiologists improves diagnosis speed and accuracy. Proceedings of the Annual Mountain West Biomedical Engineering Conference, Snowbird, UT; 2006.Google Scholar
  18. 18.
    Jungk A, Thull B, Hoeft A, Rau G. Evaluation of two new ecological interface approaches for the anesthesia workplace. J Clin Monit Comput 2000; 16(4): 243–258.CrossRefPubMedGoogle Scholar
  19. 19.
    Jungk A, Thull B, Hoeft A, Rau G. Ergonomic evaluation of an ecological interface and a profilogram display for hemodynamic monitoring. J Clin Monit Comput 1999; 15(7–8): 469–479.CrossRefPubMedGoogle Scholar
  20. 20.
    Law A, Freer Y, Hunter J, Logie R, McIntosh N, Quinn J. A comparison of graphical and textual presentations of time series data to support medical decision making in the neonatal intensive care unit. J Clin Monit Comput 2005; 19(3): 183–194.CrossRefPubMedGoogle Scholar
  21. 21.
    Liu Y, Osvalder A. Usability evaluation of a GUI prototype for a ventilator machine. J Clin Monit Comput 2004; 18(5–6): 365–372.CrossRefPubMedGoogle Scholar
  22. 22.
    Michels P, Gravenstein D, Westenskow D. An integrated graphic data display improves detection and identification of critical events during anesthesia. J Clin Monit 1997; 13(4): 249–259.CrossRefPubMedGoogle Scholar
  23. 23.
    Ng J, Man J, Fels S, Dumont G, Ansermino J. An evaluation of a vibro-tactile display prototype for physiological monitoring. Anesth Analg 2005; 101(6): 1719–1724.CrossRefPubMedGoogle Scholar
  24. 24.
    Wachter S, Johnson K, Albert R, Syroid N, Drews F, Westenskow D. The evaluation of a pulmonary display to detect adverse respiratory events using high resolution human simulator. J Am Med Inform Assoc 2006; 13(6): 635–642.CrossRefPubMedGoogle Scholar
  25. 25.
    Wachter S, Markewitz B, Rose R, Westenskow D. Evaluation of a pulmonary graphical display in the medical intensive care unit: an observational study. J Biomed Inform 2005; 38(3): 239–243.CrossRefPubMedGoogle Scholar
  26. 26.
    Wachter S, Agutter J, Syroid N, Drews F, Weinger M, Westenskow D. The employment of an iterative design process to develop a pulmonary graphical display. J Am Med Inform Assoc 2003; 10(4): 363–372.CrossRefPubMedGoogle Scholar
  27. 27.
    Watson M, Sanderson P. Sonification supports eyes-free respiratory monitoring and task time-sharing. Hum Factors 2004; 46(3): 497–517.CrossRefPubMedGoogle Scholar
  28. 28.
    Zhang Y, Drews F, Westenskow D, Foresti S, Agutter J, Bermudez J, et al. Effects of Integrated Graphical Displays on Situation Awareness in Anaesthesiology. Cognit Technol Work 2002; 4(2):82–90.CrossRefGoogle Scholar
  29. 29.
    Phansalkar S, Staggers N, Weir C, eds. Development of the QUASII (QUality Assessment of Studies in Informatics Implementations) instrument. VA HSR&D National Meeting, Washington DC; 2006.Google Scholar
  30. 30.
    Cook TD, Campbell DT. Quasi-experimentation : design & analysis issues for field settings. Boston: Houghton Mifflin, 1979.Google Scholar
  31. 31.
    Shadish WR, Cook TD, Campbell DT. Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin, 2002Google Scholar
  32. 32.
    Cooper HM, Hedges LV. The handbook of research synthesis. New York: Russell Sage Foundation, 1994.Google Scholar
  33. 33.
    The Cochrane Collaboration. The Cochrane manual. 2007 [updated 8/23/2007; cited 9/18/2007]; Available from:
  34. 34.
    Shadish WR, Fuller S. The social psychology of science. New York: Guilford Press, 1994.Google Scholar
  35. 35.
    Zhang Y, Drews F, Westenskow D, Foresti S, Agutter J, Bermudez J, et al. Effects of integrated graphical displays on situation awareness in anesthesiology. Cognit Technol Work 2004; 4(2): 82–90.CrossRefGoogle Scholar
  36. 36.
    Ammenwerth E, Iller C, Mahler C. IT-adoption and the interaction of task, technology and individuals: a fit framework and a case study. BMC Med Inform Decis Mak 2006; 6: 3.CrossRefPubMedGoogle Scholar
  37. 37.
    Carayon P, Schoofs Hundt A, Karsh B, Gurses A, Alvarado C, Smith M, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care 2006;15(Suppl 1): i50–i58.CrossRefPubMedGoogle Scholar
  38. 38.
    Despont-Gros C, Mueller H, Lovis C. Evaluating user interactions with clinical information systems: a model based on human–computer interaction models. J Biomed Inform 2005; 38(3): 244–255.CrossRefPubMedGoogle Scholar
  39. 39.
    Staggers N. Human–computer interaction. In: Englebardt S, Nelson R, eds. Information technology in health care: an interdisciplinary approach. Harcourt Health Science Company, 2001: 321–345.Google Scholar
  40. 40.
    Daniels J, Fels S, Kushniruk A, Lim J, Ansermino J. A framework for evaluating usability of clinical monitoring technology. J Clin Monit Comput 2007; 21(5): 323–330.CrossRefPubMedGoogle Scholar
  41. 41.
    Breslow M, Rosenfeld B, Doerfler M, Burke G, Yates G, Stone D, et al. Effect of a multiple-site intensive care unit telemedicine program on clinical and economic outcomes: an alternative paradigm for intensivist staffing. Crit Care Med 2004; 32(1): 31–38.CrossRefPubMedGoogle Scholar
  42. 42.
    Hinkle D, Wersma W, Jurs S. Applied statistics for the behavioral sciences. 5th edition. Boston: Houghton Mifflin Company, 2003: 297–330.Google Scholar
  43. 43.
    Strayer D, Drews F, Crouch D. A comparison of the cell phone driver and the drunk driver. Hum Factors 2006; 48(2): 381–391.CrossRefPubMedGoogle Scholar
  44. 44.
    Sanderson P. The multimodal world of medical monitoring displays. Appl Ergon 2006; 37(4): 501–512.CrossRefPubMedGoogle Scholar
  45. 45.
    Hart S, Staveland L. Development of NASA-TLX (Task Load Index) results of empirical and theoretical research. In: Hancock P, Meshkati N, eds. Human mental workload. Amsterdam: North Holland Press, 1988: 139–183.CrossRefGoogle Scholar
  46. 46.
    Rubio S, Diaz E, Martin J, Puente JM. Evaluation of subjective mental workload: A comparison of SWAT, NASA-TLX, and workload profile methods. Appl Psychol Intern Rev 2004; 53(1): 61–86.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2007

Authors and Affiliations

  1. 1.Department of AnesthesiologyUniversity of UtahSalt Lake CityUSA
  2. 2.Informatics Program, College of NursingUniversity of UtahSalt Lake CityUSA

Personalised recommendations