Abstract
The growing volume of clinical data in modern medical practice creates difficulties for clinicians when attempting to come to a full understanding of each patient’s overall health status. Many different approaches to computer-based visualizations have been taken in an attempt to alleviate this burden; however, no single approach has been widely adopted. As a step towards optimization and standardization of data visualization in healthcare, this paper presents a diverse set of approaches to visualization for multiple organ systems. To do so we summarize best practices in design and evaluation while proposing usability testing methodology. We then review and illustrate the goals of various clinical data visualization techniques.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
The modern medical practitioner is confronted with a growing volume of clinical data which must be sorted through and understood in order to provide effective medical care. This data can include the patient’s medical history, laboratory values, diagnostic reports, pathology reports and physician consults, as well as financial and administrative information [1, 2]. Visualization tools have been created in order to integrate and manage this expanding amount of electronic health data. The implementation of computer-based solutions for certain aspects of medical care such as billing and scheduling have become wide-spread and sophisticated; however, it is difficult to find similarly advanced support for routine medical care tasks such as reviewing laboratory and physiologic data.
There is a need for optimization and standardization of data visualization in healthcare. As a step towards improving our ability to compare effectiveness of data visualization techniques, this paper puts together in one place a variety of approaches to visualization for multiple organ systems in the acute care setting. We will summarize best practices in design and evaluation while proposing usability testing methodology. We will then review and illustrate the goals of clinical data visualization with selected implementation examples.
Background
Clinicians commonly complain that due to the large volume of electronic health data, it is difficult to understand the “big picture” of each patient [3]. Thus, various clinical visualization tools have been created to monitor and aggregate patient data in order to aid health care providers in interpretation of electronic health records and treatment decisions. These tools track physiologic variables such as heart rate, pulse and biomarkers and could support accelerated risk stratification, leading to timelier and higher quality care [4]. Throughout all medical specialties, well-designed visualizations have been shown to improve the clinicians’ recognition of patient health trends and their understanding of effects of interventions over time [4].
While many different approaches have been taken to computer-based visualization of clinical data, no single approach has found success in broadly influencing the manner in which clinical data is viewed on a daily basis. This is likely due to the absence of a nationally shared user interface common to inpatient medical care, as well as a competitive vendor-driven electronic health record environment with sporadic reliance upon unique home-grown integration systems.
Acknowledging divergent terminology used to describe visualization methodology, for the purpose of this review we will adopt a taxonomy presented by Starren and Johnson specifically for classifying clinical data presentation [5]. The authors propose a system for analyzing graphical user interfaces in which each interface element is considered an object that can be classified as a list, a table, generated text, an icon or a graph (Fig. 1).
Lists are further classified as either simple or, if containing sub-lists, nested. Tables are items arranged in an n-dimensional grid. Generated text is the computer-aided transformation of coded data into text. Icons are stylized pictorial symbols. Graphs are defined as spatial arrangements of points, lines and labels and can be further divided into charts, configural charts, and graph notation. Charts display data with respect to axes, while configural charts display data that creates a specific “shape” and graph notation describes nodes connected by edges with data conveyed by labels and connection topology.
Medical data visualizations best practices
Goals for medical data visualizations
The primary goal of visualizing clinical data is to inform clinical decision making. This can be achieved by providing displays that clarify the temporal relationship of data points and allow related data to be compared in a straightforward manner. One study that compared search times for clinical patient care guidelines found that users spent twice the amount of time searching when using a system with poor document format and organization in the interface [6].
Secondary goals of data visualization include increasing the efficiency and accuracy of data review as well as facilitating a more comprehensive understanding of the organ system or disease process to which the data relates. By intuitively unifying data normally kept separate, a data display can accomplish all of the above. For instance, a display designed for evaluating acute kidney injury could bring together urine output, urine electrolytes, serum creatinine and blood pressure trends. This would allow a medical provider to rapidly assess a change in kidney function without sorting through multiple data sources. Review of these data in a user friendly format could be utilized both on rounds in discussion with trainees, as well as during a family meeting to explain changes in a patient’s condition.
Designing medical data visualizations
Like other medical innovations, creating a novel data visualization typically starts with a concept motivated by clinical need. This concept is then implemented, evaluated and refined. These processes can occur before, during or after roll-out to clinical setting. Implementation tends to be time and resource intensive and is required with each refinement. Several authors have described variations of this workflow. Rather than start with an implementation, an alternative approach begins with a paper sketch reflecting how medical data would be visualized [7]. That sketch is then evaluated by clinicians for its intuitiveness and iteratively revised based on evaluators’ suggestions, either formally or using a think-out-loud approach. This methodology allows a new design to be validated before formal implementation in a rapid and cost-effective manner.
The iterative design process requires a concept to initiate the design workflow. When a clinical need is present without a visualization idea, one approach in identifying potential solutions is to evaluate information tools that users create themselves to simplify the cognitive tasks at hand [8]. In practice, this means discovering what notes and crib sheets clinicians are currently using and understanding the information they contain. These tools represent a window into deficiencies in existing systems and provide inspiration into improvements.
Another approach that has been described in aiding design is known as presentation discovery [9]. This method is empiric and starts with the identification of concepts of interest within a particular clinical domain, followed by the collection and sorting of potential icons to represent these concepts and evaluation of these icons by domain experts. A graphical domain vocabulary is thus built up based on agreement between experts on which icons represent concepts best. From this vocabulary, effective domain-specific visualizations can be created.
Evaluating medical data visualizations
Multiple methodologies have been described in the evaluation of clinical data visualizations. The majority of these consist of creating a cognitive task to be completed by a clinician with the aid of either the existing information display or the new display being evaluated. Performance on that task is then compared to ascertain the benefit, or lack thereof, of the new display. Cognitive effort is typically measured, with the intent of reducing task time, minimizing cognitive burden and increasing user satisfaction with the process. Minor improvements in completing a task are additive when the task is frequently performed.
Different methodologies have led to conflicting conclusions regarding the optimal display type, revealing an underlying dependency on the experimental design. For instance, a study that compared text to text plus charts in interpretation of neonatal intensive care data found text to provide superior performance on the clinical task [10]. The additional text provided to the first arm of the experiment; however, was a focused summary of the chart data presented, thus giving that group an easier cognitive task. In contrast, other experimenters have found that combining a traditional data interface with charts to be superior [11]. Consensus on meaningful experimental design to grade data visualizations is lacking.
Recently a formal methodology has been described for evaluating clinical data visualizations [12]. Borrowing from experience in evaluating display systems in radiology, the authors outline an experimental design strategy in which a pool of cases is combined with a pool of information displays to be evaluated. Each study participant views the cases matched with displays in a random order such that all combinations of cases and information displays are evaluated. Accuracy and speed of task completion are analyzed along with preference data, ultimately identifying the superior display modality.
Examples of medical data visualization tools
Selected implementation examples from the literature
A comprehensive review of the abundant data visualization tools developed for clinical and research use is not the goal of this paper. The lack of grading systems has diluted potentially useful tools into published obscurity, preventing implementation and dissemination. Instead, we outline a construct with which to select tools for clinical adoption, emphasizing those that were designed or evaluated using best practices. We have sought to include a diverse set of visualizations that are representative of a variety of approaches taken and cover areas of medicine that are data rich, such as the intensive care unit and operating rooms, and frequently viewed medical such as laboratory data and medication lists. They are categorized by body systems and data types.
Cardiac data
One data domain where rapid detection of changes in patient condition is essential and data can be complex is the cardiac system. A configural chart for displaying cardiac parameters was developed by Blike et al. which utilizes two multi-axes charts to create geometric objects (Fig. 2) [13, 14].
These objects were designed to have emergent properties which facilitate differentiation of shock states. Clinicians evaluated physiologic data in a computer-based simulation in two separate studies. Use of the configural chart in conjunction with a traditional display resulted in improved accuracy and reduced time to diagnosis of shock states compared to traditional display alone. This configural chart incorporated meaningful shapes into its display, including features that would have physiologic meaning to the physicians using this tool, such as displaying “the heart object as ‘empty’ or ‘full’ when in a low stroke volume state,” [13, 14].
Pulmonary data
One important source of clinical data related to the pulmonary system is the ventilator. Interpretation of ventilator data in conjunction with blood gas data is crucial in evaluating a patient for potential extubation as well as for managing diseases processes such as pneumonia and acute respiratory distress syndrome. Integrated pulmonary displays have been developed as an alternative to manual review of individual respiratory parameters.
An approach developed by Liu et al. for displaying pulmonary data relies on a configural chart utilizing a circle as the basis for determining clinical changes (Fig. 3) [15]. Providers preferred the advanced display because it was easier to detect changes compared with traditional displays. While the use of the configural chart was preferred by providers due to it enabling easier detection of changes compared with traditional displays, the use of this display did not result in improved clinical task performance under simulated conditions. A possible influence on this display’s failure to improve clinical task performance is its reliance on a polygon-based display and lack of clinically meaningful shapes, such as in the chart developed by Blike et al. [13, 14].
Chemistry data
The ability to efficiently review laboratory data and place recent data into context is valuable to all medical providers. As early as 1986, clinicians were publishing on how “laboratory data have become more numerous and more difficult to interpret” [16], a situation which has been exacerbated in the intervening years due to a continued increase in the volume of available data. These recommendations for data display are still applicable today and include filtering out unnecessary aspects of the display, simplifying information where possible, automatically coding data to flag abnormal conditions and grouping data into clinically useful sets. All of these strategies assist the provider in recognizing patient conditions via accurate interpretation of data.
We developed a unique approach for displaying chemistry data which combines graphs with a configural chart (Fig. 4) [17]. The configural chart layout mirrors medical shorthand notation for chemistry data. This is one screen from a locally developed application known as Wandering Data, which has served as a test-bed for data visualization concepts.
No formal user evaluation of this specific display has been performed but it has been adopted by leading industry electronic health record companies. By incorporating medical shorthand notation, this display leverages shared symbolism from medical training which may confer a performance advantage, similar to the meaningful symbolism utilized in the cardiac example.
Microbiology data
The review of microbiology data is another area where effective data visualization has the potential to both increase the efficiency of provider review as well as improve patient outcomes through the selection of appropriate antibiotics and antibiotic course length. For instance, integration of local antibiotic recommendations into a culture result report based on institutional sensitivity data might improve compliance with infectious disease guidelines. Willard describes a system that provides a list of all culture organisms and then a custom chart where these data were provided along with sensitivity data [18]. Additional information such as recommended dosing regimen, cost and need for infectious disease approval is added in. This system was compared with a traditional microbiology data system in a clinical task. The reorganized and enhanced microbiology data display decreased task completion time by 45 % while reducing task errors.
Another example is provided by Duclos [19]. A conceptual pharmacodynamics model was designed that includes pathogens, susceptibility tests, antibiotics and the prevalence of resistance. The tool was created to aid physicians when prescribing antimicrobial therapy and is presented as an HTML table (Fig. 5). The model takes into account minimum and maximum values for prevalence of resistance and presents the susceptibility of the bacterium using gray scaling, with darker gray corresponding to higher susceptibility. The extraction algorithm was evaluated for accuracy by comparing automatically extracted spectra with previously reported spectra. No formal user interface testing was done.
Medication data
In addition to reviewing laboratory data, another task common to all medical providers is interpreting patient medication data. Crucial to understanding drug effects and side effects is obtaining an accurate medication history complete with dosage changes and when medications were started and stopped; a non-trivial task for patients on more than a few medications. Medication lists are frequently stored as free text within physician notes, which complicates this process. One approach to simplifying this problem integrates a free text note processor with a graphical timeline display (Fig. 6) [20].
The resulting chart condenses a complex medication history into a set of colored boxes that represent drugs as being “on” or “off” annotated with dosage information. The tool was evaluated for accuracy compared with the manual process and performed well. No formal user interface testing was done.
Koch provides another approach. Researchers used a user-centered approach to create an integrated multi-page display for ICU management [21]. The tool was created in response to identified information gaps during medication management, patient awareness and team communication with the hypothesis that integrating data in one location would improve clinicians’ situation awareness. The initial prototype was developed after a mixed observation and interview approach. This was then revised after rounds of user testing and questionnaires.
The medication management tool presented provides vital signs, the status of current and scheduled medications, as well as potential side effects, IV compatibility and drug-drug interaction information of the selected medication with the currently administered or scheduled medications (Fig. 7). User interface testing via clinical simulation was conducted. For the medication management portion of the testing, participants were asked about current medications, scheduled medications, side effects, drug-drug interactions and compatibility of a new medication with those already administered. The study found that clinicians had higher situation awareness and faster task completion times using the integrated display with an accuracy of 85.3 % compared to 61.85 % with the traditional display and median task completion times of 26.0 s compared to 42.1 s with the traditional display.
Integrated specialty-specific data
Integrated displays are being created that aggregate and present data critical to each specialty. Within the previously described Wandering Data application, we have included a module specific to the infectious disease specialty [17]. This display includes temperature trends, culture results, antibiotics administered, and reports on potential sources of infection using icons within a table (Fig. 8). Future work will focus on combining culture-derived antibiotic sensitivity data with current antibiotic coverage, which will allow visual assessment of the appropriateness of antibiotic coverage.
An integrated display was developed by Michels et al. with the intention of simultaneously displaying all variables relevant to administering an anesthetic within an operating room (Fig. 9) [22]. These variables included the cardiac and pulmonary indices mentioned above but also included inhalational and intravenous drug delivery measurements with a total of 30 displayed data points within a full screen configural chart. Performance on clinical tasks was compared between the configural chart and a simulation of a traditional display. Anesthesia providers performed better only on some tasks with the configural chart; task performance was best where emergent features of the configural chart were aligned with the task objective.
Substitutable data visualizations
In an ideal world, end-users would be able to select the data visualizations that best suit their needs. One approach to enabling this functionality is the concept of a substitutable application as promulgated by the SMART Health IT platform. SMART is an open, standards based technology platform that enables apps to run on top of electronic health records from any vendor that supports those standards [23]. Examples of clinical data visualizations provided by SMART applications include cardiac risk indices, growth charts, and vital signs displays.Footnote 1
Discussion
Usability testing for data visualization
Many of the clinical data visualizations reviewed above underwent formal evaluation in order to demonstrate superiority, or inferiority, to existing practice. The methodologies behind these evaluations are not standardized which makes comparing data visualizations between studies difficult. In addition to proposing a taxonomy for classifying visualization, Starren and Johnson [5] reviewed four distinct metrics for evaluating visualizations: user preference, accuracy, latency and compactness. Unfortunately, these metrics are sometimes at odds with each other, and will vary with the user group.
Recently a formal evaluation methodology has been proposed [12], but it is both labor intensive and applies best to the context of radiology in which it was developed. We support the standardization of usability testing, in which the ease of use of a specific clinical data visualization tool is tested in a realistic scenario. Usability testing allows for evaluating accuracy, performance, intuitiveness, training needs and user preference. As usability varies with the user group tested, we suggest that the test group be composed of clinicians who will be the targeted end-users of the tools being evaluated.
Recommendations for effective data visualizations
We have reviewed the use of a diverse set of data visualization techniques to support interpretation of clinical data in a variety of contexts. Evaluation of these techniques through simulated clinical tasks demonstrates that some of these techniques hold promise in improving real world clinician efficiency and performance. It also reveals several themes (see Table 1). First, configural charts perform well when the emergent properties of the chart correlate with important clinical states that are not obvious from review of the raw data itself. Second, displays that incorporated symbols with meaning derived from shared medical training tended to have more evidence for successful usability testing. Finally, the addition of related data to visualizations can provide valuable context. In the face of continued growth in the volume of clinical data requiring review, combining these insights will be essential for optimal data display. Together, these studies demonstrate that displaying the right information at the right time in an effective manner can improve the performance of the modern medical practitioner.
Conclusions
The distribution and adoption of various data visualization tools is limited for multiple reasons, not the least of which is the lack of a shared graphical user interface and data platform. In the increasingly cognitively complex clinical environment, the dependence upon data visualization to distill important information in a real-time fashion is likely to increase. To facilitate movement towards improving our ability to compare effectiveness of data visualization techniques, we have summarized best practices in design and evaluation while calling for the standardization of usability testing methodology.
Notes
http://smarthealthit.org/<accessed July 28, 2016>
References
Jensen, P. B., Jensen, L. J., and Brunak, S., Mining electronic health records: Towards better research applications and clinical care. Nat. Rev. Genet. 13(6):395–405, 2012.
Stead, W., and Lin, H., National research council (US). committee on engaging the computer science research community in health care informatics: Computational technology for effective health care: Immediate steps and strategic directions. National Academies Press, Washington DC, 2009.
Staggers, N., A systematic review on the designs of clinical technology: Findings and recommendations for future research. ANS Adv. Nurs. Sci. 32(3):252, 2009.
Badgeley, M. A., Shameer, K., Glicksberg, B. S., Tomlinson, M. S., Levin, M. A., McCormick, P. J., Kasarskis, A., Reich, D. L., and Dudley, J. T., EHDViz: Clinical dashboard development using open-source technologies. BMJ Open 6(3):e010579, 2016.
Starren, J., and Johnson, S. B., An object-oriented taxonomy of medical data presentations. J. Am. Med. Inform. Assoc. 7(1):1–20, 2000.
Wallace, C. J., Bigelow, S., Xu, X., and Elstein, L., Collaborative practice: Usability of text-based, electronic patient care guidelines. Comput. Inform. Nurs. 25(1):39–44, 2007.
Wachter, S. B., Noah, S., Drews, F., Weinger, M. B., and Westenskow, D., The employment of an iterative design process to develop a pulmonary graphical display. J. Am. Med. Inform. Assoc. 10(4):363–372, 2003.
Gurses, A. P., Xiao, Y., and Hu, P., User-designed information tools to support communication and care coordination in a trauma hospital. J. Biomed. Inform. 42(4):667–677, 2009.
Payne, P. R., and Starren, J. B., Quantifying visual similarity in clinical iconic graphics. J. Am. Med. Inform. Assoc. 12(3):338–345, 2005.
Law, A. S., Freer, Y., Hunter, J., Logie, R. H., McIntosh, N., and Quinn, J., A comparison of graphical and textual presentations of time series data to support medical decision making in the neonatal intensive care unit. J. Clin. Monit. Comput. 19(3):183–194, 2005.
Charabati, S., Bracco, D., Mathieu, P., and Hemmerling, T., Comparison of four different display designs of a novel anaesthetic monitoring system, the ‘integrated monitor of anaesthesia (IMA™)’. Br. J. Anaesth. 103(5):670–677, 2009.
Pieczkiewicz, D. S., and Finkelstein, S. M., Evaluating the decision accuracy and speed of clinical data visualizations. J. Am. Med. Inform. Assoc. 17(2):178–181, 2010.
Blike, G. T., Surgenor, S. D., Whalen, K., and Jensen, J., Specific elements of a new hemodynamics display improves the performance of anesthesiologists. J. Clin. Monit. Comput. 16(7):485–491, 2000.
Blike, G. T., Surgenor, S. D., and Whalen, K., A graphical object display improves anesthesiologists’ performance on a simulated diagnostic task. J. Clin. Monit. Comput. 15(1):37–44, 1999.
Liu, Y., and Osvalder, A.-L., Usability evaluation of a GUI prototype for a ventilator machine. J. Clin. Monit. Comput. 18(5–6):365–372, 2004.
Politser, P., How to make laboratory information more informative. Clin. Chem. 32(8):1510–1516, 1986.
Roederer, A., Soegaard, J., Lee, I., Wanderer, J., Park, S., Wandering Data: A Scalable, Durable System for Effective Visualization of Patient Health Data. In: 2014 I.E. 27th International Symposium on Computer-Based Medical Systems, IEEE, pp 547–548, 2014.
Willard, K. E., Johnson, J. R., and Connelly, D. P., Radical improvements in the display of clinical microbiology results: A Web-based clinical information system. Am. J. Med. 101(5):541–549, 1996.
Duclos, C., Cartolano, G. L., Ghez, M., and Venot, A., Structured representation of the pharmacodynamics section of the summary of product characteristics for antibiotics: Application for automated extraction and visualization of their antimicrobial activity spectra. J. Am. Med. Inform. Assoc. 11(4):285–293, 2004.
Daye, K., Lashman, D., Pandiani, J., Medication history display and evaluation. In: Proceedings of the Annual Symposium on Computer Application in Medical Care. American Medical Informatics Association, pp 993, 1994.
Koch, S. H., Weir, C., Westenskow, D., Gondan, M., Agutter, J., Haar, M., Liu, D., Görges, M., and Staggers, N., Evaluation of the effect of information integration in displays for ICU nurses on situation awareness and task completion time: A prospective randomized controlled study. Int. J. Med. Inform. 82(8):665–675, 2013.
Michels, P., Gravenstein, D., and Westenskow, D. R., An integrated graphic data display improves detection and identification of critical events during anesthesia. J. Clin. Monit. 13(4):249–259, 1997.
Mandel, J. C., Kreda, D. A., Mandl, K. D., Kohane, I. S., Ramoni, R. B., SMART on FHIR: a standards-based, interoperable apps platform for electronic health records. J. Am. Med. Inform. Assoc.: ocv189, 2016.
Author information
Authors and Affiliations
Corresponding author
Additional information
This article is part of the Topical Collection on Systems-Level Quality Improvement
Rights and permissions
About this article
Cite this article
Wanderer, J.P., Nelson, S.E., Ehrenfeld, J.M. et al. Clinical Data Visualization: The Current State and Future Needs. J Med Syst 40, 275 (2016). https://doi.org/10.1007/s10916-016-0643-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10916-016-0643-x