This research examines the capabilities and boundaries of a hands-free mobile augmented reality (AR) system for distributed healthcare. We use a developer version of the Google Glass™ head-mounted display (HMD) to develop software applications to enable remote connectivity in the healthcare field, and to characterize system usage, data integration, and data visualization capabilities.
In this chapter, we summarize findings from the assessment of the SnapCap System for chronic wound photography. Through leveraging the sensor capabilities of Google Glass, SnapCap enables hands-free digital image capture, and the tagging and transfer of images to a patient’s electronic medical record (EMR). In a pilot study with wound care nurses at Stanford Hospital (n = 16), we examined feature preferences for hands-free digital image capture and documentation; and compared SnapCap to the state of the art in digital wound care photography—the iphone-based Epic Haiku application.
The results of this study (1) illustrate the application of design thinking for healthcare delivery involving mobile wearable computing technology for distributed care, (2) improves our understanding of the benefits of human augmentation through enhanced visualization capabilities, and (3) explores a system’s ability to influence behavior change through equipping clinicians with tools to improve complex problem solving and clinical decision-making in context-dependent medical scenarios. The work contributes to the future implementation of new features aimed at enhancing the documentation and assessment of chronic wounds, and provides insight into the need for future IT systems engineering projects aimed at enhancing healthcare connectivity for distributed care.
- Augmented Reality
- Pressure Ulcer
- Image Capture
- Wound Care
- Word Error Rate
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.