Tuesday, July 16, 2013
The Healthcare IT Applications of Google Glass
Last week I had the opportunity to test Google Glass.
It's basically an Android smartphone (without the cellular transmitter) capable of running Android apps, built into a pair of glasses. The small prism "screen" displays video at half HD resolution. The sound features use bone conduction, so only the wearer can hear audio output. It has a motion sensitive accelerometer for gestural commands. It has a microphone to support voice commands. The right temple is a touch pad. It has WiFi and Bluetooth. Battery power lasts about a day per charge.
Of course, there have been parodies of the user experience but I believe that clinicians can successfully use Google Glass to improve quality, safety, and efficiency in a manner that is less bothersome to the patients than a clinician staring at a keyboard.
Here are few examples
1. Meaningful Use Stage 2 for Hospitals - Electronic Medication Admission Records must include the use of "assistive technology" to ensure the right dose of the right medication is given via the right route to the right patient at the right time. Today, many hospitals unit dose bar code every medication - a painful process. Imagine instead that a nurse puts on a pair of glasses, walks in the room and wi-fi geolocation shows the nurse a picture of the patient in the room who should be receiving medications. Then, pictures of the medications will be shown one at a time. The temple touch user interface could be used to scroll through medication pictures and even indicate that they were administered.
2. Clinical documentation - All of us are trying hard to document the clinical encounter using templates, macros, voice recognition, natural language processing and clinical documentation improvement tools. However, our documentation models may misalign with the ways patients communicate and doctors conceptualize medical information per Ross Koppel's excellent JAMIA article. Maybe the best clinical documentation is real time video of the patient encounter, captured from the vantage point of the clinician's Google Glass. Every audio/visual cue that the clinician sees and hears will be faithfully recorded.
3. Emergency Department Dashboards - Emergency physicians work in a high stress, fast paced environment and must be able to quickly access information, filtering relevant information and making evidence-based decisions. Imagine that a clinician enters the room of a patient - instead of reaching for a keyboard or even an iPad, the clinician looks at the patient. In "tricorder" like fashion, vital signs, triage details, and nursing documentation appear in the Google Glass. Touching the temple brings up lab and radiology results. An entire ED Dashboard is easily reduced to visual cues in Google Glass. At BIDMC, we hope to pilot such an application this year.
4. Decision Support - All clinicians involved in resuscitation know the stress of memorizing all the ACLS "code" algorithms. Imagine that a clinician responding to a cardiac arrest uses Google glass to retrieve the appropriate decision support for the patient in question and visually sees a decision tree that incorporates optimal doses of medications, the EKG of the patient, and vital signs.
5. Alerts and Reminders - Clinicians are very busy people. They have to manage communications from email, phone calls, patients on their schedule, patients who need to be seen emergently, and data flowing from numerous clinical systems. They key to surviving the day is to transform data into information, knowledge and wisdom. Imagine that Google Glass displays those events and issues which are most critical, requiring action today (alerts) and those issues which are generally good for the wellness of the patient (reminders). Having the benefits of alerts and reminders enables a clinician to get done what is most important.
Just as the iPad has become the chosen form factor for clinicians today, I can definitely see a day when computing devices are more integrated into the clothing or body of the clinician. My experience with Google Glass helps me understand why Apple just hired the CEO of Yves Saint Laurent to work on special projects.
Ten years ago, no one could imagine a world in which everyone walked around carrying a smartphone. Although Google Glass may make the wearer appear a bit Borg-like, it's highly likely that computing built into the items we wear will seem entirely normal soon.
I will report back on our Google Glass experiments as they unfold.
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment