The practice of healthcare in the U.S. has resulted in patients spending less time speaking with their physicians and other professionals. According to a presentation by Nick van Terheyden, Chief Medical Information Officer with Nuance, about three quarters of patients spend less than 20 minutes with a doctor during an office visit. As a result, they feel rushed and do not always feel that they have had adequate time to discuss their situation and get answers to their questions.
van Terheyden sees technology as providing part of the solution. He cited research that indicates that patients are actually more willing to disclose personal details to an automated system than to a human. When patients were asked to respond to health-screening questions, many were more comfortable revealing information when they thought that they were responding to a computer than to a person.
This has important implications for Health Tech devices, especially wearables. As van Terheyden points out, small wearable and handheld devices can be difficult to interact with when typing to enter information. The use of a natural language interface, however, can be easier to use. By using an virtual assistant app on a smartphone or wearable device, the user can log information, respond to requests for data, and request information about their past or current behaviors. (“How far have I walked today?” “How many calories have I had so far?”) This accessible way to record and access data can help shape user behavior. Speech recognition can be easier and faster than typing or other touch-related inputs, and with the ability to use a trigger word to get a device to “wake up,” a spoken user interface can save power to extend battery life. And text-to-speech output can be easier for users than trying to read a tiny screen.
Speech recognition may end up playing a major role in wearable Health Tech device user interfaces.