Electronic health records (EHRs) promise improved patient care coordination among healthcare professionals. Long-touted benefits of EHRs include faster and more accurate diagnosis, more effective treatments, and lower healthcare costs. Pushback from physicians and other clinicians often cite the need to look at device screens to reference and enter data at the cost of time spent engaging with patients.

A team at Vanderbilt University Medical Center (VUMC) is developing an AI-empowered voice assistant to help medical personnel interact with EHRs without harming doctor-patient relationships. EVA, the EHR Voice Assistant will employ Nuance AI technology to interpret voice requests to work with EPIC-based EHR systems. For example, if a clinician asks, “What was the last sodium?” the system will transcribe the request, pull up the last test results, and indicate where the result falls in relation to possible scores. If a physician can just ask the question rather than needing to navigate a series of screens to either find the right button or type in a request, the time can be better spent with the patient or other critical tasks.

Today the EVA project provides answers in text format but the next step is to add voice reporting: giving EVA the ability to speak as well as listen. The focus in the current stage of the project is on clinical staff engagement with EVA, but the developers plan to build in patient engagement as well. The result should be easier access to EHR information.