Conversational AI technology is spurring a wave of innovation between tech companies and healthcare organizations.
This article appears in the July/August 2021 edition of HealthLeaders magazine.
Over the last 17 months, a tsunami called telehealth revolutionized the healthcare industry, becoming an "overnight" sensation due to a global pandemic that forever changed the way healthcare is delivered. Meanwhile, in the background, another remarkable development is quietly gaining traction. The whisper many are hearing is the power of voice technology, or to be more accurate, conversational artificial intelligence (AI). While still in its infancy, it holds the potential to deliver the next significant wave of innovation in healthcare.
"Conversational AI technology allows people to use natural voice or text to interact with systems," says Brian Kalis, managing director, health strategy at Accenture. "There's been a growing trend of artificial intelligence moving beyond a back-end tool for the healthcare enterprise to the forefront of the clinician and consumer experience."
This trend has the potential to reduce the administrative burden on clinicians, improve clinician-patient interactions, and reduce financial pressure on healthcare enterprises, Kalis explains. It is also part of an emerging movement toward industry-specific clouds, which provide a set of cloud services, tools, and applications tailored to industries like healthcare.
It's also big business. "The healthcare conversational AI market is growing at a 25%–35% compound annual rate," Kalis says. Companies like Microsoft, Amazon Web Services (AWS), and Google Cloud are investing heavily in technologies specifically designed for healthcare and partnering with health systems to pilot their innovations. There are dozens of other companies gaining traction in this space, carving out specialized niches. The biggest obstacle to overcome is a need for near absolute accuracy in healthcare, which presents tremendous challenges with accents, dialects, and clinical language that varies by specialty.
The solutions being developed involve an array of jargon and technologies. It's important to understand two underlying concepts: "Natural language processing (NLP) is how we interpret human text; natural language generation (NLG) is how we create it," says Jeff Becker, MBA, principal analyst, healthcare at CB Insights.
For the uninitiated, imagine having the conversational functionality of a digital assistant like Amazon Alexa that not only understands medical language, but also can respond, record, transcribe, translate, and interact with the electronic health record (EHR). With patient permission, the technology could be embedded in exam rooms, the OR, call centers, the patient bedside, and the patient's home.
These use cases are not farfetched; many are already being tested, and HealthLeaders talked to the companies and health systems that are piloting these initiatives.
WellSpan and Nuance/Microsoft: Holding the patient's hand instead of a mouse
With roots that extend back more than 20 years, Nuance Communications is the elder statesman of conversational AI for healthcare. The company is recognized as the global market leader in medical transcription software, according to Fortune Business Insights. Nuance's flagship product, Dragon Medical One, is used by about 55% of clinicians to document directly into the EHR through dictation, according to Pete Durlach, Nuance's executive vice president and chief strategy officer.
About five years ago, the company decided "to really disrupt ourselves and create that next-generation technology where the clinician is not always just explicitly dictating what they want; the system's actually listening and then turning that multi-party conversation into a clinical note," Durlach recalls. Thus began the journey to what Nuance calls ambient clinical intelligence (ACI).
The market is ripe for such solutions, says Becker, because of the potential to reduce clinical documentation workloads. "The general aim," he says, is to "shift the amount of time clinicians are spending on their computers and replacing that either with more patient care time or less overall time working."
The company's ACI solution is known as Dragon Ambient eXperience, or DAX. This innovation, coupled with deep experience training its systems on the nuances of medical language in numerous specialties, was part of the allure that led to Microsoft's April announcement that it will acquire Nuance this year, Becker said at the time. The $19.7 billion deal is expected to enhance offerings available through Microsoft Cloud for Healthcare.
R. Hal Baker, MD, senior vice president and chief digital and chief information officer at WellSpan, an eight-hospital system headquartered in York, Pennsylvania, is among the early adopters of DAX. He oversees a pilot of the technology through the health system's Innovation Center, using it himself in his internal medicine practice.
"Very quickly, a lot of us were amazed at how good a note came out of using DAX," Baker says. "Especially in those complicated conversations where you're dealing with a spouse who's got dementia or medical problems and another spouse who's telling the story. Trying to weave that into a cohesive medical narrative that's appropriate for documentation and billing is a pretty tall order. We've all been remarkably impressed with how all that comes off."
The ability to distinguish voices and medical language is essential, says Baker, who explains that the technology can differentiate between cabbage and CABG (coronary artery bypass graft surgery), for example.
Yet there was another unexpected advantage that helped change the dynamics between the physician and his patients. "What I underappreciated was how much of my attention was going into invoicing," says Baker. "We were kind of mentally keeping that note in our head because we had to dictate it at the end of the visit. Just freeing myself up to pay attention to the patient has been a wonderful relief."
While WellSpan is in the preliminary stages of rolling out the technology, early results are promising, says Baker.
• Family practitioners who were high utilizers of DAX saved an estimated 29–144 hours per year compared to the way they formerly documented clinical notes.
• A patient satisfaction survey indicated 97% agreed that physicians using the technology were more focused, personable, and engaged.
• The technology also resulted in reducing the length of visits by about 9%, Baker estimates.
"We're hoping it's going to be an effective tool in the battle of burnout," says Baker. "It lets us get back to what we love about medicine, which is being focused on the patient. When you can take your hand off the mouse and hold a patient's hand, that's a nice thing."
Durlach cautions that DAX is still in its infancy, but the vision for the future is immense. Currently, it produces more than a clinical note; it's also driving a lot of the structured data, he says. Down the road, it will do more.
"To reach its peak, [ambient clinical technology] has to be tightly integrated to the EHR."
Baker points out that WellSpan's EHR vendor, Epic Systems, "is really leaning into this. We've got three companies, WellSpan, Epic, and Nuance, all working together to try to change the care environment for patients so that it gets back to where it's just between two people, and the technology kind of fades into the background. This is one of the most exciting things I've ever worked on."
"At the end of the day," says Durlach, "we expect this to be a real-time interaction inside the EHR, so that as you're documenting … it triggers real-time decision support." When the doctor is done with a visit, not only will the clinical note be done, but also orders will be queued up, coding will be complete, and extractions of data will be performed.
The vision extends even further, however, with sights set on the inpatient environment "because we want to blanket all the different workflows in healthcare," says Durlach. In addition, through its virtual agent technology, Nuance plans to explore ways to enhance digital front door and consumer engagement strategies in healthcare, whether someone is calling in over the phone to book an appointment or needs something while lying in a hospital bed.
Another possible development includes opening up its infrastructure for others to conduct research into voice biomarkers, which interpret signals in spoken language that can be used to diagnose or predict medical conditions and diseases. "We process hundreds of thousands of verbal interactions between patients and their physicians," says Durlach. "What else could be done that could not only help the provider document the note, but actually help drive better care by looking at voice biomarkers?"
Houston Methodist and AWS: Hands off the computer in the OR
Executives at Houston Methodist Center for Innovation long suspected "clinical voice technology" might play a key role in disrupting the healthcare industry, says Roberta Schwartz, PhD, MHS, executive vice president and chief innovation officer at Houston Methodist Hospital. Through a partnership with AWS that began with a conversation in 2018, the Houston-based academic medical center is now exploring the potential this technology holds for use in the OR, as well as ambulatory patient exam rooms.
"With over 1.3 million clinical visits and more than 89,000 surgeries per year, Houston Methodist was interested in using automatic speech recognition technology to create contactless solutions to improve patient experiences, while also enabling clinicians to interact seamlessly with clinical applications," says Phoebe Yang, JD, general manager, healthcare, at AWS. "Voice technologies help to convert time-
consuming, labor-intensive, and often inefficient tasks and functions into actionable items."
Usage in the OR is akin to using a digital assistant like Amazon's Alexa that enables the surgeon or other staff to use voice commands to interact with the Epic EHR and other clinical applications, explains Josh Sol, MBA, administrative director of ambulatory innovation at Houston Methodist Hospital. A surgeon asks the computer to start the case. Once activated, the system also enables the ability to set timers for alerts that are vital for tasks such as antibiotic administration or tourniquet thresholds. Rather than clicking commands and entering data with a keyboard and a mouse, the spoken word controls computer interactions.
Audio is captured using microphones in the OR, and that data is routed to the cloud for processing, explains Yang. "The short audio clips are sent to Amazon Lex—a service for building conversational interfaces into any application using voice and text—so that the clinician's commands can be fulfilled by Amazon Lambda, a serverless compute service."
"Our physicians have been looking for a way not to be staring at a computer or staring at somebody who's staring at a computer and to go back to a conversational mode," says Schwartz. The potential this solution holds is "kind of nirvana," she says.
The technology is still in very early stages, with initial testing occurring in a simulation center.
In addition to the OR system, Houston Methodist is also working with AWS to develop "ambient listening technology" for patient exam rooms. The solution shares many similar characteristics to the DAX system WellSpan is piloting. Working with IT consulting firm Pariveda, Houston Methodist created an application that captures dialogue between clinicians and patients after receiving patient consent to record, Yang says. The system uses in-room microphones controlled by a clinician's smartphone, tablet, or computer, which securely transfers the conversation to the cloud for processing, where it is transcribed and indexed.
Part of the solution involves using Amazon Comprehend Medical, a HIPAA-eligible NLP service, to "parse the medical terminology," Yang says. "After the visit, a summary note of the interaction is automatically generated in real time and emailed to the patient, as well as inserted into the clinician's EHR inbox as a SOAP (subjective, objective, assessment, and plan) note for review. Furthermore, since clinically relevant data from the exam is now indexed and searchable, Houston Methodist is able to automatically insert relevant data points captured during the visit into discrete fields of its EHR."
Capturing the conversation accurately is essential, says Sol, and live simulations will help build greater accuracy into the model. To gauge effectiveness, the system generates a confidence score for each physician-patient interaction, indicating how certain it is that the conversation was captured correctly.
"Our first focus is going to be family practice [and] internal medicine," says Sol, who explains that the partnership with AWS will help the tech company expand its Comprehend Medical platform.
While the technology is still in very early stages, Schwartz cautions that there are other factors to consider before these solutions will be in general use. "There's [not] a magic box that you open and it's available to you," she says. "It just doesn't work that way. There's always a technological aspect and a cultural aspect of bringing your doctors into the conversation of making a change. Even though they may hate the electronic medical record, they are used to it. Culturally, switching to a new way [of doing things] takes time. You have to go into these changes with a very strong stomach and a huge level of patience."
“There's been a growing trend of artificial intelligence moving beyond a back-end tool for the healthcare enterprise to the forefront of the clinician and consumer experience.”
Brian Kalis, managing director, health strategy at Accenture
Mandy Roth is the innovations editor at HealthLeaders.
Photo credit: Illustration by Fabio Consoli
Conversational AI technology will reduce physicians' administrative demands and create greater patient engagement.
Voice tech uses are being explored in the exam room, OR, and patients' homes.
The ability to accurately capture medical language and produce searchable fields in the HER is crucial to progress.