Healthcare organizations are using digital health technology to help doctors and nurses communicate with patients who speak different languages, aren't comfortable using English, or have other communication challenges.
With more than 800 languages spoken in the New York City area, communication challenges are a very real possibility. And nowhere is that more dangerous than in a healthcare facility, where an incorrect translation could affect clinical outcomes.
Healthcare organizations are turning to technology to address that challenge, with partnerships and digital health platforms that enable care teams to access interpreters in real time.
"We deal with a melting pot as far as diversity goes," says Kerry Donohue, MSN, RN, manager of patient experience and culture leader at Manhattan Eye, Ear, and Throat Hospital (MEETH), a division of Northwell Health's Lenox Hill Hospital. "Every day, I'd say one out of every five patients [speaks a language other than English}, and it can be challenging."
When confronted with a patient speaking Farsi, Romansh, Mandarin, or any other language, the traditional tactic would be to look for a multilingual family member or grab the nearest staff member who just happened to speak that language—at least that's what happened in St. Elsewhere—or grab a phone, call the hospital's translation service and hope they had someone nearby who knew that language.
Digital health technology has made that process easier. Care teams can now use a smartphone or tablet to connect through an mHealth app with an interpreter in real time, even by video, on a platform that specializes in translation services. MEETH, for instance, uses LanguageLine services on tablets provided by Equiva Health, a digital health patient engagement company based in New York.
"It's like FaceTime," says Donohue. "You're connected with someone who knows the language."
Making sure patient and provider are speaking the same language is critical in healthcare, and it goes far beyond patient engagement. Doctors and nurses not only need to know exactly what happened and how a patient is feeling, but that their questions, diagnoses and care plans are understood. Something lost in translation could result in a missed symptom that indicates a more serious health issue, or a misunderstood prescription or treatment plan that could make things worse, even fatal.
"It's really not best practice to use a fellow clinician or a family member as a translator," Donohue says, noting that a trained medical interpreter can pick up nuances in both language and clinical terms that others might miss. In addition, this resource means providers don't have to pull in colleagues to help with translation, interrupting other workflows and affecting patient care.
The language barrier isn't just in New York City, either. From Maine to Hawaii, in communities and healthcare sites large and small, the chance of coming across someone who speaks a different language—and who may not speak English at all—has grown. And with the advent of telehealth, more hospitals are engaging in virtual care with patients and other providers in different parts of the world.
In Boston, Brigham and Women's Hospital is testing a device-agnostic website and app called CardMedic, designed to tackle both language and communication barriers, including visual, hearing and cognitive impairment.
"You need as many tools as you can get to help communicate with patients," says Andrew Marshall, MD, an emergency medicine physician. "Clinical questions don’t always fit well into a box, and interpreters aren't always available."
Marshall sees the technology addressing a key social determinant of health that affects care for a wide array of underserved populations. If someone is uncomfortable talking to a care provider in another language or has issues communicating, he or she might delay going to a clinic or hospital or even skip the visit altogether. Or that person might come out of a visit to the hospital or doctor's office with questions about what was said or communicated.
"Brigham and Women's has a robust interpreter service, but you need to make sure" that every word is understood correctly, he says. That might mean using sign language, or providing visual cues or a vocabulary for someone with cognitive issues.
"God forbid you end up having to use Google Translate" to explain the intricacies of diabetes or a heart condition, he adds.
Equiva and CardMedic are part of a wave of innovative ideas aimed at tackling communication barriers in healthcare. Aside from apps and websites that can handle interpretation, there's ongoing research into natural language processing (NLP) and voice activated technology—imagine Alexa handling these tasks in an ER or doctor's office. Other ideas include robots, avatars, and wearables, even smartglasses and hearing aids, that can handle translation.
"At the end of the day you're making physicians into better physicians," says Marshall.
“You need as many tools as you can get to help communicate with patients.”
— Andrew Marshall, MD, an emergency medicine physician at Brigham & Women's Hospital.
Eric Wicklund is the associate content manager and senior editor for Innovation at HealthLeaders.
KEY TAKEAWAYS
Healthcare organizations are increasingly facing challenges communicating with patients who speak another language, don't fully understand English, or have cognitive issues that affect communication.
This communication barrier not only hinders providers from understanding a patient's health concern, but could also affect how a patient understands a diagnosis or treatment plan.
A new wave of innovation aims to address this challenge through apps and platforms that link providers in real time to qualified medical interpreters, as well as robots, wearables, and other technology aimed at helping people communicate.