As health systems like Mercy look to use ambient AI in inpatient settings, they’re finding that nurses have a different way of delivering care – and talking about it.
Ambient AI for nurses isn’t the same as ambient AI for doctors.
That’s an important point to factor into ambient listening programs as healthcare organizations look to integrate them into inpatient care. No longer are we talking about just the doctor-patient encounter.
"Nurses document entirely different from how the physicians document," notes Cheryl Denison, BSN, RN, NI-BC, Principal Clinical Business Solution Analyst and Integration Director of Clinical Applications at Mercy.
The St. Louis-based 44-hospital health system is one of a handful piloting Microsoft’s Dragon Copilot AI clinical assistant in the nursing space, targeting an area where nurses spend about a quarter of their time on documentation. And while the benefits to patient care and nurse stress and burnout may be easy to identify, it’s not as simple as turning on an app and telling nurses to talk.
Understanding How Nurses Speak
"The nurse-patient relationship is intimate," says Stephanie Clements, Mercy’s SVP and Chief Nurse Executive. "That ability to be hands-on is why most of us chose to go into nursing in the first place."

Stephanie Clements, SVP and Chief Nurse Executive at Mercy. Photo courtesy Mercy.
In other words, nurses are focused on patient care and interactions, and their first thought isn’t to speak about what they’re doing so that it can be recorded into the medical record. In many cases that’s an entirely different way of thinking and doing.
With this technology, "we have to speak," says Denison. "So [patients] may feel like we're talking more because we're saying our assessments out loud more than … previously. When we assess, we don't always speak those things out loud. We kind of keep them in our head. Whereas to make this work, you actually have to … talk about them out loud."
The key to success, then, is helping nurses understand how best to work with an ambient AI platform. That means focusing on change management, as well as working with nurses who might not be comfortable at first in talking so much about what they’re doing.
"This works the best the more you speak, and it doesn't even have to be directly about your assessment," Denison says. "It can be about the just the interactions that you're having with the patient. Because that's the invisible work that nurses do, besides the assessment, that really helps the patient through their stay in the hospital."
Denison and Clements say it’s taken Mercy about a year to prepare Dragon Copilot for use by nurses, and a lot of that time was spent understanding how nurse communication and workflows can best be captured by an ambient AI tool. While nurses might be able to recite accurately what they’re doing at the bedside, in some cases there’s value in what is said, what words are used or how certain actions are related, so that everything is captured for the medical record.
"The nurse might just say ‘Pulses are palpable,’" Denison says. "OK, well, when the AI listens to that, it doesn't know which pulse you’re describing. So it’s going to put all pulses in because you didn’t define them. … So next time when they speak, they might want to say ‘OK, your feet pulses or your pedal pulses are palpable.’ Because the Ai can’t see what you’re doing."
"Not that we're trying to make you say something differently, just be a little more clear," she adds.

Cheryl Denison, BSN, RN, NI-BC, Principal Clinical Business Solution Analyst and Integration Director of Clinical Applications at Mercy. Photo courtesy Mercy.
As with any AI tool, Mercy requires that all notes transcribed by AI be reviewed by the nurse for accuracy. Both Denison and Clements say the results so far have been very good, and that nurses are learning how to speak better and more accurately by reviewing those notes.
‘They Feel Like We’re Talking More to Them.’
This, in turn, is creating better nurses’ notes in flow sheets, and giving doctors better data and insights on their patients. But more importantly, as with doctors, the AI tool is giving nurses more time in front of patients and less in front of a computer. And patients are noticing.
"The patients really love it because we are spending more time [with them] and they feel like we're talking more to them," Denison says.
"It's very similar to when you go to a restaurant and the waiter or waitress will [recite] your order back to you. You know you've been heard. And so that's really where we're seeing some impact on the patient satisfaction scores."
To gauge the effectiveness of the tool, Denison says they’re tracking incidental overtime, which is usually the extra time spent by nurses at the end of their shift adding data into the flow sheets. They’re also charting time spent documenting, time spent in flow sheets, and flow sheet latency.
Not to mention nurse stress, as measured through regular nurse surveys. Joe Schmitz, Mercy’s Executive Director of Optimization, says roughly two-thirds of nurses surveyed are reporting the ambient AI tool has helped them and reduced stress and burnout.
"We've actually seen better adoption rates from some of our experienced nurses than some from some of our newer nurses, just from that care out loud perspective," he adds. "They just really dove into the product and have done really well with it."
Eric Wicklund is the senior editor for technology at HealthLeaders.
KEY TAKEAWAYS
Healthcare organizations are expanding ambient AI platforms to help nurses with their workflows -- and finding that the nurse-patient encounter is far different from the doctor-patient encounter.
Nurses work a lot more with their hands, and many aren’t used to talking out loud or describing what they’re doing.
Healthcare leaders need to spend time working with nurses to both understand their workflows and make sure they’re comfortable with an ambient AI platform. This means giving nurses the time and space to learn how to describe their work so that it’s entered accurately into the medical record.