Skip to main content

Is Your AI Scribe HIPAA-Compliant?

Analysis  |  By Eric Wicklund  
   July 23, 2025

Ambient AI technology is great for capturing the doctor-patient conversation, but are healthcare executives making sure that data is protected?

Healthcare executives who are letting their doctors use AI scribes to capture and code patient encounters need to be careful they aren’t exposing or misusing protected health information.

Ambient AI technology, described as the “digital sidekick of modern healthcare,” is quickly gaining favor among health systems, hospitals and payers looking to reduce the administrative burden on providers and accurately capture the doctor-patient visit. But those tools, which can easily be downloaded onto a smartphone, could be running afoul of HIPAA.

“Technically, it’s a third party listening into the conversation,” says Aaron Maguregui, a partner with the Foley & Lardner law firm who specializes in AI and healthcare technology.

An AI scribe, he says, is essentially a service provider, so the healthcare provider using that app, as a covered entity, would need a Business Associate Agreement (BAA).

The challenge is particularly acute at this point in the AI cycle, when vendors are flooding the market with their own products – some from companies new to the healthcare industry and unfamiliar with or ignorant of the regulatory requirements. In many cases these apps can be downloaded by providers and put to use almost immediately.

That’s a nightmare for healthcare leaders trying to keep track of what their doctors are using. There are plenty of stories of CIOs and CEOs learning that a doctor in one of their hospitals or clinics is using ChatGPT or some other product on their own.

“There might be some (doctors) that are already enjoying the scribe and you just don't know about it,” Maguregui, who also chairs the American Telemedicine Association’s Artificial Intelligence Committee, points out. “Technically you could have an unauthorized disclosure of PHI.”

Maguregui recently authored a blog on the Foley & Lardner website with Jennifer Hennessy, a data privacy and security attorney with the law firm, on the most common mistakes that healthcare executives make in managing AI scribes. Those pitfalls, he says, can be grouped into two basic issues: Data use rights and patient consent.

Issues around data use are particularly critical, and point to an intriguing catch-22 in the healthcare space. AI needs access to better data to learn and improve, and vendors often will ask for more data in which to train and improve their products. But healthcare leaders are notoriously stingy in granting access to that data.

“It's always interesting to me that the knee-jerk reaction is we don't want you to train AI on our data. And that to me is backwards thinking,” Maguregui says. “You want the technology to be efficient and accurate, but you don’t want it to use your data. Without that you don’t get the full value of AI.”

That’s especially tchellenging, he says, with tech companies that are new to healthcare, bringing in ideas from other industries.

“There are some really cool stuff out there, but maybe this is their first foray into healthcare and they don't understand that, yes, they can very much ingest all the data that they believe they're allowed to ingest, and then?” he asks. “The output would somehow not be theirs to be able to use, to train their products. That's a very foreign concept to the tech world. … [Those] data use cases, data use rights, those end up being a pretty sticky subject.”

On the other hand, Maguregui points out, using AI in clinical care means training the technology on the best data available – including protected health information.

“Specificity counts, and specificity is what we're looking to get to with respect to AI,” he says. “We want AI to give specific answers. We want it to be nuanced. Those nuances are going to have to at some point start to take into account identifiable information in order to glean cohorts and cultural differences and social determinants of health, things that we probably want to learn. We want to understand these concepts. But we also need to make sure that we're being cautious with people's privacy rights.”

And that’s where the second hang-up with AI scribes comes into play. By using a third-party app to record their patient encounters, providers need to secure the patient’s permission to be recorded, a requirement included in federal wiretapping laws.

Maguregui says providers need to understand that getting a patient’s permission to record their encounter has to be a part of the workflow. And that may be fine in the doctor’s office, but what happens when AI captures conversations in the Emergency Department, ICU or even the operating room?

Whatever the case, healthcare executives need to make sure their ambient AI tools are HIPAA-compliant – and they need to make sure their AI strategies take into account the potential for using PHI in future programs.

To that end, in their blog, Maguregui and Hennessy offer five steps that healthcare leaders should take when dealing with AI scribes:

  • Vet vendors thoroughly;
  • Build governance into your EHR workflows;
  • Limit secondary use/training without authorization;
  • Update your risk analysis; and
  • Train your providers.

Eric Wicklund is the associate content manager and senior editor for Innovation at HealthLeaders.


KEY TAKEAWAYS

AI scribes are one of the hottest tools on the market, as healthcare executives look for technology that can reduce the administrative burden on providers and give them a complete and codable transcription of the doctor-patient encounter.

Healthcare executives need to treat these tools as third parties in the healthcare setting, rigorously testing for security flaws and ensuring that protected health information captured during the conversation is used properly and safely.

Because AI tools get better by continuously updating and accessing better data, there will come a time when healthcare leaders need to re-evaluate their data use priorities and develop new strategies for using PHI.


Get the latest on healthcare leadership in your inbox.