Skip to main content

How to Create a Gameplan for Assessing AI Readiness

Analysis  |  By Eric Wicklund  
   February 09, 2024

AI NOW panelists from Scripps Health and Providence say a health system needs to be up-front and transparent about how AI will be used, while making sure the resources are in place for educating staff and clinicians

Healthcare organizations need to establish a clear and transparent process for enterprise-wide AI governance. The first step is knowing whether you’re mature enough to use the technology.

That’s the opinion of executives from Scripps Health and Providence who participated in the recent HealthLeaders AI NOW summit. Both agreed that health system leadership needs to look at both culture and infrastructure before moving on to developing programs.

“This is truly a team sport,” said Shane Thielman, FACHE, CHCIO, corporate SVP and chief information officer at San Diego-based Scripps Health, who noted that leadership has to commit to open dialogue and transparency to not only educate staff and clinicians but keep patients and the public aware of how AI will be used.

And that conversation will be ongoing. It may even mean ending a program if the results just aren’t there yet and waiting for the technology to improve.

“This is not something that you turn on and then walk away to the next opportunity,” he said.

Sara Vaezy, EVP and chief strategy and digital officer at Seattle-based Providence, says health systems often have to start by surveying their staff and clinicians about what they want to know about AI, then  creating specific resources to address those concerns. That might include creating a resource hub, work groups, and videos.

“Everyone’s getting their hands dirty,” she said, referencing the evolving nature of AI. “It’s a constant undertaking because so much is happening out in the market.”

She also noted that the hype around AI has taken on a life of its own, in some cases obscuring what health system leadership should be focusing on with the technology. Some AI models can drift away from what they were designed to accomplish, and leadership needs to “lean in with your hands on the wheel and make sure you’ve got the right processes and the right technology in place.”

Thielman said the analysis and decision-making need to be multidisciplinary, as AI extends into and affects many departments within the enterprise, from administrative to clinical to IT to security. All departments, he said, need to “understand what the lift will be to do that successfully.”

“There’s a significant element of change management that goes into introducing any AI solution,” he pointed out.

[See also: Are Health Systems Mature Enough to Use AI Properly?]

Aside from creating a culture around AI readiness, Thielman and Vaezy said health systems need to assess their infrastructure. Do they have the technology in place and the capacity at hand to support AI programs, which include data storage, quality, and analysis?

“Many organizations don’t necessarily have the capability or the capacity from a capital perspective to make investments,” Vaezy said. “Finding the right partners to build that out is a great way to extend that.”

Providence, for example, has partnered with Microsoft for more than eight years.

“If you don’t have a cloud structure, it’s going to be difficult,” she noted.

Thielman said data quality is an often-overlooked part of AI governance, especially with generative AI programs that require continuing oversight.

“If you have garbage data, it isn’t going to help you do much,” Vaezy added.

[Listen to the podcast: Assessing Healthcare's Fascination With Generative AI.]

A key component of assessing AI maturity is understanding where the technology will be used. Too many organizations jump at what’s being called the “low-hanging fruit,” or programs that involve minimal effort and produce quick results, without planning ahead. Those early wins may be great for establishing a base and building morale, but a forward-thinking organization should be planning several steps ahead from the outset.

Thielman pointed out that early programs are tied to back-office and administrative gains and focus on financial improvement. But clinical outcomes need to be considered as well, as they usually take longer to prove value. Taking those into consideration at the start enables leadership to map out costs and outcomes over time.

“What’s the return on investment?” he asked. “That can have a financial element and it can also have a value element. As we explore AI further it is not only about a direct financial benefit … particularly if there is an up-front financial investment that is necessary. There are some other really intractable challenges … that we’re all interested in addressing.”

Vaezy noted that many technology projects “are worse before they get better,” and need time to settle in and show value. That’s especially true of AI.

“In some cases … with generative AI, frankly the solutions aren’t really ready for enterprise-grade adoption,” she said. “You don’t want some mission-critical function rely on something that’s a flash-in-the-pan.”

Finally, Thielman noted that health system leadership needs to pay attention to the ongoing debate over who should govern AI. The Biden Administration has unveiled its own strategy, with an emphasis on collaboration, but many within healthcare feel the reins should be in their hands.

“It is important that the autonomy continues to reside with healthcare systems relevant to AI that is not currently regulated today,” he said. “We don’t want to have an unintended consequence [or] a negative clinical outcome. We don’t want to place more burden on our clinical workforce. … We don’t want to introduce more inefficiency in our operations through the introduction of AI. … That level of decision-making should continue to be retained within the individual healthcare system.”

Eric Wicklund is the associate content manager and senior editor for Innovation, Technology, and Pharma for HealthLeaders.


KEY TAKEAWAYS

The fast-paced development of AI is catching many health systems off guard, to the point where they may not be mature enough to use the technology just yet.

Healthcare leadership needs to assess staff and clinicians for their understanding of AI and readiness to use it, and provide resources and ongoing communication as part of an AI strategy.

Leadership also needs to map out how and where AI will be used, and understand that proper planning looks at long-term as well as short-term goals.


Get the latest on healthcare leadership in your inbox.