Jim Gilligan, Vice President, Health System and Group Engagement at the American Medical Association (AMA), chats with HealthLeaders Exchange team member Abby Mathis to discuss challenges and trends...
Partnering with international organizations can help influence nursing education and give nurses global experience, says this nurse educator.
On this episode of HL Shorts, we hear from Dr. Yolanda VanRiel, the department chair of nursing at North Carolina Central University, about how CNOs can create pipelines into the nursing industry by partnering with international organizations. Tune in to hear her insights.
Peggy Norton-Rosko, system chief nurse executive at the University of Maryland Medical System, chats with CNO editor G Hatfield about nursing challenges in 2024 and what the industry will look like...
Anger toward health insurers reflects people's 'pent-up pain'
In the aftermath of UnitedHealthcare CEO Brian Thompson's fatal shooting, an outpouring of rage at the U.S. health care system has risen to the surface.
Social media posts have ranged from mournful to apathetic to joyful, including morbid celebrations of Thompson’s death. That deluge has forced people across the country to grapple with two heavy subjects at once: the callousness of a slaying, and an undercurrent of deep-seated anger at a health care industry that makes a lot of money by exploiting Americans.
“People feel there is an inherent unfairness in the way that the system works,” one advocate said. “That someone who has health insurance gets sick and then it’s a company, a business, that can be the barrier to them accessing the care they need to sometimes save their lives.”
Despite the hype, AI has the potential to cause harm as well.
AI may be at the top of the hype cycle in healthcare, but its uncertain governance and potential for misuse are also making it the top technology hazard for 2025.
‘Risks with AI-enabled health technologies’ soared to the top of ECRi’s annual top 10 health technology hazards, after placing fifth last year (when it was called ‘Insufficient governance of Ai in medical technologies.’).
The rise to the top of the list underscores growing concern over AI. Health systems and hospitals are embracing the technology at a rapid pace, even as industry groups and the federal government try to keep up with governance.
According to the ECRI report, inaccurate or incomplete data fed into AI algorithms can lead to disparate health outcomes or inappropriate responses, as well as hallucinations and data drift. Healthcare leaders who don’t invest in and emphasize continuous monitoring run the risk of overlooking these lapses and threatening their patients.
“Further, AI solutions can yield disappointing results if organizations have unrealistic expectations, fail to define goals, provide insufficient governance and oversight, or don’t adequately prepare their data for use by the AI application,” the report stated.
“The bottom line? Placing too much trust in an AI model—and failing to appropriately scrutinize its output—may lead to inappropriate patient care decisions,” researchers concluded. “AI offers tremendous potential value as an advanced tool to assist clinicians and healthcare staff, but only if human decision-making remains at the core of the care process. Preventing harm requires careful consideration when incorporating any AI solution into healthcare operations or clinical practice.”
Healthcare in the home setting continued to score high on the list, reflecting both the industry’s interest in remote patient monitoring and Hospital at Home strategies and a growing senior population interested in living out their years at home. ‘Unmet technology support needs for home care patients’ placed second on ECRI’s list, while last year’s list was led by ‘Usability challenges with medical devices in the home.’
“For many patients, healthcare at home is an attractive alternative to hospital-based treatment,” the report noted. “But delivering care in the home has unique concerns, particularly when the patient or a family member is responsible for operating a complex medical device. Devices such as ventilators, dialysis machines, and infusion pumps traditionally have been used in acute care settings under clinical supervision but increasingly are being used in the home.”
“Minimizing the risk of harm requires providing home users with the support they need to operate, maintain, and troubleshoot the device successfully,” researchers concluded. “This involves anticipating challenges that the user may face and selecting devices that are well matched to the patient and the environment of use.”
Rounding out the list:
Vulnerable technology vendors and cybersecurity threats
Substandard or fraudulent medical devices and supplies
Fire risk from supplemental oxygen
Dangerously low default alarm limits on anesthesia units
Mishandled temporary holds on medication orders
Poorly managed infusion lines
Harmful medical adhesive products
Incomplete investigations of infusion system incidents
Cybersecurity scored high on the list, coming in third following a year in which data breaches and ransomware attacks often dominated the healthcare headlines. Last year, ‘Ransomware as a critical threat to the healthcare sector’ scored 6th—only a few months before the devastating Change Healthcare attack.
“Measures that can help a healthcare organization mitigate thirdparty risks include thoroughly vetting vendors at the start of the service acquisition process, building in redundancy, conducting incident response testing, and developing recovery procedures,” the report stated.
Nurses recruited from international communities help increase diversity in the workplace and care delivery in rural communities, says this nurse leader.
The former Banner Health and Intermountain Health CIO will become Chief Digital and Information Officer in early 2025, succeeding Craig Richardville, who departed in July.
Intermountain Health is bringing a familiar face back into the fold to serve as the health system’s new Chief Digital and Information Officer (CDIO)
Ryan Smith, who spent more than 20 years with Intermountain and served as its Chief Information officer from 2020-2022, will take over as CDIO in early 2025.
Ryan Smith, Intermountain Health's new Chief Digital and Information Officer. Photo courtesy Intermountain Health.
“This opportunity is deeply meaningful to me,” Smith said in a press release. “I'm excited for the opportunity to make healthcare an easier, safer experience for patients, members, and caregivers alike.”
“I’m confident that Ryan is the right leader to help Intermountain successfully navigate both opportunities and obstacles as a model health system in the complex world of healthcare that lies ahead,” Rob Allen, Intermountain Health’s president and CEO, said in the release. “He will lead DTS to support our mission, vision, and strategy to simplify, expand proactive care, and improve the healthcare experience for our caregivers, patients, members, and communities.”
Smith served a number of roles at Intermountain from 1994 to 2013, then became Banner Health’s SVP and CIO from 2013-2018 before joining Health Catalyst as an SVP and executive advisor. After his two-year stint as Intermountain’s CIO, he joined the digital health non-profit Graphite Health as its Chief Operating Officer in 2022, then became interim president and CEO this past February.
Smith succeeds Craig Richardville, who was SVP and CDIO of SCL Health from 2019 to 2022, became SVP and CDIO of Intermountain in 2022 when it acquired SCL Health, and left the health system this past July.
Smith will report directly to Intermountain’s Chief Strategy Officer, Dan Liljenquist, and serve as a member of the health system’s Enterprise Leadership Team. He’ll lead Digital Technology Services (DTS), including DTS Operations, Digital Services, Data Services, Clinical Informatics, Cybersecurity, Application Services, and Information Technology.
Jon Handler and Roopa Foulger of OSF HealthCare, participants in the HealthLeaders Mastermind program on Ai in clinical care, say the healthcare industry still has a lot to learn about ROI.
The trick to embracing Ai for clinical care is managing expectations. That includes understanding what ROI really means with this technology.
“The ROI piece is always interesting,” says Jon Handler, Senior Fellow for Innovation with OSF HealthCare. “There’s this concept of hard costs and hard ROI and soft ROI. … At the end of the day, the real-world impacts on the bottom line are the same regardless of how hard or easy it is to measure it.”
Jon Handler, Senior Fellow for Innovation, OSF HealthCare. Photo courtesy OSF HealthCare.
Handler and Roopa Foulger, Vice President of Digital and Innovation Development for OSF HealthCare, are taking part in the HealthLeaders Mastermind program on the use of AI in clinical operations. They say the Illinois-based health system is looking to be reasonable in finding the value of new tools and programs, with an eye not only on the bottom line but also long-term clinical value.
“How do we measure it?” Handler asks. “How do we assess it? How do we validate it? And I think that gets harder, not easier, with some of the new large language models and the generative AI that’s out there. Because now, instead of algorithms built on a use case by use case basis, you’ve got this general purpose model – how do you evaluate all the things that it can do?”
“There are so many other ways to measure the value that is created,” he concludes. “Determining the right things to measure, which may not always be the easiest things to measure, is critical.”
Foulger says the health system has been using AI in several areas, including some clinical programs around mortality and risk prediction and imaging reviews. Through OSF Innovation, they’re looking at small startups with unique ideas, in addition to implementing AI tools provided by their EHR vendor.
“We’re encouraging what might be different that we should keep an eye on,” she says. “At the same time we’re asking, ‘Why try to build something already available?’”
Both Handler and Foulger say they’ve been surprised at how fast AI has worked its way into healthcare, even as the industry has been using automation and predictive algorithms for more than a decade. But while they’re seeing adoption in several departments and showing success in improving efficiency and reducing administrative stress, they’re also seeing a lot of strong use cases fail to make an impact.
Roopa Foulger, Vice President of Digital and Innovation Development, OSF HealthCare. Photo courtesy OSF HealthCare.
“I’m surprised at what is working and what is not working,” says Foulger, who notes that AI tools have shown value in revenue cycle and finance by handling complex processes that take a lot of time and effort. She wonders if healthcare organizations are embracing new ideas too quickly, and not giving these tools time to prove their efficacy.
Handler says he’s surprised that some promising projects, like using AI to transcribe the doctor-patient encounter or generate draft replies to inbox messages, have seen mixed results in published literature. There may be a disconnect between the outcomes some expect from these new tools and the benefits they might more consistently provide, like reduced stress and burnout.
It may also be, he says, a good indication that healthcare still has a lot to learn about AI.
“It’s hard to know when you’re dealing with something that’s overhyped or not,” Handler says, noting the internet was once a shiny new tool that received mixed predictions of its impact before becoming universal. “So any prediction about the future [of AI] … is treading on dangerous territory because people who make predictions are very often wrong.”
“In addition to unexpected upsides, there may also be downsides that that we haven't anticipated or been able to manage because of the speed with which these things are happening,” he adds. “These are really, really important questions to wrestle with.”
Foulger sees a future where AI is part of smarter healthcare ecosystem, giving patients and providers instant access to decision support, best practices and health and wellness tips. She notes that the industry has access to vast amounts of data, but until now it hasn’t had the tools to make use of that information.
The key, Handler adds, is to find the right way to use those tools.
“My biggest hope is that we capitalize on it as effectively as possible to help improve the service we can provide to our fellow human beings.”
The HealthLeaders Mastermind program is an exclusive series of calls and events with healthcare executives. This Mastermind series features ideas, solutions, and insights on excelling in your AI programs.
To inquire about participating in an upcoming Mastermind series or attending a HealthLeaders Exchange event, email us at exchange@healthleadersmedia.com.
Nurses should lead innovation so that it happens with them, not to them, says this CNE.
It’s an exciting time for innovation in the healthcare space, as new technologies pop up across the industry that can improve care delivery.
Health systems everywhere are experimenting with several new innovations, all with the goal of streamlining processes and removing unnecessary burdens from nurses and physicians alike.
Gail Vozzella, senior vice president and chief nurse executive at Houston Methodist, said nurses should get involved with innovation and leaders must use their seat at the table to advocate for nursing technology.
Here are the four reasons nurses should lead innovation, according to Vozzella.
Research finds that a telehealth platform is more effective in helping suicidal patients than similar treatment delivered in person.
Critics of telehealth have long said a virtual visit can’t replicate in-person treatment, especially for serious concerns like treatment of patients considering suicide. But a new study from The Ohio State University’s Wexner Medical Center and College of Medicine finds that virtual care is an effective platform.
In a randomized clinical trial of 96 patients between 2021 and 2023 with recent suicidal ideation or suicidal behavior, counselors using brief cognitive behavioral therapy (BCBT) via telehealth were able to cut suicide attempts by 41% compared to present-centered therapy (PCT).
The research lends strength to the argument that effective treatment isn’t based on the mode of delivery, and that virtual care is a suitable platform for those unable or unwilling to access in-person care.
“For those suffering with suicidal thoughts and behaviors, we have good, tested treatments that will lead to significant symptom reduction and improved quality of life,” Craig Bryan, PsyD, professor in Ohio State’s Department of Psychiatry and Behavioral Health and director of its Suicide Prevention Program and a co-investigator in the study, said in a press release. “Even with lessening restrictions, many therapists are keeping a portion of their telehealth practice post pandemic. This study has the potential to increase access to needed evidence-based treatments for those in rural and hard-to-reach areas.”
Justin Baker, PhD, a clinical psychologist at Ohio State-Wexner, clinical director of the health system’s Suicide and Trauma Reduction Initiative (STRIVE) and the study’s principal investigator, said high-risk patients are historically excluded from virtual care due to risk and liability concerns. But the pandemic forced health systems and hospitals to shift to an almost all-virtual strategy, leaving many patients with no in-person access.
“We wanted a way to ensure that those who needed care the most were able to receive care during the pandemic,” he said in the press release.
According to the study, published in JAMA, 768 people were asked to participate, 112 were assessed for eligibility and 98 were eventually selected, with 52 undergoing virtual care and the other 46 seeing a clinician in person.
“A strength of this study is the use of an active, evidence-based treatment as the comparator instead of treatment as usual,” the study reported. “The use of an active comparator in this study provides a higher level of internal validity than previous studies, thereby enabling us to conclude with greater confidence that reductions in suicide attempts are likely attributable to the skills-training focus of BCBT, which prioritizes targeting core underlying vulnerabilities in how patients regulate emotions and cognitively reappraise stressful situations.”