Skip to main content

The Many Costs of Imaging Technologies

 |  By gshaw@healthleadersmedia.com  
   July 27, 2010

Quick: When you think about defensive medicine, what comes to mind? For me, it’s imaging technologies. Try going to your primary care physician’s office on a Friday afternoon and telling her you have a slight pain in your abdomen. You’ll be holding your nose and swigging a barium cocktail in no time as technicians warm up the CT scan machine. You—or, more accurately, your health insurer—will spend a lot of money to find out whether your appendix is about to burst or if that burrito with extra jalapeño peppers you ate last night is to blame.

In the July issue of HealthLeaders Magazine, I wrote about the cost-quality conundrum of healthcare imaging technologies.

Advanced imaging technologies add to the high cost of healthcare; the latest model of any given machine is always more costly but not always more effective than the previous version; and access to technology definitely plays a role in overutilization and defensive medicine. It may not be the only problem, but it is part of the picture.

On the other hand—and this is a hard argument with which to quarrel—these technologies lead to earlier detection of conditions because they can see details right down to the molecular level. And early detection can save lives.

Meanwhile, like a snake eating its own tail, earlier detection leads to an increase in utilization and adds to healthcare costs.

Marty Khatib, director of imaging for Mercy San Juan Medical Center in Carmichael, CA, says early detection is the key to finding cures. “That's one of the cornerstones of effective and quality care, and that's what really has led to one of the causes behind this paradigm shift in technology in imaging," he says.

So what’s the solution? One way to fight the rising costs of technology is with, well, technology.

In addition to earlier detection, another transformation in the imaging field is an explosion in the amount of data available and the power of electronic medical records to record, store, transmit, share, and analyze it.

"There's so much emphasis on evidence-based best practice in the industry right now. Those gray areas are becoming much more clear," Khatib says. "Healthcare IT has allowed us to be much more quantitative in our approach and we're able to measure things much more accurately."

IT can help healthcare organizations identify and implement best practices while other technologies—such as teleradiology—might reduce costs and increase efficiency.

Teleradiology taps the technology in appropriate ways, says Khatib. “It's a very good example of how you can truly utilize technology to have best outcomes."

But changing our long-standing reliance on the very best and the very latest technology—regardless of whether evidence shows it to be better-may also be part of the answer.

"This is a societal issue," says Andrew Pecora, MD, chairman and executive administrative director of the John Theurer Cancer Center at Hackensack (NJ) University Medical Center. Every generation expects to get more out of its healthcare system, live longer, have fewer deaths or side effects of medications than the generation before it, he notes. "We have to make a decision as a society what we want out of the healthcare system, and it has to be reality-based. It would be wonderful if everybody could get everything and it didn't cost anything."

That's an absurd extreme, he says, but so is the idea that you can remove all waste and solve economic incentives and other problems.

"We're going to do all these things, and as a consequence of that, everyone is going to continue to have the relationship they currently have with their physician, be able to pick the hospitals they go to, and have access to any and all new breakthrough technologies," he says. "That's not going to happen either."

Tagged Under:


Get the latest on healthcare leadership in your inbox.