Lead researcher: Robert J. Levy, MD, the William J. Rashkind endowed chair in pediatric cardiology at the Children's Hospital of Philadelphia. Levy's group collaborated with engineers and scientists from Drexel University, Northeastern University, and Duke University.
Purpose: A platform technology that delivers drugs and other agents to specific sites in diseased or injured blood vessels in patients with vascular disease. It builds on an existing medical technology, catheter-deployed stents.
How it works: Uniform magnetic fields drive iron-bearing nanoparticles to metal stents in injured blood vessels, where the particles deliver a drug payload that prevents blockages in those vessels. The nanoparticles are impregnated with magnetite, which responds strongly to a magnetic field.
Potential improvement: Current drug-eluting stents contain a fixed dose of medication, good for just one release. But reobstruction occurs in a significant number of patients. The magnetically guided system could deliver higher doses and additional doses if problems recur.
Evidence: Researchers implanted stainless steel stents into the carotid arteries of live rats. After injecting paclitaxel-loaded nanoparticles into the arteries through a catheter, they produced a uniform magnetic field around each rat for five minutes, magnetizing both the stents and the nanoparticles and driving the particles into the stents and the nearby arterial tissue. Five days later, the first group had four to 10 times as many particles in their stented arteries as the control group.
What's next: Clinical application is at least a few years away. Ultimately, the process could be used to deliver therapy via DNA, cells, and drugs.
For $29.99 you can buy a smartphone app that translates a baby's cries. For $3.99 you can buy an app that claims to stimulate hair growth by issuing inaudible frequencies that increase blood circulation in the scalp. And for 99 cents you can flick a finger and send a cow bouncing around your phone's screen.
So what would patients pay for a mobile app to monitor their health? One recent study suggests the answer lies somewhere between the cost of an implausible baby decoder and that of a questionable hair restoration technique. Look at the numbers a little more closely, however, and the price drops to less than a game of cow-tipping.
Roughly half of patients surveyed said they would buy mobile technology for their health. Of those, 20% said they would use it to monitor fitness or wellbeing, 18% want their doctors to monitor their health conditions, and 11% would like to monitor an existing condition, according to a recent survey report by PricewaterhouseCoopers' Health Research Institute.
Although 40% of respondents said they were willing to pay for a monthly mobile phone service or device that could send information to their doctor, they don't want to pay more than $10 a month for it. And actually they don't even want to pay that: Most patients expect their insurance would cover the cost, according to the report.
But with little or no evidence that mobile health improves quality and reduce costs, insurers aren't opening their wallets.
Other industries have figured out ways to get paid for electronic transactions and services—music downloads or stupid smartphone applications, for example. But healthcare lags in figuring out who pays—and how much.
The current reimbursement model is one of the barriers to more rapid adoption of mobile health: in-person consultation is still the main basis of reimbursement in healthcare, says PricewaterhouseCoopers. "Public payers and private health insurers, who are primarily responsible for paying for healthcare, have generally not pushed for adoption of mobile health."
This is beginning to change, as a small but growing number of health plans pay for remote monitoring devices to help reduce hospital readmission costs. According to the survey, physicians were most often reimbursed for phone consultations for chronic disease management. But wellness and maintenance—for which smart phone apps are well-suited—is still the least reimbursed.
Payers want to see evidence, says Roy Swackhamer, chief information officer of SCAN Health Plan. "Everyone is doing pilots, but it needs to be scaled so a physician with 500 congestive heart failure patients can take advantage of the data. We need predictive algorithms that can be used with data aggregation tools in order to analyze trends and perform predictive analysis."
Until that data-driven evidence comes in, physicians in the survey listed a number of benefits to mobile healthcare for payers, patients, physicians, hospitals, and other healthcare organizations. Among them:
Of those physicians who are using mobile devices in their practice, 56% said the devices expedite decision making and nearly 40% said the use of mobile devices decreases time spent on administration.
Physicians agreed that the greatest benefit of mobile health would be to help them make decisions faster by accessing more accurate data in real time. One-third of physicians surveyed said they currently make decisions based on incomplete information for seven out of ten patients they see. Only half currently access electronic medical records while visiting and treating their patients.
Forty percent of physicians said mobile health could reduce office visits by 11 ? 30%, potentially easing the physician shortage, reducing hospital readmission costs, and increasing access for patients who delay care because they don't want to wait for an appointment.
Forty-five percent of physicians said that Internet visits would expand access to patients, giving them more time to interact with patients.
"The technology of telehealth is well ahead of the socialization of the telehealth idea and we are at a tipping point for utilization to begin taking off," David Jacobson, WellPoint's staff vice president of business development, state sponsored business, says in the report.
All we need now is an app that shows when the cow is done tipping—and has finally fallen over.
PricewaterhouseCoopers' Healthcare Unwired report and survey highlights are available here.
Science and technology are at a fortuitous crossroads: As we’re learning more about how variations in human genetics affect health and disease, we’re expanding our use of the electronic medical records systems that make it easier to gather, store, sort, and analyze genetic data. And growing right alongside clinical and technological medical advances: The importance of informed consent and its kissing cousin, re-consent.
So, how many times do you have to get permission from patients before you use their medical data for research? From the patients’ point of view, the answer is “every time,” a new study suggests. It doesn’t matter if the data is de-identified or if they’ve already approved its use for one purpose. If you want to use it again, they want you to ask them again, say investigators at Group Health Research Institute and the University of Washington (UW) in a report called “Glad You Asked,” which was published in the September 2010 Journal of Empirical Research on Human Research Ethics.
Genomic research on large numbers of people can yield insights that aren't possible with smaller numbers. So, since 2008 the National Institutes of Health (NIH) has encouraged researchers to submit genetic information to the federal database of Genotypes and Phenotypes (dbGaP). But researchers must respect their study participants. And that means going back to those who may have signed up for a study on diabetes or heart disease and asking them to share their medical data for genomic research projects, as well.
It’s not that people don’t want to share their data. When researchers asked volunteers who were already enrolled in a joint Group Health-UW longitudinal study on aging if it would be OK to share their de-identified genetic and medical record information in the database, 86% said yes. Then researchers went back to a subset of those who gave permission mostly because of a “desire to help others.” Of those, 90% said they thought being asked for re-consent was important.
Alternatives to re-consent were not popular with the participants, either. Opting out was unacceptable to 40% of respondents and notification was unacceptable to 67%. When asked what they thought about their data being used for a new purpose without individual permission or notification, 70% gave researchers the thumbs down.
"We were surprised that so many people felt it was important for us to ask them, even though they decided to give their consent," said lead author Evette Ludman, a senior research associate at Group Health Research Institute. "This indicates that even if most of a study’s participants would agree to data sharing, it’s still crucial to ask them."
Asking permission is a sign of respect, after all. And that’s particularly important when researchers are asking for genetic samples and related medical data. "Trust is a two-way street, and human research requires lots of trust," Ludman said. "People have an understandable feeling of ownership over their bodies and medical records, including their genetic information. Researchers show we're worthy of trust when we ask research participants for permission to use their information in a way that they haven't already agreed to."
As luck would have it, there are tools to help manage and document the informed consent process. The University of Texas Health Science Center at Houston is using one such solution, called iMedConsent, to manage its research consents. The program can be integrated with an organization’s EMR so that it can automatically populate consent forms with patient and provider information. It can capture signatures from digital pads, send images of signed documents to document management systems, and post comprehensive notes to the patient’s medical record. And automating a traditionally paper-based process saves money by eliminating scanning costs and the need to manage and archive paper documents.
Low-tech solutions such as checklists and calling a time-out before surgery can dramatically improve safety in the operating room, according to the American College of Obstetricians and Gynecologists (ACOG), which recently released guidelines to deter surgical errors. But there are plenty of technologies that can help make the OR a safer place—from systems that keep track of sponges and to robots that may someday be able to deliver anesthesia remotely to systems that help counteract alert fatigue.
The ACOG guidance supports the Joint Commission's "three-part universal protocol" as a useful tool for healthcare teams to prevent surgical errors. The first protocol calls for the healthcare team to ensure that each patient's relevant documents and all of the surgical equipment are available, correctly identified, and reviewed before surgery."Using standard checklists, systems, and routines may sound to some like cook-book medicine, but they have been proven to greatly reduce surgery errors," said Richard Waldman, MD, ACOG's president.
The steps sound simple—and they are. And yet surgical errors still occur. The steps endorsed by ACOG rely heavily on humans to perform them, after all. And human behavior, human actions, and human memory are far from perfect.
Here are five surgical pitfalls and the technology that can help humans in the OR avoid them.
1. Retained surgical instruments
The FDA recently approved another surgical tool tracking system that helps keep all manner of surgical implements, including sponges, where they belong—namely, outside of the patient's body. There are a number of these systems, which keep track of every item in the room through RFID chips and alert OR staff when one is not in its proper place. Some systems have a wand that, when waved over the patient's body, detect any items left behind. These systems help cut down on complex and time-consuming counting procedures that are prone to human error.
More than a third of all retained surgical items are instruments (52% radiopaque sponges and 43% instruments), according to a 2007 study in the Journal of Surgical Research. Correcting such errors adds about $2 billion each year to the nation's medical bill.
2. Wrong-site surgeries
Wrong-site surgeries, an alarming surgical error that sometimes inspires patients to draw on their bodies with magic markers (“THIS LEG, PLEASE!”) could be reduced with better access to data. Charts can get mixed up, but a new portable biometric-activated data card that stores patients' personal medical information could be a solution to that problem. It doesn't work unless the patient first scans his or her fingerprint on the card itself. Only after the finger is verified can physicians view the data. It's the size of a credit card, but it can hold gigabytes of data, including full EKGs, complete CT scan images, and digital MRI images.
3. Healthy tissue damage
Technology could prevent complications that aren't caused by human error, as well, including damage to normal, healthy tissue. A new electrosurgical device aids surgeons by selectively targeting diseased cartilage tissue during procedures. Its design, which combines low RF energy delivery with a protected electrode (to avoid electrode-to-tissue contact), allows for localized treatment of damaged/fibrillated cartilage tissue while avoiding harm to healthy cartilage.
4. Lack of access to anesthesiologists
Improper anesthesia administration is another scary medical error—and one that particularly affects patients in rural areas or areas with shortages of anesthesiologists. Experts could administer regional anesthesia from afar with the help of a surgical robot, according to a study in the September issue of Anesthesia & Analgesia. In the study, both single-injection and perineural catheter techniques were successfully performed by an operator who was not physically present at the bedside by placing ultrasound-guided nerve blocks into an ultrasound phantom using a surgical robotic system. “Similar advances in teleanesthesia will be necessary to bring comparable perioperative care to the geographically remote patient,” the authors note.
5. Alert fatigue
Clinicians and caregivers can become overwhelmed by alarms to the point that they start to tune them out—it's common enough that there's a term for it: alert fatigue. An alarm management system that can differentiate between serious alerts and less pressing matters works on top of pulse oximeters, which monitor patients for oxygen saturation levels after surgery. The system prioritizes the importance of these alerts and only notifies the nurse about the most pressing. It doesn't just alert to drastic drops, but also to the more subtle—and easier for humans to miss—recurring moderate reductions in airflow. The system also provides the clinician with historical and real-time information about a patient's condition.
Steel yourself for some alarming news: Cyberchondriacs are on the rise, up from 50 million in 1998 to 175 million today, according to market research firm Harris Interactive. And they're also getting more active: "Fully 32% of all adults who are online say they look for health information 'often,' compared to 22% last year."
Sounds like bad news, doesn't it? But wait—what exactly is a cyberchondriac? According to Harris, they've used the term since 1998 to describe people (are you sitting down?) who look for healthcare information online.
It turns out cyberchondriac is just a malignant-sounding word for what is in fact a benign—if not beneficial—condition.
The debate over what to call patients who look up information about a condition or treatment, a physician's credentials, or even (gasp!) alternative forms of treatment, has been simmering since at least 2007. It was brought to a head, in part, by a Time magazine column by Scott V. Haig, MD, When the patient is a Googler.
In it, he complains about a patient who was rude, demanding, and who researched him and her condition online. (Not to mention that her three-year-old "little monster" stomped fish crackers and Cheerios into his rug.)
After she asked a "barrage of excruciatingly well-informed questions," Haig wrote, he decided to drop her as a patient.
Patient advocates were, not surprisingly, displeased—and not about the fish crackers and Cheerios on the carpet. A post on the New York Times Well blog, A doctor's disdain for medical Googlers, garnered hundreds of emotional responses.
While some said the doctor was merely complaining about demanding patients, others said that by harping on the fact that she'd done research online, he was suggesting that patients who educate themselves online and ask well-informed questions are by their very nature as annoying as a toddler's snack foods ground into a rug.
Meanwhile, a short time after his ruckus-raising Time column, Haig seemed to have learned his lesson—or at least expanded his lexicon. In 2008 he conducted a very civilized roundtable discussion with a group of orthopedic surgeons about how the Internet had changed their practices. The article was titled: How to deal with the digitally empowered patient.
Digitally empowered. Now that has a nice ring to it.
The authors of that study, however, use the term differently than does the Harris study. They grant—and good for them—that online medical information can help laymen better understand health and illness and provide them with feasible explanations for symptoms.
"However," the authors add, "the Web has the potential to increase the anxieties of people who have little or no medical training, especially when Web search is employed as a diagnostic procedure. We use the term cyberchondria to refer to the unfounded escalation of concerns about common symptomatology, based on the review of search results and literature on the Web."
But they might also be looking for diet and exercise programs to help reduce cholesterol based on a warning from their doctor. Or they might want to make sure they are well-prepared for a visit to the doctor in order to make the most of his or her limited time. According to the Harris poll, 51% of all so-called cyberchondriacs say they have searched for information on the Internet based on discussions with their doctors; 53% discuss the information they found online with their doctors.
In other words, not only is the Internet helping patients better communicate with their doctors, there's no evidence in the Harris poll that the patients in question are suffering from increased anxiety or unfounded escalation of concerns (with or without italics).
Now to be fair, it is unclear whether or not Harris' use of the term is pejorative. Maybe they just think it sounds catchy. And there's no doubt their data is valuable proof that patients have increasingly come to rely on the Internet for medical information since 1998.
But we are not living in 1998. So-called cyberchondriacs are now known by a less negative descriptor: well-informed, engaged, and empowered patients. Here in the year 2010, we call them e-patients.
A note to Harris: Regardless of your intent, it's time to update your terminology.
Cancer isn't just a leading cause of death in the U.S., it is also the world's costliest disease, according to a new American Cancer Society report. And big-ticket technologies used to diagnose and treat the disease are part of the reason cancer care costs so much—from stereotactic radio surgery systems that cost hundreds of thousands of dollars to the massive nuclear particle accelerators used for proton beam therapy and the physical plants needed to house them, which can cost hundreds of millions of dollars.
Meanwhile, these expensive cancer treatments are not always more effective than less expensive options and many patients simply don't need them. So why do hospitals keep spending money on them?
Because calculating the ROI of cancer care is about much more than money.
Hospitals that invest in high-cost cancer technologies do not do so lightly. It is, they argue, not only critical to their mission of offering patients the best possible care, but it is also a business imperative.
Organizations invest in the latest cancer diagnosis and treatment technology in order to show that they are a state-of-the-art facility. They tout the technology in advertisements and annual reports to boost their reputation among patients and referring physicians. They buy machines to keep up with competitors and to recruit the best specialists, who want to practice in a facility that has the equipment they trained on. And patient demand drives purchases too—even if patients don't actually need high-tech care, hospitals that have it still bring them in the door.
Is it possible to build a cancer service line or run an oncology practice in a way that makes both clinical and business sense—to find that balance between serving your patients, keeping your organization competitive in the marketplace, and controlling overtesting and overutilization?
I interviewed leaders at several organizations that have found an answer to that question. You can read about how they negotiated the complex calculations involved in building and running high-tech cancer care programs in this month's issue of HealthLeaders magazine. (See The Complex Calculations of Cancer Care.)
From a tiny eye telescope to microscopic health data on the surface of contact lenses to advances in cancer treatments, medical devices continue to get smaller and smaller. And the smallest of the small fields—nanotechnology—is expected to get bigger (so to speak) over the coming years.
The global market for nanotechnology was worth $11.6 billion in 2007 and could reach $27.0 billion by the end of 2013, according to India-based market research firm Bharatbook in a report released this month. Biomedical applications has the highest projected growth rate (56%) compared to other applications over the next 5 years.
A report published by the same company in June found:
US demand for nanotechnology medical products will rise more than 17 percent per year to $75.1 billion in 2014.
The total market for nanomedicines will command strong growth over the long term, rising to almost $59 billion in 2014 and sustaining a strong upward pace through 2019.
Among nanodiagnostic products, nanosized monoclonal antibody labels and DNA probes are greatly enhancing the speed, accuracy, capabilities and costeffectiveness of in vitro diagnostic testing, drug discovery and medical research procedures.
Within the medical supplies and devices segment, nanomaterials are already gaining significant demand as active ingredients of burn dressings, bone substitutes, and dental repair and restoration products. In the long term, advances in nanotechnology will lead to the introduction of new, improved medical supply and device coatings as well as a new, diverse group of medical implants.
The greatest near-term impact of nanotechnology in health care by indication will be in therapies and diagnostics for cancer and central nervous system disorders.
Here are just five of the many ways smaller medical technology is getting better.
1. Nanomaterials
Engineered nanomaterials—about 100,000 times smaller than a single strand of hair—represent a significant breakthrough in material design and development for industry and consumer products, including diagnosis, imaging and drug delivery technologies. So much so that the National Institute of Environmental Health Sciences (NIEHS), part of the National Institutes of Health, has put $13 million in grants over a two-year period to increase understanding the potential health, safety, and environmental issues related to the tiny particles The NIEHS awards are funded through the American Recovery and Reinvestment Act to develop better methods to assess exposure and health effects associated with nanomaterials and develop reliable tools and approaches to determine the impact on biological systems and health outcomes of engineered materials, according to NIEHS.
There’s still plenty to learn about nanomaterials and gauging their safety will also be a priority of the grants. “We currently know very little about nanoscale materials' effect on human health and the environment," said Linda Birnbaum, PhD, director of the NIEHS and the National Toxicology Program (NTP), an interagency program for the U.S. Department of Health and Human Services. "Nanomaterials come in so many shapes and sizes, with each one having different chemical properties and physical and surface characteristics. They are tricky materials to get a handle on. The same properties that make nanomaterials so potentially beneficial in drug delivery and product development are some of the same reasons we need to be cautious about their presence in the environment."
2. Particle toxicology
Can nano-sized particles travel from the nose to the brain? That’s one of the questions that The Society of Toxicology (SOT) researchers are exploring. Particle toxicology has come a long way from revealing the prominent role for coal and silicainduced diseases in the early 20th century, according to SOT. Investigations have gone from asbestos fibers to manmade mineral fibers, ambient particulate matter, and engineered nanoparticles. The focus too has grown from the traditional target organ, the respiratory system, to extra-pulmonary organs such as the heart, vascular system, and the brain. The connection between the nose and the brain and the transport, in particular, of nanosized particles to the olfactory bulb, was described early on to explain how poliovirus infection progressed. Research that is more recent suggests that man-made nanosized particles can access the same pathway.
Ongoing research seeks to better understand if and how nanosized poorly soluble particles get into the brain, the properties of the particles that accumulate in the brain (e.g. size, solubility, and reactivity), how the particles get cleared from brain tissue, and how particles might induce adverse effects such as neurodegenerative disease.
3. Microscopic sensors
Just this month, University of Washington researchers used nanotechnology to integrate microscopic optical, electronic, and biosensing devices into contact lenses to continuously monitor a patient’s health through the biochemistry of the eye surface—displaying the information through symbols right on the lens That sound distracting? In the future, the information could also be sent via text message or e-mail.
In July, Food and Drug Administration approved a new treatment that could help millions of older adults who are nearly blinded by macular degeneration—a miniature telescope implanted directly into the eye that magnifies images to more than twice their size. One problem? Although it can sit comfortably atop a fingertip, the device is still relatively large—and it’s not for everybody.
4/5. Nanotechnology and tumors
In July, at a meeting of the American Association of Physicists in Medicine (AAPM), researchers talked about nano-coated “gold bullets” that help destroy tumors and improve radiation therapy.
Image-guided radiation therapy targets tumors in organs that tend to move during treatment, such as the prostate gland or the lungs, as well as tumors near vital organs. Often, inert markers are implanted into the body to help radiation oncologists pinpoint the cancerous tissue.
Researchers say they want to use these markers to deliver drugs that will combat cancer and make the tumor more sensitive to radiation. The drugs can be tailored to different tumor types, the researchers say.
“Right now, these markers are just passive implants that are inserted into the tumor,” says Srinivas Sridhar, a physics professor at Northeastern University and director of the university’s Electronic Materials Research Institute. “We’re making them active and smart using nanotechnology,” he said.
While researchers are already developing nanotechnology capsules that deliver a cancer drug to tumors with precision, researchers at Baylor College of Medicine in Houston, TX, have developed a targeted nanocapsule system that delivers two cancer therapies simultaneously: the chemotherapy agent doxorubicin and heat therapy (hyperthermia).
The system is based on nanoparticle-assembled capsules (NACs), structures that form themselves as a result of their chemical properties. The capsules contain the chemotherapy agent doxorubicin. An external magnetic field passed over the nanocapsule releases doxorubicin and also heats up the NAC solution, heating the tumor cells to kill them.
"The great thing about our magnetic, nanoparticle-assembled capsule is that it's a multifunctional device that can be used simultaneously to release the desired drug concentration at the tumor site while heating up the tumor cells," says lead researcher John McGary.
The more effectively patients and physicians can communicate the better healthcare will be. And that includes better electronic communications, such as by e-mail. Yes, I understand doctors’ concerns that it would take up too much of their time—uncompensated time at that—and expose them to liability and leave an electronic trail of typos. But I believe that deep down inside they also know it is the right thing to do.
Almost every non-healthcare business uses e-mail to communicate with its customers (and it’s probably safe to drop the modifier “almost” on that one). And we know that most patients would like the option to e-mail their doctor.
Resistance from physicians is hard to overcome, but one recent study might give them a push in the right direction. In a study of 35,423 people with diabetes, hypertension, or both, the use of secure patient-physician e-mail messaging was associated with a statistically significant improvement in effectiveness of care during a two-month period, according to a Kaiser Permanente study published in the July issue of the journal Health Affairs. Effectiveness was measured by the Healthcare Effectiveness Data and Information Set.
In addition to better care, other findings that might sway physicians include:
An improvement of 2.0 to 6.5 percentage points in performance of other HEDIS measures such as cholesterol and blood pressure screening and control.
The ability to replace some outpatient visits, this improving the efficiency of care.
Physicians participating in the study said they were not overwhelmed by a large number of e-mails. Earlier studies showed that physicians received two to 12 messages each work day and responding to each took an average of 3.5 minutes.
Message threads were shorter than one might think, too—on average, exchanges contained just slightly more than one patient message and one physician message.
Most patient-generated emails were not frivolous: 63% required clinical assessments or decisions and 24% required clinical action such as ordering a lab test.
Offering patients the ability to communicate via e-mail can improve physician-patient relationships (thus improving continuity of care), support patient self-management, and improve patient satisfaction.
Feedback from physicians was generally positive, according to the study’s authors. “Many physicians perceive that the use of e-mail increases their efficiency and improves the care they provide. The top five reasons that patients e-mail their physicians, according to Kaiser, are a change in health condition, to check lab results, to report a new condition, to check on drug doses, and to inquire about a new drug.
In addition, physicians looking to get stimulus money under the American Recovery and Reinvestment Act for meaningful use of electronic medical records must incorporate secure patient-physician messaging into their EHRs.
Kaiser is, of course, a leader in electronic communications. Through its online portal, patients can access lab results, pharmacy records, prescription refill information, self-care instructions, and online educational materials. The organization, frankly, has resources that the average physician does not. E-mail communication is still not reimbursed, and face-to-face visits are still the standard of care for quality measures such as HEDIS. Further, even when free online communication resources are available, they are not used by certain populations, including those who are underserved.
Surely, though, the healthcare industry can figure out a way to solve these problems or simply agree that they are not enough to provide a service that the public clearly wants, that improves quality of care, and in fact saves physicians time and reduces healthcare costs. Physicians don’t get paid to return phone calls, either, for example, but most still call their patients when warranted. It’s time for physicians to overcome their resistance to online messaging. And if they don’t? I say we take away their smart phones and see how they like being disconnected from their own worlds.
Data-loving physicians can read the full study, Improved Quality at Kaiser Permanente Through E-mail Between Physicians and Patients, in the July issue of Health Affairs (subscription required).
Quick: When you think about defensive medicine, what comes to mind? For me, it’s imaging technologies. Try going to your primary care physician’s office on a Friday afternoon and telling her you have a slight pain in your abdomen. You’ll be holding your nose and swigging a barium cocktail in no time as technicians warm up the CT scan machine. You—or, more accurately, your health insurer—will spend a lot of money to find out whether your appendix is about to burst or if that burrito with extra jalapeño peppers you ate last night is to blame.
In the July issue of HealthLeaders Magazine, I wrote about the cost-quality conundrum of healthcare imaging technologies.
Advanced imaging technologies add to the high cost of healthcare; the latest model of any given machine is always more costly but not always more effective than the previous version; and access to technology definitely plays a role in overutilization and defensive medicine. It may not be the only problem, but it is part of the picture.
On the other hand—and this is a hard argument with which to quarrel—these technologies lead to earlier detection of conditions because they can see details right down to the molecular level. And early detection can save lives.
Meanwhile, like a snake eating its own tail, earlier detection leads to an increase in utilization and adds to healthcare costs.
Marty Khatib, director of imaging for Mercy San Juan Medical Center in Carmichael, CA, says early detection is the key to finding cures. “That's one of the cornerstones of effective and quality care, and that's what really has led to one of the causes behind this paradigm shift in technology in imaging," he says.
So what’s the solution? One way to fight the rising costs of technology is with, well, technology.
In addition to earlier detection, another transformation in the imaging field is an explosion in the amount of data available and the power of electronic medical records to record, store, transmit, share, and analyze it.
"There's so much emphasis on evidence-based best practice in the industry right now. Those gray areas are becoming much more clear," Khatib says. "Healthcare IT has allowed us to be much more quantitative in our approach and we're able to measure things much more accurately."
IT can help healthcare organizations identify and implement best practices while other technologies—such as teleradiology—might reduce costs and increase efficiency.
Teleradiology taps the technology in appropriate ways, says Khatib. “It's a very good example of how you can truly utilize technology to have best outcomes."
But changing our long-standing reliance on the very best and the very latest technology—regardless of whether evidence shows it to be better-may also be part of the answer.
"This is a societal issue," says Andrew Pecora, MD, chairman and executive administrative director of the John Theurer Cancer Center at Hackensack (NJ) University Medical Center. Every generation expects to get more out of its healthcare system, live longer, have fewer deaths or side effects of medications than the generation before it, he notes. "We have to make a decision as a society what we want out of the healthcare system, and it has to be reality-based. It would be wonderful if everybody could get everything and it didn't cost anything."
That's an absurd extreme, he says, but so is the idea that you can remove all waste and solve economic incentives and other problems.
"We're going to do all these things, and as a consequence of that, everyone is going to continue to have the relationship they currently have with their physician, be able to pick the hospitals they go to, and have access to any and all new breakthrough technologies," he says. "That's not going to happen either."
Leaders from healthcare organizations and associations, lawyers, consultants, IT vendors, and a host of other experts are slogging their way through all 800-plus pages of the Centers for Medicare & Medicaid Services final rule for the meaningful use of electronic health records. They have plenty of opinions about what's right and what's wrong with the rules. And that's fine. But it's also a good idea to think about something more important than what could have or should have been: What happens next?
The long-awaited final rules spell out exactly what hospitals and physicians must do to qualify for their share of a pool of roughly $27 billion in bonus Medicare payments over the next ten years for using electronic health records (EHRs). Eligible professionals can get up to $44,000 under Medicare and $63,750 under Medicaid, and hospitals may receive millions of dollars for implementation and meaningful use of certified EHRs under both Medicare and Medicaid.
Providers and vendors have been playing guessing games on what the final rules would look like and placing bets on when, exactly, the final rule would be announced. Some have predicted it will be an unfunded mandate. Others were skeptical it would ever come to fruition at all.
"There should no longer be any doubt that this program is real," says Charles W. Jarvis, vice president healthcare services and government relations for NextGen Healthcare. "Hospitals should be able to march ahead and make some final decisions."
Leigh Burchell, director of government and industry relations for Allscripts, agrees. "People who were in a wait and see or even wait and understand mode are now trying to understand, because it's real," she says.
A number of organizations have objections about the final rule—ranging from security and privacy concerns—to worries that smaller or individual hospitals in multi-campus settings will face barriers to achieving widespread IT adoption.
"We continue to be concerned that, given limited vendor capacity and workforce shortages, many hospitals will not have timely access to certified products, since no certified EHR systems are available today," said Rich Umbdenstock, president and CEO of The American Hospital Association (AHA).
AHA also frets about the timeline for Computerized Provider Order Entry (CPOE), says the certification process penalizes early adopters by requiring them to upgrade or replace already functional systems, and adds that the rules limit how quickly hospitals can adopt a certified EHR that can benefit patient care.
"The challenge now is to extend its use and integrate it into the routine care processes in all hospitals, big and small, in both rural and urban areas," Umbdenstock said.
But concerns, quibbles, and complaints aside, it's time for healthcare organizations to get moving.
Get out the roadmap
Whether they've been sitting on the fence waiting for the final rules to come out or planning for their eventual release, providers will begin taking more concrete action, such as hiring IT staff and investing in hardware and software, says Mitch Morris, MD, national leader, health information technology for Deloitte Consulting.
"As health care IT leaders move forward with their plans they will want to develop a clear roadmap," Morris says, adding healthcare leaders should consider the following questions:
What are your competencies?
What is your capacity?
What external resources will you need?
"A meaningful use roadmap should include not only goals and expected outcomes, but also timelines, staffing requirements, and a projection of expected capital and operating costs. Factor in the risks involved and the needed controls," Morris says.
Mark Segal, vice president of government and industry affairs for GE Healthcare IT says healthcare leaders should also scrutinize potential vendors' roadmaps, as well-and where they are in relation to the meaningful use timeline—which is broken into three stages over the next few years—stage two starts in 2013, CMS has not yet set a date for stage three.
"It's not just where we are for stage one, it's pretty clear that you've got an accelerating trajectory," Segal says, noting that while CMS softened some stage one requirements, they didn't eliminate them. Rather, they've been deferred to stage two. "It really is going to be important that people look at stage two," he says. And "2013 is around the corner."
The meaningful use rule is a turning point for the industry, says Glen Tullman, Allscripts CEO. "The time for waiting is past and we're seeing people already acting in a very dramatic way," he says.
So what's his advice to healthcare providers? "The advice is get going. You don't want to be the one that, by the time you decide, no money is available," he says.