Predictive modeling offers the key to understanding which healthcare services most affect utilization, readmissions, and payment, and how to tackle the outliers. These analytics are within the grasp of any healthcare system.
Somewhere out there, a hospital near you may be figuring out the technological secret to significantly lowering readmissions.
It isn't a secret easily uncovered, it takes hard work, and it takes working smart. But it can be done.
"Our admits and readmits have dropped like a rock," says Pamela Peele, PhD, chief analytics officer of the UPMC Insurance Services Division. UPMC is the short name for the University of Pittsburgh Medical Center, and Peele is one of two presenters in my October 28 HealthLeaders webcast.
UPMC uses a variety of modeling tools to identify patients who are high utilizers of its services and—significantly—are likely to continue to be high utilizers in the next year. "Most people just regress to the mean," Peele says. "There's a whole industry of disease management that has been built on regression to the mean. They say they're managing the patient, but the patient would have gotten better, utilization would have gone down if nothing had happened, because most people actually get better."
The secret, Peele says, is finding the 20% of patients who won't get better on their own but who could respond to intensive, coordinated care.
When a Medicare patient new to UPMC walks through their doors, "We use a questionnaire that CMS requires us to get anyway on our patients," Peele says. Out of 24 questions on that survey, eight combinations of answers on five of those questions are "absolutely the signal that these people are going to run about 300% more expensive than people who don't hit those rules," she says. "Once you've discovered the questions and rules, you don't need to run a model anymore. The call center can deploy that on the phone."
Later, when a UPMC Health Plan member presents for hospitalization, the upper right-hand corner of UPMC's authorization screen prominently displays that member's readmission risk. But Peele shares an unexpected insight: Sometimes, the risk level is just so high that no added intervention is going to reduce it.
In this struggle to reduce readmissions, the main tool in UPMC's toolbox is the home visit. The trick of UPMC's success is in identifying the "sweet spot" of patients—those who aren't so sick that the home visit won't matter but aren't sick enough that a home visit will make a difference.
"There's a predictive range where you should put your resources, and the resource is a home visit," Peele says. "That's what actually matters. So we changed the discharge plan for people in that sweet spot, so they get a follow-up home visit."
The ins and outs of predictive modeling
Unlike traditional IT projects, predictive modeling is never truly finished, but instead aspires to turn healthcare into a continuously learning system. "This stuff is all iterative," Peele says. "You get started. You discover something. You try something. It doesn't work. You figure something else out. It's sort of like groping around in the dark in a room until you find the light switch."
On October 28, my other presenter with Peele will be Christine Vanzandbergen, the clinical decision support officer at Penn Medicine, at the oppose end of Pennsylvania from UPMC.
While not as far along with its predictive modeling efforts as UPMC, Penn Medicine is also leveraging a variety of different tools to make progress. "Our approach has been to take our data internally, understand it, validate it in a predictive model, evaluate our results, and look for partners that we think can improve those results," Vanzandbergen says.
Both Peele and Vanzandbergen have been unimpressed with the offerings of the many technology vendors hawking products specifically for predictive modeling for healthcare. Both provider organizations continue to focus on internal efforts while they try to identify partners that can truly add value. "Our most recent partnership turned in some really disappointing results when compared to the performance of our internal readmission risk model," Vanzandbergen says.
"What the vendors are selling in predictive modeling land is so generic and so basic that it's not useful," Peele says. "That's not the vendors' fault. It's just [that] healthcare is local." But that hasn't stopped lots of health systems from spending millions on these vendors' products.
Healthcare analytics vendor MEDai produces an external financial model that UPMC's insurance division has found useful, Peele says. "The validity of having an outside predictive model smooths our bond rating, and we're very highly rated, so there's a business reason that we buy that particular predictive model," she says.
Otherwise, UPMC builds its own predictive models, using statistics packages such as Statistica, R, Stata, SAS, MATLAB, Vensim, and Tableau.
Don't let the potentially endless parade of tools fool you. Predictive modeling is within the grasp of any healthcare system. At Penn Medicine, a single variable—previous hospital utilization—is the key to predicting readmissions. While the results aren't as impressive as UPMC's, they're still significant.
Penn is trying to understand what services are affecting admissions and payment. "We've lowered our readmissions by2 or 3%, which has us on the right track, so we're encouraged by that," Vanzandbergen says.
It takes an entire team to build predictive modeling excellence in your organization: mathematicians, epidemiologists, and other highly-trained specialists, and, according to Peele, even a journalist. "If the C-suite can't consume it, it doesn't make any difference, and the C-suite is accustomed to getting their information filtered through a professional journalist."
(Talk about validating one's role in life.)
Just listening to Peele—a PhD economist by training—and Vanzandbergen—a physician assistant by training who fell into clinical IT work about six years ago—makes me confident that healthcare organizations can use predictive analytics to tame readmissions, and that's just for starters.
Join me on a webcast with clinical decision support experts from Penn Medicine and University of Pittsburgh Medical Center. They’ll reveal how they powered predictive analytics to reduce readmissions, improve care, and reduce financial risk. October 28, 2013, 1:00–2:30 p.m. ET. Register today.
More stringent privacy regulations now in effect mean hospitals and medical practices can expect random audits, higher fines, and a surge in formal complaints from patients who ask for, but do not receive their medical records in a timely fashion.
This article appears in the September issue of HealthLeaders magazine.
Spurred by stricter and closer regulation and enforcement, healthcare providers spent the summer scrambling to update their ability to abide by the federal privacy, security, and breach notification rules of the Health Insurance Portability and Accountability Act .
The new rules kick in on September 26, 2013. Providers can expect random audits, fines that now rise based on the number of records compromised, more frequent and sterner communications from HHS' Office for Civil Rights, and a surge in formal complaints from patients who ask for, but do not receive in a timely fashion, their medical records upon request.
"Before, it said, when you have a breach, you can use your judgment to decide if there was risk of harm to the patients," says Pamela McNutt, CIO at the six-hospital Methodist Health System in Dallas. "Under the new omnibus rule, they actually gave some very specific criteria that you have
to consider."
For instance, if someone left some records with protected health information in a box somewhere, before the rule change, if the box turned up on the provider's doorstep or some third party hands the box back to the provider, normally a breach notification did not have to be issued. Now, such breach notifications become mandatory.
Investigators remain lenient for first-time breaches if the breach is addressed properly. "If you haven't done your due diligence, then that's where you open yourself up to the fines," McNutt says. The new omnibus rules "just really put very solidly in writing exactly what you need to do to determine risk. It does turn it into 'assume you're guilty unless you can prove you're innocent.' "
The OIG's promise of random HIPAA audits, even without a breach notification, is putting even more focus on compliance, McNutt says. "The privacy of patient records is not where it needs to be. We're having too many breaches.
Most of the American public can understand somebody's laptop was stolen and it had some data on it, versus when you hear some of these other stories like some company found a hole in their Internet system and found out that people for years have been able to peruse patient records through their Internet. But I think the public's forgiveness is going to be based on how grievous they perceive the error was."
For providers without in-house expertise to train employees about securit and patient privacy, training materials are available for sale, she adds.
Providers must do all this while at the same time expanding authorized access and exchanging protected health information with patients and other providers.
"The more we're pushing for transparency and interchange of records and patients being able to have a lot of access to their own records online, the more you have to think about security and privacy," McNutt says. "We want to give patients portals, but how can we make sure that we've made it secure enough that someone can't hack in and get that patient's records? This raises the bar on the need for security."
As with all corporate security, that can be a tricky balance. Easy-to-remember passwords may be less secure than more difficult-to-remember ones, for instance.
Two more factors arriving at the same time as the new HIPAA omnibus rule are the provider movement toward storing PHI in the cloud and the bring-your-own-device phenomenon among healthcare employees.
"You need to have cloud storage vendors to agree to a business associate agreement to store company data," McNutt says. "One thing that's keeping a lot of CIOs up at night is the explosion of mobile devices and people's desire to do cloud sharing."
Some cloud providers are refusing to enter into business associate agreements with healthcare providers and, therefore, should not be considered for storing the provider's PHI-based data, McNutt says.
As providers enter into health information exchange agreements, they also can expect to spend considerable time discussing and crafting documents assuring that the appropriate risk assessments and HIPAA compliance steps are being taken in connection with PHI flowing to and from those HIEs, McNutt says.
"It took us over a year to go through contracts in regard to data sharing with the HIE," McNutt says. "Business associate agreements are important to legally protect an organization should a breach occur within the HIE. However, a breach by a provider's business associate could reflect back on the provider, causing reputational harm."
Providers must consider another challenge the HIPAA omnibus rule poses: If a patient pays in full and requests that the provider not bill his insurance company for the services, the provider has to honor that request.
"Most organizations are going to have to implement process and procedural changes to ensure that the patient's request is honored," McNutt says. "That includes tweaking your billing systems to make sure the patient is flagged in such a manner that all employees know that the patient's insurance should not be billed."
Even more important is to establish a culture of privacy in each organization. "When I've seen security firms come in and do security audits, generally the weaknesses are cultural and social, not so much the technology," says Brian Ahier, health IT evangelist at the 49-bed Mid-Columbia Medical Center and president of Gorge Health Connect, Inc., a health information exchange, both located in The Dalles, Ore., about 85 miles east of Portland.
Ahier notes the coming surge in patient complaints about being denied access to their electronic medical record. "The HIPAA omnibus rule expands that right now into the digital realm," Ahier says. "I'd be willing to bet that the first penalty that gets applied after September is going to be one not for a breach, but from a patient complaining about being denied their PHI. People from advocacy groups have been plastering letters around from the OCR explaining patients' access right, with information on how and where to complain."
Ahier also contends that patients can request their electronic PHI be provided in an unencrypted format, even if they wish it to be emailed to their Yahoo or Gmail account—although such a transmission being sent in an unencrypted format is itself a breach of HIPAA.
Despite this possibility, other providers intend either to deny such a request from patients, or plan to make patients sign consent forms so that they understand the risks of receiving PHI in an unencrypted format.
"It's a substantial contradiction," says Ron Strachan, CIO at Community Health Network in Indianapolis, Ind. "That was an oversight in the rule development, something that's going to have to be corrected. Certainly sending it unencrypted to a public email provider like a Yahoo or a Google is the absolute wrong way to do it."
Strachan says if patients won't sign a release that holds the provider harmless for sending the EHR unencrypted, the provider should not be obligated to send the EHR that way.
Community Health, with annual revenues topping $2 billion, is a network of seven hospitals whose total bed count is approximately 1,500 and includes more than 200 ambulatory sites throughout central Indiana. CIOs such as Stratchan aren't going it alone in their enterprises on such HIPAA decisions. Corporate privacy officers, compliance officers, and attorneys are part of the decision-making process, Strachan says.
"The way the people who I know in the business look at it, it's not a question of if you're going to have a breach," Strachan says. "The question is really when and how it is going to occur, and then how you react to your notification and the cleanup."
HIPAA's chief enforcement officer said as much at a June appearance at a patient privacy conference in Washington, D.C.
"Our rules do not proscribe a specific security approach or a specific kind of security, but they do require an actual process to evaluate whether in fact the things you are using are providing you an adequate level of security," said OCR Director Leon Rodriguez.
At the conference, Rodriguez was asked about the tension providers feel to provide healthcare data interoperability and data privacy simultaneously.
"I'm actually a person who thinks that tension is sometimes useful," Rodriguez said. "Tension helps you sometimes balance priorities, balance competing issues. To me, the patient always needs to be the fulcrum of the discussion. A lot of these questions ultimately can be resolved thoughtfully and correctly if both the interest and the dignity and autonomy of the patient are the fulcrum of the discussion and I think generally you'll end up in the right place on these issues."
Technology to oversee HIPAA compliance will play a role in achieving that balance. At CaroMont Health, "we've done what lots of other organizations have done, which is listen to every webinar, printed the omnibus rule and read it a bazillion times, and put together a to-do list of the things that we have to get accomplished in order to be in compliance before the enforcement date," says Donnetta Horseman, vice president of corporate responsibility at the system, which features a 435-bed hospital and 43 primary and specialty physician offices headquartered in Gastonia, N.C.
When she arrived in 2010, she had her work cut out for her: "We found some of our applications didn't even have audit logs turned on." She set about "sending the message [of security and privacy] and saying it in different ways so that while [staff is] hearing the same thing, you're making it interesting. We did carnivals and had games and gave away prizes. In our newsletter we'll do crossword puzzles and different things just to get people engaged."
But in addition, CaroMont has FairWarning software that analyzes its network and examines audit logs and presents at-a-glance summaries of this information.
"It's also a huge deterrent to employees who in the past were used to looking at their own records or records of their family members, even though we've always had a policy that that was not allowed," Horseman says. "There was never anybody watching, and so nobody was ever getting in trouble. Nobody was ever getting caught. So they just continued to do it.
"Then we put FairWarning on. Within the first month that we had it in, we had hundreds of alerts that were popping up. We sent all that information out to department managers and directors and said, 'Look, these are all the alerts for the people in your department. You need to be reinforcing the policy and doing the education,' and within the first two weeks after we started enforcing it, this inappropriate access just fell off the face of the earth."
Protecting healthcare privacy will never be simple. In a few short years, providers have evolved from unencrypted laptops being stolen or lost to more sophisticated threats, sometimes inside jobs. But as the HIPAA omnibus rule and the random audits kick in, regulations and enforcement will be harder for healthcare providers to ignore, as digital privacy and associated safe practices rise to their proper place alongside other healthcare safety practices.
Reprint HLR0913-7
This article appears in the September issue of HealthLeaders magazine.
In the second part of a two-part interview, departing National Coordinator for Health IT Farzad Mostashari, MD, discusses EHRs and quality measures, Regional Extension Centers, and VDT requirements.
In the second half of my exit interview with outgoing National Coordinator Farzad Mostashari, MD, of the federal Office of the National Coordinator, we discussed the evolution of electronic health record systems and concerns over quality measures. Part I of the interview is here.
Farzad Mostashari, MD, ScM
National Coordinator for Health Information Technology
HLM: Some argue that HITECH has funded the purchase of EHR software which is really not suited to the emerging value-based care system, that today's EHRs are overwhelmingly designed for fee-for-service and maximization of revenue.
Mostashari: I would say that their views are exactly correct, but a little outdated. When I testified before Congress seven years ago, that's what I said, and I said that EHRs today don't let you make a list. They don't let you measure quality. They don't collect smoking status, or blood pressure, in standardized ways. They don't offer decision support.
One EHR vendor's chief technology officer, during their acquisition process, said, 'we don't believe in decision support.' That's what they were focused on, and that was the whole point of Congress saying, we're not going to just pay for systems. We're going to pay for systems that have what it takes to improve care, and their use of it that way.
So when we see outpatient providers' ability to look at evidence-based guidelines, increasing within a two-year time period from 20-some percent to 50-some percent… that tells you something... That's data.
That's evidence that says that EHRs are changing, and they're use is changing, because of the Meaningful Use incentive program. So I guess I would say EHRs are critical tools… and we can't afford to not use them as tools for population health management and consistent application of guidelines and care coordination and patient engagement.
It would be a tremendous loss if we just view EHRs as data-harvesting machines sitting on top of an army of data slaves, tapping away. That's not the vision that I have for electronic health records.
HLM:Jeff Immelt, the CEO of GE was quoted recently as saying he had never seen an industry with so many measures that didn't matter. That gets to my question of quality measures. People are complaining, even in Congress, that there are too many of them. How do you respond to those concerns?
Mostashari: Patrick Conway and Carolyn Clancy and I wrote an article in JAMA which talked about, [and] I hope, laid out that vision for where we need to go in quality measurement. I would agree that there are a lot of legacy measures that don't really matter that much.
Part of what we laid out in the paper is [that] we need to move towards measures that matter, towards measures that are more for accountability purposes, more outcomes-oriented, and more parsimonious, more broad-based and parsimonious.
So if we have a thousand measures, [and] they're all about clinical processes for each different specialty, a chicken in every pot, well that's okay. But we should also have measures that apply to anyone who writes a prescription. A measure around medication safety and safe prescribing would apply to them, and we don't have a very good measure for that.
Anyone who gets a referral or sends a referral should be judged on the basis of how well they close that loop. And there were no measures for closing the referral loop until we worked with CMS to develop them for Stage 2 of Meaningful Use.
I mentioned medication safety. There was exactly one measure. I wouldn't say we had too many measures for medication safety. There was exactly one measure for safe prescribing. It was safe prescribing in the elderly, and there was a list of 100-plus medications that quote unquote elderly patients shouldn't be prescribed.
Unfortunately, it only accounted for three percent of medication errors that led elderly patients to the emergency room. What accounted for 40 percent was Cumadin, Warfarin, and keeping people in the right therapeutic range for Cumadin, so they don't bleed out.
Was there a measure for that? Not until Stage 2 Meaningful Use, where we worked with CMS to develop one for that. So I guess I would say it's a little more complicated than that. We need measures that matter. We need to make measures in which the data for it from… routine care provisions, so we reduce the burden of it. And [we need to make measures] that are meaningful, that are longitudinal outcome-based measures that make use of the strength of electronic health records, and we are very much making progress on doing that.
What I think we need now is for big employers like GE to demand of their health plans that they all use the same damn measures, instead of providers getting 12 different signals from 12 different health plans, so that I think is another area we highlight in the paper as needed focus for quality measurements.
HLM: Touché. Is any of the pause in healthcare spending growth attributable to the role of technology?
Mostashari: We don't know, and I don't think that the role of technology in reducing the growth in healthcare spending is going to be something neatly attributable. It's going to be part of a broader system of changes in how we pay for and deliver care, for which health IT is an essential component, but not by itself able to be attributed to some portion of it.
So some portion of this is due to readmission adjustments, and the decline that we're seeing in readmissions. Has health IT contributed to that? Probably. I think it probably has contributed to that.
Certainly if you talk to hospitals that are reducing their readmissions, some part of it has to do with better communications and better adherence to standardized protocols. If you talk to any accountable care organization, in particular the folks who are successful or trending toward success on reducing costs, and you ask them, do you need health IT to do this? They'll say obviously we need health IT to do this.
It starts with being able to make a list of patients, and you can't do that on paper. So I guess I would say it's a little more complicated than that. But I do think that if we want to get to better care at lower cost, not just less care at lower cost, less care at lower cost, that we can do.
In fact, we tried that, right? People didn't like it very much. What we're trying to do now is better care, better health at lower cost, and that means that we have to think differently, and we have to use every tool at our disposal, in particular information tools, and in particular the patients, and empower the patients. So those all, I believe, are going to be fundamentally enabled by the technology that we're laying the foundations for.
HLM: As part of your HIE roadmap, you urged that funding be extended for the Regional Extension Centers. Where is that funding going to come from?
Mostashari: We didn't propose an existing vehicle for that, but I think the issue is, are the extension centers adding value? Are they adding value to states? In which case I would hope that states would support them.
There's actually [a] 90/10 federal match available for states who wish, like Kentucky, to support the extension centers. I would ask [whether there is] value being provided to specialists and others who would get consulting practices for something like $5,000.
I don't know where you can get a consultant to come to your practice for $5,000, but if you think that's valuable, then I would hope that they would provide funding to the Regional Extension Centers. And I would hope that if the federal government is able to, and this brings us back to a little bit of our budget discussions, then I believe that it's been a very sound investment of public funds, and it would be, I think, terrific if there was continued support for extension centers beyond Stage 1.
HLM: I've heard many providers are scrambling to meet the "view, download, and transmit" requirement in Stage 2, some going so far as planning to put PCs in the lobbies of their hospitals, so patients will be encouraged to log in there. Other people complain they're in rural districts which have very poor broadband support.
Mostashari: I was just at a roundtable yesterday, and one of the providers was saying, "I can't imagine how we're going to do VDT." The other provider said, "Oh yeah, we met that, no problem. That's easy." And I said, well what do you mean it's easy? What did you do? And she said, "Oh, it's just workflows. We just tell patients, for example, that if they want to talk to us, the best way to do it instead of trying to leave a message is to message us on the portals. We said, if you want to get your lab values, go to the portal. We say, if you want to get your summary from this visit, go to the portal. And we make sure they have their user ID and their password when they leave. And that works." So it's all a question of implementation and workflows.
The departing National Coordinator for Health IT reflects on the effect of the sequester, says meaningful use Stage 2 is a done deal, and lauds the free market for EHRs.
Last week I held a final conversation with Farzad Mostashari, MD, before his tenure as National Coordinator for Health Information Technology in the federal Office of the National Coordinator (ONC) ends. In two years as the nation's health IT czar, Mostashari has become the face of meaningful use and an advocate for health information exchange. Here is part one of our conversation.
Farzad Mostashari, MD, ScM
National Coordinator for Health Information Technology
HealthLeaders: Everybody says you're a hard act to follow. Is there any news on that front?
Mostashari: National search…as has been said, we've had the right national coordinator for the right time, and I have a lot of faith in that.
HealthLeaders: What do you think has been your greatest achievement during your time as the head of ONC?
Mostashari: I guess I would defer a little bit the idea that there's really any individual achievement in this whole thing, because the national coordinator doesn't do anything by themselves. It's 170 people here, and it's tens of thousands of people in the community to whom any achievements really belong. For myself, I think one thing I did, just from my perspective coming into this, was to focus on the outcomes in terms of population health. I think I brought a little more of an appreciation of that to the community, and also thinking about what we're going to need to do around chronic disease management and prevention for populations.
Then the other thing that maybe was a little surprise to myself was the pivot that we did to expand what we focus on to include consumer e-health and personally controlled health records. Those are two things that I may have nudged the direction a little bit more than where we were already going.
HealthLeaders: How about the biggest disappointment in those four years?
Mostashari: In the movement as a whole, what gives me a little pang every time…is when people say, We don't understand how the stuff you're doing on health IT fits into everything else that's going on around accountable care and new payment and delivery models, and how do these things fit together. I feel like, gosh, I failed, in that this is on me as kind of the communicator for ONC, in terms of having failed to make that connection more clear, because it is absolutely what we're trying to do.
All the discussions have been working backwards from those goals around care coordination, population health, and making sure that people have the tools to do that, and they really do fit together pretty well, but if there's something that kind of pains me is when people still don't see the connections.
HealthLeaders: A lot of people are asking for your advice for the next coordinator. I want to focus on two particular pain points. One is the possibility of a government shutdown and how that would impact the office. The other is just the continuing heartburn that the sequester has caused.
Mostashari: What can I say? I took over as national coordinator on April 8, 2011. It was the day the government was supposed to shut down. So my first act was to assemble all the folks at ONC and talk about the fact that they may need to go home, leave your Blackberrys, and you're not going to get paid until we don't know when. That did not come to pass. We have limped through with continuing resolutions and then sequester cuts without really an ability for the department to rebalance how we budget. Things are frozen at the same relative proportions between initiatives for years. That's like passing a household budget where your kid's in school now, but you still have to keep your budget for diapers. You can't increase your budget for school supplies. It's crazy. But despite that, what I would say to the next national coordinator on that is to lean on the community, and to tap into the desire that everybody has to help us succeed.
Beyond anything else, I think that's been such a gift for us at ONC—the willingness of people in the community to pitch in, to do pilots, to do implementation, to contribute their comments, to make it smarter, to pitch in and participate. But this is hard. Leading a federal agency like this is hard, and that's something that I'm sure the next national coordinator is going to be well aware of.
HealthLeaders: ONC and CMS recently put forth their roadmap for making health information exchange the norm rather than the exception. What additional regulation or legislation do you think will be required to get to that point, beyond what you've already announced?
Mostashari: I don't know. It's going to be something where we continue to have the roadmap. We know where we're going. We know what the leaders are. We know what our immediate steps are. And we know what the destination is, and we have such a rapidly changing and dynamic field, both in terms of the technology, in terms of policy, and in terms of payments and market forces, that we have to remain agile, and have the ability to assess what's happening and react. So I think where we are right now is the best balance that we've been able to craft in terms of permitting for innovation and flexibility, while providing guidance where there's low regret that that's going to freeze things. But that may change, and if it does, we have to be prepared to act.
HealthLeaders: There's been a chorus of calls for longer timeframes for meaningful use Stage 2, but you have hung tough. What are your thoughts on that decision and all of the outcry about it?
Mostashari: I think we've gotten credit for really, seriously, actively soliciting and actively listening and engaging with the field. So yeah, have we heard providers and hospitals and vendors say…and the associations, 'Yeah, it would be good to have more time'? Yeah, we've heard that, and we understand about the pressures that folks are under, and the speed with which change is happening, and the competing priorities. On the other hand, we also appreciate the calls of the other side, from vendors who say 'No, we're ready. We worked hard to be ready. We see that as our competitive advantage. Don't slow down.' From providers who say, 'We're ready to go,' and from groups like payers and health plans and purchasers and consumers and others who say, 'We can't wait for the benefits.' But ultimately, it's not about stakeholders finding a metric mean.
Good policymaking is about finding what's in the public interest, as you incorporate all of the information you get. It continues to be my belief that we can talk about Stage 3 but Stage 2 rulemaking is done. We already extended Stage 2 by a year, and then by another nine months, and we have a final rule. It is my belief that people need to get with the program on that, and we'll have another round of rulemaking about the timing for Stage 3.…[For] those people who need most to get moving and not hope for delays, hope is not a strategy, and they need to act now. It's getting late, and they need to get a move-on.
HealthLeaders: Do you have any sense of how many vendors who qualified for Stage 1 may not make the cut for the Stage 2 certification?
Mostashari: What I do know is that two-thirds of the market has already certified for Stage 2, by which I mean if you look at the market share, those vendors that account for two-thirds of the existing attestations have already certified, and there are more in the pipeline, so that's going to be something that we continue to watch. We continue to expect that the market pressures will have vendors put the necessary resources into it; there's a lot of revenue coming into vendors, and if they need to prioritize resources into getting the development dollars and the implementation dollars into place, then I would expect the market to reward those who do that, and to punish those who don't do that. That's the way markets work, or at least are supposed to work.
The other issue is, why exactly does it take so long for some vendors, not others, in terms of how long their development and implementation cycles are? Again, we've looked at some of our processes, and we've reduced the timelines for quality measure development, for our regulatory processes. We've reduced the timeline and cycle times for standards acceleration and endorsement, and I would expect that industry should also look to see what they can do to reduce their cycle time, which includes some of the legacy vendors.
HealthLeaders: Have you been concerned about reports of lots of rip-and-replace of EHRs? Some of it has been attributed to a provider that bought a product maybe for Stage 1, but found that product was inadequate to cover what Stage 2 demands.
Mostashari: Stage 2 is more demanding. It's supposed to be more demanding. I guess I'm of two minds. If there is replacement happening because providers are changing their affiliations and there's been mergers or acquisitions, that's understandable, and there's a lot of that going on, and there are reasons for people to want to be on a single system. There may also be good reasons why someone might say, 'I like that better, and I'm not locked in to a single vendor.' Markets work well when you get to choose one, but then you're not stuck with a vendor for the rest of your life. Good markets exhibit switching behavior, and if there are vendors who are more agile, more responsive, better customer service, better usability, that people want to switch to, then the overall quality of the marketplace will improve, and innovation will be rewarded.
That's a good thing. If, on the other hand, people are left high and dry because there are poor business practices, if someone is a perfectly good vendor that people like that is bought out and that product discontinued to force people to switch over to another market, then that is less good for the overall benefit. I would say this is another one of those where we have to really observe and understand what's happening, not only [with] the only remedy being to slow down, but think about how the market participants, including importantly the customers, can exert their influence on creating a more perfect market.
Next week, the second half of my conversation with Mostashari: too many quality measures, how much credit IT should get for lowering costs, and whether today's EHR software is outdated.
If providers keep up with the tools available to patients, and turn yesterday's one-page brochure on a disease into tomorrow's Web site or mobile app, they will continue to be at the center of patient care.
What's the value of health information technology?
That's the question being asked during this week, National Health IT Week, organized by the Healthcare Information and Management Systems Society (HIMSS).
Certainly health IT has been very good to the members of HIMSS. The billions in Medicare incentive money paid out to providers in the past several years have enriched participating health IT vendors in a way that few portions of the IT industry have been able to enjoy, even considering the dot-com boom and the Y2K scare.
In Verona, WI this week, attendees of Epic's annual user conference are marveling at the company's brand new conference center, which holds 13,000 people. That's one big corporate conference center.
Out here on the West Coast, unionized nurses at Sutter Health targeted Epic in a press release titled "Sutter's $1 billion boondoggle," which described a scheduled eight-hour EHR outage followed by an unscheduled Monday morning outage that the nurses claimed exposed patients to risk.
Officials at Sutter sent me this statement:
"Sutter Health undertook a long-planned, routine upgrade of its electronic health record over the weekend. There's a certain amount of scheduled downtime associated with these upgrades, and the process was successfully completed.
"On Monday morning, we experienced an issue with the software that manages user access to the EHR. This caused intermittent access challenges in some locations. Our team applied a software patch last night to resolve the issue and restore access.
"Our caregivers and office staff have established and comprehensive processes that they follow when the EHR is offline. They followed these procedures. Patient records were always secure and intact.
"We appreciate the hard work of our caregivers and support staff to follow our routine back-up processes, and we regret any inconvenience this may have caused patients."
I tried to find out more details about the risks to which nurses said patients had been exposed, but a spokesman for the California Nurses Association has yet to provide me with details. One press report stated that patients were not getting their medicine throughout the day, and that the glitch originated not with Epic's software but with virtualization software from Citrix, software which increasingly controls how desktop and tablet users access applications.
Meanwhile, on the complete opposite end of the health IT spectrum, physicians and patients are fascinated with the latest innovation to hit the operating room, Google Glass.
Let's hope those glasses aren't rose-colored. Somewhere between the devilish details playing out at Sutter and the innovations of Google and others, all while U.S. healthcare is going through an unprecedented transformation from volume to value, is the reality of health IT's value.
It's not a panacea, but without health IT, we probably wouldn't even dream of healthcare's creative destruction or innovative ways of engaging patients.
Geeta Nayyar MD is a practicing rheumatologist at Florida International University in Miami, Florida whose other job is CMIO of PatientPoint, one of the largest providers of TV-based education in practice waiting rooms and exam rooms – and more broadly, any technology required to engage patients in managing their own care.
"You want to make sure the content that people are reading is accurate, which is a large part of what we're doing at PatientPoint," Nayyar says.
But how much difference can this make in an age when the vast majority of the population conducts Google or a smartphone's app store for medical advice before they ever contact a doctor or nurse?
"It's a great question," Nayyar says. "Part of being an empowered patient is knowing where to look. When I am thinking of starting [patients] on a new regimen, I tell them, please do go out there and find information, but I want you to find the information from these trusted sources that I think are good – the Arthritis Foundation, the American College of Rheumatology, the Lupus Foundation. Absolutely go online, talk to your friends, but these are the top five sources that I trust as your physician."
To me that defines not only the value of health IT, but also the value of the healthcare provider in this Google age – as curator, guide, Sherpa, coach and counselor.
Not quite the authority figure of old, but if providers keep up with the tools available to patients, and turn yesterday's one-page brochure on a disease into tomorrow's Web site or Pinterest page or mobile app, they will continue to be at the center of patient care.
As Nayyar notes, "anybody who reads the side effects on something like Tylenol would be terrified of taking it." It's still the clinician's role to put the plethora of information out there in context.
I am struck by the number of practices, however, that have a long way to go beyond just installing an EHR. Scheduling technology, or the lack thereof, continues to be a barrier to coordinated care. Recently I heard someone remark how much better an experience it would be if an elderly patient with five chronic diseases could go to a single exam room, then have visits with all her specialists in that same room, rather than have to get from office to office – an exhausting experience for many elderly patients.
Even better, how about being able to have multiple specialists at the same appointment, as appropriate?
To me, it is those kinds of patient expectations that are defining the new value of health IT.
That requires some scheduling technology. Sounds simple, until you realize that EHRs may or may not be able to help, that each practice may have its own silo of scheduling software, that each provider carries a mobile device – probably his or her own – that may or may not be in sync with the office's schedule, and the patient has his or her own schedule that ought to be able to coordinate with the providers' schedules.
But too often, we spend too much time and too many steps trying to sort all this out. The value of health IT is multiplied when it adheres to standards in everything from medical records to scheduling.
Savvy providers will always find ways to innovate on top of basic services and standards, to stomp out waste and inefficiency and delight employees and patients. True, the landscape has changed forever, and there will come a day when most patients will hardly ever visit a doctor in person. But we're in the business of health, not healthcare.
"We all wish we had five hours with each patient, but it's not realistic," Nayyar says. "The average office visit is like 8 to 10 minutes and typically docs are dealing with several disease states, but that's why I think health technology is so great, because you're able to extend that conversation beyond the office walls."
So, Happy National Health IT Week. We may not be a nation of e-patients yet, but somewhere between the Google search and the office encounter, we're learning.
If CMS could be sued for HIPAA violations, it would be. But behind tales of government inefficiency and inertia is a tremendous debate. Summed up, the very technology that could solve our identity and fraud problems could open up tremendous privacy concerns.
Depending on who you talk to, Medicare fraud is estimated to be a $48- to $120-billion-a-year problem in the United States. Yet, for all the technology this country cranks out, surprisingly little so far has been applied to combating this problem. Could it take another act of Congress?
On August 15, Rep. Jim Gerlach, a Republican from Pennsylvania, introduced H.R. 3024, the Medicare Common Access Card Act of 2013.
Under the proposal, within 18 months of passage, the HHS secretary would conduct a pilot program utilizing smart card technology for Medicare beneficiaries.
Smart cards are devices that contain an embedded integrated circuit chip that can be either a secure microcontroller or equivalent memory, or a memory chip alone. That's the definition put forth by the Smart Card Alliance, a trade association that supports H.R. 3024. Other supporters include the AARP, the ACPE (American College of Physician Executives) and the AAOS (American Association of Orthopaedic Surgeons).
Smart card technology is already commonplace in employee key cards, transit cards, credit cards (outside the U.S., and in the U.S. starting by 2015), and more. I even have a smart card that allows me to easily rent bicycle locker space at transit stations in the San Francisco Bay Area, at the big-ticket rate of 1 to 3 cents per hour.
Meanwhile, today's Medicare card is a piece of paper with no intelligence. Dare I say it, it's downright stupid. That's because the Medicare member's Social Security number is printed right on the card.
Yes, that means everyone who comes in contact with that card, from clerks on their first day on the job to EMTs making a midnight run, has access to that Social Security number.
If CMS could be sued for HIPAA violations, it would be.
But since it can't, I am left wondering why Medicare is so far behind the rest of society, and facing its own share of responsibility for the fraud and inefficiency so often ascribed to it. The truth is that behind tales of the same old government inefficiency and inertia is a tremendous debate about the role that digital identity plays in our modern world.
Summed up, the very technology that could solve our identity and fraud problems could open up tremendous privacy concerns, due to the very powerful effect that digitizing all our personal details has on the ability to aggregate and, unfortunately, abuse that information.
5 Data Problems
1. Consumers have no easy way to read the information stored on the smart cards they carry. So they can't verify the accuracy of that information without a lot more help.
2. Smart cards may help verify a patient's identity at the clinic, but they provide no benefit for the consumer at home trying to log into a patient portal or other online health services, again because there are no home readers or standards for same.
3. Government is really good at building silos of information, one act of Congress at a time. Government is really bad at integrating these silos of information quickly or inexpensively. At a HIMSS analytics conference this June, one speaker said that CMS alone has multiple data warehouses, built over the years, which have great difficulty sharing information with each other.
4. HIPAA currently has a prohibition against the federal government planning or even researching a national patient identifier system. H.R. 3024 claims the Medicare smart card pilot will be compliant with HIPAA. That should be interesting to watch, particularly after the HHS lawyers get through with it.
5. Should we have a smart card for every possible use in society? That's the direction modern society is going. Library cards, brand loyalty cards, insurance cards, keys as cards – they're all getting smarter. But no one is making wallets any bigger. Or if they are, they shouldn't be. We will need flexibility, so at the consumer's discretion, they can use one smart card in multiple ways.
With all these problems, it's no wonder that "analysis paralysis" seems to be the order of the day. Now let me suggest what can be done about it.
5 Proposals
1. We should let Rep. Gerlach and his Medicare smart card allies make their case. Similar legislation was introduced in the last Congress but didn't get anywhere. This time, let's hold a hearing. Capitol Hill holds lots of hearings about what's wrong with healthcare in this country. It's time for (another) hearing or two about the role that technology can play to solve the identity problem, the fraud problem, and the problem of Congress building one information silo after another.
2. Let's look around the world to see if anyone else has solved this problem, and see what we can learn from them. Taiwan has the lowest administrative cost of healthcare in the world – two percent, according to Kelli Emerick, executive director of the SecureID Coalition. One reason: They use smart cards. And I am told that Canada may have some clever ways to roll out a national patient identifier.
3. Let's put some effort into the public/private partnership that is NSTIC, the National Strategy for Trusted Identities in Cyberspace. It is the umbrella group established by executive order in 2011. NSTIC describes an "identity ecosystem" that allows individuals and organizations to trust each other through a set of agreed-upon standards and practices.
NSTIC has convened a healthcare committee which has regular conference calls, and could benefit from greater participation by providers. Already, a number of major stakeholders are participating. It is also conducting its own pilot, with the help of five awardees.
4. Engage with a group that's just been announced, the Medicare Identity Fraud Alliance. Supporters include AARP, Blue Cross and Blue Shield Association, Consumer Federation of America, ID Experts, Identity Theft Resource Center, and National Health Care Anti-Fraud Association. Get some providers involved in that effort, for a less piecemeal approach.
Adrian Gropper, MD is PPR's chief technology officer. He has a deep understanding of NSTIC's concept of the identity ecosystem. We spoke last week and I noted the irony that NSTIC faces challenges receiving further funding to solve the identity problem the right way, while at the same time H.R. 3024 proposes allocating $29 million for the Medicare Smart Card pilot.
Maybe these two government initiatives should get together and share expertise and funds, I suggested to Gropper.
"That's a very nice idea," Gropper told me.
Now before you write me, yes there are many other ways to fight Medicare fraud with technology other than figuring out the identity and smart card problems. Algorithms are already at work, and getting smarter, at detecting patterns of abuse. The Medicare regulations themselves probably still contain an encyclopedia's worth of loopholes that permit waste and fraud, loopholes that should be closed.
But in an age when libraries do a better job of protecting our privacy than healthcare does, and when the average wallet has an impressive array of security-powered smart cards, surely Medicare, and the rest of the healthcare system, can be doing better than it is.
The personal computer has done a lot of good for clinicians in hospital settings, but its days are numbered. Zero-client terminals are quieter, safer, and cheaper to buy and to maintain.
Look around the modern U.S. hospital, and you'll see fewer and fewer traditional personal computers.
That's a good thing. PCs helped bring technology to the masses and powered electronic medical records in a way that previous computing had barely dented. But that was then, and this is now, and as aged PCs get refreshed, they are being moved out for "zero-client" terminals that IT organizations simply plug in.
Behind the scenes though, is a complex set of technologies that allow users to continue to use the same desktop and apps as before. But things are different:
Zero clients have no hard disks, and in fact, usually run only a small kernel of Linux software in flash memory, and thus require almost no updating.
With no spinning hard disks, cooling requirements are minimal, so the hardware is longer-lasting and quieter.
Virtualization technology, coupled with proximity technology that senses a user's name badge not unlike modern door access systems, allows a user's desktop to follow him or her around a facility while requiring the user to enter a user ID and password only once during a shift.
A recent annual survey by Imprivata, which makes the technology that authenticates those name badges to start sessions with virtualization software from Citrix, VMWare and Microsoft, just found for the first time that a majority of hospitals are using thin or zero clients instead of traditional PC clients.
That same survey found that two years from now, 98 percent of those surveyed will be using thin or zero client as part of their IT strategy.
Thin clients, a previous incarnation of this same idea for reducing the complexity and increasing the manageability of a PC, still required the kind of occasional software updates and patches that drive CIOs to distraction.
The latest zero client hardware won't be found at your local Wal-Mart, but CIOs know the channels where they can find these increasingly commodity-like terminals able to serve up a Windows desktop.
Virtualization is an idea almost as old as computing itself, having been popularized by IBM on its 360 mainframe in the 1960s. Even running virtualization on a PC is not novel anymore. But the move toward a totally virtualized desktop as a mass phenomenon, particularly in healthcare, is just now pulling into the station.
I first witnessed virtualization in healthcare firsthand watching a clinician log in with her name badge at a thin client when I was a Kaiser member eight years ago. But Kaiser wasn't the only trailblazer. Another provider who implemented virtualization on PCs starting in 2002 is Memorial Healthcare, a 150-bed hospital in Owosso, Michigan.
Memorial Healthcare moved strategically from thin clients to zero clients a year ago.
"As healthcare changes over the next four or five years with the Accountable Care Act and with industry pressures to reduce cost, virtualization is going to become a key component to creating efficiencies that right now we just don't have," says Frank Fear, vice president of information services.
Fear says one of the biggest advantages – and an occasional drawback – to zero clients at Memorial is this: Every morning, or whenever staff arrives for a 12-hour shift, they are creating a brand new virtual desktop. The previous desktop's state of appearance is never saved. The data is always safe in the data center, and not stored locally on the zero client.
The drawback – really just a different way of doing things – comes when a software patch must be installed. Under the traditional PC model, that meant enlisting the aid of the help desk to update each machine. Under the virtualization model, only the master image of the software being pushed out to the zero client needs to be patched. That requires a different skill set in the data center, but saves considerable time by essentially only needing to be performed once.
As the rules of the HIPAA Omnibus legislation breathe down healthcare IT's neck, with hair-raising tales of breaches starting with rogue USB drives and missing hard disks, the kind of centralized management represented by virtualization and zero client technology is a siren song.
And right behind that is next April's retirement of Windows XP, still in use on too many desktops in healthcare. Virtualization is the natural replacement for XP, although it requires the master image to be at least based on Windows 7 if the same Windows apps will be used, Fear says.
Laptops and tablets and phones aren't immediately touched by this virtualization wave the way the desktop is, although my cover story on tablets back in January found a number of healthcare IT shops allowing access to desktops through virtual sessions implemented on tablets.
But healthcare IT executives such as Fear are sleeping better at night thinking about the control, security, and management options that the new virtual desktops provide. Plus, the zero client hardware will outlast the older PC hardware and is even a bit cheaper to buy.
Deploying virtualization enables a lot of other intriguing possibilities. Radiologists who need more computing power have traditionally defined the level of computing power in a hospital's basic PC. But in a virtual world, those users who need more computing power on their desktops can get it served to them over the network, according to officials at Imprivata.
Zero client hardware nevertheless benefits from a decent level of commodity graphics that would have been in a high-end gaming PC just a few years ago.
When I hear the rising chorus of calls for more effective return on investment of healthcare IT, it's clear to me that job one for IT right now is getting inefficient, buggy, costly-to-maintain traditional PCs off to the scrap heap, and move in the direction that Kaiser and Memorial Healthcare and others have cleared. The writing is on the wall.
A provider who went through a Meaningful Use audit explains how she got through it, relying on training received at a user group conference and the help of her state's Regional Extension Center.
Don't be surprised if every third word I write for the next year or so is about Meaningful Use. It's a mountain we all must climb, but we're now well up on the slopes and have already encountered some challenges of late. No wonder that the calls for delay keep piling up.
But I have a snippet of good news on the Meaningful Use front this week. According to industry consultant Frank Poggio, the tally of EHR software certified for the 2014 Edition of Meaningful Use, including Stage 2, now stands at Epic, McKesson (Paragon only), Allscripts, Meditech. HMS, and CPSI.
According to Poggio, that still leaves Cerner, GE, Siemens, Healthland, QuadraMed, NTT-Data (Keane) and a host of others not yet certified. But hey, it's a start.
And more good news: for the first time, I've found a provider who went through a Meaningful Use audit, and survived.
South Arkansas Eye Clinic, founded in 2000, is the largest eye clinic in Union County, Arkansas, based in the county seat of El Dorado. Eighteen months ago, it implemented Allscripts PM practice management software and Medflow, eye-care-specific EHR software certified for Meaningful Use.
Shortly thereafter, Jeannie Atkinson joined the clinic as practice manager, which is how she came to be the lucky recipient of a CMS notice: the clinic had been selected for a pre-payment Meaningful Use audit.
"Lucky you," I congratulated Atkinson.
"I know," she replied with a laugh.
At least Atkinson was prepared. At a Medflow user group conference, she took a class on how to survive a Meaningful Use audit. "I was glad that I did, because it started me thinking about, okay, so what if I did get audited?" she said. "Would I be ready for it?"
Atkinson had another valuable resource to turn to: HITArkansas, the state's Regional Extension Center.
"They had created a binder for us, that walked us through the core measures we needed to attest to, what measurements we needed to meet, what documents we needed to keep, what security risks [existed], the way we documented.
Receiving the notice of audit triggered a brief feeling of panic, but then Atkinson called Valerie Moring, implementation specialist at HITArkansas.
"When Jeannie had originally called me, I sent an email out to all of the other specialists that we worked together with, to see if anyone else had received requests like this yet," Moring says.
They had not. It was also one of the first pre-payment Meaningful Use audits. "Initially when Meaningful Use came on, a lot of the providers who jumped on board really fast, they attested, and CMS just basically just paid them right and left," Atkinson said.
In fact, approximately 5 to 10% of providers will be selected for pre-payment audits. Selections will be made both randomly and also based on protocols that identify suspicious or anomalous attestation data. Post-payment audits will also affect approximately 5-10% of providers who submit attestations through the program. The scope of both the pre- and post-payment audits is consistent with the scope of audit strategies for similar CMS programs.
CMS said "wait, hang on just a minute. We shouldn't be auditing after the fact. Maybe we should stop, slow down and before we give the money out, and we have to take it back, let's start doing pre-payment audits. And that's what they've started doing this year."
CMS speeds up the audit process by providing a portal, operated by the accounting firm Figliozzi & Company, allowing providers such as South Arkansas Eye Clinic to scan and upload required documents, rather than trusting them to the U.S. Postal Service.
With recent talk about the forthcoming ending of funding for HHS' national network of Regional Extension Centers, it's worth noting that HITArkansas is part of a larger concern, the Arkansas Foundation for Medical Care, a 40-year-old state quality improvement organization.
According to HITArkansas director Nathan Ray, his REC has helped more than 1,400 primary care providers and specialists around the state. "We've got over 52 percent of our target to Meaningful Use," he says. "We're out there helping clinics such as Jeannie's with actually making sure that they are compliant with the Meaningful Use criteria, helping them where there are gaps."
I was curious about what is in HITArkansas's helpful binder. According to Ray, it covers assessment and planning; selecting an EHR vendor; implementation tools, checklists and strategy guides; a post-go-live EHR evaluation guide; Meaningful Use criteria; security risk analysis; and details of registration and attestation.
Ray says different institutions have different ways of saving paperwork in case of audit. Some use Microsoft Sharepoint or similar document management technology. Others upload it to a secure area of their network.
"We had actually backed ours up on a shared hard drive," Atkinson said. "It was a really easy, simple process just to go there and like Valerie says, just zip all that up that they were requesting and asking for, and submitting it to them."
The clinic received two responses from CMS. The first request, received in March, requested supporting documentation on all of the Meaningful Use quality measures. Then, at the end of May, CMS requested more documentation on Core Measure 5, the Active Medication List.
Atkinson once again called Moring, who helped her gather the final round of documentation and draft a cover letter. Finally, in July, the final letter from CMS arrived: South Arkansas Eye Clinic had passed the audit.
It must have been a great feeling.
"It was," Atkinson said with a laugh. And it didn't hurt that the Meaningful Use Stage 1 incentive check arrived shortly thereafter.
One other note for specialists reading this: Atkinson checked with the American Association for Ophthalmology Practices and the regional Arkansas Eye Association to determine and confirm which exclusions to Meaningful Use quality measures applied to them. "A good example would be, when you go into a doctor's office, they're going to take your basic vitals and record your height and weight and your blood pressure," Atkinson said. "We don't do that. It's not in our scope of practice, and so we took that as an exclusion." Other specialists are well-advised to check with their own professional associations for similar information.
Want more good news? Atkinson says future CMS Meaningful Use guidelines will change some criteria to be more specific for specialists, meaning faster attestation and simpler audits. Hooray for that.
Hoarding medical imaging data is one of the most highly profitable, and strategic, tactics of hospitals competing in a fee-for-service market. It also represents a huge opportunity to reduce the high cost of healthcare as reform comes online.
One of the lesser-talked about menu objectives in Meaningful Use Stage 2 is a requirement to use EHRs to receive more than 10 percent of imaging results. Given the current crunch regarding core objectives of Meaningful Use Stage 2, it is understandable that not much is being said about this requirement.
Nevertheless, sharing images goes to the heart of what is possible with healthcare IT. The generation of medical images costs a fortune. Under a fee-for-service model, generating the maximum possible images out of the various departments of a hospital is a huge source of revenue.
As we know, the fee-for-service system is hanging on for the foreseeable future. I'm guessing that if you strip away the generation of duplicative, unneeded medical images, you are probably talking about the difference between many a profitable hospital and those same hospitals running at a loss.
From the technology standpoint, medical imaging systems have been big-ticket items. It's probably true that hospitals in the 1980s, 1990s and 2000s engaged in an imaging arms race, making massive investments in systems that pushed the imaging state of the art to where it is today. Their installation drove many a press release and marketing campaign.
A July 25 op-ed in The Wall Street Journal faults the annual U.S. News and World Report "Best Hospitals" survey for its main focus being on the degree to which hospitals use certain cutting-edge technologies. The main example offered is robotic surgery, but imaging is probably a close runner-up.
Once those hugely expensive systems get installed, however, the hospital's imperative is to keep those imaging systems humming. If those systems aren't generating maximum possible revenue, they become cost centers, not revenue generators.
Now factor in standalone imaging centers that jump into markets, install newer systems, and steal some imaging business away from the hospitals in that market. That just adds to the pressure hospitals feel to maximize utilization.
It's a high-stakes game that won't be changed tomorrow by the Meaningful Use Stage 2 imaging menu option. Quoting Farzad Mostashari as I did last week, "We cannot have it be profitable to hoard patient information and unprofitable to share it."
Arguably, hoarding imaging data is one of the most highly profitable, and strategic, tactics of hospitals competing in a fee-for-service market. It also represents a huge opportunity to bend the cost curve of healthcare, but since it has only been introduced as a menu item for Stage 2, we can expect its impact not to really matter until Stage 3 is implemented, when it will become a core requirement. Don't look for that before 2017 at this rate.
Six months ago, I wrote at length about enterprise imaging, the effort to bring together a common architecture for all medical images. The more I learn about the dominance of the radiology-oriented DICOM format, the more I realize that dominance is impeding innovation in enterprise imaging and image sharing between enterprises.
"Whenever I criticize DICOM in any way, generally pitchforks and torches greet me at my doorstep," said John Halamka, CIO of Beth Israel Deaconness Medical Center, during a discussion of standardization of image sharing at the July 19 meeting of the HIT Standards Committee Clinical Operations Workgroup.
"DICOM is a wonderful format for radiologists who have dedicated workstations inside an institution, but does have challenges in an Internet-enabled, mobile-enabled, Android and iOS kind of world," Halamka added.
"DICOM to me has been a wonderful standard," replied Hamid Tabatabie, founder and CEO of Life Image, Inc. "It has made radiology be years ahead of all the other 'ologies' in ability to transfer and share files. But [here] we are years after; we have run out of lipstick to put on the pig and we can use a new thing."
Now, DICOM isn't dead yet. One recent innovation in sharing DICOM images, the RSNA Image Share, is going strong and growing. But the various specialties each have their own spin on imaging. "An EKG isn't an image," Halamka noted at the July 26 meeting of the HIT Standards Committee Clinical Operations Workgroup. "An EKG is a time series. It's a waveform. And it has absolutely nothing to do with a picture… conceivably, it could be represented as text."
At the opposite end of the spectrum is pathology, where it's all about the image. The challenge is that each of these formats does have its own related non-visual multimedia, ranging from text to waveforms to numerically analyzed specimens.
Bringing pathology into the digital age presents a particular challenge to healthcare. At one extreme are "frozen sections" whose display and analysis may need to be provided remotely while a patient is awaiting surgery for removal of a tumor. There, time is of the essence.
At the other extreme are untapped researching resources such as the Joint Pathology Center, which holds a repository of 60 million glass slides. "It's a tremendous data store that no one has access to," says Mark J. Newburger, CEO and president of Apollo, which provides enterprise patient multimedia PACS to hospitals ranging from Henry Ford Health System to the University of Illinois.
Newburger, an industry veteran, says the future is turning today's proprietary data stores and imaging systems into a set of device drivers, much like platforms such as Windows and OS X provide. Apollo built such an imaging platform in collaboration with The Hospital for Sick Children in Toronto, and even made it possible for clinicians there to build their own such drivers to connect with other image stores.
Meanwhile, hospitals have just now reached the 1990s in one regard. Patients are now bringing in hordes of images on compact disks. Massachusetts General Hospital performs 750,000 imaging exams a year, but also brings in more than 100,000 exams per year from patients carrying CDs into the facility, according to Keith Dreyer, MD, vice chairman of radiology at Mass General.
According to LifeImage, when the average consumer-provided CD arrives at a hospital, 22 percent of them will not open. More than half result in additional re-radiating tests being performed. And even when a CD does open, it takes 4 minutes per CD to copy.
There are also concerns by Newburger and others that the Blue Button initiative underway at ONC remains oriented too heavily toward text, and not enough toward image download and exchange.
The opportunity of image sharing outweighs all these challenges. In this column, I've tried to illustrate potential cost savings and benefits to care. I encourage the HIT Standards Committee Clinical Operations Workgroup to continue its work, and for providers and the industry to join them in their efforts to break through some of these barriers, to share best practices, and make real headway, despite the continuing market forces promoting high cost, difficult sharing and opportunities lost.
In seven weeks, providers are supposed to be implementing stage 2 of Meaningful Use. The government's interoperability plans are lacking. And a key Washington player says he's leaving the scene. It's starting to look like a calamity.
As if the turbulence of July 2013 on healthcare IT wasn't bad enough, last week things got arguably worse.
First, Farzad Mostashari, director of the Office of the National Coordinator (ONC) for Health IT at the Department of Health and Human Services, announced he is resigning, staying on just long enough for a replacement to be found.
Then, one of ONC's major projects of 2013, a strategy by CMS and ONC to promote interoperability in an industry that desperately needs it, made its underwhelming debut, overshadowed as it was by Mostashari's resignation, which hit during the same 24-hour news cycle.
How underwhelming was the ONC/CMS plan, itself a response to comments on an earlier request for information? Highlights of the initiative related to health information exchange tell the story:
Accelerating Interoperability and Electronic HIE through Payment Models Require electronic HIE in all advanced payment models and Medicaid waivers
Extend Center for Medicare & Medicaid Innovation (CMMI) efforts
Include Long-term care and post-acute care (LTPAC) and Behavioral Health (BH) in State Innovation Models (SIM) grants
Direct incentives for LTPAC and BH providers
Explore additional reimbursement codes for care coordination via telehealth, e-visits, radiology queries, and Evaluation & Management
Require electronic HIE standards as regulatory requirements for quality measurement and conditions of participation
Extend Regional Extension Center (REC) support
Extend Stark and Anti-kickback exceptions for donations of EHR software
The RECs are doing fine work, and it's all well and good for CMS and ONC to want to extend support for their work in the hopes of moving health IT interoperability forward and provide necessary training, but that will require Congress to act to extend that funding.
Russell Branzell, president and CEO of CHIME
You had to be pretty technical, and dig down deep into the 14-page document, to find anything that really impressed. I did find this:
ONC, through the HHS Entrepreneurs Program, is developing targeted, open source toolkits ("Health Information Service Provider [HISP]-in-a-box and Admission, Discharge, and Transfer [ADT]-Alerts-in-a-box") that can be rapidly and cost-effectively deployed by a wide range of health care entities including those that are not eligible for the EHR Incentive Programs (e.g., SNFs, surgery centers, and home health agencies).
Still, it was hard not to feel a sense of dashed expectations from this, the major work product of ONC's year of health information exchange.
Searching for insight or perhaps some encouraging words, I spoke last week with Russell Branzell, president and CEO of CHIME, the College of Healthcare Information Management Executives.
"I'll give kudos that they're addressing these concerns," Branzell told me. "The right people are listening. We went into meaningful use pretty quickly during a period of economic turmoil in the country, probably without the homework we should have done on the front end of what this might look like.
"Now we've got some chances to fix stuff, so I think ONC and CMS are both very strongly interested in getting the right things moving in the right direction."
Continuing the direction analogy, Branzell says "we're probably heading in the correct north direction overall. The difference is we're not heading true north yet, which is fine early in a journey, when you're only 50 miles away from your original starting point, if you're only a few degrees off.
"But if you're a thousand miles down your journey, and you start off a few degrees off, you're a long way from your target line. We're still early enough in the journey that we've got to make some of these corrections, and really a lot of this does have to do with the way the law was written, [and] the way the interpretation of standards and certification requirements come out."
Branzell astounded me at one point, noting that some providers are on not just their second, but their third electronic health record implementation.
It's actually not that inconceivable. In the initial rush for the ONC incentive money, too many organizations picked more than one EHR, one for inpatient records and one for outpatient records. That was bound to be trouble.
Not only that, but large organizations have been gobbling up practices and smaller organizations all this time. Numerous acquired facilities have been ripping and replacing their first choice of EHR to match the new parent company's preference.
Moreover, Brazell said, "a lot of people jumped in because they knew that the sand in the hourglass was running out, so they [were] slamming systems in, [and] probably didn't do the right transformation or process change, and now are going, 'okay, we got those first payments, but now we realize we should have done this a little bit differently.'"
Noting that some large organizations are acquiring multiple smaller organizations, Branzell characterizes the work underway as "not multiplicative work, but rather exponential increases in work, because each one of those small systems are pretty tough to convert. I've talked to CIOs all over the country, and that's probably one of their major pain points right now.. the ambulatory conversion factor that they're having to address."
While we try to determine just how much money might have been wasted by these multiple EHR implementations, Branzell said most organizations have not yet begun their work on meaningful use stage 2. How could they? Only a handful of certified EHRs for stage 2 are yet available. In just seven short weeks, providers are supposed to be implementing. It's starting to look like a calamity.
But perhaps we should keep our eyes on the horizon, no matter how rough the trip. One encouraging sign is that CMS is starting to consider making health information exchange part of the criteria for Medicare payment.
"We cannot have it be profitable to hoard patient information and unprofitable to share it," Mostashari said last week. I can't imagine market forces doing it by themselves.
CMS must take more action, and Branzell thinks it's inevitable. States cannot solve this problem by themselves. "My previous organization, a third of our patients came from two other states, because we were on a border," he says. In such a scenario, connecting to the HIE in one state does not solve the state-to-state interoperability problem.
"What we really need is a framework," Branzell says. "I don't think it will adapt as fast as the needs of the industry, whether the [ONC] Standards and Interoperability Framework will really give us those things that we need."
The current approach, built around flexibility and modularity, let the healthcare IT industry "kind of adapt to itself," Branzell says. "The general approach, I think, is flawed. I'm not personally a big government person, but I think there's a really strong place here for the government to much tighter rein in the guardrails of where the variation can occur, to give us a smooth path to where we need to get to."
There you have it – the dog days of summer 2013, a good time for healthcare CIOs to take a break if they can, if they believe that things will turn around when they return in September. Meanwhile, I do wish ONC and HHS leadership all the luck in the world. Losing Mostashari now is the last thing they need.