Skip to main content

Should the Feds Fix Interoperability?

 |  By smace@healthleadersmedia.com  
   July 21, 2015

Population health leaders are divided over government actions to create interoperability—but agree that lack of data standardization hampers good analytics.

If the angst over meaningful use has taught us anything, it's that federal policy and federal money is a blunt instrument when it comes to directing the future of healthcare IT. That's why it's perilous for Congress to be acting as it is to tackle health IT's current big bugaboo, data interoperability. Leaders of the efforts to drive population health initiatives are divided on the notion.


Michael D. Robertson, MD

When I brought up this topic at the recent HealthLeaders Media Population Health Exchange, there was no shortage of opinions. The very stimulus money that is driving health IT's winds of change through healthcare is also magnifying the flaws in meaningful use that allow new information silos to arise and lead to accusations of information blocking.

Those accusations permeate the Office of National Coordinator's April report to Congress on information blocking—a report that clearly led to language in the 21st Century Cures Bill, which passed the U.S. House of Representatives this month. That bill would penalize providers who hoard information and give ONC new powers to investigate and take both punitive and corrective action against the hoarders.

But one healthcare executive's perceived hoarding may at the very same time be another executive's prudent effort to protect patient data and their own customer list from being poached by larger rivals. A third executive may simply be frustrated with the differing technology approaches and Tower of Babel that healthcare IT solutions sometimes resemble.

But listening to the assembled executives at the Population Health Exchange, I didn't hear a lot about the hoarding that has Congress and the ONC so up in arms. Instead, I heard plenty of grief about the lack of data standardization between the range of EHRs, analytical tools, and other health IT systems these executives have been purchasing.

"You've got to figure out a way to build a bridge to map those [systems] out and bring that into your data warehouse," says Michael D. Robertson, MD, chief medical officer of Covenant Health Partners in Lubbock, TX. "We've got tons of data, but [we] can't do anything with it. It's just like a room full of paper."

Robertson, like several others at the Exchange, wishes that someone with authority would intervene to enforce interoperability between different vendors' products. "There's too much money in it" for the vendors themselves to solve the problem without some prodding, he adds.

Even data originating from payers, which is essential to population health initiatives, still shows an unhelpful variability, says Richard Vaughn, MD, chief medical information officer of SSM Health Care in St. Louis, where he also serves as system vice president at SSM's Center for Clinical Excellence.


Richard Vaughn, MD

"In order for aggregated claims data to be useful, the information in a claims file must be extracted and properly loaded into our vendor's analytics data warehouse," Vaughn says. "Unfortunately, due to variation on the payer side, each claims source has to go through a mapping process to make sure the data lands in the right place in the warehouse, followed by testing and validation to prove the data is reliable.

"It is only after mapping, loading, and validation are complete that we can then use the risk models from the vendor to develop insights and actionable reports. The mapping, loading, and validation can take months, is labor-intensive, and generates significant cost to the customer."

Vaughn points hopefully to newer analytic tools that combine both claims data and clinical (EHR) data, which allows for a more robust risk model than the traditional approach using only claims data. "Again, due to EHR vendor variation, the analytics vendor must map, load, and validate each EHR source before the full power of the analytic tool can be applied, which becomes another source of cost, delay, and frustration," he says. "Imagine the efficiency we could obtain if these information sources, or at least the critical data that they contain, were all using the same approach to data structure and reliably used the same standard in the same way for every transaction."

Vaughn believes government has a role to play in sorting all this out.

"As for the proper mix of public/private involvement—as insurers continue to grow in size, healthcare organizations may struggle to impose a standard on them," he says. "This is where government may serve an important role by driving carefully designed health information standards that will improve information flow by decreasing the costly and inefficient processes currently needed to transform and merge the data."

Another Population Health Exchange participant was Scott Joslyn, CIO and senior vice president of MemorialCare Health System in Orange County, CA. "With regard to interoperability, the government has set expectations and the pace of progress through meaningful use," he says. "Still, the results have been less than satisfactory in some cases. It can be awkward. Also, the data exchanged are only a subset of the record, and it typically lacks the semantic meaning to associate like data. Still, it's a major step forward."

Back in 2008, when MemorialCare pioneered the use of Epic's Care Everywhere interoperability platform, "the ability to connect to the Epic system of another organization without intervening technology was a true advance," Joslyn says. "In their native Epic workflow, clinicians are able to examine, select, and incorporate patient data from another Epic system into the patient's record. This has proven helpful time and again for us and for many others as the Epic community has grown."

Joslyn acknowledges the concerns of Congress and others that data does not flow well among different EHR vendors. "That is true, though it is mostly a matter of being sufficiently cumbersome to impede clinician use and the requirement of having some intervening technology in place, i.e., a community HIE."

But more regulation is not the way to go, he says. "It is best that the industry address these problems, rather than through some regulatory means. The government should do more to encourage exchange through consistent, clear policy on opt-in/opt-out, for example. And, we need the data to be more complete and compatible among systems."

HL7's evolving FHIR standard shows great promise to increase the richness and compatibility of exchanged data, Joslyn adds. "While very early in the Gartner hype cycle, [FHIR] might just deliver all that is required if strongly embraced by all concerned."

Scott Mace is the former senior technology editor for HealthLeaders Media. He is now the senior editor, custom content at H3.Group.

Tagged Under:


Get the latest on healthcare leadership in your inbox.