Skip to main content

Let’s Get Serious About Data Interoperability

News  |  By Jack Cox MD  
   August 16, 2016

The U.S. Department of Health and Human Services has set a target of attaining data interoperability by 2024, and we need to get serious about it. There are many benefits for having our nation’s electronic health information systems seamlessly share data, including improved quality of care, greater health care efficiencies and more convenience for patients as they navigate among various health care networks. Yet, this goal of having systems truly “speak the same language” may be difficult to achieve, especially given where we are today.

Significant gaps still exist for health care before our data can truly be shared.  Why has progress been so frustratingly slow? Let’s go back several years when health systems first began purchasing and implementing their electronic health record (EHR) systems. In those early days of EHR, replacing paper records and adoption of new electronic systems among clinical staff was our big concern–not interoperability across health systems. Most of the EHRs we purchased were not designed as open systems that could pull in information across multiple providers. These systems were created to coordinate care within a distinct ecosystem, and at the time, that in itself, seemed a good thing.

Additionally, health care providers weren’t focused on setting similar standards of measurement.  The systems we built were to meet our own data needs. As a result, we now have literally hundreds of EHR products with different technology architectures which are storing all kinds of data. Whether or not the data is collected in a manner that allows us to compare with one another is hit or miss. Some systems are good at it, some are notoriously difficult for promoting sharing.

To be truthful, part of the situation is that there simply hasn’t been enough pressure to focus on the issue. Take for example  The Centers for Medicare and Medicaid Services  (CMS)’ Hospital Value-Based Purchasing (VBP) program, which is an effort to  link Medicare’s payment system to national  value-based metrics, many of which measure quality of care in the inpatient hospital setting. The problem, however, is the incredible lag time we are provided for data gathering –the data we need to collect is 18 months old. That’s not exactly a compelling incentive for health systems to move quickly. Also, it’s another example of data that’s of minimal value. Eighteen month old data is not representative of current performance, so although the data is shared, it’s not exactly helpful. I compare it to attempting to fly an airplane with a 30-minute delay on the altimeter. The data is interesting, but it’s just not useful to promote change.

How do we collectively get better at data sharing? Step one is getting our vendors serious about interoperability. Every television set in the United States uses the same connectors.  The same cannot be said about our myriad of EHR systems, which are a far bigger investment than most television sets. It should not require multiple interfaces or great expense simply to get systems to connect.  That’s especially true of the large vendors which tout their many features, of which interoperability remains an especially low priority.

Also, we have to all agree on the data most important to collect and share. Timeliness of data is crucial as well. Once we all agree upon which metrics are important to us all, it will be far simpler to design processes for interoperability. And we have to agree to get the data shared faster. Real-time data sharing isn’t a pipe dream. We do it all the time within our own health systems. It seems we can do this together as well.

I would also suggest re-examining and clarifying privacy protections. While CMS states that the Health Insurance Portability and Accountability Act (HIPAA) does not deter clinicians from sharing data, enough providers are still concerned about the risk. HIPAA came about in the paper-based era of 1996, when its enactors could not have considered the complexity of today’s technology, the necessity for clustering very large amounts of data or the concerning and costly consequences when a breech occurs.

Jack Cox, MD, is Senior Vice President and Chief Medical Officer, St. Joseph Health Division, at Providence St. Joseph Health.

Tagged Under:


Get the latest on healthcare leadership in your inbox.