Skip to main content

Pandemic is a Huge 'Forcing Function' on Interoperability

Analysis  |  By Scott Mace  
   December 28, 2020

Data now streaming in around vaccines, their administration, and efficiency will need to move past previous practices, consultant says.

New federal rules on interoperability between healthcare information systems are poised to make a substantial impact in 2021. On October 30, the U.S. Department of Health and Human Services published the final 2020-2025 Federal Health IT Strategic Plan. Patient-centered aspects of the plan describe the goal of having patient care information follow patients from any provider to any other provider, and to be accessible from their smart phones.

HealthLeaders spoke with Seth Hirsch, chief operating officer of SES Corp., a consulting firm to commercial and government organizations, about interoperability and other major trends impacting healthcare and healthcare IT in the coming year.

HealthLeaders: Healthcare data interoperability has been promised for several years. Why do you think 2021 will be the inflection-point for widespread adoption of enabling technologies?

Seth Hirsch: The pandemic has served as a huge forcing function—essentially the old saying about necessity being the mother of invention, writ large by the scope and urgency of the problem. This coming year is going to be even more pivotal for data interoperability because—after a year of getting to know the disease and the health metrics that are most relevant—we’ve got more health data and context to work with. On top of that, we’ve now got data streams flooding in around newly approved vaccines, their administration, and efficacy.

HL: Disease surveillance, including clinical trials, has trailed the rest of healthcare by not being automated, repeatable, and scalable. How does the industry move past previous practices to more state-of-the-art tools and results?

Hirsch: Complexity has been a stumbling block, since we’re dealing with huge volumes of COVID-19 data of a diverse nature from diverse sources, often with strict privacy limitations. That can make it hard to know where to begin.  Part of the answer is to look at the nature of clinical trials and see what parts lend themselves most readily to innovation, and then focus on that as a starting point.  For instance, some longitudinal studies last for years, during which time you have strict limits about changing participants or drawing conclusions too soon. So you can see how it may be hard to automate and scale your management of data in that case.

But repeatability is something you can definitely innovate on right way, in that the principles of reproducible research in data science that dovetail fairly well with the scientific method of clinical trials. If you standardize management of clinical trial data, including how you categorize, cleanse, and analyze the data, then even the most advanced algorithms or AI analysis will have the data lineage to ensure your methods are reproducible. And that’s a foundation for being able to scale over time. 

HL: Specifically, how can analysis of COVID-19 data be supercharged to answer an ever-changing short list of questions about the disease and its treatments?

Hirsch: We need to enhance adoption of common health IT data standards—such as the ANSI-accredited Fast Healthcare Interoperability Resources (FHIR) framework— to break down silos between health-related data sources like mobile phone apps, cloud communications, EHR-based data sharing, and data stored on institutional servers. There’s also a cultural component in that we need to break down the silos between technologists and the business users, in this case medical professionals, in managing data around the disease and its treatments. The more each side understands the other’s world, the more seamless, secure, and proactive the analysis will be.

HL: To what degree is Big Tech (Apple, Google, Microsoft) able to answer some of these questions to an extent unachievable by other health IT? If so, where does health IT fit into their emerging big data initiatives in healthcare?

Hirsch: I'm not sure how fully I can answer this question. But one observation is that COVID-19 is a global challenge that involves many public systems and datasets in the U.S. and internationally. That means even the most advanced and well-funded innovations by big tech need to ultimately deal with the regulatory realities, such as HIPAA restrictions on health data, or ATOs for anything that interacts with government data or systems. And by the same token, big tech companies should look to balance their proprietary interests with the need to collaborate together on the bigger mission where it make sense—not unlike how we’re seeing FedEx and UPS put their competitive instincts aside temporarily when it comes to coordinating the massive task of distributing COVID-19 vaccines.

“Big tech companies should look to balance their proprietary interests with the need to collaborate together on the bigger mission where it make sense.”

Scott Mace is a contributing writer for HealthLeaders.

Tagged Under:


Get the latest on healthcare leadership in your inbox.