The federal government, which is the biggest source of big data, is looking for ways to help the industry use data to improve healthcare. Agencies and offices from the White House to the National Institutes of Health to the National Science Foundation to the Department of Health and Human Services to the Office of the National Coordinator for Health IT are partnering with researchers and private IT companies to develop tools to harness big data sets.
NSF has funded several projects focusing on cloud computing to help researchers store, index, search, visualize, and analyze data, "allowing them to discover new patterns and connections," Tom Kalil, Deputy Director for Policy at the Office of Science and Technology Policy, wrote in a recent White House blog post on big data. HHS has spearheaded many projects, including efforts to ease data-sharing among rural healthcare providers.
And ONC reaffirmed its pledge in its recently released Federal Health IT Strategic Plan to create what it calls a "learning health system" that uses information to continuously improve health and healthcare, and has placed renewed emphasis on patient access to data.
The healthcare industry is still struggling to get its arms around big data. Analyzing large data sets is not easy. But health leaders can emulate and implement some best practices, the McKinsey report authors write.
The report points to a few healthcare organizations that are doing a good job with big data, including the Department of Veterans Affairs' health information technology and remote patient monitoring programs. "The VA health system generally outperforms the private sector in following recommended processes for patient care, adhering to clinical guidelines, and achieving greater rates of evidence-based drug therapy," McKinsey says. These achievements are largely possible because of the VA’s performance-based accountability framework and disease-management practices enabled by electronic medical records and health IT.