Innovators are blending technology with new care models while targeting high-risk patients in a patient-centered strategy.
This article appears in the November issue of HealthLeaders magazine.
Without robust analytics technology, the goals of accountable care and population health cannot fully be achieved, good intentions notwithstanding. ACOs must correlate clinical data and claims data and use analytics technology to produce the actions needed to manage the health of a population. The data is there, but the healthcare industry does not have an evenly distributed knowledge of how to use it effectively.
With potential savings of up to $300 billion a year, according to the consulting firm McKinsey & Company, the upside of industrywide analytics to manage a population is considerable.
And, increasingly, providers have the raw data they need to feed an analytics system. But it is not as simple or quick as installing electronic health record technology—no small feat in itself for many organizations—and must be accompanied by solid governance and education, according to leading providers.
These providers are using analytics to bring a more intense focus on gaps in care, to discover cost outliers, and to put a magnifying glass on efficiency. But the use of such healthcare analytics has yet to reach maturity.
Early in the process
"Our organization is facing what most of the industry is facing, and that is the need to build a bridge to the future through analytics; so unlike some other industries that may be high users of data and very sophisticated, the healthcare industry is just in a different point," says Aric Sharp, vice president of the accountable care organization at UnityPoint Health, a West Des Moines, Iowa–based integrated health system with 3,026 licensed beds across 15 hospitals and total operating revenue of $2.7 billion.
"We're still in the process, as an industry, of going through implementing electronic health records and achieving meaningful use and those types of things. At the same time, with a lot of the new efforts around accountable care organizations, for one of the first times many providers have an opportunity to collect claims data by working with payers," Sharp says. "We felt it necessary to build a platform where we can mesh together both claims data and data out of our electronic health records, because there's a lot more that's able to be learned in that type of an environment. The type of intelligence that we can glean is at a much more informed level than if we're just dealing with one of those data sets in isolation."
UnityPoint Health typifies numerous providers, having initiated analytics for its population health initiative only a couple of years ago. "The primary lesson is, this is really difficult, and there's a lot to learn along the way," Sharp says. "And yet, we can certainly see that as we continue to enhance the work, there's more and more benefit with every step. The big learning is that there's just a lot to be learned, and it's exciting, because with every step of the process, we are better able to identify opportunities to improve care, and we're able to become more efficient at this type of work."
At the heart of population health analytics is the concept of risk stratification: understanding, through various inputs such as claims data, surveys, and EHRs, which members of a given healthcare organization's customer base represent a level of risk for which intervention offers the greatest possibility of preventing future hospital admissions, reducing readmissions, improving overall health, and lowering costs.
UnityPoint Health selected analytics technology from Explorys, a data spinoff of Cleveland Clinic founded in 2009.
"Explorys is able to pull data from a variety of sources—multiple electronic health records, our own billing systems, claims data from CMS or other payers—and assimilate that all together," Sharp says. "Explorys is really what sits on top of that and gives us an ability to slice and dice and analyze it and probe it and report quality metrics, identify gaps in care, and in the future even use that to do outreach to patients and do registry-type functions."
UnityPoint Health still counts the time until the big payoff in years. "We're not yet ready to say that it has an impact on our global per-member per-month spent," says Vice President of Operations Kathleen Cunningham. "It will, but we are so early in our innovation that some of our results are really based on the pilot type of innovation programs that we're working on."
Starting with employee populations
In many healthcare systems, population health analytics success stories are just beginning to emerge, but some providers have used their own employee populations as a proof of concept for the effectiveness of the effort.
For the past 11 years, employees of Adventist HealthCare—a nonprofit network based in Gaithersburg, Md., with three acute care and three specialty hospitals, 6,263 employees, and 2012 revenue of $726 million—have been managed for risk by the self-insured provider.
"It got started with the idea that a healthier population is going to be a more effective employee population, and it's going to also be a lower-cost population," says Bill Robertson, president and CEO of Adventist HealthCare.
A decade ago, Adventist started working with InforMed Healthcare Solutions, since acquired by Conifer Health Solutions, to use InforMed's set of data warehouse tools to improve its health plan design and determine where interventions were needed, Robertson says. Adventist and InforMed worked collaboratively to develop those tools and restructure the Adventist workflow to ramp up the effectiveness of the population health program.
As a result of population analytics, as well as other measures such as discouraging tobacco use and encouraging use of generic drugs, the inflation rate of Adventist's employee health plan cost over the past nine years was half the national average, Robertson says.
A key development in the population health initiative came in 2005, when Adventist created personal health nurses as part of a pilot patient-centered medical home to work with the approximately 360 high-risk members of Adventist's 6,600 employee-based covered lives identified by the InforMed data tools, Robertson says.
In a pilot, Adventist selected 27 of 50 high-risk patients (54%) and was able to move them out of the high-risk pool into moderate or low-risk pools, and it achieved a 35% reduction in the cost of care for that population, he says.
According to Adventist, the pilot project that achieved the 35% reduction did reduce health plan costs by $381,000 among the 27 patients who moved from the high-risk pool. The amount expended to achieve this 35% reduction was only $31,000, so every dollar spent returned approximately $12 in savings.
"It was actually so dramatic that it brought the inflation rate on our health plan to zero in that year," Robertson says. "We were pretty pleased with that." Overall, Adventist has saved "tens of millions of dollars" due to employee population health analytics to reshape the program and services for employees, he says.
Adventist then expanded this pilot PCMH to 5% of its employees (roughly 360 people), and continues to see the same kind of positive outcomes, Robertson says. Nurses make up the majority of InforMed users.
Three years ago, Adventist created ACES, which stands for Ambulatory Care EHR Support, an initiative to move its ambulatory physicians to use electronic medical records to expand its capacity to do population-focused care. By the end of 2013, more than 400 physicians will be using the ACES system. "So much of the job is how you integrate care across physicians and across the delivery system," Robertson says. "When you have one person who's seeing 15 physicians, but each physician thinks they're the only one, you end up with different challenges than when you can see everything."
All physicians who are participating providers in the Adventist HealthCare employee health benefit plan have access to the InforMed tools and analytics. Only a limited number directly access the information because the personal health nurses provide most of the ongoing care management, with the physicians serving more as the team captains, Robertson says.
The next step for Adventist IT is to tie analytics with the employee EHR. "What we're morphing toward is linking all of this together with HIE infrastructure so that the information that is in the InforMed platform will be available in your EHR platform and vice versa through the information exchange," Robertson says.
Adventist also created financial incentives that help its physicians spend "all the time it takes" to manage high-risk patients, Robertson says. "With an ACO, you don't really get paid an incentive until you've been successful—at least after the first year you've demonstrated that things are working and that they're [generating] shared savings," he says. "So we're still in the process of sorting out how we'll make sure this infrastructure is utilized actively."
Detailing the financial incentives, Robertson says the primary care physicians who participate in the patient-centered medical homes receive additional compensation, such as a monthly retainer or hourly incentive to compensate them for the additional time that is necessary to care for the high-risk patients in the PCMH.
Recent headlines have highlighted some fallout from the Pioneer ACO program. Fifteen charter members dropped out of the program after finding inadequate return on investment or improvement from their ACO initiative. To Robertson, this just highlights the importance of population health analytics in achieving ACO success. Had Adventist focused on no-risk or low-risk populations, it might not have achieved nearly the cost savings it had with its own proof of concept by targeting the high-risk pool of its self-insured employee-based covered lives, he says.
Now Adventist is forming an ACO for Medicare populations based on this same set of tools to track high-risk members of those populations. As time goes on, commercial-payer populations are also in Adventist's sights. "We have a couple of pilots, like an apartment building that has a very large population of higher-risk individuals that we're providing those types of services to, and it's interesting to see when you focus on it what you achieve in terms of reduced consumption of healthcare services and increased health status," Robertson says.
Leading the way to better patient care
At Virtua Health, population health analytics from Alere Analytics is being implemented to determine the highest-risk patients from a cohort of 12,000 attributed Medicare lives, says James Gamble, MD, chief medical information officer of the four-hospital, 885-staffed-bed integrated delivery network headquartered in Marlton, N.J.
Virtua became an ACO on January 1 and is preparing to add another 14,000 covered lives with a commercial insurer, says Alfred Campanella, Virtua's executive vice president of strategic business growth and analytics.
"There are lots of different scenarios where action is needed to prevent an admission or to prevent a condition from getting worse," Campanella says. Virtua is working with Alere to publish its alert lists via a Microsoft Dynamics customer relationship management platform. "That allows care nurses to take advantage of our Microsoft products like email and word processing," he adds.
Virtua uses RNs to provide close case management of the high-risk population. Meanwhile, 80 Virtua-employed primary care doctors are kept updated via the workflow into the system's electronic health record software. "That way that doctor doesn't have to leave their EMR or jump around to see where things are going," Campanella says.
"Our initial focus," Gamble explains, "will be on these high-risk patients, so as we see it, these case managers' day-to-day job will be: They'll have a patient load, they will have care plans, they will have activities assigned to them for these patients."
But the physician does not need to be the primary manager.
"As long as patients are following care plans, which are developed and approved by the providers, then the nurses will be managing them," Gamble says. "Their communication will be more as updates. When an alert arises that the patient is at risk or in trouble, then obviously the nurse would directly communicate with the physician to try to intervene at any early stage before the patient's health deteriorates or the patient ends up in the emergency room of the hospital."
"What we're seeing now is a more intense focus to try to fix those gaps in care and to identify patients who are at high risk for hospitalization or readmission or who need special attention," Campanella says. "Technology gives you a greater magnifying glass in many respects for seeing the barriers to care and for creating efficiencies in care delivery. While all the analysis is not complete, early results for clinical and financial savings are promising."
Support from top leadership has been crucial to Virtua's transformational pivot toward analytics. "This whole idea of care coordination was approved at the board of trustees level," Campanella says. "We've had tremendous support from our CEO, Richard Miller. One of our senior vice presidents, Stephen Kolesk, MD, doubles as the president of this subsidiary that is the ACO. He has a title of senior vice president for clinical integration, so it's very tightly integrated with the physicians."
Technical design of the Virtua analytics solution is close to completion. Parts of it will deploy before the end of 2013, and other parts will roll out in the first quarter of 2014, says Campanella. Also part of the project are an existing health information exchange and a new patient portal built on top of the HIE, he adds.
"Innovation does require some experimentation and risk," Campanella says. "The ones who are leaders are taking on some risk and putting some investment in without fully understanding the full picture, but that's what makes them leaders.
"It's now the right way to care for patients, to have this high touch, high visibility into all the different domains of their care and the handoffs between those domains, and so even if the ACO concept from a regulatory standpoint goes away, it's still the right way to care for patients,"
Campanella says.
Outside the hospital walls
Organizations beyond postacute hospitals are also engaging healthcare in a variety of ways that have broad implications for how analytics will be deployed in healthcare across the United States.
Brentwood, Tenn.–based Brookdale Senior Living owns and operates about 650 senior living communities in 36 states. In 2012, Brookdale, through a partnership with the University of North Texas Health Science Center and Florida Atlantic University, received $2.8 million of a $7.3 million Centers for Medicare & Medicaid Services Health Innovations Challenge grant for population health management. The program expects to save more than $9 million over a three-year period.
Initially, Brookdale is focusing on population health at 27 communities in Texas and Florida, but by the end of the three-year grant, it will involve 67 communities, says Kevin O'Neil, MD, chief medical officer of the organization.
The CMS grant sets a goal for Brookdale of reducing avoidable hospital readmissions by 11%, O'Neil says. "We know we're going to be focusing on certain quality metrics in addition to readmissions," he says. "We'll focus on dehydration rates, as well as new incidents of pressure ulcers, some of the major problem areas in geriatric care, and then, based on the data that we receive from the analytics tool, it'll help guide our quality improvement teams in terms of the type of improvement efforts that need to be initiated."
A variety of tools exist to help stratify risk. Some tools place members of a population on a scatter plot to make the identification of outliers easier. Other tools organize a population into patient registries to track various diseases and treatments. Still other tools use input gathered from patient surveys. A recent study, however, reported that many of those tools had not performed very well.
At St. David's Health System in Austin, which is working with Brookdale on the challenge grant, 60% of readmissions recently were measured as coming from low-risk groups. "To me [this] means either that people hadn't been stratified properly, or that they were being sent home when they probably did need some kind of service or follow-up," O'Neil says.
The biggest hurdle in O'Neil's experience with population health analytics has been engaging with the hospital C-suite to craft the business associate agreements necessary to manage populations. "Once we've developed a relationship with one entity and had success, it's much easier to engage other entities within that system."
In dealing with the two universities, O'Neil says, "We had to resolve some issues related to intellectual property to incorporate INTERACT into electronic information systems," he says. INTERACT is an acronym for Interventions to Reduce Acute Care Transfers, a free quality improvement program for which FAU holds the trademark and copyright. "This has been resolved through a licensing agreement—Loopback [a Dallas-based analytics platform vendor] also has a licensing agreement with FAU to bake INTERACT tools into software programs."
Both Brookdale and its hospital partners are using a common population health analysis dashboard and software provided by Loopback Analytics. "As a geriatrician, this is the most exciting time in my career, because I've always felt that fee-for-service medicine was the bane of good geriatric care because it rewarded volume rather than quality," O'Neil says. "Having that near-real-time data is really going to be extremely helpful to us."
Analytics and meaningful use
Analytics tools produce the patient registries that identify gaps in care, not just to meet ACO objectives, but also to meet the requirements of meaningful use stage 2, which takes effect in 2014, says Gregory Spencer, MD, a practicing general internist and chief medical officer at Crystal Run Healthcare, a multispecialty practice with more than 300 physicians based in Middletown, N.Y.
"There are frequently registry functions within EHRs, but the EHR is set up at the patient level," Spencer says. "It's not optimized for reporting groups of patients, so to kind of get that rollup, you have to have another layer on top of that to gather it up."
Thus, some sort of aggregator function is needed. "Usually that is not something that many EMRs do well," Spencer says. "Registries are mostly condition- or disease-specific lists of patients who satisfy a certain criteria: diabetics, patients with vascular disease, kids with asthma. Care gaps look at all patients who have not had a certain recommended service. There is overlap with the registries, since a list of patients due for their colonoscopy is a kind of registry that needs to be 'worked' to get those patients compliant."
Like numerous other healthcare organizations, Crystal Run's first foray into population health analytics employed Microsoft Excel spreadsheets.
"The basics can be done with available tools," Spencer says. "People shouldn't wait for the killer app that's out there that's fancy and has a slick user interface. You can really do a lot with what you have, probably immediately."
Since 1999, however, Crystal Run has incrementally left Excel behind and built population health analytics reporting tools on top of its NextGen electronic health record software, Spencer says. Crystal Run also adopted the Crimson Population Risk Management service from the Advisory Board Company, which incorporates technology from Milliman Inc. on the back end, he says.
Like other providers, Crystal Run saw the shift coming from fee-for-service to accountable care and took early opportunities to get its hands on claims data and learn how to work with it, Spencer says.
Other resources offering insight to accountable care analytics were the Group Practice Improvement Network and the American Medical Group Association, where Spencer has been able to network with peers who have been pursuing population health analytics longer than Crystal Run has.
The Crystal Run practice, formed in 1996, grew out of a single-specialty oncology practice and today has 1,700 employees. It is designated by the NCQA as a level 3 patient-centered medical home, and in 2012, Crystal Run became one of the first 27 Medicare Shared Savings ACOs.
Analytics have revealed "a lot of surprises at who you think has been getting most of their care from you," he says. Snowbirds—typically someone from the Northeast, Midwest, or Pacific Northwest who spends substantial time in warmer states during the winter—are receiving significant amounts of care that had been outside of Crystal Run's knowledge.
But with Medicare claims data examined through its analytics services, Crystal Run has had its eyes opened to previously unobserved cost centers. For instance, the No. 1 biller of pathology services for a 10,000-patient Crystal Run cohort was discovered to be a local dermatologist.
"What it's all about is improving quality and eliminating waste," Spencer says. "That waste is [in] tests that aren't really required [and even some] visits that are [being required]. It's your habit and custom to see people back at a certain frequency, but when you really start thinking about it, do you really need to see somebody back every three months who has stable blood pressure and has been rock solid? Well, probably not. And so you start doing things like that, and it adds up incrementally."
Crystal Run is able to incorporate patients' outside visits to providers, Spencer says, "but it's not easy. We require source documentation to satisfy measures. For example, we scan outside mammogram results into a directory that we can then report against. We don't take people's word for dates. We need to have the document."
Getting the initial claims data from CMS took three months, and then it takes another three or six months' worth of that data for it to become actionable, Spencer says.
Claims data on any one patient is also plagued by incurred but not reported claims. Until IBNR claims get processed through Medicare or other payers, a true picture of a patient's treatment is incomplete.
In light of this, it's important for all concerned to have realistic expectations of what population health analytics can achieve and when, Spencer says.
"Cost is a practical concern we all face in our day-to-day lives," he says. "You get more for more money, but as in all things, you have to be prudent. I don't know how you will be able to do business in the very near future without using some form of analytics. How will your quality measures be good enough to meet the 'gates' required for contracts? How will you know where you are or if you can grow and how? It has cost a lot of money—money that's been spent over a long period of time. The cost is into the low millions.
"That said," Spencer adds, "we are able to take advantage of newer payment models that reward us not just for healthcare, but outcomes. We can potentially get paid for not doing anything—the PMPM that can be negotiated when you show you are doing a good job managing a population of patients."
Analytics in the ambulatory practice
Gastroenterologist Tom M. Deas Jr., MD, practices as part of North Texas Specialty Physicians based in Fort Worth, an independent physician association comprising nearly 600 family and specialty doctors. NTSP has its own health plan and has been managing Medicare patients at risk for several years.
NTSP provided initial funding for a population health analytics firm, Sandlot Solutions, which has now been spun out as a separate company, although NTSP remains a part owner and Deas also serves as Sandlot's chief medical officer. NTSP uses Sandlot's analytics software to manage 80,000 at-risk lives, Deas says.
"Without some of the information technology to identify those patients based on their illnesses, comorbid illnesses, their severity of illness, who their physicians are, where they've been going to get their care, and being able to manage the whole spectrum of the care, you're at a serious disadvantage," Deas says.
Sandlot's technology combines claims and clinical data into a robust patient data warehouse that helps meet some of the quality measures required to be an ACO, says Deas. "With the ACO, no matter how much money you save, you don't get a dime of it if you haven't met all the quality measures, so if we fall short in that area, it's economically not good and it's not good for the patients."
By default, all Pioneer ACOs received three years of Medicare claims data. Getting the data into the warehouse requires overcoming some well-known healthcare IT issues, such as reconciling that claims data with an enterprise master-patient index, eliminating duplicates, and general patient-matching issues, Deas notes.
Once that was done, NTSP could concentrate on using Sandlot's analytics to spot and eliminate wasteful services, as such home visits for patients lacking a medical necessity for such visits, Deas says. Analytics-driven interventions can manage a few hundred overutilizers of services as outpatients, focusing care management on them, he adds.
After a year's effort, NTSP has bent its cost curve through these efforts to the tune of $50 per member per month, Deas says. "Now we're not completely there," he cautions. "It's an incremental process, because you're not only doing management, but you're changing behaviors also. You're trying to get patients aligned with the primary care physician, trying to move them from one source of care that was maybe excessive utilization to another."
Deas says measuring the ROI of analytics technology remains elusive.
"A lot of people think they just buy an analytics tool and a data warehouse and an HIE and it'll sit there and solve their problems," he says. "That is not the case. You have to have human folks using that tool to manage the care of patients, to lower the cost and improve the quality. It's like me asking you how much more efficient are you with a smartphone than you were five years ago with whatever version of phone you had then. You can't answer that question. All you know, it's just one part of what's happened in the past five years to make you more efficient."
It no doubt helps that NTSP's executive director, Karen van Wagner, has a PhD in statistics, giving the organization added expertise to quantify results as they emerge.
Analytics technology is just beginning to make its impact felt in population health management. Careful consideration of products, objectives, workflows, and business conditions will steer providers through potential pitfalls, but the effort is considerable and the challenge to healthcare leadership is ongoing.
"Among the things that made these changes successful is an IT infrastructure that supports population health management and care management," Deas says. "We still have to throw a fair amount of resources—human resources—at it to make it work."
Reprint HLR1113-2
This article appears in the November issue of HealthLeaders magazine.
Cheap, ubiquitous teleconferencing technology can turn any visit to a primary care provider into a patient-centered care team huddle, cutting weeks off the referral run-around and reining in costs. But it only works if the right team of providers, specialists, and the patient are available at an agreed-upon time.
Every one of us carries in our pocket or bag one of the untapped technological saviors of healthcare.
No, it's not Twitter. It's the calendar on your phone.
It's one of those things that generally goes unused, but not because it wouldn't be extremely useful. It's because schedule-sharing for years has had a "last mile" problem, an interoperability chasm.
Because of this gap, patients still receive phone calls or emails from providers to remind them of upcoming appointments. Phone calls may be ignored or dumped into voice mail with a dozen other messages which people are too busy to check. Emails can wind up in spam folders. Appointments entered onto some online patient portal may as well be listed on the far side of the moon.
Is it any wonder that despite the best efforts of providers, the healthcare system is continually burdened by missed appointments that leave exam rooms and equipment tied up with no-shows?
This is a problem that must be addressed soon and in a big way. Care may be decentralized, but it must be coordinated.
The waste and inefficiencies of missed appointments is bad enough in private practices. But now we are entering the age of the e-consultation, where large systems such as Intermountain Healthcare plan to turn most every encounter into a telemedicine encounter. Cheap, ubiquitous teleconferencing technology can now turn any visit to a primary care provider into a patient-centered care team huddle cutting weeks off the referral run-around.
But it only works if the right team of specialists, the PCP, and the patient are available at an agreed-upon time. All the broadband bandwidth in the U.S. won't matter if a specialist or a physician can't see her free and busy times at a glance and quickly coordinate with others on the team. Patients, too, need to see all their appointments at a glance, instead of scrolling through email or listening to recorded calls ad nauseam.
Big, proprietary solution providers such as Epic are offering their own solutions, but woe to the patient who has a provider or six outside one of those systems. And woe to large physician practices with a hodgepodge of EHR vendors to support.
And yet, I am optimistic that solutions are at hand for the diverse IT needs of healthcare.
The most recent evidence I have found comes from the U.S. Department of Veterans, which just completed its Medical Appointment Scheduling System (MASS) Contest.
All the winning entries work with the VistA open-source EHR software used by the VA, and are themselves open source, which will perk up the ears of foreign countries whose entire EHR infrastructures are built upon VistA.
The winning team's entry, Health eTime, is working code that can set individual, group, and patient appointments, with a ton of features that form the basis for all sorts of intriguing resource utilization analytics. I saw a brief demo, and marveled at Health eTime's capabilities. Care coordinators are truly project managers with the ability to schedule and re-schedule sequenced appointments and appointment-dependent tests, assisted by Health eTime.
As to providing physicians and patients with unified glances at their schedules, Health eTime supports the CalDAV standard*, a way for calendars on smartphones and tablets to receive automatically "pushed" updates to their calendars reflecting moved, added and changed appointments. Those subscriptions, unlike ordinary emails, are securely delivered. And they don't just wind up in someone's voice mail or e-mail inbox. (Although I note with dismay that CalDAV support is spotty on Windows phones and some Android phones, iPhones and iPads support CalDAV beautifully.)
Health eTime's patient portal also can be populated from VistA to serve up directories of providers. Once a patient chooses a provider, Health eTime can display that provider's free and busy time slots.
Just think how many phone calls and emails that could save the VA. Dare I say patients would be happier? Staff no longer needed to schedule all those appointments by phone could be put to far better use in delivering care.
By structuring this as a contest, the VA avoided the usual soul-crushing process of issuing a request for proposal (RFP), usually the start of a mind-numbing blizzard of bureaucracy that, too often, lead to Web sites such as healthcare.gov.
The price to the U.S. taxpayer for all this scheduling goodness came out to a bit more than $3 million, which also includes three other top winning entries. In return for the cash, the teams delivered their source code to OSEHRA, the Open Source Electronic Health Record Agent, which publishes the VistA code under an open source license.
The VA hopes to implement an updated scheduling system based on this winning code within 18 months, starting with a pilot at a smaller hospital, says Michael L. Davies MD, national director of systems redesign at the VA.
"Integrating with VistA is not a chip shot," Davies says. "It's hard, in part, because industry has to understand what the [scheduling] problem is. The other piece of it is that not all of our VistA code is documented in a way that allows industry to come in and just plug into it."
With 70,000 personnel VA currently scheduling more than 85 million appointments a year, the potential cost savings are huge. I don't know how many appointments the rest of healthcare schedules annually, but I can't help thinking it's a number that dwarfs 85 million.
In this column, I've only scratched the surface of the possible efficiencies of scheduling technologies. It's a topic I intend to return to. As the Internet of Things emerges, we will see more and more ways for technology to insert itself appropriately into the ultimate workflow – that between providers and patients. The possibilities appear endless.
* Full disclosure: I am so enamored of scheduling's potential, I currently serve on the Board of Directors of CalConnect, the Calendaring and Scheduling Consortium, a non-profit organization that champions the CalDAV standard, and which serves as the nexus for the IT industry to move this work forward. I invite you to check them out.
Hospitals and health systems just now getting around to meaningful use have clearer guidance from CMS, a better selection of off-the-shelf EHR software, and the cautionary lessons learned from HMA.
Judging by last week's readership on HealthLeadersMedia.com, more than a few of you were keenly interested in HMA's $31 million giveback to the Centers for Medicare & Medicaid Services last week for failing to tell the truth about its meaningful use attestation.
While we ponder the fallout at HMA, how can you avoid being next?
The good news is, CMS now has a web page to help you navigate your way around (or through) a meaningful use audit.
Scroll down to the heading "Audit Information and Guidance," where you'll find:
That's just where El Centro (CA) Regional Medical Center went at the start of its meaningful use journey. It's poetic justice, or something, that those just now getting around to meaningful use have a better selection of off-the-shelf electronic health record software, resources such as those from CMS, and the lessons learned from HMA—and, no doubt, other cautionary tales to come.
Tura Morice is CIO of ECRMC, a municipally owned hospital located less than 15 miles from the Mexican border in California. With 162 beds, two general outpatient centers, and two specialty outpatient clinics, ECRMC successfully attested for Stage 1/Year 1 of meaningful use on October 25 for the reporting period of June 4 to Sept. 3. As a Year 1 site, ECRMC's reporting period was 90 days.
"I tease my coworkers that there's real value in being a late adopter," Morice says. "Our physicians have been using CPOE in their practices and our hospitalists at other sites, so they were way ahead of the game compared to us, and were more than ready to adopt [our] CPOE pilot." A full rollout of inpatient CPOE is scheduled for next month.
CMS "outlined specifically what they're going to look for in an audit so that you can have all the screenshots and information ready," Morice says. "The early adopters didn't really have clear instructions on what an audit would look like, so I think unfortunate for them, now that they can't go back and redo their attestation screenshots a couple of years down the road.
"For us, we followed the formulary so if we ever get audited, we'll be ready."
In its clinics, ECRMC is using eClinicalWorks, and on the inpatient side, Siemens Soarian Clinicals. Morice is all too aware that Stage 1 of meaningful use is the data collection stage, and that true benefits of being a meaningful user probably won't kick in until Stage 3, when people may experience "the absolute change in the industry that we're all hoping for: removing a lot of these manual processes, automating healthcare information, and involving the patient more."
This year, ECRMC started with required core measures on the inpatient side: stroke, ED, and VTE. Next year, like all meaningful users, ECRMC will start reporting its clinical quality measures through its EHRs. "It's part of what we're all groaning under, trying to get our EMRs ready for that," she says. "I do see that as a huge leap forward in making the EMR more sophisticated with workflows, order sets, alerts, and all the benefits that are supposed to improve patient care and core measure compliance."
Speaking with Morice, I was reminded that CPOE itself isn't a one-and-done act, but more of a process. "You can use your CPOE order sets to help you meet those core measures," she says. "This year we're still abstracting charts manually, where human eyes have to go through the electronic chart and determine whether or not we've met our quality measures. And next year, that abstraction process has to be completely automated. That will involve a huge amount of EMR build effort between now and then to ensure that all required information is in the chart in codified electronic form in order for it to be counted."
And it's not all about quality: In 2014, 16 out of 29 core measures required will be value-based purchasing core measures, Morice notes.
While some larger systems have been able to banish paper all at once, ECRMC takes an approach that makes more sense for a system of its size. "There's unbelievable amounts of paper that flows through a healthcare system, and sometimes you just have to start in one place," Morice says.
"For some of us, that started with the medical record. Our charts are all completely electronic. Our physicians sign them all electronically. And then there are parts of the chart that we are still scanning, still paper-bound, but as time goes on and as opportunities present, our goal is to make that all paperless. But again, it's an ongoing process, and you have to fit those kinds of activities in where you can with all of the other regulatory activities. There's only so many resources, so it's not nearly going as fast as we'd like, but that's where we are."
Morice also gives credit to an outside consultancy that offered integrated system testing and a clinical help desk service, which helped ECRMC through the EHR deployment process.
"We didn't have the internal resources to man a 24-by-7 clincal help desk so we were able to outsource that to them, and now my system builders, instead of being distracted by maintenance and support, can continue building," Morice says."We brought Stoltenberg [Consulting] back in to help us with the CPOE build and on demand to support some of our other clinical systems that we don't currently have internal resources for."
So, take heart, all you CIOs of smaller hospitals just struggling through the meaningful use Stage 1 process. At the end of the process, you don't have to end up being the next HMA. The resources are there to make you as successful.
Meaningful Use audit-related givebacks and penalties can pack a much harder punch than other Medicare audits, says the CEO of CHIME, because failure in one area of MU attestation is perceived by auditors as "failure in aggregate."
Concerns that Meaningful Use auditors have unclear expectations of their targets were heightened by reports this week that the 70-hospital Health Management Associates (HMA) is giving back approximately $31 million of Medicare and Medicaid payments to the federal government.
Last month, based on the results of an internal review, HMA determined that it had made an error in applying the requirements for certifying its EHR technology under these programs and, as a result, that 11 of the hospitals it had enrolled in the HIT programs did not meet the Meaningful Use criteria necessary to qualify for payments.
Russ Branzell, CEO of the College of Healthcare Information Management Executives (CHIME), expressed concern that Meaningful Use audit-related givebacks and penalties can pack a much harder punch than other Medicare audits.
"If you mess up one bill for Medicare, and you're in a RAC audit, and they ask you to correct that one bill, they don't tell you to give back all your Medicare money," Branzell says.
A "Failure in Aggregate" Meaningful Use auditors, on the other hand, possess a mindset that failure in one area of Meaningful Use attestation is "failure in aggregate," leading to much larger potential givebacks such as HMA's.
"Now if they're appropriately finding things that are of concern, like you would with any audit, then we welcome that scrutiny," Branzell says. "If they're auditing without a standard, rigid process they're following for consistency across all their audits, with some clear expectations, then we will be concerned, which we were the first time, and we made sure that was very well-expressed to the powers that be in Washington."
The latest round of Meaningful Use audit notices, which bore response dates of November 7, first appeared as emails from designated auditing firm Figliozzi and Company to providers during CHIME's most recent conference in October.
"More Pervasive" Audits Coming Branzell says based on the responses Figliozzi receives, the firm will decide how many on-site followup audits would then occur. These on-site audits "will eventually be much more pervasive" than they have been so far, Branzell says.
CMS is required to audit a certain number of organizations or individuals that have attested for Meaningful Use, Branzell says. "I do think there's an appropriateness to audit," he says. "We just want to make sure it's done correctly."
So far, CMS is unwilling to consider granting "partial credit" for Meaningful Use attestations that fall just short of meeting minimum criteria, Branzell says. "I don't think they're going to bend on this, just the way it was set up," he says. "At least at this point they won't."
Branzell says he was not aware of which electronic health record system or systems were in use at HMA. "I would expect that they have a pretty robust IT staff," he says. He did note, however, that the corporate leadership listed at HMA's Web site does not list a chief information officer.
"It does make you wonder if there was one, and that person is gone, or if they didn't have one, and they were sourcing their IT somewhere, and that might have been part of the problem," Branzell says. "At this point, we really don't know."
Destructive APIs, a lack of interoperability standards—and their glacial pace—and Twitter are a few of the tech irritants on Scott Mace's mind this week.
This week, I've worked up a mini-rant about some of the most maddening things about information technology, in a healthcare context, of course.
1. APIs work great, until they break Epic recently announced an application program interface (API) for its EHR software. Details remain sketchy, but I can guarantee one thing: Somewhere down the road, for some good reason, Epic will change its API and break a whole bunch of things built on top of the first API.
Don't blame Epic. It's the nature of APIs to change, particularly if the business model of the company publishing them depends on not allowing too much openness with competitors or potential competitors.
That's why we hunger for standards from groups such as HL7 to set the APIs in concrete. But standards are usually the product of vendors jockeying to deny each other any kind of competitive advantage, so they always end up being some kind of least common denominator.
Sorry HL7, but that's why your work, though important, is only the foundation work for much of the interoperability standards work (hopefully) underway at CommonWell Health Alliance and elsewhere.
2. Interoperability is coming along great, especially if you like really incremental progress Maybe I'm not being fair, but let me tell you what crossed my desk yesterday, and you decide.
The announcement reads: "Michigan Health Information Network Completes Onboarding to eHealth Exchange; MiHIN becomes one of the first health information networks to onboard under new testing process working in collaboration with PCE Systems."
In plain English, MiHIN is inviting providers to join its service and thereby enabling them to exchange electronic medical records with federal organizations and other participants in the eHealth Exchange community.
That doesn't mean providers have agreed to exchange data. That announcement comes later, provided those providers find it in their economic interest to do so. I think ONC is still working on that part.
The announcement continues that MiHIN is the first organization to "onboard" the Consolidated Clinical Document Architecture (C-CDA), a data format required under Meaningful Use Stage 2 to support health information exchange.
PCE Systems turns out to be a behavioral health data sharing organization, and MiHIN officials note that this means HIE is moving from its focus on physical health to embrace behavioral health data as well.
It says something that this is probably going to be the biggest HIT interoperability story of the week. And it's safe to say the impact is pretty marginal on most providers. I do applaud all the participants, including Healtheway, Inc., which supports the eHealth Exchange community; and the Certification Commission for Health Information Technology (CCHIT), master of healthcare interoperability testing efforts.
I won't mind if you retweet this item, but good luck getting your summary into 140 characters.
3. Twitter is great. Twitter is maddening. You might be surprised to learn I was a relative latecomer to Twitter. I couldn't seem to understand the value of hashtags, and often got my pound sign and my at sign confused. Around the time I joined HealthLeaders, I really got the hang of it. There's great benefit in forcing everyone's thoughts into 140 characters, even though there's no hope that it can completely cure one of FOMO (fear of missing out).
But it's far more useful than Facebook for seriously keeping up with what's going on. And don't get me started onhow LinkedIn could be improved.
But Twitter is also frustrating. Just like LinkedIn, it needs some power data mining tools. TweetDeck and Twitter-based apps of its ilk appear to be great for following several hashtags on a single screen, but so far I haven't seen anything that really adds a sophisticated level of filtering to the service. And there is always the danger that if someone builds such an app, Twitter will change its API and break all those apps.
A slew of Meaningful Use audit notices have suddenly materialized, aimed not only at Medicare, but at Medicaid recipients as well. The deadlines are tight and the documentation requirements exacting, making a most unwelcome October surprise for healthcare CIOs.
As the CHIME conference wound down on the evening of October 10, CIOs were abuzz: A new wave of Meaningful Use audit notices was making its way into their email boxes with November 7 due dates for responses.
The government might have been shut down, but the federal contractor conducting the audits, Figliozzi & Company, was still on the job. The new fiscal year was unfolding before CIOs with a fright worthy of Halloween.
In response, CHIME leadership sent out an urgent survey to its members. The results were sobering. The rolling, random audits were indeed going out in force, and they weren't just aimed at Medicare, but at Medicaid recipients as well. The survey found that out of 1,400 member organizations, close to 100 received audit notices this month.
I had led myself to believe that the Meaningful Use audit process was more cut-and-dried than it is. In fact, that may be more true for small practices, where the provider's own bureaucracy is at a minimum. When audit notices go to the largest organizations, however, they can really test the governance mechanisms and responsiveness of providers.
For one thing, there appears to be great variability in who receives the audit notice emails at the larger organizations. Some emails are going to general inboxes. So the first challenge is to filter and find the audit-notice emails, wherever they're landing.
Rigid Documentation Requirements I am also struck by how much documentation the auditors are asking for. They are demanding proof that risk assessments are being conducted during the MU attestation period in question, rather than before those periods begin.
And auditors are demanding screen shots showing various aspects of compliance. Submitting ancillary proof of compliance, such as checked-off lists of tasks performed, is insufficient.
Furthermore, healthcare systems with multiple hospitals or multiple physicians are also being required to provide that documentation for each hospital and for each physician. "There are folks across the country, especially in physician offices, that are going to be end up tripping over [their] security risk assessment," says Pamela McNutt, senior vice president and CIO at Methodist Health System in Dallas.
Tips from Methodist Health System
McNutt is a CHIME leader, and someone whose system received an audit notice for each of the four hospitals in her system. In a CHIME Webinar held Oct. 22, McNutt says there have even been debates within Methodist's physician entities about what actually constitutes a risk assessment.
"It's not something like where you hire a hacker to try and break into your networks to find your vulnerabilities," she says. Instead, it's a matrix of considerations provided through HIPAA regulations – and includes listing the organization's certified EHR plus any individually certified modules of that EHR, plus how the organization has mitigated risk "for each and every component."
These risk assessments must also show that any deficiencies found were completely remediated before the reporting period ended, McNutt says.
Providers even have to watch their words carefully lest they invite extra scrutiny. "Avoid using words like 'deficiencies' and 'remediations'" if the organization is simply contemplating a set of best practices, McNutt says.
Another lesson Methodist learned was to provide proof that they were on a Meaningful Use-certified version of its EHR software during the entire reporting period. This can be tricky if software upgrades happen anywhere near that period of time, McNutt says.
"A letter from your vendor would do, or if you have screen shots you can take from your EHR that say what exact day certain releases were moved into production, you could use that for your defense, but that's the first thing they ask for."
Medicaid Surprise and a CMS Challenge Then there was McNutt's Medicaid surprise, which reinforces the fact that these audits are as much about proving actual use of EHR systems as they are about proper installation and risk assessments. "This audit from CMS is as much an audit of your state Medicaid agency as it is of you, and so they come to you to reprove everything that the state already has," she told the CHIME Webinar audience.
This particular audit process dragged on. After four or five go-rounds with the auditor hired to do Texas-specific audits, "they finally just said, 'you just need to send us every single claim that you produced,' and we could de-identify it, but they wanted to know who the payer was and how much we were paid, and whether we were denied, for all payments, not just Medicaid.
"That was a surprise to us, and I challenged it all the way up to CMS, and I was told that that was a valid request. So be prepared for that."
Some providers, including McNutt, have even received phone calls as part of HHS's Office of Inspector General's effort to audit the auditors in each state.
At CHIME, I happened to mention this to former National Coordinator Farzad Mostashari, whose response was a shrug, signifying that this is the way things go with audits at times.
McNutt's co-presenter during the CHIME Webinar was Liz Johnson, vice president of applied clinical informatics at Tenet Healthcare, which has received nineteen audit notices so far. Tenet has the added headache of operating in 22 states, making its challenge and learning experience exponentially greater than Methodist's.
In some cases, the audit notice got to Johnson with only two days left to respond. "We did call and get a few extra days, but it is one of those things where you want to stay on top of it," she said.
Once an audit notice is received, Tenet has a policy that its audit response team decides a course of action within 36 hours. Among other things, Tenet has had to demonstrate to auditors that clinical decision support rules are firing correctly. OIG staff even visited Tenet facilities in person to see some of these rules in action.
A Necessary Burden The twists and turns of these audits seem to go on and on. Not every provider tuned into this CHIME Webinar, so I hope raising the issue to a higher profile here is useful to all providers. Like too many things in healthcare, it seems that larger organizations, with more clinical, financial and legal resources, might be better able to respond to these audit requests.
Probably those at greatest risk, as usual, are smaller community hospitals, while the very smallest of practices might benefit from being more simply organized than larger providers.
As with audits in so many areas of healthcare, audits of Meaningful Use are a necessary burden of leadership, and the continuing scrutiny of the value of technologies purchased with Federal and state funds shows no sign of easing.
Software that can create structured tables of data from clinicians' notes and then incorporate them into any standard electronic medical record, is easing concerns that structured EHRs are killing the clinical narrative.
At CHIME earlier this month, I heard many CIOs complain that electronic health record systems do a poor job of summarizing clinicians' notes and integrating them with the structured data which forms the backbone for much of the population health analytics which can bend the cost curve of care.
I've been writing for a long time about concerns that structured EHRs are abandoning the clinical narrative. I've even written about the potential for natural language processing (NLP) technology to extract actionable information from that narrative.
Now there is evidence that NLP is starting to make a difference, and more importantly, may not require providers to be locked into a new set of such technologies. Instead, providers might be able to shop around for best-of-breed tools to get the job done.
The reason for my optimism is IBM's LanguageWare Content Analytics software, now in use at the University of North Carolina Health Center.
IBM's software can actually create structured tables of data from free text, which can then be incorporated into any standard medical record, according to IBM officials.
At UNCHC, the software is digging into written mammography reports and finding abnormal results, then presenting them for followup examinations, says Carlton Moore, associate professor of medicine at UNCHC.
"We looked at a random sample of mammography reports taken from our electronic medical records done over the past five years," Moore says. Two physicians reviewed the reports; then IBM's software went through the same reports, and the team compared the two findings.
The software found 98 percent of the abnormalities that the physicians found. "It was actually very effective" and could be tweaked to be 100 percent effective, Moore says. UNCHC's results have been written up and submitted to a research journal for possible publication.
At present, UNCHC has a home-grown electronic medical record, but is in the process of switching over to Epic by next May, and is looking to integrate IBM's NLP software with Epic after that.
Moore says physicians' crazy day-to-day workflow makes a place for NLP to flag abnormalities for followup that otherwise would be overlooked.
"There's a lot of information coming physicians' way and they have to process it," Moore says. "They have a lot of interruptions. They're writing a note and might get interrupted, because a patient just called or a nurse wants you to see a patient right away, so it's very easy for things to kind of fall through the cracks."
Physicians should not have to rely upon memory to go back and make sure that an abnormality is followed up on, Moore says.
Providers will also be given options on how to follow up. In some cases, this analysis can trigger an alert in an EHR. In others, a daily report could be routed to nurse care managers to look through the report and make sure all patients have had proper follow up care, Moore says.
The same NLP techniques UNCHC is using to scrutinize mammography reports and pathology reports could also be used to scan other kinds of radiology reports, or any other type of free text reports.
By taking the next step and converting its results into structured tables that can be incorporated into patient EHRs, UNCHC and other IBM customers will avoid being locked into some proprietary NLP system that would sit alongside the traditional EHR.
That's not to say IBM doesn't want customers to stay with its solution. But by getting into that structured format, it does tilt the balance of power back toward the customer, because structured data is inherently more interoperable and portable than unstructured data. As IBM's Ed Macko, worldwide CTO for healthcare and life sciences puts it, the newly-structured data, extracted by NLP, becomes part of the patient's longitudinal record.
You might not think it takes a rocket scientist to figure this out, but you might be wrong. Prior to IBM, Macko, an engineer by training, built computer systems for the space shuttle.
Seriously, NLP can uncover the hidden truths in the patient record. Take the elusive condition known as smoking status. A clinician could check the box indicating the patient does not smoke. But the clinical narrative reveals a more nuanced reality. Macko notes that a physician's note may say that the same patient is "down to two packs a day." Only technology such as NLP, or an army of people scanning physicians' notes, could possibly get to the single version of that truth.
There may even be new business opportunities for larger systems such as UNCHC to align with smaller healthcare systems and hospitals who can't afford the resources to implement IBM's LanguageWare Content Analytics.
Here's how it might work. Small providers could connect to systems such as UNCHC via health information exchanges. UNC's own HIE could provide NLP services to such providers, who would send their free-text reports and receive structured data and actionable reports in return, for a fee.
"That's where I see the future of this is going," Moore says. "For a small practice to be able to independently take the software and have a programmer do this themselves is probably not going to happen."
There's also a role for this kind of service in research. Many inclusion and exclusion criteria important to clinical trials are buried in physicians' notes. NLP could be used to identify these patients for use in study cohorts, Moore notes.
One other note: Although you might hear IBM describing NLP as an element of Watson, it is but one element of Watson. While Watson does employ NLP, it also contains a machine-learning element that helps Watson understand the entire ontology of a particular medical practice.
Over time, Watson "learns" more than it used to about a given practice, such as oncology. The NLP technology in use at UNCHC must be specifically programmed to seek and extract unstructured data into structured data, and as such, falls far short of Watson's ability to understand medicine.
Still, NLP is probably our best tool for mining unstructured data, and arrives none too soon given the explosion of electronically-stored medical data. And as Macko notes, "in the end, physicians are going to do what physicians do. They like to write things down, right?"
Not only is health information technology helping to control costs, it's also creating new opportunities for revenue.
This article appears in the October issue of HealthLeaders magazine.
Healthcare systems find that telemedicine can help grow their volume and drive out inefficiencies, but new methods of care delivery require thoughtful planning to avoid hiccups.
The UC Davis Health System now offers access to 30 specialty care services ranging from behavioral health and dermatology to audiology and ophthalmology for both children and adults.
Recently, the system reported it was able to grow its pediatric medicine practice through telemedicine. In a study published in the July 2013 issue of Telemedicine and e-Health, authors from the UC Davis Health System reported 2,029 children transferred to the hospital from 16 surrounding hospitals connected via telemedicine between July 2003 and December 2010.
From these patients, average hospital revenue increased from $2.4 million to $4.0 million per year, and average professional billing revenue increased from $314,000 to $688,000 per year.
The Sacramento, Calif.–based system, which has 619 licensed beds and reported total operating revenue of $1.3 billion in 2012, has the benefit of being in a state that has long recognized the practice of telemedicine.
"Our experience has been positive with private payers in California," says Shelley A. Palumbo, chief administrative officer for UC Davis Health System's Center for Health and Technology and the Center for Virtual Care. "In fact, several payer organizations have attended the UC Davis Telehealth Education Program to better inform themselves about telehealth and the benefits for their membership."
Since starting its telemedicine program in the early 1990s, UC Davis Health System has provided nearly 37,000 consultations that way. This represents synchronous (real-time communication) and asynchronous (recorded and stored for another time) telehealth consultations to more than 100 sites spanning 44 of California's 58 counties, Palumbo says.
"Patients and their physicians can connect with UC Davis doctors from many different locations, including rural hospital emergency rooms, community clinics, and Native American healthcare sites," she says.
Integrated delivery networks have been able to sidestep reimbursement issues due to the fact that they own their own insurance companies.
In Utah, Salt lake City–based Intermountain Healthcare enrolled about 77,000 Medicaid participants under full capitation in January, says Wesley Valdes, DO, telehealth services medical director at the 22-hospital system.
"Once you're capitated, you're freed up from thinking like fee-for-service, and now you can look at what we jokingly say is how we really want to take care of patients without having to worry about justifying everything as a single encounter," Valdes says.
"We have recently been discussing how to best obtain information from this population regarding the technology that they have access to and what obstacles, if any, they face. This effort is still in the formative discussion phase, but we felt it to be necessary to accurately design an approach to this group in an effective and sensitive fashion," he says.
Intermountain's alternative to the ACO model, known as shared accountability, is leading to interface cost reduction (admission to discharge) and is allowing the system to invest in telemedicine infrastructure in every single inpatient room and ambulatory exam room, which will allow consultations among any physicians in the system.
"We decided to take those cameras and point them inside first," Valdes says. "By doing that, and looking at the infrastructure and process that would require that, we started to realize additional cost savings and additional efficiencies that we previously couldn't have realized without that infrastructure. For example, if I were to run a monitor and microphone and speakers in a room, sure I can bring a consultant into that room virtually, but I can also bring an interpreter into that room, a case manager, a pharmacist, a nutritionist, a chaplain, and I could leverage these resources across my system very efficiently."
Intermountain also realized that when the infrastructure was not in use by staff, patients could use it. "Grandpa comes out of hip surgery and wants to chat with the grandkids, say, 'Hey, doing okay.' We'll let him do that. We can make that available."
The interactive patient room design now in progress at Intermountain includes a touchscreen-enabled device that allows the patient to request interpretation services or other video services, Valdes says. "Patients have a better idea of when they need an interpreter than we do.
"This infrastructure is the exact same infrastructure that the organization was looking at to roll out patient entertainment to all the rooms," Valdes says. "That was about a $12 million to $14 million investment. It's also the same infrastructure that they were looking at to roll out patient education to all the rooms. That was about an $8 million investment. So by committing to this infrastructure, we were able to roll all those projects into one and save the organization about $22 million of proposals right off the bat."
Intermountain will roll out the new telemedicine infrastructure, including all 2,800 hospital beds in its system, during the next 12 to 18 months, "faster if we figure out how to do it," Valdes says. "We're prioritizing things like the ED and the intensive care units first."
At this year's American Telemedicine Association meeting, Intermountain demonstrated its latest telemedicine technology for neonatal intensive care as well. A prior implementation at Intermountain's McKay-Dee Hospital Center was supposed to allow parents to be able to view infants in neonatal bassinets, but the cameras were embedded in fixed ceiling positions. Bassinets were too often covered by blankets, and nurses had to be mindful of where the bassinets were placed. By placing the new cameras within the bassinets themselves, those problems go away.
The newer cameras also reflect the plummeting cost of such technology for telemedicine. "We're actually leveraging consumer-grade Web cams, $70 Web cams you can go pick up at any local electronics store, which again reduced the cost of the implementation, and it was a wild success at the ATA conference," Valdes says.
Another healthcare system that continues to see its telemedicine business grow is nonprofit HealthPartners, a Bloomington, Minn.–based integrated system that reported more than $1.3 billion in total revenue in 2012. Earlier this year, the system reported savings of $88 per office or emergency department visit, compared to the typical office visit or urgent care visit, which can fall in the $120–$140 range. Part of the strategy of this offering, which has resulted in 68,000 treatment plans since it launched in 2010, is attracting visits for certain straightforward conditions, says Kevin Palattao, vice president of patient care systems at virtuwell, the online service front end for HealthPartners' telemedicine services.
He describes the straightforward conditions as those that "behave really well according to rules and protocols, [for example,] bladder infection, sinus infection, pinkeye," Palattao says. "We treat the flu, allergies, and even minor skin rashes. We have a pretty simple picture upload capability, so consumers can share images with us that help increase the accuracy of the diagnosis. Any time the standard of care calls for a physical exam or a lab test, those are the types of things that we stop in the interview process and redirect you to in-person care."
At Mercy Health in metropolitan St. Louis, clinicians are monitoring 450 beds spread across Arkansas, Oklahoma, Kansas, and Missouri, with plans to expand to a fifth state. All told, Mercy Health has more than 70 telemedicine projects in development, "anything from doing e-consults from any specialty to inpatient to outpatient and then in the remote home monitoring field and then even consulting in the home via video," says Wendy Diebert, vice president of telemedicine services at the 32-hospital system, which includes 300 outpatient facilities.
"Some of it is direct fee-for-service," Diebert says. "Some of it, we're paid a service fee to provide the service, so then the hospital can bill for that service. But once the physician's paid a service fee, they cannot bill for it."
Still other telemedicine services are funded on a population management model, where Mercy is paid a set fee to keep a population of patients healthy, Diebert says.
With telemedicine reimbursement issues still unsettled in various states, Diebert says the system has spent the past year trying to come to grips with the issue.
"Each model of service that you deliver, you have to develop a template for reimbursement around that, then it all ties into scheduling and where you draft the fees at," Diebert says. "It seems so simple, because everybody just says just bill it. Absolutely you can just bill it, but you also have to have certain things in place to do that."
One of those things is that the healthcare system has to have a license in the state where the patient receives the telemedicine services, Diebert says.
"The second thing is you have to have privileges at the hospital" where services are delivered, she adds.
"The third thing, which everybody underestimates, and we clearly underestimated, is that you have to have those credentials with every plan—whether it's managed care, government plans, and that could be anywhere from five to seven applications per physician after you've already completed all the other applications."
At this point, Mercy Health has created "a centralized hospital privileging agreement across our health system, and now we're working on a centralized managed care contract so that if I get them in this plan and this community, that they're accepted in all plans in all communities," Diebert says.
For those just beginning their telemedicine efforts, it is essential to convene a multidisciplinary team from the start, Diebert says. "We ended up pulling in our Epic [EHR] builders. We pulled in our finance team. We pulled in the managed care team. We pulled in the government plan team. We pulled in the clinical team. Just because it hit every aspect and has a ripple-down effect into everything you do.
"It comes down to even how I schedule the patient, how I refer the patient, [how we] work to follow each transaction to see if you're going to get paid and if you're not getting paid by that managed care company."
In the process, Mercy Health officials would discover existing contracts where a particular company covered telemedicine but not in particular communities, Diebert says. "It was just so unknown that nobody really did a good job around negotiating contracts around telemedicine, not from a managed care standpoint," she says. "Medicaid, Medicare, if you're in a rural community, that becomes a whole different thing, but we have a mix of metropolitan, urban, and rural communities, so we're hit with all the different kinds of scenarios."
The scheduling issue also sneaks up on telemedicine programs. "You have to schedule the patient, you have to schedule the physician, and their schedule is their office area, because you're taking a slot of their time away from their regular schedule," Diebert says. "You have to make sure that they're in a telemedicine room, and that the telemedicine room has the right type of equipment," which these days includes a growing variety of scopes attached to computers and the Internet. "So it is much trickier than just saying I'm going to schedule this patient in this office room."
The telemedicine boom also puts pressure on hospitals to standardize their technology platforms from clinic to clinic. "I have a physician coming in, and he's going to do telehealth in four different hospitals or four different clinics. I don't want him going to five different applications to get to all of those things. It needs to be streamlined, because the other side of it is, from a scheduling perspective, you cannot impact their schedule, because it impacts their productivity if you don't have that room and that patient in that room ready to go. You can't be saying, 'Wait, the camera's not working or this isn't working.' You have got to be ready to go so that 20-minute visit goes on like clockwork."
In this upcoming HLM webcast, learn how specialists and specialty services are reaching outside hospital and clinic walls to engage patients via the Internet, and how telemedicine innovators are overcoming challenges such as billing, scheduling and contractual agreements. November 19, 2013, 1:00–2:30 p.m. ET. Register today.
Reprint HLR1013-5
This article appears in the October issue of HealthLeaders magazine.
It would be easy to focus on the price of the federal government's health insurance marketplace and expect it to work better than some Web startup launched by a couple of heavily-caffeinated kids. But it wouldn't be fair.
No one ever said information technology was foolproof. Even the most advanced systems today have their glitches. Now we can add healthcare.gov to the list.
Last week, at the CHIME conference, my informal poll of healthcare CIOs found broad agreement that it's no surprise the federal health insurance exchange Web site, healthcare.gov, has been overwhelmed by the number of unexpected visitors and other basic flaws.
Here's one example of how bad it is: A New York Times researcher successfully registered, but despite "more than 40 attempts over the next 11 days," was reportedly unable to log into healthcare.gov.
"I know the government spends a lot of money on their contractors and Web sites, and I also know that it's very difficult for them to make changes because of the process the government may go through," said Pam McNutt, senior vice president and CIO of Methodist Health System in Dallas, TX.
McNutt thought back to when CMS rolled out software that providers use to achieve Meaningful Use attestation, and how long it took the government to remediate problems in that software. "It does not surprise me," she says, that an IT project of healthcare.gov's scope and scale started out on the wrong foot.
A System Almost Predisposed to Failure It's easy to focus on the price tag, $643 million or so, and expect it to work better than some Web startup here in Silicon Valley that say started on a shoestring a few weeks ago and yet is serving millions today. While tech's most darling sites have all had their outages, healthcare.gov last week is one for the books, resembling one long running outage.
But remember that the users of those Silicon Valley sites are usually tech-savvy types who have low expectations of a new site's capabilities, and are often attracted by a few glitzy features. In contrast, the nation's health insurance exchanges were defined by Congress—a body not known for designing things of beauty—and implemented by a raft of government contractors who got the job of plodding through an incomprehensible set of rules and regulations while purporting to be the lowest bidder.
It's a system almost predisposed to failure, and I've already received more than one email urging me to point out that the fact that the mere fact that the system is working at all is something of a coding triumph, or, as I put it back in July, something of a miracle.
But returning to CHIME, CIOs in attendance also pointed out that the public, which is so annoyed by healthcare.gov's shaky start, also requires a whole lot of educating about the kaleidoscope of choices the Web site presents them. And all of them are aware of the ticking clock—they must be signed up for coverage by the end of December to avoid a penalty.
Remember that healthcare CIOs aren't exactly sitting around waiting for things to do. They are up to their armpits in Meaningful Use, ICD-10, HIPAA Omnibus, accountable care organizations and a raft of other mandates and transformations that won't wait.
One CIO who I won't name even confided to me that he had received an email about the new health information exchanges from the CEO of his hospital's health plan, but he had not bothered to open it yet. "If I see anything exciting, I'll send it to you," he said to me.
As my colleague John Commins pointed out recently, hospitals are becoming default advisors on health insurance exchanges. CIOs I talked to at CHIME confirmed that this is happening, to lesser or greater degrees depending on the hospital.
It's Not Like Amazon A hospital cannot and will not be expected to provide the same analysis that the federal data hub will, processing a prospective health plan member's income and producing expected tax credits. That will still require the federal and state HIX systems which have performed poorly so far.
A few pundits last week took issue with the way healthcare.gov requires all visitors to create accounts before they can receive such information, but I don't see any way around it. It's true that you can anonymously price-compare on Amazon and lots of other popular Web sites, but those sites do not require your tax information to calculate a price. By statute, healthcare.gov has to work this way.
So while everyone is throwing bricks at the Beltway Bandits who now scramble madly to fix and scale up the federal and state sites, let's not forget that the master architect of this information system was Congress, and the people who elected its members.
Technology is usually best designed by some megalomaniac who gets a good idea and pushes it through over the objections of investors and even the public because of some overarching vision of what it can be. Technology is not usually designed well by committees of any kind.
Give it Time But we are not designing an iPad here, more like a set of incredibly boring plumbing that, like the plumbing in our houses, simply has to work, all the while adhering to an ocean of FIPS (Federal Information Processing Standards) designed by other committees and acts of Congress.
Could the Federal and state procurement processes that produced healthcare.gov be reformed? Sure, there is always room for improvement. But at the federal level, agencies have to deal with 435 cranky investors, each of which has way too much authority over the process without the requisite qualifications to design systems. I don't see a procurement process around that.
Let's all take a breath or two and give this some time. That was my takeaway from CHIME on this whole health insurance exchange mess. The CIOs gathered in Scottsdale, Arizona were indeed far from idle – serving up passionate ideas about how to fix electronic health records, how to move to value-based care, how to align IT goals with business goals.
I don't expect them, or healthcare systems, or even Silicon Valley, to fix the health insurance exchange rollout, and neither should anyone else.
At the College of Healthcare Information Management Executives, former ONC chief Farzad Mostashari stopped short of supporting alterations to the 2014 final rules or deadlines, but suggested that providers determine whether hardship exemptions can postpone Meaningful Use Stage 2 penalties.
Farzad Mostashari, MD, ScM
National Coordinator for Health Information Technology
In his first appearance since leaving his post earlier this month as National Coordinator for Health IT at HHS, Farzad Mostashari has suggested that providers find out if hardship exemptions can postpone Meaningful Use Stage 2 penalties.
"CMS could clarify what constitutes a hardship exemption. You wouldn't get the incentive payment, but you wouldn't get the penalty, one percent of Medicare payment cut either," said Mostashari, speaking at the annual conference of the College of Healthcare Information Management Executives (CHIME) in Scottsdale, Arizona.
The final rule governing Meaningful Use 2014, including Stage 2 requirements for providers entering their third year of attestation, specifies compliance by October 1, 2014 for a 90-day reporting period during that year. The final rule does allow for delayed attestations due to hardship, but CMS clarification "could provide additional flexibility… so that's where I would advise CHIME to look."
CHIME has strongly supported the current timetable for Meaningful Use 2014, but at the same time has asked ONC to provide additional flexibility by delaying the penalty phase.
Mostashari stopped short of supporting alterations to the 2014 final rules or deadlines laid out in the final rules published in 2012.
"There's no legal way to change a final rule without doing a pretty involved process," Mostashari told the CHIME audience. "It takes about nine to 12 months of proposing the new rule, getting that cleared through OMB, getting public comment on it, and all the stuff you guys have been through with Stage 1 and Stage 2. It would be total chaos."
Delaying implementation of the Meaningful Use 2014 and Stage 2 rules would also negatively impact programs such as the IQR, PQRS, and ACO initiatives at CMS. "We've worked really hard to align them, and if you take this out, that alignment falls apart," Mostashari said.
"You can't then use the Meaningful Use quality reporting to count towards PQRS and IQR and ACO… I think folks should assume that the timelines stick."
Before Mostashari left ONC, the office announced that effective October 4, the management team would be led by Jacob Reider who will serve as acting national coordinator, Lisa Lewis would serve as acting principal deputy, and Joy Pritts would continue to serve as chief privacy officer.
Reider was scheduled to speak at the CHIME conference, but the October 1 government shutdown and strict government rules led to that presentation being cancelled. Reider was even prohibited from traveling at his own expense to the event, CHIME officials said.
Mostashari took the occasion to outline a series of concerns he has about the Meaningful Use program moving forward:
On Business Practice Challenges "We may have the right incentives and we may have the right information, but too many people, [and] too many institutions don't know how to do this new form of delivery. We can talk about population health management all day long. We can even buy population health management software. But changing, flipping practice, flipping the hospital, changing so that everything doesn't have to take place in an eight-minute doctor's visit that is what we could get reimbursed under fee-for-service, creating standing orders and protocols, that's a cultural challenge. That's not an IT challenge. That's a business practice challenge."
On Product Usability "I do worry about the usability of the products. Not that the usability isn't getting better. I think it is getting better. But the expectations are getting better even faster. Maybe new hardware forms are going to actually help with that. But this was one where I didn't think…there was a clear government role as much as there was a market role, and I don't know if the market is incentivizing usability as much as maybe it could."
On Time Constraints "The other worry is of course time, the limits of time, and all of you are under incredible pressures this year. ICD-10, Meaningful Use Stage 2, and other transformation, ACO enablement, population health management. "[What] I don't think is going to happen is the pace of change slackening off."
On Collaboration "How much of your time is spent doing things that you think other hospitals have already done? A lot of the time of our committees is figuring out order sets. That development time is unnecessary. Everyone rediscovering that same thing is not necessary… helping others is the only way we can get through this. These are fantastic opportunities for us to come together and find those ways of information sharing and those relationships that will sustain that information sharing once you leave here. But I wonder if there's a way to make that even more systematic."
On Vendors "Vendor user groups are good. Vendor-provided platforms where you can share modules or content, that's good. It just doesn't feel like we're having enough of that happening to get us collectively through this next difficult period."
On Innovation "I'm seeing a problem in terms of the matching up between the supply and the demand of new products. I'm seeing a lot of people coming up with ideas, not necessarily businesses yet, who are pitching furiously. They need some partners. They need some clinical understanding. They need a place to try out their new innovations. Someone said to me, it was the vice president for innovation at Dignity, said I've had a thousand people come to me with readmission algorithms… I don't know that I need a thousand of them, but it would be nice to know which one's best. It's really hard to evaluate which products, which vendors, those additional vendors you want to work with. And it's really hard for them without making that connection. So again, a lot of what we're going to need is going to be about matching up with each other, matching up the learning, matching up with innovation…a lot of it isn't government policy."