Skip to main content

Claims Data Underreports Pressure Ulcers

 |  By John Commins  
   October 17, 2013

Rates of hospital-acquired pressure ulcers vary widely in hospitals based on how the data is collected, researchers say. In one study, surveillance data identified 10 times more pressure ulcers than billing data.

Hospitals that rely on claims data to measure hospital-acquired pressure ulcers are likely under-reporting the problem and creating an inaccurate comparison with competitors for the public. A far more accurate measure uses surveillance reports by trained clinicians inside the hospital, a new study shows.

A University of Michigan School of Medicine study, which appears this week in the Annals of Internal Medicine, found that HAPU rates varied wildly in hospitals based on how the data was collected, inaccurately making some hospitals appear to be better or worse than peer institutions.

The UM study examined two million all-payer administrative records from 448 California hospitals and quarterly hospital surveillance data from 213 hospitals in that state that were publicly reported on CalHospitalCompare in 2009. The researchers winnowed the sample to 196 acute care hospitals with at least six months of claims data and surveillance data.

Jennifer A. Meddings, MD, an internist and pediatrician at UM and the leader author of the study, says researchers "looked at the same hospitals in the state of California in the same year and mapped criteria looking at the same patients as much as possible and the same severity of ulcers."

"The surveillance data found 10 times more pressure ulcers than the billing data. That is a big problem. We expected the surveillance data to be a little bit higher but not dramatically higher," she says. "The one thing we had hope for was 'are we still able to track the same hospitals that are bad performers. Are the same hospitals with high rates in surveillance data the same hospitals we would have identified by the billing data?' Unfortunately this was not the case."

Meddings says the process by which claims data is collected is not conducive to accurate reporting of HAPUs.

"Hospital coders are not clinicians and they are basically restricted by federal rules as far as what types of papers they are even allowed to look at to get the pressure ulcer diagnosis," Meddings says. "The surveillance data is quite different and generated by a team of nurses and other specialists who are trained specifically on how to examine a patient head to toe, a full skin exam to look for pressure ulcers, how to stage them correctly, how to figure out if they were hospital-acquired or not, and how to differentiate bed sores from other types of skin problems."

Specialized clinical surveillance teams in California examined every patient in the hospital at least once every quarter or more frequently and detailed their findings through a standardized public reporting process with the state.

"Not only did they examine the patients, they were allowed to look at the entire medical record, the doctors' notes, the wound care specialists' notes, the nurses' notes," Meddings says. "They had all the data and they were looking at the patients with their own eyes, which is quite different as opposed to the billing data."

The study's HAPU findings are consistent with a similar review Meddings' team conducted last year of catheter-associated urinary tract infections at California hospitals.

"We know the use of the catheters and skin conditions is much better documented in nurses' notes rather than doctors' notes because nurses are providing much of the skin care and they routinely do the skin exams and they are also the providers who place and monitor the urinary catheters," she says. "Both of these complications are essentially easy to find in nurses' notes, but very hard to find in the notes that coders are allowed to look at."

Continued use of claims data as a measure of hospital HAIs also misinforms the public. "One of our biggest concerns is that there seems to be a trend of quickly using data to publicly report because the data is available and only asking questions years later of whether or not their data is accurate. These types of assessments should have been done before the data was publicly released," she says.

"Frankly, the way the data is collected, if a physician in a hospital is actually very diligent about documenting these issues, then it is more likely to end up in the billing data, so you are actually having an unintended consequences of penalizing the hospitals that document better and reward hospitals with incomplete documentation. If you do an incomplete job of documenting these complications they will never show up in your billing data."

"The data we actually used to do this assessment was available when this policy was being implemented. These types of assessments could have been done before they chose the policy, but then it's a different approach for using the data and we prefer validating the data before using it for this purpose, rather than using it until somebody proves that it is wrong."

Meddings says Medicare and other payers need to adopt a more systematic reporting process that relies less on claims data. "We like the use of the surveillance data in California. We recognize it can be an expensive process, but probably something like a modification of what they're doing would standardize validated measures."

Pages

John Commins is a content specialist and online news editor for HealthLeaders, a Simplify Compliance brand.

Tagged Under:


Get the latest on healthcare leadership in your inbox.