Veterans: VA medical system isn’t as big a success as officials have asserted

McClatchy Newspapers

VCS Note: This story has two parts — see below for second part. 

WASHINGTON – The Department of Veterans Affairs has habitually exaggerated the record of its medical system, inflating its achievements in ways that make it appear more successful than it is, a McClatchy Newspapers study shows.

While the VA’s health system has gotten very good marks for a transformation it’s undertaken over the past decade, the department also has a habit of overselling its progress in ways that assure Congress and others that the agency has enough resources to care for the nation’s soldiers.

The assurances have come at a difficult time for the agency, as a surge in mental health ailments among returning veterans over the last few years has strained the system and a spate of high-profile problems with caring for veterans in the VA and the Department of Defense’s Walter Reed Army Medical Center has provoked heightened public scrutiny.

A review by McClatchy of the quality measures the VA itself commonly cites found that:

_The agency has touted how quickly veterans get in for appointments, but its own inspector general found that scheduling records have been manipulated repeatedly.

_The VA boasted that its customer service ratings are 10 points higher than those of private-sector hospitals, but the survey it cited shows a far smaller gap.

_Top officials repeatedly have said that a pivotal health-quality study ranked the agency’s health care “higher than any other health-care system in this country.” However, the study they cited wasn’t designed to do that.

In general, the VA has highlighted what it says are superior conditions in its health system. Over the last 10 years, the agency has remade itself, boosting outpatient and preventive care in a growing network of outpatient clinics. It’s received glowing news coverage for the transformation.

“Today we’re positioned as an internationally respected force in health-care delivery, leading private and government providers across every measure,” Secretary James Nicholson said in a 2005 speech. “And we can prove it.”

On key issues of access, satisfaction and quality of care, however, other data contradict the agency’s statements.

Consider how returning soldiers with post-traumatic stress disorder – a major ailment to emerge from the war in Iraq – are cared for. The VA’s top health official, Dr. Michael Kussman, was asked in March about the agency’s resources for PTSD. He said the VA had boosted PTSD treatment teams in its facilities.

“There are over 200 of them,” he told a congressional subcommittee. He indicated that they were in all of the agency’s roughly 155 hospitals.

When McClatchy asked for more detail, the VA said that about 40 hospitals didn’t have the specialized units known as “PTSD clinical teams.” Committees in the House of Representatives and the Senate and experts within the VA have encouraged the agency to put those teams into every hospital.

Even considering that other PTSD programs are available, there are about 30 hospitals with neither PTSD teams nor any other kind of specialized PTSD programs, although all hospitals have at least one person who specializes in the ailment, VA records show.

The VA stood by Kussman’s statement. He wasn’t referring to a specific type of team, officials at the agency said, but to the fact that a collection of medical professionals will tend every veteran, whether or not his or her hospital has a PTSD clinical team.

Experts inside and outside the VA point to studies showing the agency does a good job, particularly with preventive care, and that it compares favorably with the private sector. While that may be true, McClatchy also found top VA officials buffing up those respectable results in ways that the evidence doesn’t support.

ACCESS

Secretary Nicholson told Congress in February about the VA’s “exceptional performance” in getting veterans in to see doctors. In 2006, the VA said 95 percent of its appointments “occurred within 30 days of the patient’s desired date.” In previous years, Nicholson and other VA officials have touted the department’s record on this issue.

Evidence from the VA itself indicates the record might be inflated.

According to a 2005 report from the agency’s inspector general, VA schedulers routinely put the wrong requested appointment dates into the system, often making waiting times appear to evaporate. In many cases, the scheduler checked for the next available time slot and declared it the patient’s “desired date.”

On Oct. 2, 2003, a veteran was referred to an ophthalmology clinic. On May 3, 2004, a scheduler created an appointment, saying the “desired date” was June 21. The appointment was scheduled for June 23, the inspector general said.

Actual waiting time: 264 days. Reported waiting time: two days.

Some schedulers even kept “informal waiting lists” to consult when they were ready to make formal appointments.

Investigators found that 41 percent of appointments contained errors in the desired dates, and that only 65 percent of a key type of appointment they analyzed were within 30 days of the desired dates.

After the report, the VA promised several fixes. But by December 2006, it had yet to complete them, the inspector general reported. The inspector general’s office continues to find scheduling problems similar to those it discovered in 2005, according to the VA’s Odette Levesque, who’s been briefed on an ongoing follow-up study.

One of those promised fixes was a new training program. As of the end of April, fewer than half the employees who need the training had completed it, the VA said.

The agency has made several changes since 2005 and it told McClatchy it “believes we have met the intent” of the inspector general’s recommendations; it also said that scheduling was “dependent on schedulers who do make errors.”

SATISFACTION

When he touted the VA’s quality before a Senate committee in February, Nicholson’s first bit of evidence was a customer satisfaction survey.

VA satisfaction ratings, he testified, were “10 points higher than the rating for inpatient care provided by the private-sector health-care industry. VA’s rating of 82 for outpatient care was 8 points better than the private sector.”

Kussman testified the same thing to a House committee.

But a review of data from the University of Michigan’s American Customer Satisfaction Index shows that Nicholson and Kussman compared the VA’s inpatient and outpatient scores to private hospitals’ TOTAL score. The total score combines three surveys: inpatient, outpatient and emergency room. The VA doesn’t do an emergency room survey, and that’s what drags the private-sector numbers down.

“The ER is a far less satisfying experience compared to inpatient and outpatient services,” said David VanAmburg, who directs the annual survey. Comparing the VA with the private sector’s overall score “is not necessarily an apples-to-apples comparison,” he said.

If the VA were to use the correct comparisons, it still would score higher, although the gap would be half as much – 4 points, in the case of outpatient care. If the survey were adjusted for age and gender differences between the VA and private samples, the gap would tighten even more, although it still would be about 3 points and therefore significant, VanAmburg said.

Asked about the survey, the VA at first said there were some ER patients in its outpatient survey (although not the inpatient survey), which could make the comparison with the overall hospital score valid. The VA said it didn’t know how many.

McClatchy then asked about a separate report indicating that about 1 percent of outpatient visits are coded “emergency unit.” The VA conceded that any ER survey responses would be a “small fraction” of the overall sample.

It added: “To ensure we’re providing the most accurate information possible” about the ACSI survey “we will consider improving how we articulate its results.”

QUALITY

To prove the quality of the VA’s medical care, Nicholson and others – often using identical words to Congress or the news media – repeatedly have cited a study by the nonprofit RAND Corp.

In the last two years:

_Nicholson said RAND “ranked the overall quality of VA medical care as significantly higher than any other health-care system in this country.”

_Dr. Jonathan Perlin, then the top VA health official, said in a radio interview that RAND “compared VA care to 12 other health-care organizations, some of the best in the country,” and found VA superior. Studies such as RAND’s showed the agency’s care to be “the best that you can get in the country,” he said.

_Kussman wrote in a statement to McClatchy earlier this year that RAND “recently” reported that veterans “receive better health care than any other patients in America.”

_The VA’s public affairs department wrote in a magazine that the study “was conducted by the RAND Corporation, an independent think tank,” as well as researchers from two universities.

As it turns out, the RAND study was neither fully independent nor all that recent. A VA grant helped pay for it. Two of its main authors had received VA career-development awards, and four of its nine listed authors were affiliated with the agency, according to the study’s documentation.

It was published in 2004 but used data from 1997 to 1999, when the system treated far fewer patients than it does now.

The study does show that VA patients are more likely than non-VA patients to receive a range of needed tests and procedures. In the eyes of health experts, that’s a real achievement; other studies have found similar results.

But Nicholson’s claim that the agency performed better than “any other health-care system in this country” and Perlin’s assertion that RAND compared the VA with 12 other health-care systems are wrong.

The study didn’t compare the agency with other systems; it compared patients in the VA with those who weren’t.

The non-VA patients were drawn from 12 large metropolitan areas across the country, while VA patients were drawn from two of its 21 regions. The two groups were surveyed with very different methods, and the non-VA sample had a far lower response rate than the VA sample.

Asked about the study, the VA said the agency had partially funded it but that the preponderance of money came from other sources; RAND has a long “reputation for independent evaluation,” it said.

The VA did say that Perlin’s quote was “partially inaccurate in describing the study,” which it chalked up to confusion. It stood by the other statements.

PART TWO: Record on important health outcomes is mixed

By Chris Adams, McClatchy Newspapers

May 10, 2007 — WASHINGTON – Many studies tout the quality of the Department of Veterans Affairs health system based on what researchers call the “process” of health care; for example, how regularly important tests are performed. The VA does well on those measures.

There are fewer studies documenting the outcomes of the agency’s health care. That is: Do patients in the VA system do better or live longer? There, the record is mixed.

-A 2004 study, covering patient care from the same period as the RAND Corp. study the VA often cites to prove its quality, examined survival rates among patients treated in either VA or Medicare-funded hospitals after heart attacks. It found that patients in VA hospitals had a higher death rate than those in the Medicare system, even after accounting for the severity of each veteran’s illness. Mary Beth Landrum, an associate professor at Harvard Medical School who led the study, said it had prompted the VA to change its cardiac care procedures.

Using slightly different methods, the VA said its own analysis of heart attack mortality found no difference between the VA and Medicare systems. That analysis hasn’t been published in a peer-reviewed journal, although the agency said it planned to submit it to a journal soon.

Landrum said the different results showed how difficult it was to research outcomes in health-care systems that record information in different ways.

-A separate study, from 2004, compared diabetes care among VA patients with those in private managed-care programs. It determined that the VA was superior in process issues, but the record was mixed on outcomes issues. For example, patients in the VA system had better control over their cholesterol and one type of hemoglobin than those in managed-care systems, but the VA and managed care were the same at controlling blood pressure.

-A 2005 study compared veterans who got coronary artery-bypass surgery at VA hospitals with those in private hospitals; the death rates were about the same.

-A 2006 study compared older patients in the VA system with those in the Medicare managed-care system. It started with two groups – one from the VA, one from the Medicare system – and determined how many of them died from 1998 to 2004.

After adjusting for health status, the VA patients had a lower death rate overall than those in Medicare. Donald Miller, a VA researcher and Boston University professor who co-authored the study, said the differences were big enough to be meaningful for the average veteran using the system.

But there was a caveat in the study: Once patients were pegged as either VA or Medicare users, those labels stuck; researchers didn’t track where the patients received their care. Another study, out in March, found substantial overlap between the systems: Forty-six percent of veterans in the study who were older than 65 used the VA and Medicare for outpatient care, and only 18 percent used just the VA.

-In a statement to McClatchy Newspapers, the VA made the case that a measuring system from the Joint Commission, which accredits health care organizations, demonstrates the VA’s superior care.

For about two dozen measures – mostly process measures, but some outcomes measures – the VA compared its performance with the average performance of other accredited hospitals. In its statement, the VA said its hospitals equaled or beat other hospitals 91 percent of the time.

What the agency didn’t factor in was all the times there were too few cases to make a comparison. Of 2,738 comparisons, the VA hospital was better than the average accredited hospital about 21 percent of the time, it was the same about 36 percent of the time and it was worse about 6 percent of the time. In the final 38 percent, there were too few patients to make a comparison.

On the specific issue of heart-attack care, VA hospitals and other hospitals were essentially even. On the crucial measure of death after a heart attack, the results found VA hospitals better 0 percent of the time, the same 59 percent of the time and worse 1 percent of the time. In the remaining 40 percent, a comparison couldn’t be made.

-Miller, the Boston researcher, participated in another study that detailed a possible problem with any attempt to assess the care at VA facilities.

That study, published in 2004 in the American Journal of Managed Care, looked at more than 200,000 veterans with diabetes who were enrolled in the VA and Medicare.

Researchers found that the systems’ medical records differed. Looking only at VA records, for example, indicated that 2,300 veterans had had amputations. But combining VA with Medicare records showed that there were 3,900 veterans with amputations.

The bottom line: VA records gave only a partial picture of a veteran’s total health care. Given such record-keeping gaps, the study concluded that “comparisons among national systems may not be possible until there is a single electronic medical record.”

This entry was posted in Veterans for Common Sense News. Bookmark the permalink.