Nov 16 2004
A new measurement tool called the computerized clinical vignette can help clinicians and policymakers assess and improve the quality of physician practice while potentially reducing costs, according to a study led by a researcher at the San Francisco VA Medical Center (SFVAMC).
The vignette -- which presents a simulated patient visit to the doctor via computer -- is an accurate, inexpensive, and efficient way to measure how well physicians handle their clinical practice, the study concludes. Research findings are published in the November 16 issue of Annals of Internal Medicine.
"As we investigate the quality of clinical practice, we know there's a tremendous need for improvement," says study leader John W. Peabody, MD, PhD, a staff physician at SFVAMC and an associate professor of epidemiology, biostatistics, and medicine at the University of California, San Francisco.
In order to improve care and control costs, he adds, "We need to give physicians a very precise estimate of where their strengths and weaknesses are. But in medicine, we don't get a lot of feedback. And in order to change that, we need a new measurement method."
At present, the most common measurement system is medical record abstraction, a summary of a doctor's patient care records. However, such records proved inaccurate 43 percent of the time when measuring the accuracy of the doctor's diagnosis in another study, also led by Peabody and just published in the November 2004 issue of the journal Medical Care. Based on the findings, Peabody notes that "the paper trail is probably not a very representative way to assess what happens during a doctor visit."
The gold standard for measuring physician performance is through the standardized patient -- a trained actor with a simulated illness who visits a doctor unannounced and records the doctor's actions immediately after the visit. This method is relatively accurate, but expensive and complicated to use, according to Peabody.
In the new study, computer vignettes proved significantly more accurate than medical records at assessing physician performance when both were compared directly against standardized patients.
A total of 116 physicians participated in the study, conducted at two VA hospitals and two large private medical centers. Physicians were visited by standardized patients, completed computerized vignettes, or did both. Physicians were informed that standardized patients might be introduced -- unannounced -- into their clinics during the coming year.
The standardized patients presented with one of four diseases: chronic obstructive pulmonary disorder, diabetes, vascular disease, or depression. For each disease, there was one patient with a simple version and another with a more complex version with a secondary diagnosis of either hypertension or high cholesterol.
Each "patient" completed a questionnaire after the visit, and each generated a medical record. The same patient cases were use for physicians responding to the computerized vignettes, in which the physician would "see the patient" on a computer and follow the normal sequence of an actual visit.
The research team scored the three measurement methods for quality of patient care in five domains: taking a patient history, performing the physical examination, ordering tests, making the diagnosis, and administering a treatment plan. Measured by the gold standard of standardized patient visits, the physicians as a group scored 73 percent (on a scale of 100) in quality of care. When measured using identical vignettes, they scored 68 percent. But when the same cases were assessed using medical records, the physicians scored only 63 percent.
Study findings showed that vignettes were more accurate than medical records -- and almost as accurate as standardized patients -- at all four sites, for all eight types of cases, in all five domains of patient care, and at all levels of physician training (second-year resident, third-year resident, and attending physician).
"What makes this study so unusual is that we rarely have the opportunity to define the case when we're evaluating the quality of care," notes Peabody. "By having the standardized patients, we could say with complete certainty that we were making identical comparisons between these three measurement methods."
Peabody and his team developed the vignette method over the course of eight years of research. He says this method would be easy to implement nationwide for several reasons: physicians are familiar with the format because it looks like a patient case they might see in their clinic; it's inexpensive because physicians can sit down, complete the steps on a computerized screen and generate a score very quickly; computerization allows doctors to complete vignettes anywhere, over the web and at home; and there is a minimal time requirement, such as twice a year for one hour.
Vignettes turn out to be a very precise way of measuring an individual doctor's strengths and weaknesses, according to Peabody. A doctor "may have strength in taking a history, but order too many unnecessary tests. Similarly, because we give different cases, it allows us to say that you're particularly good at taking care of a patient with depression, but you're struggling a bit with a patient who has congestive heart failure."
Peabody says this level of accuracy led researchers to a striking finding: the differences in quality of care between institutions were not nearly as great as the differences within institutions.
Within a particular clinic, "it's usually not the case that there are a couple of bad doctors driving down the overall quality of care, but more that some doctors are having trouble with certain cases or with certain aspects of clinical care," he says. Vignettes could be used as training tools to improve those doctors' performances and thus patient outcomes, which in turn could lower costs, he adds.
Vignettes are also far better than paper records at determining whether a physician has identified a co-morbidity -- a secondary condition that complicates the primary disease. "We're all living longer, and we don't typically just have one medical problem," notes Peabody. In the earlier Medical Care study, his team found that medical records correctly noted secondary diagnoses in only 27 percent of cases, which he calls an "abysmal" rate. By contrast, vignettes reliably indicated co-morbidities for all diseases studied and in all institutions, he says.
http://www.ucsf.edu/