MADRID—Determining what is a best practice in rheumatology and then implementing improvements based on what you find can be fraught with complexity, an expert said during the 2017 Annual European Congress on Rheumatology (EULAR). Examples are emerging of benchmarking projects in which electronic registers are used to improve patient care, said William Dixon, MD, chair of digital epidemiology at the University of Manchester, UK.
Dr. Dixon outlined how successful benchmarking—which Xerox pioneered in the 1990s to regain ground lost to competitors in the photocopying business—involves asking the right questions and coming up with good answers to those questions. Benchmarking, as he described it, is a seven-step process:
- Identify what to benchmark;
- Determine what to measure to assess that benchmark;
- Identify who to benchmark;
- Collect the data;
- Analyze the data;
- Set goals and develop a plan of action; and
- Monitor the process.
Whatever is measured should be relevant, important, reliable, unambiguous, feasible in terms of being measured and standardized, valid and able to be acted upon, Dr. Dixon said. But what to measure may not be as clearcut as it seems. Example: In an intensive care unit, mortality is an obvious data point to compare from institution to institution, but what if one unit has a higher patient age and a higher comorbidity burden?
“If we think about rheumatology, what is it that we should be benchmarking that meets all of these criteria?” Dr. Dixon asked.
He said the ACR’s RISE Registry deserves praise for having collected data from about a quarter-million patients and more than 300 doctors in at least 55 practices, according to data presented last year. [Editor’s note: As of June 2017, 719 U.S. providers participate in the RISE Registry, representing more than 1 million patients and roughly 20% of practicing rheumatologists in the U.S. RISE has amassed data on approximately 6 million patient encounters.] “[After] you’ve done the plumbing [i.e., set up the EHR software], you don’t have to do any more,” he said. “Those data just flow out to the RISE Registry.”
In Portugal, the Rheumatic Diseases Portuguese Register (Reuma-pt) involved creating a core set of data that was considered good clinical practice before the registry was established. There, just the process of measuring for these data led to improved care, Dr. Dixon said.
In Denmark, the Danbio registry could be a model to follow. One of its features is that it allows patients to enter their data in real time before a consultation. Dr. Dixon said it’s a “fantastic example of how we can collect structured data for quality improvement.”
He noted that it’s not just a matter of collecting the data in the clinic, but organizing it into a repository of the data in a usable format.
Options for analyzing the data and giving feedback range from static reports to face-to-face meetings to interactive reports and graphics. A lot isn’t known about the best way to give feedback, but it’s fairly well established that prompt feedback is best and that simply telling people what they’ve done is not too effective, he said.
With Danbio, the Danish registry, there’s an annual meeting held for every annual Danbio report that’s issued. “In the first year, in 2005, this public meeting caused substantial anxiety,” Dr. Dixon said. “But since then, everyone has viewed it favorably, and the meeting is always seen as very fruitful, with exchange of experience across the country.”
Building better systems and tools for benchmarking is a worthwhile endeavor. “We need to build and invest in infrastructure that facilitates real-time data extraction,” he said.
It will also be important to keep the patient perspective in mind. “We should also think about feedback to patients and what should be presented there,” he said. “What’s benchmarking from their point of view?”
Improving Care
In another talk on improving care in a systematic way, Tore K. Kvien, MD, professor of rheumatology at the University of Oslo, said that guidelines and recommendations seem to be leading to better care, at least in some areas, but it’s clear that clinicians don’t automatically adjust their care for the better simply because of official recommendations.
“We can know about recommendations … but whether we are really applying them in clinical practice, that is something different—and that is the most important thing,” Dr. Kvien said.
A study on physician compliance with EULAR recommendations on RA management—such as early start on disease-modifying drugs and monitoring of treat-to-target goals—found that doctors tend to say they comply more often than they actually do.1 Example: 98% of physicians said they were committed to the recommendation on early DMARD use, but this treatment was actually done only 67% of the time, according to a review of records. And 83% of physicians said they were committed to monitoring treat-to-target progress, but this approach was actually done only 27% of the time.
Nonetheless, patterns have improved in some areas following the issuing of recommendations, including the use of MTX at the right time, Dr. Kvien said. Other areas, such as management of gout, need much improvement.
At Dr. Kvien’s center, the rheumatology department at Diakonhjemmet Hospital in Norway, a system of evaluating cardiac risk in patients with rheumatic diseases—in a preventive cardio-rheuma clinic—has been put into place in response to recommendations. This clinic has led to reduced risk, he said.
“I think this is a good example of how recommendations have been introduced into clinical practice, and I am quite certain that this is really benefiting patients regarding cardiac comorbidity and, perhaps, mortality,” Dr. Kvien said. “I think management recommendations will contribute to improved quality of care—if implemented into clinical practice.”
Thomas R. Collins is a freelance writer living in South Florida.
Reference
- Gvozdenovic E, Allaart CF, van der Heijde D, et al. When rheumatologists report that they agree with a guideline, does this mean that they practise the guideline in clinical practice? Results of the International Recommendation Implementation Study (IRIS). RMD Open. 2016 Apr 28;2(1):e000221. doi: 10.1136/rmdopen-2015-000221. eCollection 2016.