Is some aspect of your practice getting a failing grade? Here’s how to turn it around and embrace a spirit of quality improvement.
Fam Pract Manag. 2001;8(7):45-46
On my first day as medical director of an academic department of family medicine, I found out what a negative quality assurance audit feels like. In short, it isn’t fun, but it can be a tremendous opportunity to embark on quality improvement.
In our situation, reviewers for Healthy Kids (HK), our state-funded health care program for indigent children, had given our department a failing grade. In the preceding years, the department had managed to meet the reviewers’ criteria for the documentation of well-child care, but with little wiggle room. Now, we were found to be deficient in three of the five areas studied. If our department did not improve, we risked losing this important contract, which accounted for over 70 percent of our child patients.
FAILING GRADE
A 1998 audit of the author’s group found deficiencies (i.e., scores of less than 70 percent) in three of the five areas studied. After embarking on a quality improvement initiative to address these areas, the group significantly improved its performance in all but one area (laboratory tests), as shown in the 1999 review of the three deficient areas. In 2000, the group initiated another review to see whether its performance was still on track. Again all areas had a passing score of greater than 70 percent except for the laboratory tests category, which had dropped due to insufficient lead-test tracking. The group has since developed a plan to do lead testing in-house and is confident its scores will improve on future audits.
Review component | 1998 | 1999 | 2000 |
---|---|---|---|
Health and developmental history | 69% | 98% | 84% |
Comprehensive physical exam | 80% | n/a | 89% |
Laboratory tests | 68% | 67% | 50% |
Immunizations | 63% | 87% | 73% |
Health education/anticipatory guidance | 76% | n/a | 79% |
Note: The 1999 audit was a review of the three deficient areas and examined only a sample of charts from the initial audit. The 2000 audit was more broad in that it examined a random sample of all current charts.
Impetus for change
Being told by an outside entity that your care is deficient is difficult. On the one hand, our department felt defensive, believing our failure was due in part to unrealistic expectations, decreasing remuneration and increasing demands on the part of the organization mandating the review. We also believed that the quality of the care we actually provided was much higher than what was documented in our charts and shown to the reviewers.
On the other hand, we knew our department could do better – if not for the auditors, then for the sake of our patients. With that in mind, we embarked on a quality improvement project:
1. Identify the problem. Our first step was to explore the areas in which we were found to be deficient and identify the underlying problems, of which there were many:
Charts were hard to locate due to a filing system that was imploding as a result of our huge volume. Many charts lacked face sheets or immunization flow sheets or were without dividers, making the search for information difficult. When reviewers could not readily find data in the chart, it was marked as missing.
Our collection of past medical history, family history and preventive screening data, although usually present, was in a different format and not as comprehensive as the newest requirements of the HK program.
The preprinted well-child visit sheets we were using had not been updated and were inconsistent with the parameters by which we were being judged.
Residents and attendings were unfamiliar with how they were being reviewed and were not well versed in the guidelines they were supposed to be following.
In cases where patients’ immunization records from previous doctors were missing from the chart, auditors marked those records as missing even when we documented that the parent was urged to forward those records to our office.
2. Focus on what can be changed. We started by redesigning many of the forms we use for well-child visits to mirror those suggested by the HK program. These included our immunization flow sheets, medical chart face sheets and patient questionnaires. Another change we made was to reorganize our charting system, replacing our battered charts with ones with multiple tabs, which made finding information easier. If we found a deficiency at the concurrent review, the resident was asked to correct the problem before the patient left the office. In addition, all physicians in our practice underwent training and obtained certification as approved HK providers.
We made it easier for the parents to obtain releases of their children’s immunization records from previous doctors by providing a one-stop, sign-here service overseen by a patient service coordinator. Each Friday, the medical director’s secretary would collect the releases, mail them out to the appropriate offices and log whether the information was sent back to us. If it wasn’t returned, we mailed a follow-up letter and documented our efforts in the chart.
Finally, to promote some friendly competition, we held a contest. The names of all residents and attendings who achieved a 100-percent score on their HK audit slips were entered in a random drawing for dinner for two at a local upscale restaurant. Another contest is under way offering the same reward to medical assistants who identify children who are behind on their immunizations or other patients in need of the pneumococcal vaccine.
3. Check for improvement. Three months prior to the anticipated reevaluation of our performance, we asked the reviewers to return to our office to make sure we were on track. This review, which included only a sample of records from the initial audit, showed we had significantly brought up our percentages in all but one area: laboratory testing (specifically, lead-test tracking, an area we felt was beyond our control because we referred these tests out of our department). The reviewers were so impressed with the turnaround we were able to accomplish that they decided to suspend their plan to revisit our site for one to two years.
Several months later, to assess how our current charts would fare under the same scrutiny, we asked that a random sample of all charts from children seen within the past three months be reviewed. The results were encouraging. Although several of our scores turned out to be lower than they were on the previous review – something we half expected due to the broader sampling involved in this review – all of our scores were passing except, again, in the area of lead-test tracking. To address this problem, our medical assistants came up with the idea of doing the lead testing themselves on-site at the time of the well-child visits, instead of referring the children who needed lead testing out of the department. Our local health department has agreed to supply the testing kits free of charge, and we believe these new measures will improve our performance at the next audit.
Beyond fixing what is broken
Many quality improvement projects are prompted by the need to fix an area that has gotten out of control, often brought to your attention by an outside reviewer. From that experience comes a logical leap into true quality improvement, which requires that you go beyond “fixing what is broken” and start looking proactively to improve other areas within your practice. By identifying an area for improvement, focusing on what specific changes you can make and then checking to see whether your efforts worked, you can improve the quality of documentation and care rendered by your office. The reward for such a job well done is the privilege of continuing to do what we do best: caring for our patients.