The Plan, Do, Study, Act (PDSA) cycle is a quality improvement methodology that was developed in the 1920s and 1930s by Walter Shewhart, one of the pioneers of modern quality control, and subsequently championed by Edward Deming.1 This methodology has been used in a wide array of industries, including health care, as a pragmatic way in which new processes are designed and implemented (Plan and Do); results are analyzed (Study); and improvements are carried out (Act). A robust quality improvement program using the PDSA system is marked by repeated PDSA cycles in rapid succession and represents a process for continuous organizational change and evolution.
In this issue, McConnell and colleagues2 report their findings (the “Study” part of their PDSA cycle) on a quality improvement initiative designed to increase the completeness and timeliness of essential postintubation care actions at a single 24-bed academic medical intensive care unit (MICU).2 Using what is described as a retrospective, controlled, before-and-after study design, they report that their intervention, a “Postintubation Time Out process and checklist,” was associated with an absolute increase of 19% in the subjects who received an arterial blood gas (ABG) in less than 60 min, from 37% of subjects before intervention to 56% of subjects after their intervention.
Quality improvement studies are often difficult to assess because, by design, the processes constantly change and are tailored to the local situation. The PDSA method also allows for, and in fact often encourages, smaller PDSA cycles within a larger PDSA cycle.3 Indeed, this was the case in this study, because the authors intervened 3 months into the postintervention period by attaching a checklist to all mechanical ventilators, noting that this change contributed to a consistent improvement in achieving the primary outcome. However, such additional changes to the intervention may introduce sources of confounding and bias that impact results if the intervention is studied and reported in the same fashion that we would analyze traditional research.
What is the best way to evaluate a PDSA cycle study? A few questions common to any quality improvement study should be answered.4,5 Was the intervention well designed? Was the adherence to the intervention optimal? Are the study results valid? Are the study results important and generalizable? In this case, the centerpiece of the quality improvement initiative was a 30-point Postintubation Time Out checklist to be used for all MICU patients intubated in or outside the unit. The authors sought buy-in and adoption from physicians, nurses, and respiratory therapists through a multipronged approach, including conferences, emails, orientations, daily shift huddles, and monthly unit-wide meetings.
Unfortunately, as is often the case when trying to effect change, there was only moderate adoption of the checklist with use in slightly less than half of the subjects studied (49%, n = 58), and the 30-point checklist was completed fully in a mere 2 subjects. This lack of adherence to the checklist casts doubt on the results reported. The changes along the way in the intervention also make it difficult to attribute causation to the quality improvement checklist.
To their credit, the authors discuss the difficulties they faced in gaining better adherence to their checklist: “Even with multifaceted educational sessions and targeted emails, altering the behavior of the large number of revolving stakeholders was difficult.” The authors further write, “Adherence to a complex checklist, which required 3 providers to perform multiple tasks at discrete times, suffered without a defined process leader.”
More than anything, this study highlights many of the challenges of quality improvement projects and the efforts to measure change with these interventions. It also demonstrates the mismatch between traditional research reports and descriptions of quality improvement interventions. Because many of us have grown up in the world where randomized, controlled trials are the ultimate form of evidence-based medicine, it is tempting to make all reports fit into a similar model. In fact, this doesn't seem to be the best approach to sharing quality improvement projects. For exactly this reason, in 2008 the Standards of Quality Improvement Reporting Excellence (SQUIRE) guidelines were published.6
The SQUIRE guidelines propose a novel structure for reporting PDSA cycle work. They allow the authors to discuss the problem and the rationale for their intervention. The guidelines include a description of the context, intervention, and the study of the invention. Metrics are reported but with greater emphasis on describing how interventions were modified, what the observed associations were between interventions and outcomes, and what the unintended consequences were. This approach allows greater learning opportunities for the reader who may want to implement a similar quality improvement project.
Although not presented in SQUIRE format, we are still able to glean many of these pearls from this paper. The authors report that better protocol adherence was associated with more timely ABG acquisition, and the more items that were completed on the checklist, the more likely an ABG was drawn within 60 min. When a specific person was assigned to the task, it appeared that a timely obtained ABG was more likely. These are aspects worth considering if developing a similar intervention.
The authors highlight that naming of a checklist is also a potentially important step that was initially overlooked. They suggest that the checklist title of “Postintubation Time Out” might have misled the health care team to use the checklist only on patients who had witnessed intubations. They subsequently changed the name of the checklist to “Mechanical Ventilation Time Out.” This is another learning curve that could inform future efforts.
Based on their results, multiple educational sessions, conferences, emails, daily shift huddles, and monthly unit-wide huddles were not enough to ensure high adherence to the checklist process. It remains unknown what educational, administrative, or process interventions might reliably ensure high adherence to complex checklists and processes. The authors write that real-time performance feedback may be an important part of ensuring adherence to a checklist, and they intend to publish a biweekly outcomes chart (presumably in a place where people may read it). They also suggest that use of computerized data monitoring (so-called statistical process control charts) may be an early warning to decreased adherence. It would be interesting to learn from their next steps.
This study also highlights how challenging it is to maintain engagement. While strategies such as the mortality and morbidity conference or initial orientation to the checklist garnered increases in ABGs, these effects faded. Checklist adherence reached an all-time low during July, presumably due to new interns working in the MICU. What is perhaps the most important message about engagement is the value of “just in time” education as demonstrated by the sustained improvement seen after attaching a blank checklist to all clean ventilators that were ready to be used.
The authors acknowledge the challenges and difficulties of implementing a checklist and process change to the usual workflow of their intensive care unit. More work needs to be done, and further PDSA cycles are needed. Perhaps a simpler checklist needs to be developed. Maybe clinical champions of the checklist need to be identified. While the exact mechanisms are still to be elucidated, further work on culture change needs to happen. The answer to creating and sustaining a program that assures timely completion of routine processes after intubation remains to be found, but we can learn much from the experiences of this group. We also need to evolve in how we report quality improvement projects, maximizing learnings and insights even if the results aren't exactly as we hoped.
Footnotes
- Correspondence: Patricia A Kritek MD MEd, University of Washinton Medical Center, 1959 NE Pacific Street, Box 356522, Seattle, WA 98195. E-mail: pkritek{at}uw.edu.
The authors have disclosed no conflicts of interest.
See the Original Study on Page 902
- Copyright © 2016 by Daedalus Enterprises