Our previous blog article about making a financial case for automated hand hygiene compliance covered the costs of healthcare-acquired infections. In this one, we’ll explain why the money spent on direct observation often leads to misleading compliance data.
It’s clear that direct observation inflates hand hygiene compliance rates (Nour-Omid, 2021), according to evidence found in five different studies (Jeanes, et al., 2019; Vaisman, et al., 2020; Gould, et al, 2020; Bruchez, et al., 2020; Leis, et al., 2020). These inflated compliance rates may give organizations a false sense of security about the interventions they’re currently taking to reduce infections. Spending money to produce an inaccurate baseline and misleading improvement data is not money well spent.
Direct observation results are inaccurate for two main reasons – the Hawthorne effect, which is the tendency for a person to modify their behavior while being observed, and the low and misrepresented numbers of hand hygiene opportunities measured by most direct observation programs.
Most hospitals don’t do enough observations to gain reliable data
A study of direct observation practices at 10 acute care hospitals found that they completed only 10 to 30 observations per unit per month (Livorsi, et al., 2018). Contrast this low number of observations with the 200 observations per unit per month required by the Leapfrog Group’s latest hand hygiene standards for its hospital and ambulatory surgery center survey. The standards also require observations to occur across all days of the week and all shifts in proportion to care providers who interact with patients.
Ramping up a direct observation program to achieve 200 observations per unit per month would require more budget to hire and manage more observers. Assuming a budget of $40/hour for direct observation and six observations an hour, per the industry average of 10 minutes per observation, an increase from 24 to 200 observations a month results in costs going from $160 to $1,360 per unit per month. Multiply that by the number of units and costs quickly become unsustainable. And these figures only count the costs of observers recording hand hygiene compliance results, not the costs of recruiting and training the observers and other expenses.
The four reasons direct observation is “problematic”
Aside from the costs, the Livorsi study stated that direct observation is “problematic” for reasons falling under four themes: 1) a lack of time and personnel for direct observation, 2) skepticism among clinical staff about accuracy, 3) a tendency to create tension between staff and observers, 4) and observers’ feedback not consistently reaching staff or motivating improvement efforts.
As part of the study, researchers gathered quotations gained from interviewing observers and from focus groups with frontline staff. Regarding a lack of time and personnel, an infection preventionist stated, “We’ve been so busy with these other things for the past month. I haven’t done observations in probably a month, which is terrible to say but it’s the truth. So we have horrible data.”
Speaking to the skepticism about the data, a hospital epidemiologist said, “Our observations are not reliable. And we believe that the compliance is far less than what the statistics may show.” A prevention control specialist commented, “We still have the Hawthorne effect in play. Whenever the staff does see us, they automatically head towards the nearest gel dispenser or sink.”
Others spoke about the tension between observers and staff. “We just call ’em spies (laughter),” said a clinical nurse. A clinician stated, “Yeah, big brother’s watching (chuckling).” An infection preventionist said, “(Nurses) didn’t want to do observations for their unit. They didn’t want to quote unquote ‘narc’ on their coworker type of thing. They really felt that it wasn’t their job.”
In regard to the feedback and improvement process, an infection preventionist said, “You send it up, but are they sending it back down? You’re the intermediary. You’re giving it up. How is it getting back down to the people?. . . Our responsibility really is to give it to the leaders and for them to share it with their staff and develop their own action plans at the grassroots. . . I should be able to go to the unit and say, ‘Hey, your hand hygiene rate was 64%. What are you – the nurse – gonna do about hand hygiene?”
Better outcomes: Automated systems have improved hand hygiene compliance and reduced infections
The Livorsi study cited several hospitals that have “used automated surveillance systems to provide immediate and individualized feedback that, in turn, has helped to improve hand hygiene compliance.” (Ellison, et al., 2015; Kelly, et al., 2016; Michael, et al., 2017; Pong, et al., 2018)
Vitalacy’s automated hand hygiene solution has demonstrated accurate data capture by individual, unit and organization, with helpful features to help care providers comply in a positive, non-punitive fashion that focuses on making care safer for the patient. The solution requires no observers and little staff time beyond the initial training.
Working with Vitalacy, St. Mary’s Healthcare System for Children in Queens, N.Y., decreased infections significantly as hand-wash duration increased from less than five seconds in 2019 to 5.2 in February 2020, 8.0 in March and more than 10 seconds in April 2020 and beyond. Unlike most automated systems, the Vitalacy system measures hand-wash duration, which correlates with the infection rate.
St. Mary’s averaged 11.8 healthcare-associated infections per month during 2018-2019. In January and February 2020, St. Mary’s averaged 9.0 infections per month, but in March infections dropped to only one, coinciding with longer hand-wash duration and with the turning on of hand-wash reminder notices received through Vitalacy SmartBands worn on care providers’ wrists. There were no infections from April through October 2020, and November and December averaged 1.5 infections.
Contact Vitalacy today for a demo!
Bruchez SA, et al. Assessing the Hawthorne effect on hand hygiene compliance in an intensive care unit. Infection Prevention in Practice, June 2020;2(2).
Ellison RT, et al. A prospective controlled trial of an electronic hand hygiene reminder system. Open Forum Infectious Diseases, 2015;2(4):ofv121.
Gould D, et al. Electronic hand hygiene monitoring: accuracy, impact on the Hawthorne effect and efficiency. Journal of Infection Prevention, May 28, 2020;21(4).
Jeanes A, et al. Validity of hand hygiene compliance measurement by observation: a systematic review. American Journal of Infection Control, March 2019;47(3):313-322.
Kelly JW, et al. Electronic hand hygiene monitoring as a tool for reducing
health care–associated methicillin-resistant Staphylococcus aureus infection. American Journal of Infection Control, 2016;44(8):956-957.
Leapfrog Group. Leapfrog Hospital Survey. Factsheet: Hand Hygiene. Last revision: 4/1/2020.
Leis JA, et al. Introduction of group electronic monitoring of hand hygiene on inpatient units: a multicenter cluster randomized quality improvement study. Clinical Infectious Diseases, November 2020;71(10):e680-e685.
Livorsi DJ, et al. Evaluation of barriers to audit-and-feedback programs that used direct observation of hand hygiene compliance: a qualitative study. JAMA Network Open, 2018;1(6):e183344.
Michael H, et al. Durable improvement in hand hygiene compliance following implementation of an automated observation system with visual feedback. American Journal of Infection Control, 2017;45 (3):311-313.
Nour-Omid J. 5 studies show how direct observation inflates hand hygiene compliance rates due to Hawthorne effect. Vitalacy Blog, May 17, 2021.
Pong S, et al. Effect of electronic real-time prompting on hand hygiene behaviors in healthcare workers. American Journal of Infection Control, 2018;46(7):768-774.
Vaisman A, et al. Out of sight, out of mind: a prospective observational study to estimate the duration of the Hawthorne effect on hand hygiene events. BMJ Quality & Safety, 2020;29(11).