164 Electronic Surveillance for Clostridium difficile Infection (CDI)

Saturday, April 2, 2011
Trinity Ballroom (Hilton Anatole)
Diane Hopkins-Broyles, RN, MSN, CIC , BJC Healthcare, St. Louis, MO
Joshua A. Doherty, BS , BJC Healthcare, St. Louis, MO
Angela Recktenwald, MPH , BJC Healthcare, St. Louis, MO
Kelly Faulkner , BJC Healthcare, St. Louis, MO
Erik R. Dubberke, MD, MSPH , Washington University School of Medicine, St. Louis, MO
Hilary Babcock, MD , Washington University School of Medicine, St. Louis, MO
Keith F. Woeltje, MD, PhD , BJC Healthcare, St. Louis, MO
Background: Conventional manual surveillance for CDI is time consuming and subject to surveyor bias.  In our 12 hospital health system, CDI surveillance definitions and processes varied resulting in data that was not comparable. 

Objective: To develop a standard electronic method for CDI surveillance as a quality measure, using the Centers for Disease Control and Prevention’s (CDC) Ad Hoc CDI Working Group classifications.

Methods: The surveillance algorithm is based on positive laboratory tests.  Date of collection is compared with admit date and dates of prior positive tests within any of our hospitals during the last 8 weeks.  As a measure of incidence, we utilized BJC Healthcare facility-onset, Healthcare facility-associated (HCFO-HCFA) cases.  These are cases with a positive test > 3 calendar days after admit and no prior positive test within the past 8 weeks.  Validity of the algorithm was assessed by comparing electronic results with manual surveillance done routinely by the hospital infection preventionists (IP).

Results: During the study period, January to July 2009, 277 incidence cases were identified.   In 5 hospitals manual surveillance identified more cases than the algorithm.  In 4 hospitals the algorithm identified more.  When compared to manual collection the algorithm’s sensitivity was 95% and specificity was 76%. A review of a subset of discordant cases at 3 hospitals where the algorithm identified more cases was completed.  In 91% (32/35) of the cases, the disagreement was related to symptoms or diagnosis present on admission found by manual surveillance but not captured by the lab based electronic process.  One patient identified as HO-HCFA by the algorithm had a negative test the day after admission and then a positive test 2 days later. This patient had symptoms present on admission.  In 6% (2/35) of the cases, the third party reviewer disagreed with the IP manual surveillance results. 

 

Manual Surveillance

 

Algorithm

 

Hospital Acquired (HA)

Community Onset (CO)

Total

HCFO-HCFA

62

51

113

Other

3

161

164

Total

65

212

277

 

95%

Sensitivity

76%

Specificity

 

Conclusions: Despite the limitations, CDI lab based electronic monitoring is an acceptable alternative to manual surveillance.  It provides data for trending and hospital comparisons.  Some discrepancies between manual and electronic surveillance may be resolved with education regarding more timely diagnostic testing on admission.  Clinician education and practices are important components in improving an algorithm’s performance.  BJC HealthCare has implemented electronic surveillance as our standard CDI measurement.