157 Implementing Automated Surveillance to Track Clostridium difficile Infection (CDI) at Multiple Healthcare Facilities

Saturday, April 2, 2011
Trinity Ballroom (Hilton Anatole)
Erik R. Dubberke, MD, MSPH , Washington University School of Medicine, St. Louis, MO
Humaa A. Nyazee, MPH , Washington University School of Medicine, St. Louis, MO
Deborah S. Yokoe, MD, MPH , Brigham and Women's Hospital and Harvard Medical School, Boston, MA
Jeanmarie Mayer, MD , University Hospital, Salt Lake City, UT
Victoria J. Fraser, MD , Washington University School of Medicine, St. Louis, MO

Background:  Due to the increase in the incidence and severity of C.difficile infection (CDI), it is now recommended that all US hospitals track CDI. To enhance surveillance and improve efficiency, we developed and evaluated the effectiveness of an automated electronic surveillance algorithm for CDI surveillance at three CDC Prevention Epicenters hospitals.

Objective: To develop and validate an automated electronic surveillance algorithm for CDI surveillance.

 

Methods: Data on patients with toxin-positive CDI admitted from July 1, 2005 through June 30, 2006 were collected from electronic hospital records from three tertiary-care centers in the US. Each center worked with their medical informatics department to design and apply the algorithm to the one year sample of electronically available hospital data. CDI case classifications identified by the electronic algorithm according to recommended surveillance definitions were compared to classifications previously determined by chart review. A second chart review was performed if results were discordant to determine the most likely source of the CDI (the gold standard). Kappa (κ) statistics were calculated to measure the agreement between the algorithm and gold standard. The sensitivity and specificity of the algorithm were calculated.

 

Results: 1410 cases of toxin-positive CDI were identified overall. The overall sensitivity and specificity of the algorithm and kappa values by CDI onset were as follows: healthcare onset: 90%, 99%, and 0.87; community-onset study facility associated: 94%, 97%, and 0.84; community-onset other healthcare facility associated: 37%, 99%, and 0.46; community-onset/community-associated: 96%,  93% , and 0.65; indeterminate cases: 80%, 98%, and 0.73; and recurrent cases: 93%, 99%, and 0.93. Hospital level results are in table 1.

 

Conclusions: Compared to chart review, electronic surveillance was highly sensitive, specific, and showed good to excellent agreement for healthcare onset, community-onset study facility associated, indeterminate, and recurrent CDI cases. Electronic surveillance is reliable for tracking CDI within a healthcare facility.

Table 1: Sensitivities, specificities, and kappa values by CDI onset and facility

 

Case definition                                                                          

Facility [Sensitivity (%), Specificity(%),(Kappa value)]

 

A

B

C

Total

Healthcare facility-onset

99, 98 (.97)

75, 99 (.66)

93, 100 (.93)

90, 99 (.87)

Community-onset, study center-associated

93, 96 (.83)

100, 97 (.86)

84, 99 (.83)

94, 97 (.84)

Community-onset, other healthcare facility associated

16, 99 (.25)

82, 98 (.61)

53, 98 (.59)

37, 99 (.46)

Community-associated, community-onset

91, 95 (.71)

100, 87 (.63)

100, 92 (.44)

96, 93 (.65)

Indeterminate

83, 98 (.80)

73, 98 (.63)

63, 97 (.48)

80, 98 (.73)

Recurrent

99, 99 (.97)

88, 99 (.85)

64, 100 (.77)

93, 99 (.93)