842 A Practical Method to Validate the Accuracy of State-Wide Hospital Infection Surveillance

Sunday, March 21, 2010
Grand Hall (Hyatt Regency Atlanta)
Roxie Zarate, MPH , Washington State Dept. of Health, Olympia, WA
M. Jeanne Cummings, RN, BSN, CIC , Washington State Dept. of Health, Olympia, WA
David Birnbaum, PhD, MPH , Washington State Dept. of Health, Olympia, WA
Background: Healthcare-associated infection (HAI) rates are being posted on a growing number of public information websites.  However, accuracy of those rates has rarely been confirmed.  The few state health agencies that attempted validation report their approach as too costly to sustain, and often discovered noteworthy problems in accuracy.  Hospitals themselves also do not routinely assess the sensitivity and specificity of their own surveillance programs. Although not traditionally applied to this area, the industrial concept of acceptance sampling provides a sound basis for designing a valid and sustainable validation method. We used this approach to develop and implement a practical approach with shared responsibility between hospitals and a state health department.  This paper describes our pilot project using internal validation performed by volunteer hospitals to assess the workload impact and usability of our approach.

Objective: (1)Develop an epidemiologically and statistically sound validation method; (2) apply the method to determine if each hospital is able to accurately  identify which patients meet NHSN criteria for bloodstream infection (BSI) in their surveillance program.

Methods: A method and toolkit was developed in consultation with the Washington State HAI Program’s Advisory Committee. Based on the literature, 85% sensitivity & 98% specificity were determined to be achievable for BSI.  An acceptance region of ± 15 percentage points was determined to be satisfactory precision, leading to calculation of 22 cases as sample size needed to estimate sensitivity & 22 to estimate specificity. This toolkit was distributed along with an evaluation form to 9 volunteer hospitals. The form addressed sample selection, time needed for completion, ease of use, difficulties encountered, and other comments.

Results: To date, 7 of 9 volunteers have provided feedback. None suggested revising language in the toolkit.   6 said they could complete the process without assistance; 1 asked for a site visit.  Time to complete the process was consistent with our estimate of <6 hours (2 hospitals report 3-4 hours).   Some were surprised to find their sensitivity was less than expected.

Conclusions: As a first step towards annual internal validation, our volunteers found this method to be feasible, practical and valuable.