919 Data Validation of State-Based Data in the National Healthcare Safety Network – the Pennsylvania Model

Sunday, March 21, 2010
Grand Hall (Hyatt Regency Atlanta)
Zeenat S. Rahman, MBBS, MPH , Pennsylvania Department of Health, Bureau of Epidemiology, Division of Infectious Disease Epidemiology, Harrisburg, PA
Kevin Nelson, PhD, MPH , Pennsylvania Department of Health, Bureau of Epidemiology, Division of Infectious Disease Epidemiology, Harrisburg, PA
William Cramer , Pennsylvania Department of Health, Bureau of Quality Assurance, Harrisburg, PA
Stephen Ostroff, MD , Pennsylvania Department of Health, Bureau of Epidemiology, Harrisburg, PA
Background:

Data accuracy is a critical attribute for any surveillance system.  Inaccurate data can lead to faulty analyses, erroneous interpretations, and ineffective public health actions.  Data accuracy is especially important with respect to mandatory reporting of healthcare associated infections (HAI), since the data are publicly disclosed.  In 2007, Pennsylvania enacted legislation requiring facility-wide reporting of all HAIs using the National Healthcare Safety Network (NHSN).  Reporting started in early 2008.  To assure the quality of reported data, the Pennsylvania Department of Health (PADOH) instituted a process to validate the data being submitted to NHSN by the 255 acute care facilities in the state.  This facility-specific system is known as the “Data Integrity Validation (DIV) Report.”

 

Objective: To describe the Data Integrity Validation report and how it is being implemented in Pennsylvania to assure the accuracy of information submitted to NHSN by Pennsylvania hospitals. 

Methods: A DIV report is run monthly by PADOH for each acute care facility reporting HAIs to NHSN.  The report analyzes data submitted by each facility from two months earlier and contains two sections – a programmatic portion and an epidemiological analysis portion.   PADOH-developed data analysis programs determine the usual patterns of monthly submissions from each facility and automatically flag significant deviations from these patterns.  Such deviations potentially include significant changes (increases or decreases) in the total number and types of infections reported, the number of procedures done at the facility, or the number of device days reported.  Each reported HAI is also scrutinized for data omissions or possible errors.  Besides missing data elements, a report could include errors such as formatting mistakes or invalid or questionable responses.   Examples of the latter include a device-associated infection but no associated device days or, zero patient-day for an infection.  An overall report with all questionable findings is provided to the facility, which has 30 days to investigate the information and make necessary corrections in NHSN.  At the end of the 30-day period, the data are locked-down for purposes of analysis.

Results: The first DIV report covered the period July-October 2008, and was distributed in December 2008.  Thereafter, monthly DIV reports began to be generated.  Early DIV reports contained large numbers of questionable results per institution.  Over time, the number of questionable findings per institution has declined substantially. 

Conclusions: The DIV process has resulted in improvements in the accuracy of Pennsylvania data in NHSN.  Continuing decreases in the number of entries in DIV reports per institution suggest that either (1) the report itself has served to improve reporting quality by identifying problems or (2) NHSN users are becoming more adept in data collection and entry.