USA flag logo/image

An Official Website of the United States Government

Measuring the Quality of Delivery of Prevention

Award Information

Department of Health and Human Services
Award ID:
Program Year/Program:
2010 / SBIR
Agency Tracking Number:
Solicitation Year:
Solicitation Topic Code:
Solicitation Number:
Small Business Information
View profile »
Woman-Owned: No
Minority-Owned: No
HUBZone-Owned: No
Phase 2
Fiscal Year: 2010
Title: Measuring the Quality of Delivery of Prevention
Agency: HHS
Contract: 2R44DA020954-02
Award Amount: $925,418.00


DESCRIPTION (provided by applicant): Quality of delivery is known to moderate the effectiveness of drug abuse prevention programs. As research-based programs become widely disseminated, having tools for assessing quality of delivery becomes important for d ocumenting and understanding the conditions under which programs succeed and fail. There are currently no standard methods for measuring the quality with which prevention programs are delivered. The goal of this project is to create a standardized tool for assessing quality of delivery for research-based drug abuse prevention programs.. Commercially, the addition of this product to an existing product (Evaluation Lizard) that provides pretest and posttest surveys tailored to the evaluation of evidence-based programs, will significantly strengthen our niche in the prevention evaluation marketplace. The broad aim of this SBIR project is to develop a tool that will allow the collection of standardized measures of quality of delivery across a wide range of progr ams being disseminated as a result of their inclusion on the NREPP model programs list and newly developed programs that wish to qualify for future inclusion. During Phase I, we interviewed prevention researchers, program administrators, program developers , and policy makers to define and elaborate on the essential criteria for defining dimensions of quality of implementation. Based on their input, we designed a prototype system that allowed us to accomplish the following tasks: (1) create session and progr am rating form templates for selected NREPP facilitator-delivered prevention programs, (2) link templates to a database to track form information (program, session, observer, teacher, and date of creation), and (3) create PDF forms printed remotely that ca n be linked by bar code back to the database. The prototype system measures dosage, adherence, engagement, and adaptation. We also developed a manual for using the system. While functional for the pilot test, this system is not yet fully automated and does not include numerous elements. During Phase II, our first task will be to complete the quality of delivery data gathering system for collecting quality of delivery data that will include tools for assessing adherence, adaptation, dosage, engagement, and m easures of overall quality. We will develop online applications that will allow data collected about quality of program delivery to be linked and jointly analyzed in various ways with pretest-posttest outcome data from students. To ensure that the applicat ion works as intended, we will conduct alpha and beta tests of all component applications. We will develop an online system for conducting basic analysis and preparing reports that meets the needs of five separate professional groups: program implementers, program administrators, program developers, prevention researchers, and policy makers. We will conduct a field trial with 10 prevention agencies and school districts to demonstrate that the quality of delivery data collection and reporting system provide program implementers, prevention agency administrators and supervisors, and government agency prevention administrators with information useful to documenting and understanding the process of prevention program delivery. Established prevention agencies wil l adopt the use of the Evaluation Lizard system, including both pretest-posttest surveys and quality of delivery report forms. We hypothesize that participating groups will report satisfaction with the functioning of the system and will report increased aw areness of how prevention is being implemented in their area of purview. Further, we will conduct a reliability and validity sub-study in which participating agencies will be asked to select one teacher to video record the lessons they teach so that observ ation data can be included in the analysis in order to demonstrate reliability and validity of the measures. PUBLIC HEALTH RELEVANCE: The field of drug abuse prevention research has identified numerous interventions that can be effective; however, u nless programs are implemented well in practice, they are unlikely to achieve their desired effect, and the effort and expense of these programs may be wasted. The need to assess quality of implementation is widely recognized, but, to date, there is no uni fied strategy for gathering or reporting about data. The value of the product we are proposing is that it would provide an easy-to-use system for documenting quality of program implementation that could be linked to outcome data.

Principal Investigator:

William B. Hansen

Business Contact:

Linda J. Petty
Small Business Information at Submission:


EIN/Tax ID: 156180926
Number of Employees: N/A
Woman-Owned: No
Minority-Owned: No
HUBZone-Owned: No