Conducting a Program Inventory
Upcoming Events
Related News
A Program Inventory (PI) is a list of programs or services provided by an agency or jurisdiction. PIs can be organized in a number of ways, depending on the needs of the user, but should include basic information such as population served, intended outcome and evidence of effectiveness. In the context of opioid settlement spending, a PI documents what opioid abatement strategies are currently funded, whether those programs have been evaluated and, if so, how effective they are. By utilizing a PI, counties can ensure that opioid settlement funds are used to supplement, rather than supplant, investments in substance use disorder prevention and treatment.
This guide provides detail on key steps and considerations for developing and using a PI to support program planning.
I. Define Programs
The term “program” can mean different things to different people. For the purpose of PI, “programs” are defined as “systematic activities that engage participants in order to achieve desired outcomes.”
For example:
Desired Outcomes | Programs |
---|---|
Reduce recidivism |
|
Increase parenting skills |
|
Prevent youth alcohol use |
|
Reduce post-traumatic stress disorder (PTSD) symptoms |
|
You may also see programs referred to as “interventions” or “services.”
For the purpose of a PI, programs do not include wraparound services, resource centers or places that offer multiple services or treatment options. Instead, programs are defined as the specific components offered at such packaged service locations, such as individual counseling, substance use treatment and job readiness training. As such, these specific components should be listed in their own individual rows in the PI.
Additionally, there are often administrative functions that support programs, such as training, issue-specific collaboratives or implementation monitoring activities. These are also not considered to be programs for purposes of the PI.
It is important to clearly define programs for two main reasons. First, the evidence base — a collection of rigorous evaluations that have determined a program’s level of effectiveness — generally focuses on “programs” as defined in this resource. Knowing what the evidence says about each of your programs is a significant part of the PI. Second, a specific and consistent definition will help those charged with collecting data about programs better communicate what information they are seeking.
Nevertheless, jurisdictions may still want to include information on services they fund in their PI that do not meet this definition of “programs.” These services are often valuable parts of the program continuum, available to everyone but not aimed at impacting a specific outcome (e.g., psychiatric evaluations or case management). However, jurisdictions should know that the inclusion of such services can make the inventory process take longer and potentially render the final PI difficult to manage, analyze and present.
Thus, it is advisable to seriously consider how this information will be gathered and used before including it in the PI.
II. Choose an Inventory Scope
The first step of the PI process is to define the scope. Jurisdictions may choose to inventory programs across an entire agency or department (e.g., all adult criminal justice programs) whereas others may decide to focus on a specific outcome of interest, setting type or program type (e.g., substance use disorder treatment programs for offenders, prison-based programs or contracted programs).
The scope is entirely up to the jurisdiction; however, it should reflect the ultimate goals for conducting and using the inventory. Other key factors to consider are the availability of information and the resources that can be dedicated to gathering it.
III. Determine Method for Collecting Programmatic Data
Depending on the entity leading the PI process, there are different approaches to collecting the key pieces of programmatic data included in the PI, such as program description, setting and target population. For instance, when an agency is leading the effort, leaders can go directly to their program staff for this information.
Alternatively, when this work is being coordinated by a central entity or a legislative body, the process may start by populating the PI with publicly available information found via an online search of agency websites. The lead PI entity can then reach out to agency program staff to fill in any gaps and thoroughly vet the data. This includes ensuring the details of a program reflect its current implementation and not simply the program’s intended design.
In jurisdictions where programs are provided at the county level or by community-based organizations, a survey tool with clear instructions may be used to collect the data.
Regardless of the approach, the lead PI entity should clearly articulate to all the parties providing the data what information is needed, why it is being requested and how the jurisdiction plans to use it. This educational step is crucial for a successful PI process.
Even when such outreach has been conducted, there is likely to be significant back and forth between the different parties since the requested data is typically not readily available or in the proper format for the PI. To minimize this back and forth – which can cause frustration or delay – it is advised to first send and/or walk through an example that clearly shows the information you hope to receive, including how it should be presented and the level of detail requested. In addition, the lead PI entity may want to ask that the relevant parties provide the requested information for only a few programs at first to make sure there is clarity on the type of data needed before proceeding.
IV. Conduct the Program Inventory
The Program Inventory template, available below, helps jurisdictions begin collecting data for a PI. The PI template is a spreadsheet with columns for collecting programmatic data based on what is needed to match to the known evidence base and what has been found to be useful for other jurisdictions across the country. The template contains an instructions page with a brief explanation for each data element.
Download Program Inventory Template
This template is meant to serve as a starting point, but it should be customized to fit your jurisdiction’s needs and goals. You may decide to add columns to document the level of investment in each program and the funding source, or other details that inform specific research and policy questions.
The following sections describe each data element and discuss how that data can be used.
A. Program Information
It is highly recommended that jurisdictions gather the following pieces of information for each program because they are needed to accurately match each program to the evidence base. As a reminder, this information should reflect the program as it is currently being delivered and not how it was designed:
- Name
- Description: A short description that accurately captures the program as provided (and not as designed). For instance, “A manualized intervention where a therapist directly observes a parent and child through a one-way mirror while providing direct coaching to the parent through a radio earphone. The focus is on building the skills of the parent to more positively interact with the child and manage the child’s behavior.”
- Average duration: The average length of time a person is considered “active” in the program (e.g., 6 months, 3 years).
- Frequency/intensity: The number and duration of sessions or classes (e.g., three 1-hour sessions per week).
- Delivery setting: Where the program takes place (e.g., in a facility, at school, at home).
- Target population: A description of the program participants (e.g., young adults ages 18-25 referred for alcohol treatment).
- Individual or group-based: If applicable, note whether the program is delivered on an individual basis, in a group format or a combination of the two.
B. Evidence of Effectiveness
The last several columns are used to document the evidence of effectiveness for each program. This an essential part of the PI because it shows jurisdictions if the programs they fund (or programs very similar to the ones they fund) have been rigorously evaluated and if so, their level of effectiveness. This information includes:
- Program evaluated in jurisdiction: This is to note whether the program has been evaluated within your jurisdiction (jurisdictions should cite the evaluation, if possible).
- Impact evaluation or performance monitoring: This column is to indicate if the program has been locally evaluated and what type of evaluation was conducted (e.g., impact evaluation or performance monitoring ).
- Clearinghouse program name: What the clearinghouse calls the program, which may differ from the name used in your jurisdiction.
- Clearinghouse: The clearinghouse in which you found your program. In cases where your program appears in more than one clearinghouse, we advise choosing the clearinghouse whose description best reflects your program as implemented. You can then use the notes/comments column to indicate which other clearinghouses have reviewed the program.
- Link to clearinghouse program page: Copy and paste the URL of the clearinghouse's review of the program in your spreadsheet. If you are using the Results First Clearinghouse Database, you can access this URL by clicking the “Learn more” button on the bottom of the program page.
- Rating: The rating indicates how effective a program is. This rating can be the Results First Clearinghouse Database rating color (e.g., green/highest rated) or the actual rating given by the clearinghouse (e.g. effective). It is recommended to choose one rating type and be consistent throughout the inventory. If there is no program match found, you can leave this column blank or mark as “not rated.”
The Results First™ Clearinghouse Database
If there has not been a rigorous impact evaluation conducted locally on a program, jurisdictions can match their programs to the wider evidence base using research clearinghouses. Clearinghouses are a useful resource because they compile and summarize program evaluations and translate those findings into more user-friendly program ratings. There are nine nationally recognized clearinghouse databases that summarize program evaluation information across a variety of program areas. The Results First™ Clearinghouse Database is essentially a clearinghouse of clearinghouses – an online resource that brings together information on the effectiveness of social policy programs from those national clearinghouses.
Because each clearinghouse uses its own ratings terminology, the Results First™ Clearinghouse Database uses a color-coding system, which creates a common language that enables users to quickly see where each program falls on a spectrum from negative impact to positive impact.
Jurisdictions can look for their programs in the Results First™ Clearinghouse Database by taking the following steps:
Search for your program
Begin by typing in the program’s name or key words in the search bar of the Results First™ Clearinghouse Database. A dropdown menu will appear that auto-populates with up to 10 suggested programs that include your search term(s) in their titles. You can then refine your search using additional filters, such as Categories or Settings.
Review the program’s description and key details to determine whether it matches your program
Once you have found a program in the clearinghouse database that matches your program, determine whether the match is valid. To do this, first review the program description, including any mentions of average duration and frequency, as well as information about outcomes, target populations, settings and ages displayed in the database. Next click “Learn more.” This will take you to the clearinghouse’s specific program page, which contains more detailed information about the program and the evaluations reviewed by the clearinghouse that informed the program’s rating. Reviewing this information should help you determine whether the program in the database is indeed the same – or comparable – to your program.
Add clearinghouse and rating to the PI
Once you have matched a program to the database, populate the PI template with the appropriate clearinghouse-related details described above.
If a program is labeled as “insufficient evidence” (denoted by a black triangle), it should be considered “not rated” for the purposes of the PI.
In most cases, the nine clearinghouses in the Results First™ Clearinghouse Database rate programs similarly. However, in some cases, clearinghouses will assign different ratings to the same program. This is generally because the clearinghouses’ methodologies differ slightly from one another, including the research base used to arrive at the rating. For instance, CrimeSolutions.gov reviews a maximum of three studies, while Blueprints for Healthy Youth Development reviews all available studies.
In cases where your program has multiple and dissimilar clearinghouse ratings, we recommend that you compare the detailed information from each clearinghouse’s program page to find the best match (e.g., the clearinghouse whose description most accurately reflects your program as provided, including target population and setting). If this information doesn’t resolve the discrepancy, you will need to decide which rating you feel most comfortable assigning to your jurisdiction’s program.
You will likely be unable to match all programs to the database. This is expected because not every program has a rigorous research base, in part because your jurisdiction may fund innovative, homegrown programs that are addressing its unique needs and challenges. Programs that are not matched should be categorized as “not rated” on the PI unless you are able to cite high-quality evidence that shows that program’s effectiveness.
While the Results First™ Clearinghouse Database is a useful tool, it is not the only source for finding evidence. In fact, there are many high-quality clearinghouses and similar resources that are not part of the database. For example, the Community Guide from the Centers for Disease Control and Prevention focuses on programs that improve health and prevent disease; the Child Trends’ What Works database includes programs that aim to impact child or youth outcomes related to education; life skills; and social/emotional, mental, physical, behavioral or reproductive health; and the Campbell Collaboration promotes programs that focus on positive social and economic change. The Pew Results First initiative published “Where to Search for Evidence of Effective Programs,” a collection of such additional resources. However, before using any of these or other resources, please review their inclusion criteria to ensure they meet your standards for evidence.
V. Analyze and Share Your Findings
Now that you have gathered a wealth of information about your programs in one place, it is time to analyze and share your findings, as well as revisit your initial questions and scope of work to see what answers you have uncovered.
We strongly encourage sharing the results of your PI with a wide variety of stakeholders, including legislators who allocate programmatic dollars, agencies that select programs for their constituents and service providers that work on-the-ground to deliver those programs.
The method for communicating the results of your PI with these different stakeholders will likely vary according to their interests and needs. For instance, jurisdictions have given presentations, produced internal reports with extensive data, released public reports with only select information, and/or provided customized briefings. Below are examples of ways jurisdictions have chosen to publicly share information from their PI:
- The Montgomery County Department of Correction and Rehabilitation in Maryland issued a press release and accompanying report detailing their findings.
- Santa Barbara County’s Probation Department posts PI reports, like this, on their county website.
- The Minnesota Management & Budget publishes their inventories online as Excel documents. They also have combined all their inventories into one, large searchable online database.
- In Mississippi, the PEER Committee’s Performance Accountability Office publishes Issue Briefs, such as Opportunities for Improving the Outcomes of Juvenile Justice Intervention Programs at the Oakley Youth Development Center.
VI. Apply Your Findings
The ultimate goal of the PI is to apply the information gathered when making programmatic, policy and/or funding decisions. For example, jurisdictions have used findings from the PI to:
- Establish baseline information on program investments to enable comparisons and track progress;
- Set goals for increasing investments in evidence-based programs;
- Identify and invest in effective evidence-based programs that address population needs and close service gaps;
- Streamline program offerings to reduce duplication and improve implementation;
- Shift funds away from programs proved not to work or with limited evidence of effectiveness; and
- Identify key programs that would benefit from rigorous evaluation.
Some of the above applications can be executed within a single agency’s administrative control while others may require collaboration and knowledge sharing with other central groups, like a budget office or a legislative body.
The Pew Results First initiative has documented some of the uses in “How to Use the Results First Program Inventory,” as well as in the “Strategy: Review funded programs to identify what works” section of their Evidence-Based Policymaking: Program Assessment page.