Guidelines for Youth Outcome and Program Quality Data Collection: Pandemic and Virtual Learning
October 26, 2020
We are all working and learning through a challenging time! Many of the ways we actively and regularly participated in youth outcome and program quality data collection may not be achievable or realistic at this time. Yet, we know that understanding and measuring the impact of our work on youth outcomes and progress on continuous program improvement goals is still important. As with other tasks and priorities, data collection may need to shift or be altered during this unprecedented time. We’ve prepared some guidelines for data collection that take modifications and shifts in program design, including hybrid and virtual learning, into consideration.
Key Considerations for using Program Quality and Youth Outcome Assessments
It is always essential to ensure a close and direct connection between program activities delivered, staff practices, and the outcomes measured. Virtual programming may take place for shorter periods of time and cover fewer learning goals than traditional programming for your organization, so it is important to reflect that change in your measurement approach with regard to program quality observation and youth outcome data collection. Any data collection process should have an explicit purpose, and it should be clear to all how data will be used to inform program practice and quality.
Questions to consider:
- What are the outcomes you can directly impact in your current program delivery mode and content (in-person/virtual/hybrid)? Is there a clear connection between program goals, staff practice, programming delivered, and youth outcome goals?
- Will there be sufficient opportunities, time, and interaction with staff to perceive youth growth in targeted areas? For example, are there products, events, or activities that will allow for the demonstration of youth skill growth and staff practice change?
- Will attendance patterns allow for what you expect to be a sufficient number of youth participating in data collection? For example, data collection with five youth or only ten total days of programming will have limited value.
- Will observation of virtual program delivery allow sufficient time and opportunity to understand and assess the experiences that youth are having without causing disruption or stress for youth and staff?
- How can you make adjustments in data collection protocols to continue to ensure completion of assessments and privacy for participants?
Data Collection Approaches
Pre- and post-test: Collect data at the beginning and end of the program (e.g., the beginning and end of the school year or the beginning and end of a session or module). During analysis pre/post data is matched for each participant. This is the most common approach used in typical years.
Minimum number of program hours: The shift to virtual/hybrid programming may have reduced program hours. Historically, youth program outcomes using a pre/post model have been investigated in program experiences that take place for 50 plus hours. It may be informative to connect data to youth attendance records to contextualize your findings.
Post-test only/Retrospective: This method is highly recommended as a data collection strategy that minimizes interruption and “survey or assessment stress” on leaders and participants. Administer only at the end of a module/supplement/curriculum, or at the end of the school year. Executing a post-test only assessment still provides valuable insight into “where youth are” relevant to youth learning outcomes and can inform what staff and program practices to build on that support skill growth in programming going forward.
Snapshot: This method is recommended as a means to get a sense of what is happening in your program at any single point in time. You would administer assessments just once, but this could be at any time during the session that would make sense and be convenient for all participants. Like pre/post and post-test only, you can use this data for continuous quality improvement (CQI), also.
Descriptive: Select a random sample of youth to assess or participate in surveys at the start and end point of the program and compare group means on the target outcomes.
When observing virtual programs, consider the various ways in which youth and staff can interact with each other. Just because a youth’s camera is off, doesn’t mean they are not engaged in an activity. Tools that assess quality program practices can guide programming changes/redesign for virtual and hybrid settings.
Youth Outcome Tools
For virtual learning experiences focus only on youth outcome domains that are directly manifested in your current program delivery mode and content. Being historically tied to previous program experiences doesn’t mean those outcomes are a best fit to investigate during this pandemic programming time.
We highly recommend the use of youth surveys as a Post-test only or Retrospective or Snapshot (one time or multiple times throughout the year without pre/post comparison). This approach may help to cut down on disruption and at the same time provide helpful information to guide and adjust programming.
Most importantly, it is essential that data is collected in meaningful and authentic ways. Staff who are part of the data collection process need to be ready and able to engage in this work. Only high quality data can be used to evaluate the efficacy and effectiveness of an OST program.
If you are an APAS tool user please