Guidelines for Youth Outcome and Program Quality Data Collection: Pandemic and Virtual Learning

October 26, 2020

teen boy STEMWe are all working and learning through a challenging time! Many of the ways we actively and regularly participated in youth outcome and program quality data collection may not be achievable or realistic at this time. Yet, we know that understanding and measuring the impact of our work on youth outcomes and progress on continuous program improvement goals is still important. As with other tasks and priorities, data collection may need to shift or be altered during this unprecedented time. We’ve prepared some guidelines for data collection that take modifications and shifts  in program design, including hybrid and virtual learning, into consideration. 

Key Considerations for using Program Quality and Youth Outcome Assessments

It is always essential to ensure a close and direct connection between program activities delivered, staff practices,  and the outcomes measured. Virtual programming may take place for shorter periods of time and cover fewer learning goals than traditional programming for your organization, so it is important to reflect that change in your measurement approach with regard to program quality observation and youth outcome data collection. Any data collection process should  have an explicit purpose, and it should be clear to all how data will be used to inform program practice and quality.

Questions to consider:

  • What are the outcomes you can directly impact in your current program delivery mode and content (in-person/virtual/hybrid)? Is there a clear connection between program goals, staff practice, programming delivered, and youth outcome goals?
  • Will there be sufficient opportunities, time, and interaction with staff to perceive youth growth in targeted areas? For example, are there products, events, or activities that  will allow for the demonstration of youth skill growth and staff practice change?
  • Will attendance patterns allow for what you expect to be a sufficient number of youth participating in data collection? For example, data collection with five youth or only ten total days of programming will have limited value.
  • Will observation of virtual program delivery allow sufficient time and opportunity to understand and assess the experiences that youth are having without causing disruption or stress for youth and staff? 
  • How can you make adjustments in data collection protocols to continue to ensure completion of assessments and privacy for participants?

 

Data Collection Approaches 

  • Pre- and post-test:  Collect data at the beginning and end of the program (e.g., the beginning and end of the school year or the beginning and end of a session or module). During analysis pre/post data is matched for each participant. This is the most common approach used in typical years. 

  • Minimum number of program hours: The shift to virtual/hybrid programming may have reduced program hours. Historically, youth program outcomes using a pre/post model have been investigated in program experiences that take place for 50 plus hours. It may be informative to connect data to youth attendance records to contextualize your findings. 

  • Post-test only/Retrospective: This method is highly recommended as a data collection strategy that minimizes interruption and “survey or assessment stress” on leaders and participants. Administer only at the end of a module/supplement/curriculum, or at the end of the school year.  Executing a post-test only assessment still provides valuable insight into “where youth are” relevant to youth learning outcomes and can inform what staff and program practices to build on that support skill growth in programming going forward. 

  • Snapshot: This method is recommended as a means to get a sense of what is happening in your program at any single point in time. You would administer assessments just once, but this could be at any time during the session that would make sense and be convenient for all participants.  Like pre/post and post-test only, you can use this data for continuous quality improvement (CQI), also.  

  • Descriptive: Select a random sample of youth to assess  or participate in surveys at the start and end point of the program and compare group means on the target outcomes.   

 

Other Recommendations

Observational Tools

When observing virtual programs, consider the various ways in which youth and staff can interact with each other. Just because a youth’s camera is off, doesn’t mean they are not engaged in an activity. Tools that assess quality program practices can guide programming changes/redesign for virtual and hybrid settings. 

Youth Outcome Tools

For  virtual learning experiences focus only on youth outcome domains that are directly manifested  in your current program delivery mode and content. Being historically tied to previous program experiences doesn’t mean those outcomes are a best fit to investigate during this pandemic programming time.

Youth Surveys

We highly recommend the use of youth surveys as a Post-test only or Retrospective or Snapshot (one time or multiple times throughout the year without pre/post comparison). This approach may help to cut down on disruption and at the same time provide helpful information to guide and adjust programming. 

Most importantly, it is essential that data is collected in meaningful and authentic ways. Staff who are part of the data collection process need to be ready and able to engage in this work. Only high quality data can be used to evaluate the efficacy and effectiveness of an OST program. 

 

If you are an  APAS tool user please This email address is being protected from spambots. You need JavaScript enabled to view it. for tool specific recommendations.

  • PEOPLE ARE SAYING

    "NIOST has been an anchor for numerous school age care projects we do, including ASQ (After-School Quality) and Links to Learning. They are a nationally respected organization that Pennsylvania has partnered with for over 20 years."



    – Betsy O. Saatman, TA Specialist/SAC Initiatives, Pennsylvania Key
  • PEOPLE ARE SAYING

    "NIOST was a core partner in supporting the development of quality improvement systems across the nine cities that participated in The Wallace Foundation Next Generation Afterschool System-Building Initiative. The NIOST team worked well with other technical assistance partners in the initiative, always willing to pitch in and collaborate with others to make our professional learning community meetings a team effort. I truly hope the Foundation has an opportunity to partner with them in the future."


    – Priscilla M. Little, Initiative Manager, The Wallace Foundation

  • PEOPLE ARE SAYING

    "NIOST has been a leader in the out-of-school time field for as long as I can remember, and I have relied on their research, tools, and advice to improve my practice throughout my career. Their staff members are good partners and good listeners, and their influence across the country is palpable."


    – Jane Quinn, Vice President and Director of National Center for Community Schools, Children's Aid Society
  • PEOPLE ARE SAYING

    "Georgia Hall, Ellen Gannett, and the NIOST team have been instrumental in driving the healthy afterschool movement. Their dedication to quality practice, informed policy, and collective impact is instrumental in our effort to create healthier communities."



    – Daniel W. Hatcher, Director, Community Partnerships, Alliance for a Healthier Generation

niost logo white

The National Institute on Out-of-School Time

A program of the Wellesley Centers for Women at Wellesley College

Wellesley Centers for Women
Wellesley College
106 Central Street
Wellesley, MA 02481-8203 USA

niost@wellesley.edu
781.283.2547
Directions to NIOST

Our website uses cookies to enhance your experience. By continuing to use our site, or clicking "Continue", you are agreeing to our privacy policy.
Continue Privacy Policy