MANY Hosts Discussion on Building Your Evidence Base

By
Katy White

In February of this year, MANY convened several of its members to host the first in a series of conversations around research, evaluation, and data-driven programming. These conversation are in response to a growing demand from funders to see more evidence-based programming and a growing desire among practitioners to understand the data they collect on youth and how that can inform their practice. To start the conversation, MANY posed the broad questions of “What is working?” and “Where are the challenges?” to understand where agencies are right now and what MANY can do to support their members and the youth services field moving forward. The following is a summary of that conversation.

WHAT IS WORKING?

At the Ozone House in Ann Arbor, Michigan, program leaders adopted a new method for sharing data with their staff as a means of incorporating data and the value of that information into daily practice.  By posting a dashboard in the staff common area that keeps track of key data points and outcomes, all staff feel a sense of ownership over the information and gain a better appreciation of why they collect the data. It was through this process the staff began to identify additional metrics that went beyond funder required metrics to those that would truly help to inform program practice. As a center that hosts runaway and homeless youth, as well as manning a crisis hotline, staff came to realize that tracking the number and types of calls to the crisis hotline gave them a better picture of what types of youth and needs they have within the rest of their programming and could plan accordingly. The YMCA of Greater Twin Cities (Minnesota) is looking at the broader picture of data and asking the question “how do we tie all the work that is done together through common outcomes?” That is to say, they are looking beyond how to respond to the requirements of funders and really exploring how to communicate program outcomes and impact back to the public. This idea is powerful on two levels. First it prompts youth organizations to coordinate with each other to ask for data and outcomes that are valuable to the youth development field and not just for funders. Even more than that, it gives programs the resources and data to tell a broader story of how various programs and interventions impact youth and the community. This is a message that can (and should) be shared with agency staff, agency board members, all funding stakeholders, policy makers, and the general public.

WHERE ARE THE CHALLENGES?

Across the board, agencies shared that a recurring challenge was finding the right balance between tracking funder mandated metrics and identifying additional metrics that may better inform program practice and youth outcomes. In some cases, such as with runaway and homeless youth providers, this challenge was exacerbated when data tracking systems for federal grants and private grants don’t line up or ask for slightly different metrics that result in staff doing double the work to track the same outcomes. Agencies were in agreement that they would be able to free up more time for direct service and deeper evaluation, if they could get funders to agree on a common set of metrics and data collection language.

Another challenge comes with finding funding for rigorous evaluation projects. Many funders are requesting these types of evaluation plans for their proposals but aren’t necessarily allocating funds to support the evaluation. Or they are requiring a certain percentage of awarded funding be dedicated to evaluation and then it is taking away from direct service support. At the federal level, the Office of Juvenile Justice and Delinquency Prevention (OJJDP) is exploring a new option for funding direct service and research by bringing together two separate pots of funding to support the same overall project.

Over the next four years, MANY is partnering with the Center for Evidence Based Mentoring at the University of Massachusetts, directed by Dr. Jean Rhodes, and Innovation, Research and Training, (iRT) a social sciences research consulting firm based in Durham, North Carolina, that specializes in program development, critical evaluation and outcomes to conduct rigorous research of this mentoring model.  MANY has been funded to provide mentoring enhancements for programs serving youth with incarcerated parents, while The Center for Evidence Based Mentoring and iRT run a corresponding formal research project on the enhancements.  Should this type of partnership prove to be successful, OJJDP and other federal agencies may explore this method of partnered funding more in the future.

In terms of implementing evaluation projects, the most common challenge for MANY members  pertain to the population of youth who were the subjects of the evaluation. Numerous programs only serve youth for a finite period of time, some as short as 30 to 60 days. While there are methods for tracking outcomes related to the program, getting follow-up data on these youth a year or more into the future can be a serious challenge. These vulnerable youth are often transient and don’t tend to have reliable communication to keep in touch and conduct follow-up evaluations. More often than not, any long-term outcomes on youth in programs come from an individual youth who reaches out to the program to share anecdotal outcomes. These stories tend to be powerful and can be a great addition to an annual report but they don’t hold the same rigor as a long term evaluation. When there are opportunities for rigorous evaluation, many agencies (particularly those that serve runaway and homeless youth), find themselves questioning how to conduct an evaluation with a control group. Or rather, “how can you deny services to a population with such specific and urgent needs?”

Recently, the Family & Youth Services Bureau (FYSB) shared a Research Roundup that looked at different ways researchers are trying to respond to these challenges when doing evaluation with runaway and homeless youth. In the article, they give an overview of three different research projects that address the question of how to use a control group, how to keep track of youth participants, and how to engage youth as partners in the evaluation. While these projects were done with runaway and homeless youth, the ideas and approaches could easily translate to other vulnerable youth population.

WHAT'S NEXT?

MANY will continue to host small group discussions and public webinars that go deeper into data-driven decision making for program staff, as well as bringing together research partners to learn more from their perspective when it comes to research and evaluation with vulnerable youth populations.   Stay tuned for more information on upcoming events and opportunities to join the conversation!