By: Adam Luecking
December 19th, 2022
Social Sector Hero Spotlights tell stories of exemplary social and public sector organizations that are making measurable differences in their communities. The following Brighter Futures and Child Youth and Families Services spotlight is an excerpt from “Social Sector Hero – How Government and Philanthropy Can Fund for Impact” by Adam Luecking. You can download the book for free here and read all 16 Social Sector Hero Spotlights.
Table of Contents
1. Inaccurate data, irrelevant data, inconsistent data
2. Uncovering data inconsistency pitfalls
3. Beth’s tips to avoid data pitfalls
4. The impacts of consistent language and autonomy
5. A bit more on consistency and flexibility
1. Inaccurate Data, Irrelevant Data, Inconsistent Data
Beth Stockton is an evaluator, community builder, and facilitator with The Jeder Institute and Collaboration for Impact. She has decades of experience advocating for child wellbeing in cities around the world, like Sydney, London, and San Mateo. In 2016, Beth realized many child and youth-focused funders have trouble measuring outcomes across their funding streams and programs. So, she reflected on her experiences in an article for the Clear Impact blog. In that piece, she recounted helping a regional family services agency in Australia to create consistency in their performance measurement.
In the early 2010s in the state of New South Wales (NSW), NSW Family Services Incorporated (Fams) worked with 16 Brighter Futures Lead Agencies across NSW and 10 Child Youth and Family Services (CYFS) in the Nepean Blue Mountains District of the state. The goal? According to Beth, the funders wanted to see if it was possible to meaningfully measure shared outcomes across funding streams.
Prior to Fams’ involvement with Brighter Futures (an early intervention program for children who are at high risk of entering or escalating within the child protection system), the agencies were already collecting significant amounts of required data. However, they reported that the data was not telling them whether they were actually improving outcomes for their clients. The funders also reported that the data they were getting was inaccurate and, therefore, not useful for programming decisions and reporting to the State Treasury.
2. Uncovering Data Inconsistency Pitfalls
Fams led an outcomes-focused overhaul to help CYFS and Brighter Futures get on the same page. Around this time, Beth had served as a Policy Consultant for Fams and was currently serving as an Outcomes Measurement and Learning Development Specialist delivering RBA training and facilitation. Beth and her fellow Social Sector Heroes at Fams were able to uncover two primary reasons the data collection wasn’t working. These were issues common to the field in general.
Throughout her work, Beth found that measurement is most effective and meaningful when funders are consistent and flexible with grantees when measuring shared outcomes. To help funders get better at measurement, Beth outlines two main data inconsistency pitfalls, a process for developing effective Performance Measures, and methods for collecting and evaluating shared measures. I will summarize her findings below (you can read the full article here).
Data Inconsistency Pitfalls
According to Beth, funders should avoid the following big mistakes:
- Measures That are Too Specific – Even for agencies and programs focused on children and families, the nature of the work the organizations do is incredibly diverse. Usually, success comes from meeting a family at their point of need. When funders are too prescriptive about all measures that a grantee should collect, the measures do not always capture the primary area of need for the family.
- Unclear Language – When an organization is collecting data for the purpose of internal quality improvement, it is critical that they are clear on what data they are collecting and why. This involves clear language and communication across the organization, particularly with those involved in collecting the data. In order for data to be meaningful, it’s important each organization interprets the measure the same way and collects the exact same thing.
3. Beth’s Tips to Avoid Data Pitfalls
Funders should embed processes that support organizations for clarifying purposes, identifying outcomes, and designing Performance Measures into the work. A useful set of steps to guide this process includes:
- Train organizations and funders in Results-Based Accountability and build buy-in
- Collaborate and agree to a set of Performance Measures that will best represent the outcomes that you are trying to achieve
- Discuss the language of each measure to ensure that every organization understands what data they are collecting and why
- Recognize the limitations caused by averaging data across multiple organizations using different data collection tools/systems
- Focus on each organization’s individual data for quality improvement planning
- Use the Results-Based Accountability framework for ongoing quality improvement
- Support each organization to capture data regularly and enter the data into a web-based performance management program
- Continue to have conversations about data collection to ensure that it remains useful.
Beth believes that funders can also meaningfully measure shared outcomes by limiting the number of mandatory shared measures and giving grantees the autonomy to develop additional measures meaningful to their own quality improvement. She noticed that when partner organizations are allowed a degree of autonomy to measure, analyze and report back on outcomes specific to their programs, they are more invested in using data for genuine learning and quality improvement, as opposed to data for reporting purposes only.
4. The Impacts of Consistent Language and Autonomy
What were the results of utilizing consistent language and making space for autonomy in measures? Beth shares that organizational leaders with Brighter Futures and CYFS are now driven to use outcomes measurement as a tool to:
- Clearly redefine why their organization exists and why they do what they do
- Support the organization to ‘show up differently’ and work innovatively, with sights set on outcomes as opposed to counting activities
- Guide their organization to think outwardly, striving to better understand how their work contributes to the broader aspirational goals of the community they serve.
5. A Bit More on Consistency and Flexibility
This blog is an excerpt from chapter six of Social Sector Hero. To summarize the chapter and Beth’s advice above, consistency, particularly in Performance Measures and Performance Reporting systems, can help you compare the effectiveness of different investments. This is difficult, but one way to accomplish consistency is to create a few standardized measures for similar programs.
Every organization, community, and program is unique, so it’s important that partners have the flexibility to design appropriate strategies that reflect their unique circumstances, skills, and resources. They should have the opportunity to report on optional unique measures and tell their detailed data story. Consistency and flexibility allow you to realize the maximum potential of your investments, tell your fellowship’s story of aggregate impact, and highlight the differences, roles, and accomplishments that make you all indispensable to your stakeholders.
You can learn more about the concepts of consistency and flexibility in grantmaking in my blog here.
Leave A Comment