Funders deserve to know that the work that they are funding is making a difference to the people they serve. But is it possible to measure shared outcomes effectively and meaningfully across funding streams?

When collected and used effectively, data can help funders work collaboratively with organisations to:

  1. Measure and clearly articulate that they are making the difference intended for the people they serve;
  2. Implement ongoing quality improvement at the programmatic level; and
  3. Inform stakeholders and the community on where and why programming is working or not working and how they are working to improve it.

But in order for funders to collect meaningful data across numerous programs, a significant amount of preliminary work needs to be spent developing appropriate and consistent performance measures with service providers. This will lead to more meaningful outcomes measurement, quality improvement, and impact for clients and the community.

NSW Family Services has been working with 16 Brighter Futures Lead Agencies across NSW and 10 Child Youth and Family Services (CYFS) in the Nepean Blue Mountains District of NSW to see if it is possible to measure shared outcomes across funding streams meaningfully. These organisations offer a case study in how the Results-Based Accountability framework can help result in more useful data and better outcomes for clients. This work has demonstrated how the framework can give organisations the data they need to assess, analyse, inform and continuously improve their individual programs.

Prior to our work with Brighter Futures and CYFS, these organisations were already collecting significant amounts of data as required by their funder. However, they reported that the data was not giving them what they needed to meaningfully capture whether outcomes were being achieved for their clients. The funders also reported that the data they were receiving was inaccurate and therefore not useful enough to inform programming decisions and report effectively to Treasury.

Through the outcomes-focused work we’ve been doing with these services, we’ve been able to uncover two primary reasons data is proving ineffectual for both parties. Keep in mind: these are not issues specific to these organizations alone, but are common to the field in general. It would be useful to keep these data “pitfalls” in mind when developing performance measures in your own organisation.

1: Measures that are too specific.

These programs work directly with children and families. However, the nature of the work they do is incredibly diverse, and their success comes from meeting the family at their point of need. Both programs are asked to collect very specific outcome measures, and these measures do not always capture the primary area of need for the family.

For example, CYFS programs are required to ask all families, ‘Have your parenting skills improved as a result of the program?’ While these services support parents on their parenting skills, there are often a number of presenting factors for the family (often at the crisis level) that must be addressed prior to working on parenting skills. For example, domestic violence, drug and alcohol abuse, inadequate housing, and financial issues must be addressed first. By addressing these issues and demonstrating how outcomes are being achieved in each specific area of need, children are likely to be better off. However, asking parents if their parenting skills have improved as a result of the program does not accurately capture how the program has helped improve any of these other areas of need. Arguably, by addressing these issues and demonstrating outcomes are being achieved in the area of need, the children are better off. However, this will not be accurately reflected by asking if their parenting skills have improved as a result of the program.

2) Unclear language

When an organisation is collecting data for the purpose of internal quality improvement, it is critical that they are clear on what data they are collecting and why. This involves clear language and communication across the organisation, particularly with those involved in collecting the data. This can be challenging. In the case of CYFS, this challenge is magnified with the funder collecting data across 100’s of organisations throughout NSW. The funder wants to use this data to demonstrate the difference that the program is making for children and families accessing the service across the state. However, in order for this data to be meaningful, each organisation must interpret the measure the same way and must collect the EXACT same thing.

As an example, Brighter Futures organisations are collecting “% of families who have achieved case plan goals at exit.” This is difficult to do, as this measure is largely subjective. What constitutes case plan goals being achieved? Does this mean all case plan goals? Most case plan goals? If they are working with families on numerous complex issues such as mental health, child safety or drugs and alcohol, can we measure the quality of the program based on whether these difficult issues have been fully addressed in the time frame that they are with the program? This is very different to the families who need a referral for housing or support in child behaviour management.

How do we resolve these issues?

Taking the time to define performance expectations and implement consistent and appropriate measures across funding streams is a difficult, but necessary, part of preliminary work. Our work with Brighter Futures and CYFS has helped us articulate a useful set of steps to guide this process and lead to the development of meaningful and useful performance measures:

  1. Train organizations and funders in Results Based Accountability™, understand it and recognise the value of outcomes measurement to clients, organisations and funding bodies.
  2. Collaborate and agree to a set of performance measures that will best represent the outcomes that you are trying to achieve in your program.
  3. Discuss the language in each measure to ensure that every organisation is on the same page in terms of what is being collected.
  4. Recognize the limitations of averaging data across multiple organisations using different data collection tools.
  5. Focus on each organisation’s individual data for quality improvement planning.
  6. Use the Results-Based Accountability framework internally for on-going quality improvement of programs.
  7. Make sure each organisation agrees to capture data regularly and enter the data into a web-based performance management program (like the Clear Impact Scorecard). Use this tool to analyse data and develop actions to improve programming.
  8. As you analyse the data, continue to have conversations about data collection to ensure that it continues to be useful.

We also believe that it may be possible for funders to collect shared measures meaningfully by:

  1. Limiting the number of mandatory shared measures, giving service providers room and autonomy to develop additional measures meaningful to their own quality improvement;
  2. Consult with service providers on what data will be collected;
  3. Communicate the purpose of the data clearly;
  4. Make sure sure that everyone is clear on what they are collecting;
  5. Provide clear definitions to ensure all organisations are collecting the same thing, and most importantly;
  6. Capture from service providers the story behind the data, partners, what works to do better and planned actions.

When these steps are implemented effectively, meaningful and useful performance measures are just one of the benefits. As a result of this work, organisational leaders with Brighter Futures and CYFS are now also driven to use outcomes measurement as a tool to:

  1. Clearly redefine WHY their organisation exists and why they do what they do;
  2. Support the organisation to ‘show up differently’ and work innovatively, with sights set on outcomes as opposed to counting activities; and
  3. Guide their organisation to think outwardly, striving to better understand how their work contributes to the broader aspirational goals of the community they serve.

A note about autonomy:

There are countless successful programs out there achieving valuable outcomes for the communities they serve. Unless organisations are allowed a degree of autonomy to measure, analyse and report back on the outcomes of their programs, we will never have a clear understanding of what is working and what isn’t. For each program, region and organisation, what works will vary. CYFS and Brighter Futures funded organisations are no exception. If we continue to force a one-size fits all approach to outcomes measurement, we will minimise the benefits of the work and never truly reflect the impact that the funding is having on the communities being served.


About the author:

Kate Tye

Beth Stockton is a consultant with Clear Impact Australia. She works closely with a wide range of non-government organisations, providing them with training and coaching around the concepts of RBA™ and how to embed the principles into their overall strategy and service delivery.