Skip to main content

5 ways to make measurement of youth provision more meaningful

2021-06-10

For many, matters of measurement in youth provision are a deeply contentious issue. However, I’d argue that, when done well, and in the right circumstances, measurement can become a meaningful part of working with young people that ‘goes with the grain’ of practice.

 

Over the past few years I’ve worked on two projects that are ambitious in the way they try to tackle the known issues in evaluating youth provision.

 

These projects have stretched and challenged my thinking about what good measurement looks like. Based on what I’ve learnt, these are the five ways I think we could make measurement of youth provision more meaningful:

 

  1. Prioritise the things that lead to change over the change itself

This isn’t a new statement - it’s been said before - but if we have a clear sense of what good practice looks and feels like, focusing on doing this consistently well should lead to positive changes for young people. There are multiple benefits to focusing on quality: it provides actionable insights that can lead to immediate improvements in provision; it doesn’t require the time-consuming and logistically challenging process of tracking young people over time – it’s focused on the ‘here and now’; and it’s absolutely aligned with the values and ethics of youth work. In the YIF, we used the Programme Quality Assessment (PQA) to measure quality. This is an evidence-informed tool based on a framework of high-quality practices that are likely to lead to the development of young people’s social and emotional skills. We found that, in our YIF sample, young people in higher quality provision (as measured by the PQA) made greater progress on outcomes related to social and emotional skills, social connectedness and well-being than those in lower quality provision. Young people’s feedback was also more positive in higher quality settings.

 

So, focus on doing things well and good things will follow.

 

  1. Rethink outcomes measurement

In addition to a belief in the fundamental importance of focused on quality practice, there are times when understanding outcomes is important, and the relationship between quality and outcomes requires further testing. However, we need new approaches to collecting this data. Firstly, we need to focus on the right conditions in which to collect outcomes data. Our work suggests the following are optimal:

  • Provision is intentionally designed to improve specific outcomes (e.g. SEL skills). If activities are designed to be purely recreational, for example, it is neither useful nor appropriate to attempt to measure SEL outcomes for young people. A key indicator here is whether youth organisations feel they need to pause or adapt regular activities to ‘get the SEL bit in’.
  • The young people attending provision are experiencing sufficient exposure to provision to improve intended outcomes. For SEL skills, this is likely to be at least regular weekly attendance over a minimum of two months. Attending one-off activities will not lead to SEL development.
  • It is possible to practically track young people’s outcomes over time. Working with smaller subsets of young people (rather than an entire group or cohort) may make this more feasible.
  • Relationships of trust and honesty exist with the young people, which includes explaining why gathering this data really matters and what will be done with it.

Secondly, we need a more feasible and sensitive measurement process. YIF Grant holders told us that young people don’t like filling in multiple questionnaires; will tell them ‘what they want to hear’; and that responses can be affected by mood or what has happened prior to completing the questionnaire. To tackle these issues, the Centre is piloting a new approach called the Adult Rating of Youth Behaviour (ARYB). This is an observational rating of SEL outcomes which reduces the burden on young people to fill out questionnaires and leans into the youth work process of building relationships with young people, including understanding their behaviours and relationships. Observation of behaviours is also more sensitive to change, and as the ARYB is based on observations of a pattern of behaviour over approximately two weeks, it is less affected by mood or situation.

 

  1. Involve young people

Youth organisations care deeply about working in partnership with young people, rather than making decisions on their behalf. This is a fundamental feature of youth work. However, our data suggests that there is room for improvement in listening to, and acting upon, feedback from young people. One way to do this is through the systematic collection of feedback data using a consistent set of feedback questions. This was a really successful part of the YIF approach, with organisations reporting that they found it straightforward and useful. Using a common set of questions enables benchmarking and comparison across settings, building on the YIF data dashboard (see ‘Feedback’ tab).

 

This is not intended to replace the qualitative approaches that many organisations currently use to learn about young people’s experiences, rather it should sit alongside this to complement the rich detail provided through qualitative data.

 

An updated set of feedback questions is available in 'Measuring the quality and impact of open access youth provision' (see Appendix B).

 

  1. Build a holistic picture using different types of data

‘What works?’ might be an important question. But what ‘works, for whom, in what conditions and how?’ is much more useful. The only way we can answer these questions is through combining different data sets. In the YIF evaluation, we focused on five types of data:

  • Beneficiary – who was attending provision (e.g. age, gender, ethnicity)
  • Engagement – what activities were young people attending and how were they engaging with them (e.g. how often and for how long)
  • Feedback – systematic feedback from young people about their experiences
  • Quality – observational self- assessment data on their provision using the PQA
  • Outcomes - the difference that YIF funded provision made to young people across outcomes related to social and emotional learning, social connectedness, and wellbeing

 

Whilst these datasets were all useful individually (see points 1 and 3), the real strength of the approach was that we could look at the relationships between these data sets to understand what conditions are more likely to contribute to impact for young people.

 

This approach allows us to ‘disaggregate’ data (i.e. break it down into component parts) which means we can really pinpoint where change is happening (or not). For example, in the YIF evaluation, one of the outcomes we looked at was ‘My own efforts are what will determine my future’ (related to the outcomes domain of ‘self-confidence and personal locus of control). Overall, we found no difference between the progress made by the YIF cohort and a comparison group on this outcome. However, when we dug deeper, we found that those in higher quality provision (as measured by the PQA) made greater progress on this outcome than those in lower quality provision. Essentially, this is suggesting that not all youth provision supports this outcome but provision that is focused on high quality staff practices, as set out in the PQA, does.

 

  1. Collaborate

Finally, a shared approach to measurement, where organisations use common ways of defining and measuring quality and impact has many benefits including:

  • Generating a shared dataset across organisations allowing organisations to compare scores with those achieved by other organisations
  • Improving understanding of collective quality and impact by building the sector–wide picture.  This can be used to inform and shape funding and policy decisions as well as the development of provision for young people
  • Supporting better learning across the sector.

 

What next?

You can read more about what we learnt from the YIF about measuring youth provision in ‘Measuring the quality and impact of youth provision’ and about challenges in implementing this approach in ‘YIF Insight Paper 6: Looking back, looking forward: Lessons learnt from conducting a shared evaluation of open access youth provision’.

 

The Centre for Youth Impact is piloting a suite of measures, including the PQA and ARYB, that are aligned with practice and build on learning from the YIF and other projects. Get in touch if you’d like to find out more.