How Can We Reconcile Evaluation with Politics?
by Ben Rickey, Agence nouvelle des solidarities actives (ANSA), France
As a social impact analyst, when was the last time you talked to a politician? Last week? Last month? Last year? Never? “Why bother?” you may ask. We all know that politicians and their advisors are focused on proposing shiny new solutions to social problems. They wouldn’t be interested in the messy world of evidence and data.
In fact, politicians have never needed evaluators more. As we see intense pressure on public resources, they need to ensure that money is being directed to programmes and services with the best chance of achieving impact. Many are also aware that certain traditional approaches to addressing social exclusion are now largely ineffective. And a large amount of evidence exists – indeed, significant amounts of money have been invested in evaluating the effectiveness of social programmes. However, the evidence produced is rarely acted on, as a number of organisations in the UK, France and beyond have argued. As a result, social problems are not addressed as effectively as they could be. To put it bluntly, people suffer.
Even when evidence is available, and acted upon, the process of scale up is far from guaranteed. Take the impact of mental health problems on employment in the UK. One person in six suffers from a mental health problem at any one time in the UK – most from anxiety or mild depression. This rate is much higher amongst those out of work, and 724,000 people claim employment and support allowance (ESA) because of mental and behavioural disorders. They need specialised support if they are to get back to work. One approach to employment support – Individual Placement and Support (IPS) – has been shown to be by the far the most effective. Numerous evaluations have been carried out in the US over more than a decade. These show that IPS gets 40-60% of beneficiaries back to work compared with around 15% for traditional support services. Years of lobbying by charities, in particular by the Centre for Mental Health, helped convince UK policy makers that IPS would be the best approach to helping people with mental health problems back to work.
However, years later “the spread of IPS services is still patchy… almost half of England’s secondary mental health services still have no IPS workers or teams in place.” In short, “the option of IPS is still not available for large numbers of people who would like help and support in finding suitable work.” In theory, a scaling up of IPS across the UK could help tens of thousands more people with mental health problems back to work. Instead, they remain inactive. This case study shows that it is difficult not only to build an evidence base for a programme, but also to scale it up.
This case study and discussions at the SIAA conference suggest there are fairly distinct stages involved in identifying and then scaling up effective programmes, each requiring real attention and investment:
Research & Development. This is the stage where promising innovations are identified, then evaluated using robust methods – like Randomized Controlled Trials. This stage takes time, so doesn’t lend itself to the involvement of national politicians, largely enslaved to 4 or 5 yearly political cycles. This means independent funding (from foundations, research institutes, local authorities) is often essential. These programmes often emerge out of a long-term partnership between a delivery partner and an evaluator, who together test it at small-scale.
Promotion. Once an intervention is clearly proven, a real effort is needed to promote it, and then scale it up. For this they need to get the attention of politicians – as one conference participant put it, the programme has to “solve a politician’s problems.” In practice, this means three things:
1. Politicians need to identify the problem as one deserving attention. This often requires years of lobbying, as Richard Layard’s work to turn Mental Well-Being into a public policy issue shows.
2. They need to have access to data on what works. One way of doing that is through ‘clearinghouses’ of evidence-based programmes. These are essentially online lists of programmes, with details of how each one works, how it has been evaluated and with what results. Around 30 exist, including Blueprints in the US and the Education Endowment Foundation in the UK.
3. Simply providing data on effectiveness often isn’t enough. Politicians often need an extra push from the key actors in the sector to adopt the most effective programme(s). A consensus conference – an approach used in France on issues ranging from homelessness to prisoner rehabilitation – is one approach. Long-term lobbying by charities, academics or think tanks is another.
Scale up. Finally, programmes need to be scaled up in a planned way, with training and capacity building to ensure they maintain their efficacy at scale. This process also requires continuous monitoring – and sometimes lobbying – to ensure they stay on track. The Centre for Mental Health’s work on the scale up of IPS in the UK is a particularly good example of this.
In short, as social impact analysts, the onus is on us to show politicians which programmes work and then help them scale these up. There are many ways to do this – evaluate a promising innovation, start a new clearinghouse, or campaign for funding for an effective programme. But this isn’t just about practical actions; it’s about a fundamentally new definition of our role. We’re talking about a shift away from the relative isolation and independence of analysis, and towards the messy world of politics. For some this will feel uncomfortable, even compromising. But in the long run it is the only way to get our voices heard in the hubbub of policy debate.
Ben Rickey is project manager at the Agence nouvelle des solidarities actives (Ansa) in France. Since 2006, Ansa has overseen the implementation of numerous social experiments, including the Active Solidarity Income (RSA).
This article was first published in Beyond Measurement: Reflections on SIAA annual conference 2013. This publication provides reflections on the workshops, hotspots and interactive activities that took place at SIAA annual conference on 10th December 2013, and highlights the discussions and the ideas that came out of the day. Click here to view the publication.