A Clash of Mindsets: Finding a middle way between different impact methodologies. Part 1
by Joy MacKeith, Director, Triangle
In this new discipline of social impact analysis there are a range of methodologies and there can be tensions between different approaches. This article sets out three key tensions and offers some ideas on what the middle ground between these competing approaches might be.
1. There can be a tension between objective measures (things that can be objectively observed like housing or offending) and subjective measures (things that cannot be directly observed but have an impact on directly observable behaviour such as confidence, improved relationships).
Middle ground: Whilst objective measures are attractive to many, there is evidence that subjective measures link more strongly to sustainability of outcomes (1) so the most robust approaches are those that use both. It is possible to bring objectivity into indirect outcomes through tools for standardising professional judgement or subjective report, for example the Outcomes Star. (2)
2. There can be a tension between standardised systems which provide a shared framework across many different projects or organisations and can be aggregated and case by case approaches which respond to the uniqueness of each project or person and reflect the values and language of those being measured.
Middle ground: Frameworks such as Big Society Capital’s Outcomes Matrix allow case by case approaches to be analysed within a standardised system. Also tools like the Outcomes Star which are modified for different client groups allow for a balance between standardisation and tailoring for different client groups.
3. There can be a tension between experimental approaches which simplify complex systems by isolating one variable and identify it’s impact (eg RCTs) and exploratory approaches which investigate the relationship between variables within that complex system.
Middle ground: Experimental approaches can produce very persuasive evidence but are expensive and can produce misleading results if the relationships between the different variables in the complex system are not fully understood before the experiment takes place. These approaches are appropriate when sufficient exploration has already been carried out and in very stable environments. When this is not the case exploratory approaches are cheaper and can produce richer insights into cause and effect.
Underlying these tensions may be different paradigms that shape how the analyst sees the world and values different forms of data.
|Positivist Paradigm: There is an absolute reality||‘Interpretivist’ Paradigm: Reality is subjective|
|Change in external, observable circumstances and behaviour is the most important and valid||Change in subjective experience is most important and subjective measures are equally valid|
|Human systems can be broken down into linear cause and effect relationships. RCT’s are rigorous and the best way of getting at objective truth||Human systems are too complex to be meaningfully broken down in that way – better to try and understand the dynamics within the system|
(1) B. McNeil, N. Reeder & J. Rich. (2012) A Framework of Outcomes for Young People. Young Foundation.
(2) J. MacKeith. (2011) The Development of the Outcomes Star. Housing Care and Support 14 (3).
G. Dickens et al. (2012) Recovery Star: validating user recovery. The Psychiatrist Online 36.
L. Harris & S. Andrews (2013) Implementing the Outcomes Star well in a multi-disciplinary environment. Australia: RMIT University.
York Consulting. (2013) Family Star Evaluation. Family Action.
Joy MacKeith is a founding director of Triangle, a social enterprise supporting organisations to take an outcomes approach to their work. She is co-author of the family of Outcomes Star tools and leads on the development of the Outcomes Star methodology including validation, quality assurance and working with researchers.
This article was first published in Beyond Measurement: Reflections on SIAA annual conference 2013. This publication provides reflections on the workshops, hotspots and interactive activities that took place at SIAA annual conference on 10th December 2013, and highlights the discussions and the ideas that came out of the day. Click here to view the publication.