0% found this document useful (0 votes)
5 views4 pages

The Dead Hand

The document discusses the evolution of evaluation practices, emphasizing the need to move beyond low-trust accountability models towards more humanistic and democratic approaches. It highlights the importance of understanding the context and experiences of individuals involved in social programs, particularly in the Swindon initiative, which seeks to align professional services with family realities. The proposed evaluation methodology aims to capture diverse perspectives and support critical reflection while honoring the values and language of the program.

Uploaded by

Saville Kushner
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views4 pages

The Dead Hand

The document discusses the evolution of evaluation practices, emphasizing the need to move beyond low-trust accountability models towards more humanistic and democratic approaches. It highlights the importance of understanding the context and experiences of individuals involved in social programs, particularly in the Swindon initiative, which seeks to align professional services with family realities. The proposed evaluation methodology aims to capture diverse perspectives and support critical reflection while honoring the values and language of the program.

Uploaded by

Saville Kushner
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

§

‘Livening the dead hand of evaluation’


Evaluation and the Swindon project
Professor Saville Kushner

Recovering evaluation: No wonder it feels like this. In recent years, evaluation has been
largely confined to the disciplinarian, low-trust accountability end of an otherwise rich spectrum of
possibilities. For 30 years after its invention in the 1960s programme evaluation spawned a broad
palette of humanistic options for understanding society’s striving to improve citizen welfare. So we
have Democratic Evaluation, Affirmative Evaluation, Responsive, Utilisation-Focused, Participatory
and Personalised Evaluation, Dialogic, Real-World and many other models that have too easily been
displaced by state regulation and inspection. In fact, today we have little programme evaluation as
such - what we mostly have is impact-assessment and quality control – often useful, but insufficient.
What was evaluation: what might it be?

We need to think more expansively about what it means to generate information on social
programmes in a context of democracy and rights. One of the towering figures who invented the
discipline of evaluation, Lee Cronbach opened his seminal 1980 book with a manifesto for evaluation
whose first clause was:

“Evaluation is a process by which society learns about itself.”

He argued that the only thing worth evaluating is human context – the determining factor.

Democratic and qualitative evaluators were commissioned by governments in the UK and the US,
more anxious to understand change than to exert control. They took over the instruments of social
science to fashion an approach to programme analysis that was collaborative, recast theory as
pragmatic (i.e. ‘practical theory’), insisted on case study and narrative as proper and accessible
portrayals of complexity. Evaluation might be an external or an internal function – the important
thing was that it is impartial and independent. And they sought to submit their evaluative enquiries
to the judgement of people who were being evaluated. “People own the data over their own lives,”
they wrote. It was unfair for people to have evaluation done to them: but it was natural to capture
people’s own volunteered judgements of quality. Evaluators worked with the grain of how people
ascribe value. Robert Stake, another of those founders, even called for evaluators to capture “the
mood and the mystery” of social programmes, and wrote an influential (one-page) paper entitled ‘On
Looking and Seeing’ – measurement instruments are good at seeking out, but too often fail to ‘see’
quality.

A third of these founders, Barry MacDonald, set out the terms for Democratic Evaluation –
evaluation as an information service to the citizenry and its representatives. He followed this up with
a contribution entitled ‘The Portrayal of Persons as Evaluation Data’ – arguing that understanding

1|Page
innovation and change required, sine qua non, understanding people in terms of their life histories
and understanding the interplay between their values and interests. “No adequate portrayal of a
programme is possible which does not portray the key personalities involved.”

None of these evaluation theorists denied outcomes; but all of them asserted fairness and critical
understanding as the keystones to evaluation. Central to fairness was the evaluator’s impartiality
and a default scepticism of any claim to truth or effectiveness. Independent scepticism is what gives
the evaluator credibility in portraying the programme in its own terms – i.e. to serve the programme.
The balance struck between scepticism and affirmation is through negotiation.

Evaluation today asks important questions about the productivity of programmes, and this we need
to know in a healthy democracy. It has a tendency to focus on measurable outcomes against pre-
specified indicators, and many of those are essential to know, too. But it has an equal tendency to
avoid those programme accomplishments and qualities that lie beyond measurement and fall within
the realm of judgement. We have become so focused on outcomes and impact that we have
forgotten how to see programme quality. It goes – crudely - like this:

Evaluation for low-trust accountability Responsive evaluation (in a context of high-


trust accountability)
Retrospective viewpoint: Prospective viewpoint:
 What happened?  What could happen?
 Why did you do this?  What would you need to do that?
 Who was responsible and did they discharge  Do we need to reconfigure roles and
their assigned duty? responsibilities?
 How much did it cost?  What resource implications are there?
 Did you make a difference?  What would make a difference?

Language - it promotes the vocabulary of: ...the vocabulary of:


 Justification (‘we did this because...’)  speculation (‘we might think of doing this’)
 defence  ambition

Encourages risk aversion and promotes certainty Protects risk-taking and promotes curiosity

Assumes singular values in the programme (usually Documents values pluralism and sees policy as just
given by policy) another voice

Persuades people to improve... ...supports people to change

Imports external criteria for measuring quality Derives criteria from within the programme
community

The evaluator is external to the programme being The evaluator may be external or internal – or the
evaluated and internal to the policy system – though evaluation might function as an evaluative culture –
the programme may be required to internalise its but it is always independent of policy
function

Set aside the ‘dead hand’: think, instead, of evaluation as a glove, given shape by the insertion of
humanity and purpose – these are its roots. Perhaps we have, in Swindon, an opportunity to recover
evaluation that is simultaneously useful and humanistic, and responsive to its social contexts.

2|Page
Finally, we return to Lee Cronbach to remind ourselves of what evaluation can aspire to be.

“The special task of [the evaluator] in each generation is to pin down the contemporary facts. Beyond
that, he shares with the humanistic scholar and the artist in the effort to gain insight into
contemporary relationships, and to realign the culture’s view of man with present realities. To know
man (sic) as he is, is no mean aspiration.”

The Swindon programme: The Swindon initiative has the characteristics of a programme. It
is innovatory in its intent, it is expressed in action on multiple sites, it coheres around a common set
of principles, has an organisational framework, it is developing its own language. The common set of
principles go like this:

 Coherence and proper organisation in public provision is given by its correspondence with
how people lead their lives, not by the historical configuration of professional services;
 The question is not how families fit into professional services, but how professional services
fit into family realities;
 Families and community should be sources of resource as well as need;
 Service provision and professional practice is a site in which providers learn about
themselves and their obligations;
 The vocabulary of service needs to be extended to describe relationships and sentiments
between practitioners and citizens;
 Citizen, family and practitioner fall within the same narrative of aspiration for wellbeing.

Swindon Authority has discovered these principles through its work with families in distress, but it is
seeking to extend them to other projects.

Evaluation and the Swindon programme: Few prominent initiatives in the


contemporary world are free of evaluation. The question, often, is not ‘whether’, but ‘what kind’?
What UWE can provide is an approach to evaluation that shares and promotes the values of the
Swindon programme while retaining a (constructively) critical independence. It would accomplish
this by negotiating a methodology that was responsive and humanistic as described above. People,
their values and interests, would lie at the heart of the methodology, seeking out multiple
perspectives, acknowledging difference but fashioning ‘overlapping consensus’. The aims would be
to ‘pin down the contemporary facts’, but support Swindon (official and civil society) to critically
reflect on its ambitions, its experience and its relationships. The evaluation would portray the
programme in its own terms – honouring, for example, the language of the programme (we can
derive qualitative indicators for moral and spiritual development and its corresponding service
values); giving equal weight to family interpretations and service aims; supporting the critical self-
reflection of the practitioner; conducting direct observation of interactions between practitioner and
citizen.

The questions the evaluation would pursue are such as these:

 How do people experience the programme?

3|Page
 How do people value the programme? Whose criteria count for putting value on the
programme?
 What is its quality? Who says so?
 What is it that makes the programme coherent?
 How do we know whether a measured outcome is worthwhile – for whom? Under what
circumstances is 53% (say) a lot or a little?
 What are the mechanisms through which programmes work on behalf of people?
 Who has the right to know what about a programme – from whom should information be
withheld?
 For how long is it reasonable to protect a programme from external scrutiny until it is
confident enough to confront its sceptics?
 Implied by all social programmes there are ‘winners’ and ‘losers’. Who are they and how do
we make judgements of overall benefit? How do we honour the ‘losers’, ensure that the
‘winners’ are duly reflective?
 How do we guarantee that the programme’s philosophy and values are properly
represented in accountability measures?
 What learning is available from social programmes and where do we look for that learning?
 Who tells the multiple stories of the programme in critical and impartial ways and
documents the multiple logics of its operation?
 How do we help a programme articulate and reflect on its theory of change?

4|Page

You might also like