« back


The Politics of Evidence

Sharpening the poverty focus of donor development efforts – or dumbing it down?

June 2013 / Jane Carter, Gender and Social Equity Coordinator, HELVETAS Swiss Intercooperation

In late May, Dr Rosalind Eyben of the Institute of Development Studies (IDS) was invited to speak at SDC Bern on the politics of monitoring and evaluation. Noting the growing insistence of donor agencies for facts and figures demonstrating the results of development funding, she argued that this is having significant consequences (unintended or otherwise) in the shaping of development programmes and their underlying theory of change. This of course has significant implications for poverty impacts. Her talk drew from a recent conference hosted by the Big Push Forward (‘Politics of Evidence’, 23-24 April in Brighton, UK) and various papers that contributed towards it, all of which are linked to the Big Push Forward initiative. Although inspired primarily by experiences of the British development sector, her arguments are relevant for many donor governments, including Switzerland.

Risks surrounding an “evidence based” mindset
In a financial climate of austerity, it is not surprising that emphasis is laid on efficiency and effectiveness, on value for money, and on demonstrating attribution rather than contribution. An “evidence based” mindset can of course contribute to a tightening of project goals and objectives, a clearer link between inputs and outputs – thus facilitating a cost benefit analysis, and greater overall accountability. However, Dr Eyben pointed out the following risks:

  • A growing tendency to fund projects that are readily quantifiable. For example, an anti-malaria programme distributing insecticide-treated bed-nets and recording the reduction in mortality rates produces highly “countable” results, but the results of a programme supporting good local governance through training and publicity campaigns are far more difficult to quantify.
  • Perverse incentives for projects to produce numbers. The most obvious example is for projects to focus on working with people who have some assets and education rather than people living in deep poverty, as the former are likely to take up activities and produce results more quickly and in greater number than the latter.
  • A simplification or dumbing down of our understanding of complex processes in order to fit the “numbers game”. One potential way in which this can be manifested is through the choice global indicators that can be applied across all countries; in being applicable to all countries, they tend also to be simplistic.
  • The reporting of spurious numbers in response to demand from higher levels.

 

Two subsequent focus group discussions produced the following reflections

  • The political implications of the use of the tools promoted within the organisation have to be borne in mind at all times. Linked to this, there is a need for a clear understanding of the way in which M&E tools are understood and used along the chain of partners. Safeguarding SDC’s organisational values is crucial; the tools should support bottom-up empowerment and a needs-oriented approach.
  • Staff members can be overly zealous in reporting numbers, even when neither necessary nor really informative. Thus a “number culture” should not be overly promoted. “Planned opportunism” to grasp opportunities when they arise in the project management cycle needs to be encouraged, not dismissed because they disrupt reporting. Similarly, the challenge of managing budgets according to a budget cycle must not squeeze out the time needed for reflection.
  • Evidence based learning is important – but the emphasis needs to be on learning rather than accountability, using impact hypotheses as a base and remaining open to change.
  • Numbers are used to communicate results – but are not the only way to do so; here SDC needs to be proactive in communicating its message in a frank and convincing manner.

 

During her visit, Dr Eyben shared a range of background material, available on Big Push Forward website

  • A one-page case study on the politics of evidence from the Accountability in Tanzania, AcT project managed by the audit and financial management company KPMG. Interestingly, this argues in favour of the collection of qualitative rather than purely quantitative evidence (1).
  • A paper outlining the findings of a crowd-sourcing study before the Brighton workshop, in which the majority of participants were engaged directly in M&E (Monitoring & Evaluation). This indicated that most found the increased focus on quantitative data collection to be a positive trend. Possibly they would have answered differently after the workshop (2).
  • A framing paper on the ‘Politics of Evidence’ (3).

 

Download this article as a PDF

 
References

1) AcT Case Study for Politics of Evidence (2013). Accountability in Tanzania. http://www.accountability.or.tz/
2) Brendan Whitty (2013). Experiences of the Results Agenda. Draft Findings for discussion from the crowd-sourcing survey. http://bigpushforward.net/wp-content/uploads/2011/01/Experiences-of-the-Results-Agenda-by-Brendan-Whitty.pdf
3) Rosalind Eyben (2013). Uncovering the Politics of ‘Evidence’ and ‘Results’. A framing paper for development practitioners. http://bigpushforward.net/wp-content/uploads/2011/01/Uncovering-the-Politics-of-Evidence-and-Results-by-Rosalind-Eyben.pdf