Humanitarian Principles in Evaluation how can we move

  • Slides: 9
Download presentation
Humanitarian Principles in Evaluation: how can we move forward? Margie Buchanan-Smith on behalf of

Humanitarian Principles in Evaluation: how can we move forward? Margie Buchanan-Smith on behalf of Tony Beck (team leader), Belen Diaz and Lara Ressler Horst UNEG EPE 25 April 2016

How did we go about this review? Our approach and methodology • Aim: to

How did we go about this review? Our approach and methodology • Aim: to provide the HEIG with a better understanding on how the 4 core HPs are evaluated, highlighting best practices, challenges and opportunities • Our methodology: • Literature review • Analysis of humanitarian strategies, evaluation policies & guidelines of 10 agencies • Screening of 142 evaluations of HA for coverage of HPs, focusing on 7 emergencies: Afghanistan, DRC, Haiti, Somalia, Sudan, South Sudan & Syria • Analysis of a purposive sub-sample of 20 evaluations which made greater reference to HPs • Interviews with 12 key stakeholders for reflections on HPs in EHA

What did we find? The evidence • Commitment to HPs in agencies’ humanitarian policies

What did we find? The evidence • Commitment to HPs in agencies’ humanitarian policies not well-reflected in evaluation policies and guidelines • Review of evaluations: • Widespread discussion of ‘access’ & ‘security’, but the link to HPs tenuous & implicit • Explicit mention of HPs in about one-third of evaluations, but often lacking in-depth analysis • ‘Impartiality’ – the most frequently referenced principle, usually addressed under evaluation criterion of coverage • Comprehensive evaluation of HPs combined not taking place: most frequent combinations: Independence and Neutrality; Independence and Impartiality • Concentration of terms in 20 out of 142 evaluations

Usage of terms in sample of 142 evaluation reports

Usage of terms in sample of 142 evaluation reports

What did we learn from this review of evaluations? • No significant difference between

What did we learn from this review of evaluations? • No significant difference between different types of agencies in their treatment of HPs in EHA • Strategic & thematic evaluations more likely to reference HPs than operational, RTE or impact evaluations • Extremely limited ‘good practice’ of evaluating HPs: 6 out of 142 evaluations • Overall: HPs are not systematically assessed in EHA

What emerged as the challenges to evaluating humanitarian principles? • Lack of common understanding

What emerged as the challenges to evaluating humanitarian principles? • Lack of common understanding of HPs • How to evaluate when there are contradictory sets of principles • Sensitivity to evaluation of HPs in the public domain • Discussions about HPs take place ‘behind closed doors’ • Methodological challenges: requires evaluation through a more ‘political’ lens. EHA tends to be more ‘technical’ • Lack of expertise to evaluate against HPs • Lack of guidance

What did we learn from the good practice examples? • TOR: have specific questions

What did we learn from the good practice examples? • TOR: have specific questions related to HPs • Political context analysis: in relation to HPs • Methodology: mostly using standard methodologies, with one innovative example • Inclusion in evaluation analysis: 2 ECHO thematic evaluations included a comprehensive analysis of implementation of HPs, bringing out tensions between principles • HPs reflected in recommendations: often quite general, some more specific eg related to access and coverage • Just one evaluation (ECHO, 2012) explores whether adhering to HPs leads to more successful interventions

What else did we learn/ deduce about facilitating factors? • Requires clarity of commitment

What else did we learn/ deduce about facilitating factors? • Requires clarity of commitment by the agency to HPs • Requires expertise • Evaluation managers with a good understanding of HPs • Requires expertise on HPs in the evaluation team • More likely in strategic evaluations (and research studies…) • Could be supported with guidance on how to evaluate against HPs • Requires a different approach to EHA: less mechanical, with analysis of political context linked to analysis of HPs • Agencies must be willing to evaluate against HPs

Suggested follow-up for the HEIG 1) Update IAHE guidance on system-wide emergencies with greater

Suggested follow-up for the HEIG 1) Update IAHE guidance on system-wide emergencies with greater attention to evaluation of HPs 2) Do follow-up review of NGO evaluations and how they reflect HPs 3) Pilot evaluation of HPs (where a lesser degree of political conflict) drawing on available methodologies 4) Pilot a confidential HP Annex, supported by peer review 5) Commission single-agency evaluations focussing specifically on HPs 6) Use existing Communities of Practice to disseminate these findings and move the discussion and practice forward 7) Carry out meta-evaluations to see if evaluation practice on HPs improves