Results management in DFID The DFID story Objectives
Results management in DFID The DFID story
Objectives • The DFID Story – including, what went right, what went wrong • How results based management is being used DFID. • Lessons for us all
The DFID Story - timeline • May 2010 – Conservative/Liberal Democrat coalition government elected. • Summer 2011 – "Changing Lives" published, with "We will" commitments – A COMMUNICATIONS TOOL • February 2012 – DFID Results Framework published, covering We will commitments, plus other indicators • June 2012 – First complate DFID Results report, covering DRF - but also other results in Country Operational plans TIMELINE WAS TIGHT !!
The DFID Story – what were we measuring The DFID Results Framework- a corporate results framework – but inconsistent measurement? Ø Number of people reached with emergency food assistance through DFID support Ø Number of people with sustainable access to clean drinking water sources with DFID support Ø Number of women and girls with improved access to security and justice services through DFID support
The DFID Story – how were targets set ? • Number of unique people reached with one or more water, sanitation or hygiene promotion intervention – target increased to 60 million • Number of children supported by DFID in primary and lower secondary education originally 2 targets for primary school children UNREALISTIC TARGETS MEANT SUDDEN REALLOCATION OF RESOURCES
The DFID Story – how were results collated • All projects – ongoing as well as ended • A standard Excel spreadsheet for each delegation • Statisticians covering all countries to assist in the collation of results • Results reported twice a year • Attribution based approach • All results, not just those aligned to the DRF
The DFID Story – how are results being used • At project level: all projects reviewed on an annual basis, and at project completion. Logframe used to assess performance. Signed off at director level. • At country level: Country operational plans, results reported on an annual basis • At organisational level: results reported against DRF
Actions taken, based on results • Develop understanding for under/over performance • Projects could be stopped • Extra resources could be devoted to meeting off-track targets • Project/programme performance used for evaluations (not all projects evaluated) • Part of staff performance – recognition if you took part in external reviews BUT THIS WAS NOT PERFECT !
Lessons for us all • Overall, results collation is a GOOD thing • It's not about the tools! • OPSYS can be the future!
- Slides: 9