Managing for Results USAID Nigeria IPs u Day

  • Slides: 31
Download presentation
Managing for Results: USAID Nigeria & IPs u Day 3: Performance Information Integrated Managing

Managing for Results: USAID Nigeria & IPs u Day 3: Performance Information Integrated Managing for Results 1

Task 5: Selecting indicators (“X” rejection criteria) Results & Indicators 1. Growth of Agribusiness

Task 5: Selecting indicators (“X” rejection criteria) Results & Indicators 1. Growth of Agribusiness n. Annual sales of assisted enterprises 2. Growth of Agribusiness n. Annual sales attributed to USAID assistance 3. Growth of Agribusiness n# customers for 1 year or more n. Annual customer sales attributed to USAID assistance 1. Effective Community-Based Resource Management Implemented # hectares of target area using collaborative management model n 2. Effective Community-Based Resource Management Implemented n# protected areas using collaborative management model 1. Improved Food Security of Vulnerable Groups % target group population with access to emergency relief supplies 2. Improved Food Security of Vulnerable Groups n% target group population consuming minimum daily food requirement 3. Improved Food Security of Vulnerable Groups n# person days of employment created Integrated Managing for Results Direct Objective Useful Practical Attribute Timely 2 Adequate

Task 5: Selecting indicators (“X” rejection criteria) Results & Indicators 1. Family health improved

Task 5: Selecting indicators (“X” rejection criteria) Results & Indicators 1. Family health improved n Under 5 mortality rate in target areas 2. Family health improved n. Ratio of under 5 mortality rate in targeted areas to national average 3. Family health improved n National under 5 mortality rate 1. Increased contraceptive security n Average % warehouses with stock-outs of 1 or more contraceptives 2. Increased contraceptive security n. Average % points-of-sale with stock-outs of 1 or more contraceptives 1. Increased private sector-led economic growth n Cumulative dollar volume of private investment in target area 2. Increased private sector-led economic growth n. Present value of cumulative dollar volume of private investment in target area 3. Increased private sector-led economic growth n. Value of investment in target area Integrated Managing for Results Direct Objective Useful Practical Attribute Timely 3 Adequate

A few words about baselines and targets Baseline The condition or level of performance

A few words about baselines and targets Baseline The condition or level of performance that exists prior to implementation of the program or intervention Target The expected level of achievement of the result, as stated in the terms of the performance indicator, within a given period of time. Targets convey an understanding of the anticipated magnitude of change vis á vis USAID’s investment. 4

Performance Baseline 5 Value of the performance indicator at the beginning of the planning

Performance Baseline 5 Value of the performance indicator at the beginning of the planning period. u Baselines can/should be: n Set just prior to the implementation of USAID-supported activities that contribute to the achievement of the relevant SO or IR n Measured using the same data collection method that the SO team will use to assess progress n Changed if the data collection method changes (document!) Integrated Managing for Results

Target and Baseline: An Illustration Strategic Objective: A competitive, private financial sector that is

Target and Baseline: An Illustration Strategic Objective: A competitive, private financial sector that is more responsive to the needs of a market-oriented economy Baseline: 2001 $10 million Target: 2007 $50 million Key Indicator: Value of credit/equity provided to small and medium enterprises by private financial institutions Intermediate Result 6

Performance Target Commitments made by the SO team about the level and timing of

Performance Target Commitments made by the SO team about the level and timing of results to be achieved in a specified time period. u Targets: n Can be expressed in quantity, quality or efficiency n May be determined by setting final target first, then interim targets n May need to be set after activities or sites are selected n Can be adjusted over time n Should be realistic! n Should be outside the margin of error of historical trend If you don’t know where you’re going, you’ll end up somewhere else - Yogi Berra Integrated Managing for Results 7

Baseline and Target Setting - Best Practices u u u Look at historical trends

Baseline and Target Setting - Best Practices u u u Look at historical trends Consider partner and customer expectations of performance Think about social norms and cultural factors Consult experts/research findings Benchmark accomplishments elsewhere Disaggregate where relevant and possible Integrated Managing for Results 8

Assessing Data Quality USAID Performance Management Workshop “There are three kinds of lies: lies,

Assessing Data Quality USAID Performance Management Workshop “There are three kinds of lies: lies, damned lies, and statistics. ” --Mark Twain The client has determined that this document (please check one): ___ Is appropriate for public distribution on the Internet ___ Is not appropriate for public distribution on the Internet Contract AEP-C-00 -99 -00034 -00

Issues 10 MANAGEMENT u u Can you make decisions based on this data? Better

Issues 10 MANAGEMENT u u Can you make decisions based on this data? Better quality data leads to better informed management and planning. Integrated Managing for Results REPORTING u u Is this data believable? Audiences want to know how “credible” your data is so they can trust your analysis and conclusions.

Five standards for quality of data VALIDITY RELIABILITY TIMELINESS PRECISION INTEGRITY Integrated Managing for

Five standards for quality of data VALIDITY RELIABILITY TIMELINESS PRECISION INTEGRITY Integrated Managing for Results 11

Validity u 12 Key question: Do data clearly and directly measure what we intend?

Validity u 12 Key question: Do data clearly and directly measure what we intend? Issue: Direct u u Result: Poverty of vulnerable communities in conflict region reduced Issue: Bias u u Indicator: Number of people living in poverty Source: government statistics office The government doesn’t include internally displaced people (IDPs) in the poverty statistics Integrated Managing for Results u u Result: Modern sanitation practices improved Indicator: Number of residents in targeted villages who report using “clean household” practices Source: door-to-door survey conducted three times a year Most of the people in the targeted region work long hours in the fields during the harvest season

Reliability u 13 Key question: If you repeated the same measurement or collection process,

Reliability u 13 Key question: If you repeated the same measurement or collection process, would you get the same data? Issue: Consistency or Repeatability u Result: Employment opportunities for targeted sectors expanded u Indicator: Number of people employed by USAID-assisted enterprises u u Source: Structured interviews with USAID-assisted enterprises, as reported by implementing partner AAA, BBB, and CCC The SO Team found out that the implementing partners were using these definitions: n AAA – employees means receives wages from the enterprise n BBB – employees means receives full-time wages from the enterprise n CCC – employees means works at least 25 hours a week Integrated Managing for Results

Timeliness u 14 Key question: Are data available timely enough to inform management decisions?

Timeliness u 14 Key question: Are data available timely enough to inform management decisions? Issue: How Frequent u u Result: Use of modern contraceptives by targeted population increased Indicator: Number of married women of reproductive age reporting using modern contraceptives (CPR) Source: DHS Survey The DHS survey is conducted approximately every 5 years Integrated Managing for Results Issue: How Current u u Result: Primary school attrition in targeted region reduced Indicator: Rate of student attrition for years 1 and 2 at targeted schools Source: Enrollment analysis report from Ministry of Education In July 2002 the MOE published full enrollment analysis for school year August 2000 – June 2001

Precision u u u 15 Key question: Are the data precise enough to inform

Precision u u u 15 Key question: Are the data precise enough to inform management decisions? Issue: Enough detail Result: CSO representation of citizen interests at national level increased Indicator: Average score of USAID-assisted CSOs on the CSO Advocacy Index Source: Ratings made by Partner XXX after interviews with each CSO The SO team reported this data to the Mission Director: u 1999 = 2. 42 2000 = 3 2001 = 3. 000 Integrated Managing for Results Issue: Margin of error u u Result: Primary school attrition in targeted region reduced Indicator: Rate of student attrition for years 1 and 2 at targeted schools Source: Survey conducted by partner. Survey is informal and has a margin of error of +/- 10% The the USAID intervention is expected to cause 5 more students (for every 100) to stay in school longer

Integrity u 16 Key question: Are there mechanisms in place to reduce the possibility

Integrity u 16 Key question: Are there mechanisms in place to reduce the possibility that data are manipulated for political or personal gain? Issue: Intentional manipulation u Result: Financial sustainability of targeted CSOs improved u Indicator: Dollars of funding raised from local sources per year u Source: Structured interviews with targeted CSOs u When a SO Team member conducted spot checks with the CSOs, she found out that organizations CCC and GGG counted funds from other donors as part of the “locally raised” funds. Integrated Managing for Results

Techniques to Assess Data Quality WHY Goal is to ensure SO team is aware

Techniques to Assess Data Quality WHY Goal is to ensure SO team is aware of: n Data strengths and weaknesses n Extent to which data can be trusted when making management decisions and reporting All data reported to Washington must have had a data quality assessment at some time in the three years before submission. ADS 203. 3. 5. 2 Integrated Managing for Results 17

HOW? Steps to Conduct Assessment 1. Review performance data Ø 2. 4. 5. 6.

HOW? Steps to Conduct Assessment 1. Review performance data Ø 2. 4. 5. 6. Examine data collection, maintenance, processing procedures and controls Verify performance data against Agency data quality standards Ø 3. 18 Reliability, precision, timeliness, validity, integrity If data quality limitations are identified, take actions to address them Ø Triangulate; Supplement with data from multiple sources Ø Report the limitations Ø Revise indicator Document the assessment and the limitations in the Performance Indicator Reference Sheet Retain supporting documentation in files Ø Decisions and actions concerning data quality problems Ø Approach for conducting data quality assessment If data will be included in the annual report, disclose the DQA findings in the “data quality limitations” section of the Annual report Integrated Managing for Results

Task 6: 1 hour DQA on Middleland Data 1. 2. 3. 4. 5. Collect

Task 6: 1 hour DQA on Middleland Data 1. 2. 3. 4. 5. Collect the rating sheet from the back of the MEMS DQA form Each person selects one indicator from the Middleland PMP in tab 1 Each person reviews the Indicator Reference Sheet for the indicator selected Check off the yes/no column for the indicator and note any issues/recommendations in the appropriate column List those indicators with problems on a flip chart Integrated Managing for Results 19

Analyze data for a single indicator u Compare actual performance against target(s) u Compare

Analyze data for a single indicator u Compare actual performance against target(s) u Compare current performance against prior year u Compare current performance to baseline(s) Integrated Managing for Results 20

Analyze trends in performance u u u 21 Analyze target trend(s) against actual trend(s)

Analyze trends in performance u u u 21 Analyze target trend(s) against actual trend(s) Examine performance (met, exceeded or short of target) of lower results in relation to higher results Examine trend data from critical assumption monitoring to help interpret results Integrated Managing for Results

Assess USAID contribution u u u 22 Examine timing of results in relation to

Assess USAID contribution u u u 22 Examine timing of results in relation to timing of USAID efforts Compare trends in results to trends in changes of level of effort Compare performance to control group or benchmarks in similar environments Integrated Managing for Results

What does the data tell you? u Do you trust your PMP data? u

What does the data tell you? u Do you trust your PMP data? u Is the hypothesis working? u Why and/or why not? u Do you need more information? Integrated Managing for Results 23

Performance monitoring and evaluation Performance Monitoring u Focuses on whether results are being achieved

Performance monitoring and evaluation Performance Monitoring u Focuses on whether results are being achieved or not Evaluation u Focuses on why/how results are achieved or not u Ongoing, routine u Occasional, selective u Often quantitative u Quantitative and qualitative u A process that involves u n n n u identifying indicators, baselines and targets collecting actual results data comparing performance against target Contributes to management decision making Integrated Managing for Results 24 u A structured, analytical effort to answer managers’ questions about n validity of hypothesis n unexpected progress n customer needs n sustainability n unintended impacts n lessons learned Makes management recommendations

An Evaluation Statement of Work States purpose, audience and use of the evaluation Clarifies

An Evaluation Statement of Work States purpose, audience and use of the evaluation Clarifies the evaluation question(s) Identifies activity, program, or approach to be evaluated Provides a brief background on implementation Identifies existing performance information sources 25

An Evaluation Aims to Produce… RECOMMENDATIONS CONCLUSIONS FINDINGS PROPOSED ACTIONS FOR MANAGEMENT BASED ON

An Evaluation Aims to Produce… RECOMMENDATIONS CONCLUSIONS FINDINGS PROPOSED ACTIONS FOR MANAGEMENT BASED ON THE CONCLUSIONS INTERPRETATIONS & JUDGMENTS BASED ON THE FINDINGS FACTS & EVIDENCE COLLECTED DURING THE EVALUATION 26

Data analysis and use What is happening with respect to each individual result? Were

Data analysis and use What is happening with respect to each individual result? Were performance targets met? How did performance compare with last year’s? What is the trend from the baseline year? How does the trend compare with expectations? 27

Rules of thumb… Results were achieved as intended Results were not achieved as intended

Rules of thumb… Results were achieved as intended Results were not achieved as intended It is not clear whether results would have been achieved Implementation success, strategy success Implementation success, strategy failure Implementation failure, strategy uncertainty Program was implemented as planned Program was not implemented as planned 28

Implementation problems? Are the activities on track and are outputs being produced on time?

Implementation problems? Are the activities on track and are outputs being produced on time? Are outputs being produced in sufficient quantity? Is the quality of the activities and outputs adequate? Are our partners achieving critical results and outputs for which they are responsible? 29

Data analysis and use What are we going to do about it? Do we

Data analysis and use What are we going to do about it? Do we change the. . . • Program? • Results framework? • Activities? • Indicators? • Targets? 30

Task 7: 1 hour Performance Information and Use 1. 2. 3. Review the SOT

Task 7: 1 hour Performance Information and Use 1. 2. 3. Review the SOT analysis of performance information for Bangladesh. Consider these questions: Do you trust the data? Does the Development Hypothesis work? Record on a flipchart what decisions you might make, including decisions to evaluate questions that arise. Integrated Managing for Results 31