Page 1 Kabul 15 17 December 2015 DRC
Page 1 Kabul, 15 -17 December 2015 DRC Afghanistan Monitoring and Evaluation
Page 2 Introductions 1. A little bit about me… 2. A little bit about you…group warm-up exercise
Page 3 Aims This M&E course aims to underpin and build on already existing skills on Monitoring, Evaluation and Learning (MEL) in the organisation by offering capacity building for Senior programme, project and M&E staff in DRC Afghanistan.
Page 4 Objectives • Increase the awareness and understanding of key staff on how monitoring and evaluation relates to DRC’s project cycle management framework • Increase the awareness and understanding of key staff on DCR’s Monitoring, Evaluation and Learning Minimum Operating Procedures and the guidelines and tools available • Strengthen the skills of DRC Afghanistan’s senior programme, project, and M&E staff in conducting project and programme level M&E in line with the standards of DRC, particularly in relation to: – How to improve log frames – How to design and use appropriate indicators – How to select and use effective monitoring tools
Page 5 Agenda Morning Tuesday 15 Wednesday 16 SESSION 1 Purpose of M&E MELMOPS Your role in M&E Challenges to M&E SESSION 3 SESSION 5 Logframes Data collection tools Developing objectives - selecting and using them SESSION 4 Introduction to indicators Afternoon SESSION 2 Project Cycle M&E Systems Thursday 17 Developing Indicators Data collection tools cont. Evaluation
Page 6 Training Resources • Training presentation • DRC INTRAC MELMOP Training Course Guide • DRC INTRC M&E Training Workbook • MELMOP website: http: //melmop. drc. dk
Page 7 Home groups Reflect on the day: content, methodology, mood of the group, are there are practical problems? Provide feedback to the trainer: • What went well today and what could have been better • Suggestions for the rest of the workshop Other tasks: • Time-keeping for both participants and trainer • Monitoring energy levels, energisers • Participatory review of previous day
Page 8 Session 1: The purpose of PME
Page 9 Session 1: Objectives • To be introduced to MELMOPs • To have explored your own role in M&E • To be clear on purpose of M&E for learning and accountability at different levels of an organisation • To have explored the challenges of M&E • To have reviewed the key questions which need to be explored when planning M&E
Page 10 What is M&E (Monitoring and Evaluation)? • What does M&E mean for you? • How do you think that it is understood in your team/programme?
Page 11 DRC Definitions I Monitoring Systematic, regular collection of data on specified indicators during the lifetime of an intervention. It provides project managers and the main stakeholders with information about progress against planned activities and budgets and towards achieving objectives. Monitoring can also be used, if designed appropriately, above and beyond reporting to reflect the quality of an intervention, project or programme.
Page 12 DRC Definitions II Evaluation is a structured and objectively initiated exercise which facilitates organisational and programmatic learning and generates and documents information on DRC’s performance. Evaluations are also a means of documenting learning and informing strategic decisions and direction.
Page 13
Page 14 MELMOPS Monitoring Evaluation and Learning Minimum Operational Procedures http: //melmop. drc. dk A set of standard procedures to be followed throughout DRC and across all country mission to ensure (with the DRC Evaluation Policy) that basic project monitoring and periodic evaluations take place
Page 15 MELMOPS overview
Page 16 Why monitor and evaluate? • Are we doing what we said we would do? • Are we making a difference? • Are we doing the right things?
Page 17 Upward and downward accountability To show that we are doing what we said we would do [e. g. budget spent, outputs delivered, compliance to standards] To show that we are making a difference Funders/Donors Senior staff/ Trustees/ Board/ Partners Implementing team The people that we serve
Page 18 Improving performance and learning Improvement and learning • Identifying and solving problems as they occur and modifying a plan, approach, or strategy in order to improve the quality, efficiency, and effectiveness of our operations - Are we are doing the right things? - Are we doing things in the right way? • Reviewing and evaluating our projects, programmes and strategies to understand - Are we making a difference? • Transferring the lessons learned to other projects, programmes and the whole organisation
Page 19 Key Points • Accountability and learning are both essential. • In effect they are ‘two sides of a coin’ • The challenge is how to put both into practice in our work
Page 20 Your role Think about your role in M&E in your Project and Country Programme What is your role in M&E? • Is it clear what your responsibilities are? • List all the purposes that you use M&E information for • Is accountability or learning the priority? • What processes do you use to be accountable? • What processes do you use to learn? • What are the challenges to good and purposeful M&E?
Page 21 What does DRC focus on in M&E? • What conclusions can we draw regarding 1. The purpose/s 2. The priorities of M&E in DRC Afghanistan? • How does this relate to DRC at the organisational level? • How is your work and M&E in DRC Afghanistan integrated into learning and accountability beyond the country programme?
Page 22 DRC MEL and Planning and Reporting Frameworks The DRC’s MEL Framework defines the purpose of M&E as: • To ensure timely and relevant data collection at all stages of the project management cycle and that this is linked to the Planning and Reporting Framework • To ensure that organisational and donor requirements in regards to monitoring and evaluation are fulfilled: this includes complying with the MELMOPs • To ensure that, whenever possible, the monitoring and evaluation processes contribute to cross organisational learning
Page 23 M&E and Programme Planning and Reporting in DRC’s main processes for programme management and quality assurance are framed under the Planning & Reporting Framework. Rolled out across DRC operations, it consists of four elements: • Strategic Programme Documents (SPD) • Annual Reviews • Results Contracts • Quarterly Reports.
Page 24 Challenges to purposeful M&E • • Multiple competing demands No clear purpose or system Unclear objectives and indicators Poor quality of logframes Focus on short-term activities and outputs and Reporting to different donors with different reports Lack of clarity on a project, programme or organisations contribution to change
Page 25 Challenges to M&E • Often the focus is on accountability • Resources are often limited • Lack of clarity on what we should monitor, the questions we ask and what will be done with the data • Lack of clear tools, procedures and systems • Complexity (e. g. advocacy work: different interventions, actors, years for changes to occur)
Page 26 Exercise on Discuss Exercise A. , Question 1. 5 in the Training Workbook in your programme or work group and answer the following questions: • What elements do you see as M&E? • What are you contributing to in your operation?
Page 27 Always ask… Are we doing what we said we would do? ’ (Internal validity) ‘Are we making any difference? ’ (Outcome & Impact Assessment) ‘Are these the right things to do? ’ (Strategic relevance)
Page 28 Understanding the jargon Over the past few years organisations have used different acronyms to describe their approach to monitoring and evaluation M&E PME&L • • PMEAL PM&E MEAL PAL MEL Do you have any more? What do they mean? Which ones are used in your organisation? What are some of the debates around using the different terms?
Page 29 Where’s your focus? • M&E – focus on linking monitoring to evaluation; • PM&E – recognition of need to effectively link planning with monitoring and evaluation. Often left out or once done ignored; • MEL – recognition that learning was talked about but often problematic to put into practice. Lost, left out, too late! • PMEL (PME&L) – as with the second example explicitly ensuring that planning is linked to monitoring, evaluation and learning • MEAL – aimed to highlight and put clear focus on main elements and purposes of M&E, accountability and learning. For Humanitarian organisations, accountability is articulated in the HAP Principles of Accountability • PMEAL, PMEL - Participation – As an essential element of M&E
Page 30 Understanding Effective PME Session 2: Understanding Effective PME
Page 31 Session 2: Objectives • To understand the purpose of M&E and its application at different points in the Project Cycle • To understand M&E from an organisational perspective, the use of M&E systems and their link to work at project and programme levels
Page 32 Link to MELMOPS
Page 33 Planning, Monitoring and Evaluation
Page 34 The Project Cycle
Page 35 The Project Cycle
Page 36 Group discussion 1. How can using the management tool of the ‘Project Cycle’ assist in monitoring and evaluating your project/ programme work? 2. Are there any specific challenges or advantages to using the project cycle in your context? What is your experience of using the project cycle?
Page 37 Project Cycle Question 1 > The main challenge is that once the project has started the project plan is not used. The project cycle becomes either: > A straight line (with no reflection or learning) A closed circle (where any learning stays within the project and is not shared)
Page 38 Project Cycle Question 2 However, it can help to look backwards and forwards Looking forwards (Helping to see if the outcomes are likely to be achieved within the time period and whether any adjustments need to be made) Looking back (Reflecting on the project and questioning your assumptions and theory of change can help you to make necessary changes or adjustments)
Page 39 Organisational Purpose and Strategy > A central value of the Project Cycle approach is that aspects of the project are reconsidered throughout to ensure that any changes which have occurred are included in the project design. > As a result, projects are more likely to be successful.
Page 40 How the Project Cycle links to MELMOPS
Page 41 M&E Systems An M&E system helps… • To provide a clear purpose to M&E • To clearly link work at project and programme level up through different levels of the organisation to its purpose and strategic objectives
Page 42 What is an M&E System M&E systems mean different things to different people, and there are no standard definitions. One definition is as follows: An M&E system (or framework or approach) is understood as ‘a series of policies, practices and processes that enable the systematic and effective collection, analysis and use of monitoring and evaluation information’.
Page 43 DRC Afghanistan M&E System • Do you think that your Country Programme has a M&E System? Rich picture exercise • What does it look like? • How closely does it link to the guidance given in the MELMOPs standards? • Are all staff aware of it and able to apply it appropriately?
Page 44 Challenges with M&E Systems • M&E is often oriented towards many different donors rather than the organisation’s strategic objectives; • The size of a project/programme dwarfs the normal size of the Country Programme; • There is no clear connection on how Projects and Programmes add value and contribute to higher level objectives • Often lack a clearly defined purpose – need to define at project, programme and organisational level • Linking between project reporting and achievement of organisational objectives • Complexity of planning, monitoring and reporting systems • Reporting is mainly on outputs (what has been done), rather than what has been achieved
Page 45 Case study of purpose-oriented PME System Swedish Committee for Afghanistan Mission: To empower individuals, communities, and local organisations, primarily in rural areas and with particular focus on women, girls, boys, and vulnerable groups, such as persons with disabilities, so that they may participate fully in society and influence their own development
Page 46 Case study – Swedish Committee for Afghanistan Lack of standard tools / guidelines Focus on outputs (annual planning) Delivery over effectiveness Diverse donor & government reporting requirements Poor internal coordination Organisational change: Approach Organisational objectives Initial challenges - Field offices, Programmes, Kabul & Sweden Low stakeholder participation, No effective outcome indicators (performance) Too many indicators Low M&E capacity – data collection, time, finance Uneven access to field
Page 47 Key steps in Developing a PME System for SCA 1. Define the scope and purpose of the system Projects and programmes within a strategic (mulit-year) perspective. Results and performance. 2. Perform a situational analysis Reviewed plans and reports, surveyed staff, face-to-face interviews with senior. project, programme and management staff 3. Consult with relevant stakeholders Consulted with key staff in planning, M&E, and management regarding needs, ideas, organisational objectives and approach 4. Identify the key levels Move from activities and outputs to outcomes and impact (RBM)
Page 48 Key steps in Developing a PME System for SCA 5. Select key focus areas These were strategic objectives (rights-based relating to women, children and rural communities) selected during participatory strategic planning 6. Identify minimum standards and expectations Developed as new strategy emerged, refined through establishment of MEP for strategic plan and then detailed programme (multi-year) planning. 7. Integrate the PME system horizontally and vertically Rolled out gradually with adoption of new strategy which itself was a significant methodological change. 8. Work out the details (templates etc. ) and train staff Key programme / project staff trained early on. Recruited new staff to an expanded PME unit, with focal points in field offices. Programmes continue to provide training to relevant staff (in total staff of over 6, 000). 9. Rollout the system.
Page 49 Developing a PME System First Stage Second Stage Third Stage • Develop a basic framework: o Programmatic and longer-term (multi-year) focus o Essential information o Linking planning and reporting o Learning and accountability o Ownership through training of key staff and participatory strategic planning • Integrate with other organisational systems and Sweden HQ strategy. Harmonise performance indicators across organisation • Rollout across organisation • Develop (refine) evaluation protocols • Keep going – review, revise, re-train, refresh. - coordinated with strategic revision.
Page 50 Key principles of M&E Systems • • The system should be minimum but cost-effective Develop the reflective and analytical capacities Give quality information on outputs and outcomes Emphasise decision-making and analysis Recognise that changes may well be unpredictable Based on participation of key stakeholders Recognise critical role of monitoring Acknowledge value of different sources of information and perspectives
Page 51 End of Day – Home Groups • • What worked well today? What did not work today? Why? Any other comments on the day’s work E. g: o Timing o Training methods and approach o Language and communication o Participation o Energy levels and your mood o Training room and arrangements
Page 52 DAY 2 Session 3: Building firm foundations Setting good objectives
Page 53 Session 3: Objectives • Have a clear understanding of what makes a good objective and how to develop objectives, specifically at outcome level. • Be able to check the relevance, achievability and fit of objectives using a logical checking tool
Page 54 Reminder on terminology Overall Objective / Impact Intended impact contributing to physical, financial, institutional, social, environmental, or other benefits to a society, community, or group of people via one or more interventions. Immediate Objective / Outcome The likely or achieved short-term and medium-term effects of an intervention’s output Outputs The products, capital goods and services which result from a intervention; may also include changes resulting from the intervention which are relevant to the achievement of outcomes.
Page 55 Logical Framework Approach Intervention Logic Overall objective / Impact Immediate Objective / Outcomes Outputs Activities Input Indicators Means of verification Assumptions
Page 56 Logical Framework Approach (LFA) An LFA is a tool that: • provides a systematic structure for planning and managing projects o It helps describe what your project is trying to achieve o How it aims to do this o What is needed to ensure success o Ways of measuring progress • enables the main elements of a project to be concisely summarised • brings structure and logic to the relationship between project purpose and intended inputs, planned activities, and expected results – it identifies the logical links
Page 57 The Logic
Page 58 Logical Framework Approach Intervention Logic Overall objective / Impact Immediate Objective / Outcomes Outputs Activities Input Indicators Means of verification Assumptions
Page 59 Assess your project’s logic Working in small groups, review the project logframe you brought with you to this training. • Does the project seem logical? • Is it realistic to achieve the objectives with the activities that are planned? To do this, apply the if…and…then logic we have just looked at. Make a note of your conclusions
Page 60 Developing objectives Objectives (outcomes) need to be linked to the problem we want to address and what we need to do to achieve the change. Ask yourself… • What is the problem? • What do we intend to do? • What do we want to change?
Page 61 Focusing our thinking • What is the problem? This is the problem or opportunity that lies behind the project. It is what leads us to believe that ‘something needs to be done’. • What do we intend to do to address the problem? We need to have clarity as to the boundaries of what will be done in terms of time, money and people. • What is it that we want to change? This is the outcome and benefit of addressing the problem.
Page 62 An objective (outcome) • Is something you achieve not something you do • Should state what will change and who will benefit • Needs to be: Specific, Measurable, Achievable, Relevant, Timebound
Page 63 Good objectives Which of the below are good objectives and why? a) Provision of water to ‘x’ villages b) People in ‘x’ villages have access to water c) People in ‘x’ villages have sustainable access to clean water d) By 2016, ‘x’ marginalised and vulnerable groups in ‘x’ villages have safe and sustainable access to clean water
Page 64 Good objectives • By the end of the project, all school-aged girls in project communities will have 100% attendance in school’ OR • By the end of the project, 50% of school-aged girls in project communities will have improved attendance in school OR • …. ?
Page 65 Always ask… • Is what we intend to do capable of achieving the outcome? • If the outcome is achieved will it solve (or largely solve) the problem? If the answer is no to either question then logically (in terms of the project’s own internal validity)the project will fail!
Page 66 Issues to consider Objectives often state what is to be done and not what is to be achieved. • There should be a clear link between the problem, what we plan to do and what we want to change • Objectives do not always clearly reflect which specific problem or part of the problem the project will address.
Page 67 Group work Protection of Refugees Case Study or the Emergency Community Protection Case Study Exercises E (p. 7) and F, case study B (p. 9) in the training workbook. Choose one of the case studies and read through it. Work in small groups to work out: • what the specific problem this project aims to address is? • what the project intends to do? • what outcome(s) it aims to achieve? Attempt to formulate SMART objectives/outcomes
Page 68 Feedback discussion • Were you able to identify the problem and the main outcome(s)? • What should this project aim to do to address the problem and achieve the outcomes? Getting clarity on the problem we aim to address, what we plan to do and what we want to achieve at outcome level provides a base from which to develop a logframe
Page 69 Reflection and group work on DRC Afghanistan Look at the project logframe you have brought with you and the plan / description of the programme to which it contributes. Are the objectives clearly stated outcome-level objectives? If not, why do you think that is? Reformulate the objectives so that they now satisfy the we have established in this session
Page 70 Monitoring and Evaluation Plan DRC projects and programmes have developed a Monitoring and Evaluation Plan using a DRC MEP Guideline For the country office to be complaint with the MELMOPs you need to have prepared MEPs for all relevant projects. An MEP is a key tool for project and programme management to ensure that they are able to track progress towards the objectives set in project proposal. Find the template DRC MEP at http: //melmop. drc. dk. Have a look at it and compare it with the project logframe. Notice the similarities and also the differences and additional information it requires.
Page 71 DRC Monitoring and Evaluation Plan
Page 72 Session 4: Developing Indicators
Page 73 Session 4: Objectives 1. Be able to develop and define/indicators at outcome level for projects and programmes 2. Be able to cross check the strength of indicators and their link to a projects objectives (outcomes)
Page 74 Defining an Indicator How would you define an indicator?
Page 75 Indicators – definitions Indicators are the observable changes or events that provide evidence that something has happened: • An output has been delivered • An immediate outcome has been achieved • A long-term impact has occurred Indicators generally do not provide proof of an achievement or outcome, but are reliable signs ‘indicating’ the probability that this happened
Page 76 Developing indicators Outcome* level indicators are usually the most challenging to develop, but often the most important. Output level indicators are simpler as they aim to show that an output (activity) has been completed/achieved.
Page 77 Outcome indicators - definitions • Quantitative or qualitative variables, related to the objectives of a development intervention that provide reliable ways of assessing (indicating) whether progress has been made or change has taken place. • Quantitative and qualitative criteria that provide a simple and reliable means to measure achievement, to reflect the changes connected to an intervention or to help assess the performance of a development actor. • A piece of information that provides evidence of a change.
Page 78 Outcome indicators? 1. An increase in the number of people with access to clean water 2. Twenty five facilitators trained in advocacy work 3. A reduction by 40% in the level of diarrhoea in district ’x’ 4. Number of journalists/ lawyers/ public officials trained 5. Amount of press coverage, advocacy and other relevant international attention on freedom of expression 6. An increase in the % of young people with disabilities completing primary school 7. Ability of the media to avoid both court proceedings and self-censorship 8. Ability of citizens to demand their right to water from local government
Page 79 Clarifying indicators Evidence of a change (the indicator) is not the change itself. An indicator should always be aligned with an outcome (or impact statement) that defines what sort of change is being sought.
Page 80 Quantitative indicators
Page 81 Qualitative indicators
Page 82 Quantitative vs Qualitative indicators
Page 83 Neutral Indicators Outcome: Meet basic NFI/food and protection needs of vulnerable non-camp Syrian refugees • % reduction in income-expenditure gap for 2000 extremely vulnerable HH • % of course participants demonstrate increase in skills acquired relevant to the course subject • % of beneficiaries who attended livelihood counselling, courses or have received a business grant can generate additional income by the end of the project
Page 84 Examples of Formatting Indicators • Girls and boys in programme areas who report living free from violence, abuse and exploitation • Proportion of children completing one year of basic primary education • Perception of supported partners that they can develop their own proposals • Policy on grazing land exists
Page 85 Baselines, Milestones and Targets An obvious challenge with having a neutral indicator is that it will require relevant baseline, target and often ‘milestone’ information (if a multi-year project or programme). In the case of quantitative indicators these will appear as numbers. For qualitative indicators these will need to be written as statements describing the change.
Page 86 Disaggregating data Indicators, especially quantitative ones should be disaggregated where relevant. This is clearly critical when taking a Human Rights-Based Approach. To disaggregate data means to show more specific differences. Common target groups for disaggregation include: • Gender • Disability • Marginalised groups • People living with HIV/Aids When should you consider whether you need to collect disaggregated information?
Page 87 Discussion: Indicators • What type of indicators do you tend to use? • Do you find it easier to develop quantitative indicators than qualitative ones? • What are the challenges that you face?
Page 88 A Process for Developing Indicators 1. 2. 3. 4. Decide on a format for your indicators (qualitative or quantitative) Select your indicators Refine your indicators (don’t have too many!) Make your indicators as specific as you require (Indicators like objectives need to be SMART)* 5. Operationalise your indicators – you need to use your indicators (they are not just decoration!) • * Note issue if developing neutral indicators
Page 89 Developing and refining indicators What are the factors and practical considerations we need to consider when developing and refining indicators? Discuss
Page 90 Refining Your Indicators • Will you be able to collect information on your indicator? • If so, where will you get the information from? • Is it likely to be accurate (credible)? • How often will you have to collect it? • Does it require baseline information? If so, can you get it? • How much will it cost to get the information? • Do your staff have the capacity (or desire) to collect the information? • How far can you attribute the indicator to your efforts?
Page 91 Developing Indicators • Outcomes/objectives need to be clear • Brainstorm indicators (what evidence can be accessed? ) • Refine indicators to select a few key ones • Be specific! • State: - who is responsible for collection - frequency of collection - where that information will come from
Page 92 Group practice Take the logframe you brought with you, or Exercise H, case study C in the training workbook: Emergency assistance to Cyclone Mahasen affected population of Barguna district of Bangladesh • Identify the problem the project aims to address, what it plans to do and what it wants to change (outcomes). • Develop a number of indicators for the project. Aim for three to four outcome level indicators (with at least two qualitative indicators) and if you have time suggest three to four output level indicators. • Don’t spend too much time refining the indicators.
Page 93 Feedback • How difficult did you find writing outcome indicators? Could you develop good outcome indicators? • Were you able to clearly differentiate between output and outcome indicators? • Were you able to develop at least two qualitative indicators for the project? • Did you struggle to keep the indicators at outcome level down to four?
Page 94 Sphere Monitoring Key Activities • • Systematic, simple, timely and participatory mechanisms to monitor progress to standards, outputs and activities Monitor agency overall performance Monitor outcomes and where feasible early impact of response Systematic mechanisms to adapt programme response to monitoring data, changing need and context Periodic reflection and learning exercises Final evaluation Joint, interagency and collaborative learning initiatives Share key monitoring findings with affected population, authorities and coordination groups
Page 95 Using standards in monitoring • Check with Government standards and come to agreement – especially over indicators • Use as a basis for negotiating/advocating on assessments and monitoring • Sphere indicators typically at activity not outcome level
Page 96 The Humanitarian Indicator Registry https: //www. humanitarianresponse. info/applications/ir
Page 97 Monitoring and standards • How do you use Sphere in monitoring in emergencies? • What are the challenges?
Page 98 End of Day – Home Groups • • What worked well today? What did not work today? Why? Any other comments on the day’s work E. g: o Timing o Training methods and approach o Language and communication o Participation o Energy levels and your mood o Training room and arrangements
Page 99 Session 5: Selecting and using PME tools
Page 100 Choosing Tools for MELMOPs
Page 101 Session 5: Objectives 1. 2. Understand how to choose different PME Tools to gather data particularly for collecting outcome data. Understand be able to use the principle of triangulation to plan appropriate data collection approaches
Page 102 Choosing tools to collect evidence When monitoring your project you will need to gather evidence that is more than anecdotal to support your case for change. • Which tools do you choose? • Which tools are most suited to gathering sensitive data? • Should you use more than one tool? If so, how do you do it? • How can you be sure that the data you collect is credible and reliable?
Page 103 Your experience with choosing tools Take the project whose Logframe you have brought to the training. List the tools you have used / are using to collect monitoring data. • How and why did you choose each of them? • Do you feel that they are the most appropriate? • What are the challenges in using these tools? • What do you feel are the main challenges in choosing tools for monitoring?
Page 104 Broad types of M&E tools
Page 105 When choosing tools consider… • Nature of the information needed (quantitative or qualitative; breadth or depth? ) • Existing data • Skills of people that will be collecting the information • Participants and stakeholders that will be involved (culture, sensitivity, access, capacities) • Resources (time and money) • Extent to which ‘proof’ is needed (type of evidence needed) • How to ensure triangulation • Which tools will best advance the project • Other…?
Page 106 Types of Data Qualitative data: Aims to convey ideas, opinions, perspectives, experiences, feelings and insights by those with a stake in the problem or issue. Quantitative data: Evidence that can be captured in the form of numbers. Aims to be objective, verifiable and measurable
Page 107 How qualitative & quantitative data complement each other Qualitative data collection to quantitative Quantitative data collection to qualitative Your focus group discussion suggests that WASH committee members feel that more community members are now satisfied with WASH facilities in the village. This information can be used to formulate quantitative survey questions to test the FGD finding and find out how many community members are now satisfied / more satisfied with WASH facilities. You can see from the survey data that more parents now say (as compared to the baseline survey) they sending their girls to school. Qualitative questions asked in a FGD or a series of interviews can be used to explore why this is. From the specific to the general From the general to the specific
Page 108 Sources of evidence Primary sources give you original, first-hand information that has not been analysed or interpreted by anyone else, e. g. If you conduct interviews or a survey with those affected by a problem, your interview notes or the questionnaire responses are your primary evidence. Secondary sources give you information that has been analysed, edited, or commented on by someone, eg. a media article, an academic publication, a report of government performance.
Page 109 Observation is an important data collection tool and has been commonly used in long term studies. There are broadly two main approaches to observation: Participatory: The observer interacts and engages with the action and actors they are observing Non participatory: The observer remains separate from the action/actors they are observing What are the strengths and weaknesses of this tool?
Page 110 Observation tools Have you used any of the following tools? : Transect Walks Community observation (e. g ‘Reality Checks) What has been your experience?
Page 111 Interviews
Page 112 Types of interview
Page 113 Tips for interviews (esp. semi-structured / unstructured) • Leave plenty of time for each interview • Ask open-ended questions using the 6 helpers: WHAT, WHY, WHO, HOW, WHEN, WHERE? • Take time to prepare each interview • Explain the interview process to interviewees • Be sensitive to cultural considerations • Assure interviewees that their responses may be kept confidential • Don't accept the first answer. Probe for more information and ask for explanations • Don’t ask questions that require two answers at the same time • Take detailed notes (with the help of a rapporteur if possible) • Training and practice is needed to yield ‘rich’ information See also M&E Training Course Guide
Page 114 Semi-structured interviews Practice and role plays: 1. Probing interviews a) To understand the effect of risk education delivered in communities OR b) To understand better the immediate needs of IDPs newly arrived in Kabul 2. Good and bad interview technique
Page 115 Surveys & Questionnaires • Often used with a random sampling approach to gather specific data on a number of questions. • Can be used to gain information from a large number of people. • Easier to apply and analyse when used for simple questions • Surveyors will have specific questions to ask respondents • Face to face, telephone or electronic
Page 116 What is your experience? In small groups discuss your experience of designing and carrying out questionnaires and surveys (and structured interviews). What are the challenges with questionnaires and surveys? What are things to remember or factors to consider when designing a questionnaire?
Forfatter/Titel? - Skrives ind i sidehoved/sidefod Page 117 Questionnaires and Surveys REMEMBER • • It takes time to develop good questionnaires Often response rate is low You may need to sample – how will you do this? Surveyors need training to avoid bias TIPS • • • Use mainly closed questions with a list of possible responses Keep the questionnaire short and simple Avoid asking double-barrelled questions that ask about two concepts at the same time Ask basic questions first. Put difficult or contentious questions nearer the end Always test your questionnaire
Page 118 Interviews vs Questionnaires
Page 119 Focus Group Discussions • • Help to gain knowledge about a particular topic from those directly affected Can be used to collect information for many purposes Provide in-depth of analysis of an issue (or issues) - in particular by posing ‘Why’ questions Useful to provide feedback from target groups and beneficiaries on DRC’s performance
Page 120 Focus group discussions REMEMBER • The findings cannot be generalised • Time and resources are needed to analyse data from FGDs • Requires experienced facilitators to ensure that everyone is included and nobody dominates the discussion • Think about dividing into sub-groups (male/female; children/adults; etc) to ensure a full range of opinions and perspectives TIPS • Prepare a small number of questions in advance which seek explanations of processes or monitoring data we do not understand • Take extensive notes. Use a trained rapporteur
Page 121 Focus group discussion Group practice: facilitation, recording data, and observation You will now have the opportunity to practice facilitating and participating in a FGD: To explore and analyse the challenges to M&E in DRC Afghanistan
Page 122 Participatory tools Mapping. Used to identify the location and types of resources used by a community, from the perspective of its inhabitants. Venn Diagrams. Drawn to show the relationships between a community or household and the institutions that affect it Pictures and stories. Useful when initiating discussions with people on complex issues such as perceptions of rights
Forfatter/Titel? - Skrives ind i sidehoved/sidefod Page 123 Templates and reports A useful tool is the Citizens’ report card This is used around the world for collecting feedback from service users on the quantity and quality of government services For example – water, police, health, education Used It is used by Water. Aid in Ethiopia and has a colour coding to denote satisfaction with the service. Have any of you used this type of tool before? What do you think would be the advantages and disadvantages of this tool in Afghanistan?
Page 124 After Action Reviews
Page 125 Challenges: Using different tools Skills are required in order to use them properly and effectively. Tools should be used which produce information which is useful, reliable, valid and of sufficient rigor. Participatory tools aim to be compatible with the way the respondents think. They can help to get the perspectives of marginalised groups (women, lower caste, those with a disability, those living in the distant places). Tools need to be selected on the basis of their appropriateness in assessing progress towards objectives
Page 126 Remote monitoring Monitoring in areas which cannot be regularly visited by staff in Kabul or field offices. Can result in…. • Monitoring being carried out by staff with little training in collecting data • Resort to Yes/No checklists; loss of qualitative data • Reliance on local shura – low literacy levels + exclusion of women and vulnerable from monitoring • Lower quality and relevance of a humanitarian project • Less accountability • Lack of duty of care (increased risk) for local staff/partners What is your experience of remote monitoring? What strategies have you tried to overcome the challenges?
Page 127 Remote monitoring strategies • Use of pictorial tools (Community-Based Education by SMCs) • Regular telephone calls to heads of facilities and shuras to ask about delivery and quality (BPHS community health) • Use of mobile phone applications with simple quantitative and qualitative questions (Min. of Education, EMIS) • Training of locally recruited (district-level) staff in M&E and data collection techniques • Community-based remote monitoring teams – simple checklists and basic PRA (Save the Children, Uruzghan) • Local hiring and training of ‘ 3 rd Party Monitors’ who undertake periodic (annual) monitoring exercises – checklists based on observation and structured interviews/questionnaires (SCA) • Photographs with GPS by locally based volunteers (Many NGOs)
Page 128 Remote Monitoring – Questions to Ask • • Is there an access problem? Does the proposed action include acceptance building measures? Is it a direct ‘life-saving’ action? Can the action be implemented without risking the lives of those implementing the work on the ground? What is the source of the needs assessment in the remote managed situation? Are the staff adequately qualified? Are monitoring arrangements adapted for remote management? Does the action reach women, children and other vulnerable groups?
Page 129 Key points when deciding which tools to use The 5 Ws • What information are you looking for? Is it quantitative/ qualitative? Is it sensitive? Is it about changes in knowledge, attitudes, practice? • Who has that information? Women, children, excluded or vulnerable groups? • Where is it? Is it easy to collect? • When – can it only be collected or gained at certain times? What timeframes/deadlines are you working towards? • Why are you using a certain tool? Is it the most appropriate/cost and time effective?
Page 130 From a Rights-Based Perspective ‘Enabling people to realise their rights to participate in and access information relating to the decision-making processes which affect their lives…. ’ (DFID) Participation throughout the project cycle should enable people to analyse and enhance their understanding of their rights and empower them to plan, act, monitor, evaluate and improve their situation What does this mean in practice?
Page 131 Example
Page 132 Example
Page 133 Case Study: Syrian Refugee Livelihood Programme In small groups read through the case study, Exercise I, p. 14 Questions: • Which tools would you propose to use for monitoring the effectiveness of the skills development courses? • Which tools would you propose to assess how the gender aspects of the project are addressed? • Which tools would you propose to assess the effectiveness of the work on intercommunal dialogue?
Page 134 Feedback and discussion • What tools did you choose and why? • How much support do you think would be needed to use the tools that you have chosen? • What do you think would be the challenges of using the tools you have proposed? • How do you think the data gathered using the tools you have proposed be analysed, aggregated, compared and presented?
Page 135 Validity and Credibility Stakeholders may ask: • Is the data representative of the different stakeholders – especially marginalised groups? • Is it biased (either by the respondents or by the field staff? • Are the conclusions it draws credible or are there other explanations?
Page 136 Triangulation is the cross checking of information. It can be done in a number of ways: • Getting information from different sources – people and places • Using different techniques (tools) • Having different perspectives (team approach)
Page 137 Addressing bias and objectivity The way in which tools are applied and used is of critical importance. • Consider particular sources of potential error or bias that might exist • Look for specific ways to deal with them through the questions asked, seeking other potential sources of information • Seek out potential counterfactual explanations for change
Page 138 Discussion: Using triangulation Do you use the principle of triangulation in your work? Questions to consider: 1. How is triangulation used/applied in the project? 2. How does triangulation help to address the challenge of rigor and credibility of the data? 3. What challenges do you see in the way in which triangulation has been applied? 4. How could these challenges be addressed?
Page 139 Training wrap-up • • Outstanding questions, clarifications Further capacity building needs How will you use your new knowledge? Training evaluation
- Slides: 139