EVALUATION HELPDESK 2014 2020 Quality assessment of Evaluation

  • Slides: 7
Download presentation
EVALUATION HELPDESK 2014 -2020 Quality assessment of Evaluation Plans Evaluation Network Meeting 5 -6

EVALUATION HELPDESK 2014 -2020 Quality assessment of Evaluation Plans Evaluation Network Meeting 5 -6 November 2015

Evaluation Helpdesk 2014 -2020 Introduction v Quality assessment of evaluation plans (EPs) is one

Evaluation Helpdesk 2014 -2020 Introduction v Quality assessment of evaluation plans (EPs) is one of the task of the Evaluation Helpdesk; other tasks are: v v to collect and summarise evaluations carried out over 2014 -20 period and maintain a database on evaluations and evaluation plans (Task 1) to manage peer reviews of selected evaluations (Task 2) to organise training for Managing Authorities on evaluation (Task 3) to provide methodological support to MAs on evaluation and related issues (Task 4) v Quality assessments based on structured approach developed by high-level evaluation experts during last year’s ‘Pilot’ Evaluation helpdesk v Structured approach covers all key requirements set out in Commission’s Guidance Document on Evaluation Plans

Evaluation Helpdesk 2014 -2020 Structured approach for quality assessment of evaluation plans Six Focus

Evaluation Helpdesk 2014 -2020 Structured approach for quality assessment of evaluation plans Six Focus areas for assessing quality of evaluation plans: 1. Management and planning: evaluation function, use of available evidence, time planning, quality management 2. Responsibility and coordination: partnership involvement, MA coordination, Cross MA coordination, budget 3. Design and methods: evaluation design, selection of designs and methods, results orientation, contribution to results 4. Data availability and data systems: data requirement, data availability, comprehensive data sets 5. Skills and expertise: evaluation independence, internal expertise, evaluation networks and providers, training and development 6. Use and communication: evaluation users, evaluation communication, analysis and comparison at EU level

Evaluation Helpdesk 2014 -2020 Implementation of structured approach and work carried out so far

Evaluation Helpdesk 2014 -2020 Implementation of structured approach and work carried out so far v Development of template with questions covering 6 focus areas. v Assessments carried out by Applica, Ismeri Europa and network of national experts v Since start in September 2015, over 50 evaluation plans reviewed (i. e. just over 1 a day on average) v Breakdown by country : 14 French OPs; 8 German; 4 Slovak, 3 Romanian; 2 for Croatia, Czech Republic, Finland, Poland, Spain, UK; 1 in Italy, Malta, Netherlands, Portugal, Slovenia, Sweden. 4 EPs of ETC v Breakdown by funding: 27 plans of OPs funded by ERDF and Cohesion Fund, 8 ESF OPs and 16 OPs with mixed funding

Evaluation Helpdesk 2014 -2020 Main findings from the assessments carried out so far The

Evaluation Helpdesk 2014 -2020 Main findings from the assessments carried out so far The evaluation plans are relatively complete and coherent with regard to: v The set-up of an operating evaluation function to plan, procure, coordinate and manage evaluations v The division of responsibilities and coordination of MAs in terms of stated powers and obligations v The involvement of relevant partners and stakeholders in defining the evaluation plan and implementing it v The independence of evaluators and the units responsible for evaluations v The identification of users of evaluation and the communication and dissemination of evaluation findings v The budget set aside for the evaluations planned

Evaluation Helpdesk 2014 -2020 Main findings from the assessments carried out so far (continued)

Evaluation Helpdesk 2014 -2020 Main findings from the assessments carried out so far (continued) The evaluation plans are less complete and coherent with regard to: v Quality management and application of quality criteria to review deliverables of evaluation process v Internal expertise available for evaluations and involvement of external sources of expertise v Training and development of internal staff of evaluation unit and identification of needs and sources of training v Scheduling of evaluations and feeding of evaluation findings into decision-making v Coordination between MAs in terms of arrangements to exchange cross-cutting aspects of evaluation v Planning of operational requirements of evaluations to enable analysis and comparison at EU level

Evaluation Helpdesk 2014 -2020 Main findings from the assessments carried out so far (continued)

Evaluation Helpdesk 2014 -2020 Main findings from the assessments carried out so far (continued) But main weaknesses common to many plans relate to: v Limited use made of existing evidence from past evaluations and research to identify main gaps in knowledge about effect of programmes and measures supported v Evaluation design - especially failure to set out key evaluation questions to be investigated and why v The approaches or methods selected to address these evaluation questions and rationale for choice v Identification of data required to answer evaluation questions in enough detail to be able to define data sources and check availability v Assessment of data available and identification of possible gaps and deficiencies, including where counterfactual analysis planned of data for non-recipients of funding v Formulation of plan to fill gaps in data and correct deficiencies