Evaluation design and implementation Puja Myles Puja mylesnottingham

  • Slides: 16
Download presentation
Evaluation design and implementation Puja Myles Puja. myles@nottingham. ac. uk

Evaluation design and implementation Puja Myles Puja. myles@nottingham. ac. uk

Session outline -Evaluation frameworks -CDC framework for evaluation -Theory of change and logic models

Session outline -Evaluation frameworks -CDC framework for evaluation -Theory of change and logic models -RE-AIM framework -Maxwell’s quality assessment framework -Practical exercise: Using a logframe matrix and decision models for evaluation planning/design

What is an evaluation framework? Step 1 F R A M E W O

What is an evaluation framework? Step 1 F R A M E W O R K Step 2 Step 3 Step 4 Deciding and measuring health outcomes

CDC framework for evaluation Step 1: Engage stakeholders Step 2: Describe the program Step

CDC framework for evaluation Step 1: Engage stakeholders Step 2: Describe the program Step 3: Focus the evaluation design Step 4: Gather credible evidence Step 5: Justify conclusions Step 6: Ensure use and share lessons learned

Step 1: Engage stakeholders Key stakeholders: • People involved in programme operations (funders, managers,

Step 1: Engage stakeholders Key stakeholders: • People involved in programme operations (funders, managers, administrators) • People served or affected by the programme (clients, family members, elected officials, sceptics) • Primary users of the evaluation (will be a subset of all the stakeholders identified; these are the people who can act on findings and bring change)

Role of stakeholders • • • Clarify the programme objectives Help you elucidate the

Role of stakeholders • • • Clarify the programme objectives Help you elucidate the underpinning theory of change Help design and carry out the evaluation Help frame recommendations for practice based on findings Initiate change/act on recommendations i. e. ensure that the evaluation is meaningful

Step 2: Describing the programme 1 -Mission and objectives of the programme -The problems

Step 2: Describing the programme 1 -Mission and objectives of the programme -The problems addressed by the programme (nature and magnitude of the problem; populations affected) -How the programme intends to address the problem (theory of change) -Expected effects of the programme

Step 2: Describing the programme 2 • • Activities Resources Context (setting and environmental

Step 2: Describing the programme 2 • • Activities Resources Context (setting and environmental influences e. g. Political/historical/social) Logic Model

Theory of change • This approach involves setting out the series of outcomes that

Theory of change • This approach involves setting out the series of outcomes that are expected to unfold as a result of the various components of the intervention as a basis for planning the evaluation strategy. • Can be visualised as a sequential process of ‘if-then’

Logic model/logframe matrix • • A practical approach to understanding theory of change for

Logic model/logframe matrix • • A practical approach to understanding theory of change for a given intervention Can be used with stakeholders

An example logframe matrix Narrative summary Goal (Why are we doing this? ) Purpose

An example logframe matrix Narrative summary Goal (Why are we doing this? ) Purpose (What will we achieve? ) Outputs (What immediate outcomes will we achieve? ) Activities (what will we do? ) Verifiable indicators Means of verification Assumptions

Step 3: Focusing the evaluation design Things to consider: • Purpose of evaluation (feasibility,

Step 3: Focusing the evaluation design Things to consider: • Purpose of evaluation (feasibility, effectiveness, change, empowerment, sponsor requirement) • Evaluation questions (merit, costeffectiveness, equity, quality) • Feasibility • Ethics

Study designs Ovretveit (1998) outlined six basic evaluation designs: • Descriptive • Audit •

Study designs Ovretveit (1998) outlined six basic evaluation designs: • Descriptive • Audit • Outcome (the before-after comparison; quasi-experimental design) • Comparative experimental • Randomised controlled experimental • Intervention to a service (impact on providers and patients)

CDC framework: Steps 4 -6 Step 4: Gather credible evidence (what outcomes and how

CDC framework: Steps 4 -6 Step 4: Gather credible evidence (what outcomes and how will you measure these) Step 6: Justify conclusions (attribution versus contribution; alternative explanations such as bias, chance, confounding) Step 7: Ensure use and share lessons learned (stakeholder involvement; participatory approaches)

RE-AIM framework for measuring public health impact Glasgow et al (1999): • Reach (uptake;

RE-AIM framework for measuring public health impact Glasgow et al (1999): • Reach (uptake; who benefits; who is left out) • Efficacy (include behaviour outcomes and participant-centred quality of life measures; consider both positive and negative outcomes) • Adoption (proportion & representativeness of settings): use direct observation, interviews, surveys • Implementation (the extent to which a programme is delivered as intended); audit • Maintenance: long-term maintenance of behaviour change (both clients and service providers)

Assessing Quality Maxwell’s dimensions of health care quality: • • • Access to services

Assessing Quality Maxwell’s dimensions of health care quality: • • • Access to services Relevance to need (for the whole community) Effectiveness (for individual patients) Equity (fairness) Social acceptability Efficiency and economy