Review Alternative Approaches II n What three approaches

  • Slides: 15
Download presentation
Review: Alternative Approaches II n What three approaches did we last cover? n Describe

Review: Alternative Approaches II n What three approaches did we last cover? n Describe n Which n What one benefit of each approach focuses on the marginalized? were the five cautions the authors shared about the alternative approaches to evaluation?

Guidelines for Planning Evaluations: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western

Guidelines for Planning Evaluations: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

Individuals who Affect or are Affected by an Evaluation Study n Sponsor: authorizes the

Individuals who Affect or are Affected by an Evaluation Study n Sponsor: authorizes the evaluation, provides resources for its conduct n Client: requests the evaluation n Stakeholders: those who have a stake in the program or in the evaluation’s results n Audiences: individuals, groups, agencies who have an interest in the evaluation and receive its results

Understanding Reasons for Initiating Evaluation n Understanding the purpose of the evaluation is an

Understanding Reasons for Initiating Evaluation n Understanding the purpose of the evaluation is an important first step – Did a problem prompt the evaluation? – Did some stakeholder demand it? – Who has the need to know? – What does s/he want to know? Why? – How will s/he use the results?

n It is not uncommon for the clients to be uninformed about evaluation procedures

n It is not uncommon for the clients to be uninformed about evaluation procedures and to have not given deep thought about the ramifications n Frequently, the purpose is not clear until the evaluator has carefully read the relevant materials, observed the evaluation object, and interviewed stakeholders

Practical Application to YOUR Plan: Questions to Begin 1) Why is this evaluation being

Practical Application to YOUR Plan: Questions to Begin 1) Why is this evaluation being requested? What questions will it answer? 2) To what use will the evaluation findings be put? By whom? What others should receive the information? 3) What is to be evaluated? What does it include? Exclude? During what time period? In what settings? Who will participate?

4) What are the essential program activities? How do they link with the goals

4) What are the essential program activities? How do they link with the goals and objectives? What is the program theory? 5) How much time and money are available for the evaluation? Who can help with it? Is any information needed immediately? 6) What is the political climate and context surrounding the evaluation? Will any political factors and forces interfere in gaining meaningful and fair information?

Informational Uses of Evaluation n Needs Assessment – Determine whether sufficient need exists to

Informational Uses of Evaluation n Needs Assessment – Determine whether sufficient need exists to initiate a program and describe the target audience – Assist in program planning by identifying potential program models n Monitoring/Process Study – Describe program implementation and whether changes from the initial model have occurred n Outcomes Study – Examine whether certain goals are being achieved at desired levels – from the initial model have occurred n Cost Effectiveness Study – Judge overall program value & its relative cost: value ratio compared to competing programs

Noninformational Uses n Postponement of a decision n Ducking responsibility [know decision already but

Noninformational Uses n Postponement of a decision n Ducking responsibility [know decision already but need to make it look good] n Public Relations [justify the program] n Fulfilling grant requirements n Covert, nefarious, political uses of information: – Typically more common in federal/national evaluations

Conditions under which evaluation studies are inappropriate n Evaluation would produce trivial information –

Conditions under which evaluation studies are inappropriate n Evaluation would produce trivial information – Low impact program, one-time effort n Evaluation results will not be used – Regardless of outcome, political appeal/public support… n Cannot yield useful, valid information (bad worse than none) – Well-intentioned efforts, “mission impossible” evals n Evaluation is premature for the stage of the program – Fitness program evaluation in first 6 weeks will not yield meaningful information – Premature summative evals most insidious misuse of evaluation n Motives of the evaluation are improper – Ethical considerations, “hatchet jobs” (propriety: eval respects rights & dignity of data sources; help organizations address all clients’ needs)

Determining Appropriateness a tool called evaluability assessment – Clarify the intended program model or

Determining Appropriateness a tool called evaluability assessment – Clarify the intended program model or theory – Examine the program implementation to determine whether it matches the program model and could achieve the program goals – Explore different evaluation approaches to match needs of stakeholders – Agree on evaluation priorities and intended uses of the study n Use

Methods n Create working group to clarify program model or theory, define information needs,

Methods n Create working group to clarify program model or theory, define information needs, evaluation expectations – Personal interviews with stakeholders – Reviews of existing program documentation – Site visits n Figure 10. 1 (p. 186): checklist to determine when to conduct an evaluation

n External Who will Evaluate? – impartial, credible, expertise, fresh look – participants may

n External Who will Evaluate? – impartial, credible, expertise, fresh look – participants may be more willing to reveal sensitive information to outsiders – more comfort presenting unpopular information/advocating changes, etc. n Internal – Knowledge of program, history, context, etc. – familiarity with stakeholders – Serve as advocates to use findings – quick start up – Known quantity

n Combination – Internal collect contextual information – Internal collect data – External directs

n Combination – Internal collect contextual information – Internal collect data – External directs data collection, organizes report – Internal is there to advocate and support after external is gone

Evaluator Qualifications/Skills n Does evaluator have the ability to use methodologies and techniques needed

Evaluator Qualifications/Skills n Does evaluator have the ability to use methodologies and techniques needed in the study? n …. have the ability to help articulate the appropriate focus for the study? n …. have the management skills to carry out the study? n …maintain proper ethical standards? n …communicate results to audiences so that they will be used?