STRATEGIC EVALUATION PLANNING GUIDING FEDERAL GRANTEES THROUGH UNCHARTED

  • Slides: 78
Download presentation
STRATEGIC EVALUATION PLANNING GUIDING FEDERAL GRANTEES THROUGH UNCHARTED WATERS Carlyn Orians, Sheri Disler, Maureen

STRATEGIC EVALUATION PLANNING GUIDING FEDERAL GRANTEES THROUGH UNCHARTED WATERS Carlyn Orians, Sheri Disler, Maureen Wilce, Leslie Fierro, Sarah Gill, Shyanika Rose, Joanne Abed, and Linda Winges November 4, 2011

What I will discuss here…. . General overview of asthma program Model of CDC

What I will discuss here…. . General overview of asthma program Model of CDC support Evolution of evaluation requirements Current model of CDC support The findings and conclusions in this presentation are the authors’ opinions and do not represent the views of the CDC or the Air Pollution and Respiratory Health

Overview of CDC’s National Asthma Control Program (NACP) ¢ ¢ ¢ Initially funded via

Overview of CDC’s National Asthma Control Program (NACP) ¢ ¢ ¢ Initially funded via Congressional mandate in 1999, 3 grantees Gradually increased number of grantees over next decade to 36 Funding three main program components: ¢ Partnerships ¢ Surveillance ¢ Interventions What about evaluation?

But………. what about evaluation? ¢ Evolution of our funding requirements with regard to evaluation:

But………. what about evaluation? ¢ Evolution of our funding requirements with regard to evaluation: ¢ 1999 -2005 – evaluate everything! ¢ 2006 -2008 –evaluate at least 1 -2 activities in each of the 3 component areas ¢ Meanwhile……. . evaluation indicator development and stakeholder engagement.

Preskill H and Boyle S, 2008. A multidisciplinary model of evaluation capacity. American Journal

Preskill H and Boyle S, 2008. A multidisciplinary model of evaluation capacity. American Journal of Evaluation, 29, 443 -459

New Ideas ¢ ¢ ¢ New funding announcement Opportunity to adjust evaluation expectations Who’s

New Ideas ¢ ¢ ¢ New funding announcement Opportunity to adjust evaluation expectations Who’s going to do it? ?

NACP Model for Technical Assistance ¢ Pre-2009: ¢ Project officer (program manager or consultant)

NACP Model for Technical Assistance ¢ Pre-2009: ¢ Project officer (program manager or consultant) ¢ Epidemiologist

NACP Model for Technical Assistance ¢ 2009 – present ¢ Project officer (program manager

NACP Model for Technical Assistance ¢ 2009 – present ¢ Project officer (program manager or consultant) ¢ Epidemiologist ¢ Evaluation Technical Advisor (ETA)

State Staffing Requirements ¢ Pre-2009 two staff: ¢ Full time program manager ¢ Full

State Staffing Requirements ¢ Pre-2009 two staff: ¢ Full time program manager ¢ Full time epidemiologist ¢ Principal investigator (may be in kind, % time not specified)

State Staffing Requirements ¢ 2009 – present ¢ Full time program manager ¢ Full

State Staffing Requirements ¢ 2009 – present ¢ Full time program manager ¢ Full time epidemiologist ¢ 50% time evaluator ¢ Principal investigator (may be in kind, % time not specified)

Evaluator Staffing Requirement ¢ Employed how? ¢ Employed in state health department ¢ Shared

Evaluator Staffing Requirement ¢ Employed how? ¢ Employed in state health department ¢ Shared between programs ¢ Consultant ¢ Experience/background: ¢ Researcher ¢ Epidemiologist ¢ Evaluator (less common)

Strategic Evaluation Planning The burning questions I’ll address today… 1. 2. 3. 4. 5.

Strategic Evaluation Planning The burning questions I’ll address today… 1. 2. 3. 4. 5. What is a strategic evaluation plan? Why is it a good idea? What guidance is available? What is the process to create one? At the end, what do we hope to get?

What is a strategic evaluation plan? The “grand plan” or vision for evaluation over

What is a strategic evaluation plan? The “grand plan” or vision for evaluation over the life of a program or funding cycle ¢ A living document that provides an informed and logical sequence of evaluations ¢ A “portfolio” of evaluations that collectively provide the information you need to meet program improvement and accountability goals ¢

Why is it a good idea to have one? A strategic evaluation plan can

Why is it a good idea to have one? A strategic evaluation plan can help you make sure that … ¢ Evaluation resources are used effectively and wisely ¢ All core program components are covered ¢ Evaluation is proactive rather than reactive ¢ Evaluations yield high-quality findings

Strategic planning is central to sustaining evaluation capacity Preskill H and Boyle S, 2008.

Strategic planning is central to sustaining evaluation capacity Preskill H and Boyle S, 2008. A multidisciplinary model of evaluation capacity. American Journal of Evaluation, 29, 443 -459.

What guidance is available? A “How to” Manual was developed by the CDC National

What guidance is available? A “How to” Manual was developed by the CDC National Asthma Control Program The Guide leads states through the process of developing a strategic evaluation plan The CDC-Battelle working group contributing to this effort received the 2009 CDC/ATSDR Honor Award for “Excellence in Program or Available at: Policy Evaluation” http: //www. cdc. gov/asthma/program_eval/guide. htm

What is the process to create one? ¢ ¢ So you’ve asked us to

What is the process to create one? ¢ ¢ So you’ve asked us to create a strategic plan How do we get there?

CDC Evaluation Framework ¢ ¢ CDC Evaluation Framework is designed to support a single

CDC Evaluation Framework ¢ ¢ CDC Evaluation Framework is designed to support a single evaluation We have adapted the steps to support strategic evaluation planning

An adaptation of the 6 -step framework To support strategic evaluation planning. . .

An adaptation of the 6 -step framework To support strategic evaluation planning. . . Step A: Establish a strategic evaluation planning team Step B: Describe the program Step C: Prioritize program activities for evaluation Step D: Consider evaluation design elements Step E: Develop a cross-evaluation strategy Step F: Promote use through communication Step G: Write and revise your strategic evaluation plan

PROCESS Establish evaluation planning team Develop a description of the program Prioritize program activities

PROCESS Establish evaluation planning team Develop a description of the program Prioritize program activities for evaluation Generate prioritization criteria Apply prioritization criteria/process List of activities/initiatives PRODUCT Strategic Evaluation Plan • Background & Purpose Initial list of priority evaluation candidates • Methods Used to Develop and Update the Plan • Proposed Priority Evaluations Consider evaluation design elements • Communication Plan Priority evaluation candidates with preliminary designs Develop a crossevaluation strategy Final list of evaluation candidates reviewed for data collection efficiencies, cross-evaluation timeline, resources, and capacity Develop a communications plan

Key points on Step A: stakeholders ¢ ¢ Your evaluation planning team should include

Key points on Step A: stakeholders ¢ ¢ Your evaluation planning team should include stakeholders who have the big picture of your program They should be champions for evaluation and be committed to annual reviews and revisions

Key points on Step B: description ¢ ¢ Your program description should be comprehensive

Key points on Step B: description ¢ ¢ Your program description should be comprehensive For complex programs, program activity profiles are a method for capturing the full range of programmatic activities that comprise your program and that could be the focus of evaluation

State Asthma Program Impact Model State Asthma Program Activities Maintain & enhance statewide asthma

State Asthma Program Impact Model State Asthma Program Activities Maintain & enhance statewide asthma surveillance activities Build, maintain, & enhance statewide asthma partnerships Identify, prioritize, & implement interventions to decrease disparities & reduce state and national asthma burden Coordinate statewide asthma activities through creation, implementation, & revision of statewide asthma plan Short-Term Outcomes Increased awareness of asthma burden, disparities, statewide asthma efforts, & ability to manage asthma Intermediate Outcomes Improved asthma management behaviors of individuals… Reduced production of & exposure to triggers • Who have asthma & their families Improved knowledge & understanding of appropriate asthma management practices & effective public health strategies related to asthma management Improved attitudes toward asthma management practices & statewide asthma efforts Evaluate state asthma program & modify statewide plans & activities based upon findings Improved skills in asthma management and partnership functioning Share findings from surveillance & evaluation efforts Increased coordination of asthma-related efforts across the state Long-Term Outcomes Asthma mortality decreased Asthma disparities decreased • Who provide services in settings where persons with asthma live, work, & receive medical care Improved medical management of asthma Asthma symptoms & morbidity decreased Improved productivity & quality of life Public & organizational policies supportive of asthma management practices proposed and adopted Increased funding to support asthma activities Statewide asthma efforts sustained & improved Improved use of available resources New or strengthened relationships & networks Improved infrastructure & public health practice

Program Component Program Activity Profile (choose one – Surveillance, Partnerships, Interventions) Title of Activity

Program Component Program Activity Profile (choose one – Surveillance, Partnerships, Interventions) Title of Activity (title of activity) Description of Activity Duration of Activity (describe the activity) Partner Involvement (describe whether partners are involved in the activity and, if so, specify major partners and their roles) Cost of Activity (provide a rough or “ballpark” estimate of what the activity costs overall or annually, including funds from all sources; specify what portion, if any, comes from partner contributions) Contribution to Intended Program Outcomes (describe what results or “outcomes” you expect to see based on conducting this activity) (start and end date or ongoing) Known Challenges in (list any known challenges in conducting the activity) Conducting the Activity Prior Evaluation (list any prior evaluations conducted of this activity)

Key points on Step C: prioritization ¢ ¢ ¢ Development of clear prioritization criteria

Key points on Step C: prioritization ¢ ¢ ¢ Development of clear prioritization criteria Application of criteria to program activity profiles Generate a list of rank-ordered evaluation candidates

Potential Criteria for Evaluation Prioritization Criterion Information Required for Prioritization Cost What financial resources

Potential Criteria for Evaluation Prioritization Criterion Information Required for Prioritization Cost What financial resources have we invested in this activity? Labor/time intensive How much staff time have we invested in this activity? Newness How new is this activity? Prior evaluation Have we evaluated this activity before? Maturity What is the stage of development or implementation for this activity? Stakeholder interest How interested are our stakeholders in this activity? Sustainability How much does this activity contribute to the sustainability of the state asthma program? Centrality How connected is this activity to our asthma partners across the state? Plan alignment How closely aligned is this activity with our state asthma plan? Plausible outcomes Can this activity reasonably be expected to lead to relevant outcomes? Disparities Will this activity reduce asthma disparities? Focus Does this activity affect those most burdened by asthma? Reach Challenges How many people in our state are (or could be) affected by this activity? Are we (or do we anticipate) struggling with this activity? Pilot Do we plan to expand this activity? Information need How critical is the evaluation information for making near-term decisions? Quality improvement Would evaluating this activity likely result in recommendations for quality improvement?

Key points on Step D: design elements ¢ Generate evaluation questions for each priority

Key points on Step D: design elements ¢ Generate evaluation questions for each priority evaluation candidate ¢ Sketch out design options ¢ Estimate resource requirements

Example Evaluation Design and Data Collection Template (partially completed) Question Surveillance What measures have

Example Evaluation Design and Data Collection Template (partially completed) Question Surveillance What measures have we taken to identify gaps in our asthma surveillance data over the past 2 years? Are these activities sufficient? Partnerships Possible Evaluation Design(s) Potential Data Possible Data Collection Sources Methods Data Collection Begins Case-study (using multiple methods of data collection) Document review; Surveillance workplans; Asthma epidemiologists; Surveillance data users Year 3 Middle of Year 4 Modest Case-study Document review State asthma (budgets from program budgets grants); partner Partners survey; key informant interviews Year 2 Modest Observations; Open-ended interviews; Online survey Year 2 Low to Modest To what extent are resources leveraged between state agencies or CDC funded programs to support the asthma program or to accomplish the state asthma plan goals? Interventions Case study How well does the (multiple data electronic system collection methods) function? Semi-structured interviews; Online survey On-site observations; Purposive sample of users for interviews; All users for online survey Final Results Due Resources Required

Key points on Step E: crossevaluation strategy ¢ ¢ ¢ Look for areas of

Key points on Step E: crossevaluation strategy ¢ ¢ ¢ Look for areas of potential overlap or leverage Consider resources and skills required for each activity Consider sequence and timing of evaluations

Issues to Consider When Looking Across Proposed Evaluation Strategies Area Definition Evaluation Design What

Issues to Consider When Looking Across Proposed Evaluation Strategies Area Definition Evaluation Design What evaluation designs are proposed? From whom is information being collected? Is the same evaluation design proposed to answer multiple evaluation questions? If several data collection strategies have the same target audience, can you collect information for more than one purpose using a single data collection tool? Are data collection activities concentrated too heavily on one target audience? Can burden be shared more equitably? When is information being collected? Data Collection: Target Audience Data Collection: Timeline Issues to Consider How can evaluation data collection needs be integrated into the program timeline? For example, if baseline data need to be collected, program activities may need to be delayed. If information on different evaluation activities needs to be collected at the same time, do you have the resources to conduct multiple evaluation activities simultaneously? Can the same data source be used for multiple evaluation activities? Can a single source be modified or enhanced to support your strategies for the future? Do you have the personnel and resources to conduct the evaluation strategies you prioritized? Do they have the necessary skills and expertise or how could they get these skills? Can you leverage additional evaluation assistance from partners? Who will do the analysis? Do they have the necessary skills and expertise or how could they get these skills? Can you leverage additional analytic capability from partners? Will the information be provided in a timely manner? Is there information from another evaluation strategy area that would be helpful to have in order to interpret the findings? Are there capacity building activities that need to be conducted with intended users to increase the likelihood that results will be used? Data Collection: Source Who From where is information being collected? Who will conduct the evaluation activity? How: Analysis How will the information from the evaluation be analyzed? How: Use How will the information from the evaluation likely be used?

Key points on Step F: promote use ¢ ¢ Consider communication of evaluation plans,

Key points on Step F: promote use ¢ ¢ Consider communication of evaluation plans, progress and results across the program life span Develop a comprehensive communication plan

Communication Template Audience 1 (e. g. , Evaluation Planning Team) Purpose Possible Formats Timing

Communication Template Audience 1 (e. g. , Evaluation Planning Team) Purpose Possible Formats Timing √ Inform about specific upcoming evaluation planning activities Email Bi-weekly √ Keep informed about progress of developing the strategic evaluation plan Email Monthly √ Present complete/final strategic evaluation plan Power-point presentation End-of year meeting √ Notify of need to update strategic evaluation plan Email As need arises Share revisions made to strategic evaluation plan ----- Email Quarterly Informal presentations Bi-monthly meetings √ Provide general update on status of evaluations as proposed in strategic evaluation plan Final report √ Document & share synthesis of findings & lessons learned during co-operative agreement lifecycle Formal presentation Working sessions End of cooperative agreement Notes Consider receiving general formative feedback on process to date Will already be aware of this. Use working sessions to generate ideas for specific use of findings in future plans focused on asthma Adapted from Russ-Eft and Preskill Evaluation in Organizations: A Systematic Approach to Enhancing Learning, Performance, and Change. New York, NY: Basic Books, 2001; pp. 354 -357.

Key points on Step G: living document ¢ ¢ Develop a comprehensive evaluation strategy

Key points on Step G: living document ¢ ¢ Develop a comprehensive evaluation strategy document Revisit the plan with stakeholders at least annually

At the end, what do we hope to get? ¢ ¢ ¢ A high-level

At the end, what do we hope to get? ¢ ¢ ¢ A high-level strategy document rather than a detailed work plan A proposal for multiple related evaluations rather than a single evaluation A comprehensive strategic approach to evaluation that will support program improvement and accountability

Arriving at a better place ¢ Our goal is to help program personnel navigate

Arriving at a better place ¢ Our goal is to help program personnel navigate the waters and arrive safely on shore, with increased evaluation capacity and better programs!

SUCCESSES AND STRUGGLES IN CREATING QUALITY STRATEGIC EVALUATION PLANS

SUCCESSES AND STRUGGLES IN CREATING QUALITY STRATEGIC EVALUATION PLANS

Ideas flow together

Ideas flow together

Gathering data 36 states submitted strategic evaluation plans Evaluation Technical Advisors reviewed using a

Gathering data 36 states submitted strategic evaluation plans Evaluation Technical Advisors reviewed using a standard form Iterative process ensued Subsection of review forms input to Survey Monkey for this analysis

Criteria for strategic Purposeful Sequenced Comprehensive Inclusive Logical Coherent Flexible Designed to build capacity

Criteria for strategic Purposeful Sequenced Comprehensive Inclusive Logical Coherent Flexible Designed to build capacity

Purposeful 92% plans included a purpose statement Improve Program Assess outcomes Gain insight Accountability

Purposeful 92% plans included a purpose statement Improve Program Assess outcomes Gain insight Accountability Document Success Build capacity 0 20 40 60 80 100

Sequenced 92% of plans showed a sequence Most used a timeline format Many annual

Sequenced 92% of plans showed a sequence Most used a timeline format Many annual surveys Some allowed for program growth

Comprehensive Required at least 3 evaluations Partnership Surveillance Intervention Considered Range 4 -140 Prioritized

Comprehensive Required at least 3 evaluations Partnership Surveillance Intervention Considered Range 4 -140 Prioritized Range an average of 13 possible evaluations an average of 9 evaluations 3 - 17

Inclusive Stakeholder Engagement None; 2 Token ; 8 Extensive ; 14 Some; 12

Inclusive Stakeholder Engagement None; 2 Token ; 8 Extensive ; 14 Some; 12

Logical 92% of the plans included at least 1 logic model Overarching Model of

Logical 92% of the plans included at least 1 logic model Overarching Model of Asthma Program--5 year Strategic Plan Inputs Outputs Activities Alabama Asthma Program & CDC Cooperative Agreement Participation Estabish surveillance system, collect data, establish baselines, analyze data Comprehensive Annual Burden Reports; Disparate populations mapped Survey healthcare providers; Offer EPR 3 training and asthma educator certification classes; pharmacy education with instructional stickers on medications List number of providers using current guidelines; additional trained providers; number of certified asthma educators; number of pharmacies participating; number of stickers given out 2008 Burden Report Asthma Coalition: individual members & strategic partners Partners, Medicaid, EPA, Cooperative Extension Agency, Hospital Assn. , School Nurses Assn. , Medical Assn. , Academy of Ped. , Pharm. Assn. , and other partners Implement Tools for Schools in 15 -30 schools per year; offer minigrants to schools; educ. prog. through 4 -H & FFA Short Data reported to programs & decisionmakers to inform programs & policies. On-line Resource Directory for asthma care continually updated Outcomes Medium Permanent Asthma surveillance within ADPH with data available on website Quality of asthma care and self-management increased; medication adherence increased Long Healthcare utilization due to asthma reduced. Asthma disparities reduced. Quality of life with asthma improved. Expand programs to additional schools each year Statewide adoption of Tools for Schools and AQ Flag Program Work with EPA and IAQ to assess air quality in schools; education programs in schools and programs w/4 H & FFA; Flag program in schools Increase education of asthma burden, air quality, and triggers Implement AQ Flag program in 10 -30 schools per year in areas where AQ data available. Assumptions EPR 3 guidelines are the gold standard. Data sources can and will provide valid data. People with asthma want the intervention. Environmental assessments will show air quality problems. External Factors Political factors can have an impact and state/community policies. Current economic factors unknown effects on air quality and asthma.

Flexible 83% of plans articulated a schedule for review and revision of the plan

Flexible 83% of plans articulated a schedule for review and revision of the plan Quarterly As needed 3% 14% 2 x a year 19% Annually 64%

Coherent All but one plan explained how and why they prioritized evaluations Common criteria

Coherent All but one plan explained how and why they prioritized evaluations Common criteria included: Sustainability Resource investment Stakeholder interest Need for information Likely use of finding Need to address disparities

Designed to build capacity 78% plans included some strategies to increase evaluation capacity stakeholder

Designed to build capacity 78% plans included some strategies to increase evaluation capacity stakeholder training evaluator training specifc training coaching partners aquiring expert aquring technoology 0 10 20 30 40 50 60 70

What worked? Provided time for reflective thought Enhanced overall strategic planning efforts Integrated evaluator

What worked? Provided time for reflective thought Enhanced overall strategic planning efforts Integrated evaluator into program structure Changed standard operating procedures Increased transparency Engaged partners effectively

Lessons learned Template was important, but not too prescriptive Strategic evaluation planning process takes

Lessons learned Template was important, but not too prescriptive Strategic evaluation planning process takes time Defining and assessing “strategy” is challenging Hard to articulate evaluation capacity building strategies Volunteering can lead you in interesting directions!

Thank you!

Thank you!

GOING WITH THE FLOW: NACP STATE PARTNER EXPERIENCES WITH STRATEGIC EVALUATION PLANNING

GOING WITH THE FLOW: NACP STATE PARTNER EXPERIENCES WITH STRATEGIC EVALUATION PLANNING

Gathering State Feedback NACP evaluation meeting One day follow-on meeting to 2011 AEA/CDC Summer

Gathering State Feedback NACP evaluation meeting One day follow-on meeting to 2011 AEA/CDC Summer Institute Facilitated roundtable discussions of successes and challenges 33 participants from states (evaluators and program managers) 6 CDC staff (evaluation technical advisors and project officers) Quarterly conference call 29 participants from states, 7 from CDC

Challenges Stakeholder burnout

Challenges Stakeholder burnout

Challenges Stakeholder burnout Overly ambitious plans

Challenges Stakeholder burnout Overly ambitious plans

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used Difficulty in using

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used Difficulty in using existing data

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used Difficulty in using

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used Difficulty in using existing data Tailoring evaluator role to program needs

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used Difficulty in using

Challenges Stakeholder burnout Overly ambitious plans Concern about findings being used Difficulty in using existing data Tailoring evaluator role to program needs Too great a demand for evaluator time and skills

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on crafting good evaluation questions

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on crafting good evaluation questions Need to orient to a stakeholder-driven utilization focus away from a monitoring orientation

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on crafting good evaluation questions Need to orient to a stakeholder-driven utilization focus away from a monitoring orientation Difficulty in supporting the development of good facilitation skills

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on

Challenges in ETA Role Nature of a cooperative agreement Difficulty in providing guidance on crafting good evaluation questions Need to orient to a stakeholder-driven utilization focus away from a monitoring orientation Difficulty in supporting the development of good facilitation skills Problem of “planning fatigue”

Additional Uses for SEP Designed as a promotional document Internal to SHD External to

Additional Uses for SEP Designed as a promotional document Internal to SHD External to SHD Used the process to educate higher ups about program Used the process to reinvigorate partnerships or coalition

Words of Wisdom from States “Give up your dream for theirs. ”

Words of Wisdom from States “Give up your dream for theirs. ”

Words of Wisdom from States “Give up your dream for theirs. ” Take small

Words of Wisdom from States “Give up your dream for theirs. ” Take small steps—start slow, set reasonable or low expectations to build confidence.

Words of Wisdom from States “Give up your dream for theirs. ” Take small

Words of Wisdom from States “Give up your dream for theirs. ” Take small steps—start slow, set reasonable or low expectations to build confidence. Get evaluation on everyone’s mind—present at every opportunity.

Words of Wisdom from States “Give up your dream for theirs. ” Take small

Words of Wisdom from States “Give up your dream for theirs. ” Take small steps—start slow, set reasonable or low expectations to build confidence. Get evaluation on everyone’s mind—present at every opportunity. Logic models really do help.

Words of Wisdom from States “Give up your dream for theirs. ” Take small

Words of Wisdom from States “Give up your dream for theirs. ” Take small steps—start slow, set reasonable or low expectations to build confidence. Get evaluation on everyone’s mind—present at every opportunity. Logic models really do help. Monitor the strategic evaluation plan as you would a budget.

Words of Wisdom from States “Ultimately, being persistent and having a healthy dose of

Words of Wisdom from States “Ultimately, being persistent and having a healthy dose of patience also has been helpful in fostering success. Along these lines, recognizing and embracing coincidence, serendipity, and unexpected opportunities has also been beneficial. ”

Questions? Orians@Battelle. org Sdisler@CDC. gov Mwilce@CDC. gov Sgill@CDC. gov

Questions? Orians@Battelle. org Sdisler@CDC. gov Mwilce@CDC. gov Sgill@CDC. gov