Improving the Toxic Substance Risk Assessment Risk Management
Improving the Toxic Substance Risk Assessment / Risk Management Process CEPA ICG Meeting Peter Forristal May 17, 2006
Recommendation To use a performance measurement tool to identify the priority areas for improvement of the Toxic Substance Risk Assessment / Risk Management (RA/RM) process.
Vision The quality of all toxic substance risk assessments meet or exceed expectations.
Goal and Objective • Increase the effectiveness of CEPA in addressing significant risks to health and the environment. • Improve the application of the Federal Government frameworks & policies. • Restore stakeholder confidence in the toxic substance RA / RM process. • Provide focus for future Industry advocacy work.
Today’s Situation • Industry is concerned that the RA/RM process is inconsistent due to systemic problems: – RA: inconsistent information gathering (hazard and risk) • PFOS, PBDE, PFOA; and window of data gathering – RA/RM linkage: the context of the risk needs to be clearer. – Varying application of precaution. • Uncertainty factors for P and B (PFOS, PBDE) • Using hazard and exposure data with significant limitations (CP) – Variations in peer review • There will be more Screening Level Risk Assessments requiring specific review and advocacy • Government resources stretched • Different industry stakeholders for each risk assessment
RA/RM Performance Measures Initiative Update September 2005 Industry proposed broad stakeholder involvement to identify performance measures for RA / RM process. A common set of performance measures would: provide a clear set of expectations for all stakeholders identify common issues facing all risk assessments provide a compelling argument for improvement. December 2005 EC/HC identified Q 2 2006 plans to develop Quality Management System for RA/RM process. May 2006 Industry has prepared an initial list of performance measures for development.
Performance Measurement • Performance requirements have been drawn from existing cabinet approved documents. – Framework for Science & Technology Advice – Framework for the Application of Precaution in Risk-based Decision-making • Proposed Performance Measures Gap Identification Peer Review Use of Science Uncertainty Identification Risk Characterization Inclusiveness Quality Assurance Information Accuracy Stakeholder Consultation Reconsideration Ø These measures will be quantified, where possible, according to the degree that they meet the expectations of the government frameworks, policies and procedures.
A Sample Evaluation of the Risk Assessment Process RISK ASSESSMENT M E A S U R E
Next Steps • Develop a clear set of expectations for each proposed performance measure. – Proposing a workshop before summer. – Encourage broad stakeholder participation (Industry, Government, ENGO? ) • Use the measures and expectations to evaluate and provide feedback on current screening level risk assessments. • Modify the measures based on feedback/changes in expectations. • After the measures have been used for >10 risk assessments, analyze for priority improvement opportunities. • Share improvement areas with Substance Management Group.
Background Material Risk Assessment / Risk Management Performance Measures Initiative
What’s required to improve any work process? • Three essential elements for process improvement: – A clearly defined process owner – A systematic approach – Performance measurement
Work Process Improvement Roadmap Define Products & Services Identify Customers Identify the work process Understand Requirements Measure Performance Identify Gaps Understand Why Innovate and Test Improve Process Evaluate and Do It Again
Risk Assessment Work Process Roadmap Substance Risk Assessment Government, NGOs, Industry Framework for Science & Technology Advice Framework for Application of Precaution Screening Level Risk Assessment Measure Performance Identify Gaps Understand Why Innovate and Test Improve Process Evaluate and Do It Again You can’t progress without performance measurement
What is the Framework for Science and Technology Advice? • Adopted by all science departments in the federal government in 2001 - 2002. • Acknowledges sound science as a key input to policy formulation. • Leads to sound government decisions, minimizes crises and capitalizes on opportunities. • Ensures Ministers can be confident that advice is based on rigorous and objective assessment of all available science.
Framework for Science and Technology Advice Principle I – Early Issue Identification • Anticipate opportunities for which science advice will be required based upon interdisciplinary, interdepartmental and international cooperation issues. • Performance Measure – Gap Identification – Exceeds Expectations • Strong effort to fill gaps and incorporate into risk assessment – Meets Expectations • Effort made to fill gaps to provide direction for risk management options. – Below Expectations • No effort to fill identified gaps – left to risk management.
Framework for Science and Technology Advice Principle II - Inclusiveness • Draw advice from a variety of scientific sources and from experts in relevant disciplines to capture the full diversity of scientific thought and opinion. • Performance Measure - Inclusiveness – Exceeds Expectations • An unbiased external advisory panel was used. – Meets expectations • Many references and key evidence comes from internationally acclaimed journals. – Below Expectations • Missing references to key science identified by stakeholders.
Framework for Science and Technology Advice Principle III - Sound Science and Science Advice • Adopt due diligence procedures for assuring quality, reliability, integrity and objectivity (including scientific peer review) of science and science advice. • Performance measures – Peer review – Quality Assurance – Use of science – Information accuracy
Framework for Science and Technology Advice Principle III - Sound Science and Science Advice • Performance Measure - Peer Review – Exceeds Expectations • Opinions of reviewers are acknowledged and areas of disagreement identified. – Meets Expectations • Qualified reviewers are used from government, NGOs, academia and industry. – Below Expectations • Missing reviews from key stakeholders.
Framework for Science and Technology Advice Principle III - Sound Science and Science Advice • Performance Measure - Quality Assurance – Exceeds Expectations • QA done on hazards and effects exposure characterization plus a review of the environmental or health sections. – Meets Expectations • QA done on hazards and effects exposure characterization. – Below Expectations • QA missing on hazards or effects exposure characterization.
Framework for Science and Technology Advice Principle III - Sound Science and Science Advice • Performance Measure - Use of Science – Exceeds Expectations • Rationale provided for new risk assessment methodologies. – Meets Expectations • Open publication of scientific information and using generally accepted risk assessment methodologies. – Below Expectations • Key science is not publicly available. • Scientific logic is faulty or not sound.
Framework for Science and Technology Advice Principle III - Sound Science and Science Advice • Performance Measure - Information accuracy – Exceeds Expectations • Current data used for exposure and effects assessment. – Meets Expectations • Current data used for exposure assessment. – Below Expectations • Materially significant exposure or effects data not used.
Framework for Science and Technology Advice Principle IV – Uncertainty and Risk • Use a risk management approach (which will have the goal of scientifically sound, cost effective integrated actions that reduce risks while taking into account social, cultural, ethical, political and legal considerations) to assess, manage and communicate the high degree of uncertainty inherent in the science on which policy advice is based. • Performance Measures – Uncertainty identification
Framework for Science and Technology Advice Principle IV – Uncertainty and Risk • Performance Measure - Uncertainty Identification – Exceeds Expectations • Comprehensive discussion of uncertainty and confidence level for all sections of the assessment. – Meets Expectations • Full discussion of uncertainty in the effectsexposure section. – Below Expectations • Poor explanation of uncertainty
Framework for Science and Technology Advice Principle V – Transparency and Openness • Provide a clear articulation of how policy decisions are arrived at to those who are affected, including providing access to the underlying science as soon as possible. • Performance Measures – Stakeholder Consultation – Risk Characterization
Framework for Science and Technology Advice Principle V – Transparency and Openness • Performance Measure -Stakeholder Consultation – Exceeds Expectations • Many comments received and incorporated into the final risk assessment. – Meets Expectations • Key stakeholders consulted and received feedback. – Below Expectations • No evidence of stakeholder consultation.
Framework for Science and Technology Advice Principle V – Transparency and Openness • Performance Measure - Risk Characterization – Exceeds Expectations – Meets Expectations • Exposure and effects evidence is solid and risk management direction is clear. – Below Expectations • Assessment did not give clear direction for risk management.
Framework for Science and Technology Advice Principle VI - Review • Subsequent review of decisions to determine whether recent advances in scientific knowledge have an impact on the advice and decision. • Potential Measure – Reconsideration – Exceeds Expectations – Meets Expectations • Precautionary measures should be implemented on a provisional basis and consistent with measures taken in similar circumstances. – Below Expectation • Precaution has been used and there is no timing set for reconsideration of risk assessment conclusions.
- Slides: 27