Criticality Analysis Risk Assessment Determining High Risk Requirements

























- Slides: 25
Criticality Analysis & Risk Assessment: Determining High Risk Requirements CARA Process Methodology 1
Getting Started Ø Introduction Ø Objectives Ø Testing Ø CARA Ø Risk Intake Process Scoring Analysis and Testing Scope Report Ø Challenges 2
Introduction Paul Shovlin Checkpoint Technologies Director of Professional Services IV&V Functional Test Manager, Department of Veterans Affairs 3
4
Objectives Ø Understand the Testing Intake Assessment (TIA) process. Ø Provide knowledge and understanding of Criticality Analysis and Risk Assessment (CARA) principles and practices. Ø Understand the purpose and content of the Risk Analysis and Testing Scope Report (RATSR). Ø Share our challenges. 5
Why VA IVV Adopted CARA Ø September 2010 mandate that all application release cycles complete from development to Initial Operating Capability (IOC) within 6 months, not leaving much time if any for IV&V testing. Ø 150+ Major application project releases each calendar year. Ø New approach to IV&V needed. Ø CARA was being used successfully at NASA. 6
Intake Assessment Process 7
Intake Assessment Process Testing Intake Assessment Form CONOPS Requirements System Design CARA Worksheet Assign CARA Resources Submit Testing Intake Assessment Read, Review Docs & Formulate Questions SMEs Answer Questions Perform CARA Determine Services Create & Review RATSR Prepare CARA Worksheet RATSR Updated CARA Worksheet Project Management Plan Project Schedule Update Testing Workload Forecast Testing Workload Form 8
Definitions for CARA Ø Risk is the likelihood of failure. Ø Criticality is the measure of potential impact of failure. Ø Risk-Based Testing is a type of software testing that prioritizes the tests of features and functions based on their criticality and likelihood of failure. Risk-based testing determines which test activities will be completed for the iteration. It is a transparent testing methodology that tells the customer exactly which test activities are executed for every feature. 9
What is CARA? Ø Criticality Analysis and Risk Assessment Standardized risk assessment methodology. § Features are consistent with PMI’s Risk Management assessment philosophy, PMBOK® Guide. § Probability IMPACT Low Moderate High Catastrophic Critical Moderate Low The uniqueness is in the process of determining the impact and probability. § Standardized criteria for determining risk values. § Ø CARA does not tell us how to test. 10
CARA Scoring Methodology 11
CARA Criticality Scoring Ø Criticality is broken down into 3 categories: Performance & Operations § Safety § Cost of Failure/Impact to Schedule § Ø Requirements for each category are scored on a scale of 1 -4 1 = Lowest § 2 = Moderate § 3 = Critical § 4 = Catastrophic § 12
CARA Risk Scoring Ø Risk § § § is broken down into 5 categories: Complexity Technology Maturity Requirements Definition and Stability Testability System Characterization Ø Requirements for each category are scored on a scale of 1 -3 1 = Low § 2 = Moderate § 3 = High § 13
CARA Scoring and Thresholds Ø Scores for Risk and Criticality Categories are separately averaged to determine a weighted value Ø The weighted value scores are then multiplied to determine the IV&V Analysis Level (IAL) Threshold Ø IAL’s are broken down into 4 categories: Minimal: § Limited (L): § Focused (F): § Comprehensive (C): 1≤CARA<2 2≤CARA<5 5≤CARA<8 8≤CARA≤ 12 § Probability IMPACT Low Catastrophic Limited Moderate High Comprehensive Focused Comprehensive Moderate Limited Focused Low Minimal Limited Critical Limited 14
CARA Criticality Evaluation Criteria Criticality Category Performance and Operation How would the failure of this requirement affect the Performance and Operation of the System/Application(s). Safety Can be the patient safety or it could be the safety of the system, for example: Do no harm to Vist. A. Cost of Failure/Impact to Schedule If a defect was found in validating the requirement, how much time/cost would it take to fix it successfully. This includes developer, SQA, IV&V, docs, etc. CRITICALITY CATEGORIES AND RATING CRITERIA Catastrophic Impact Value=4 • Failure could cause loss of use of system for extended time, loss of capability to perform all project requirements. • Failure is not easily resolvable. • Failure could result in loss of life or cause severe personal injury. • Failure could result in severe harm to system (or data) integrity. Critical Impact Value=3 • Failures could cause loss of critical function not resulting in loss of system/Application(s) use, lengthy maintenance downtime, or loss of multiple objectives. • Failure is partially resolvable. Moderate Impact Value=2 Low Impact Value=1 • Failure could cause loss of a single application / objective or reduction in operational capability. Failure is fully resolvable. • Failure could cause inconvenience ( e. g. , rerun of programs, computer reset, manual intervention). • Failure could result in non disabling personal injury, • Failure could result in • No safety serious occupational minor physical or implications. illness, or loss of mental harm. emergency procedures. • Failure results in significant schedule • Failure could result in delay. • Failure could result in • Alternate means to large cost and cost overruns large schedule overruns. implement function enough to result in are available but at • Alternate means to unachievable implement function reduced operational capability. are not available. • Full operational capability delayed. • Failure results in minor impact to cost and schedule. • Problems are easily corrected with insignificant impact to cost and schedule. 15
CARA Risk Driver Criteria Risk Category Complexity of this requirement. Maturity of Technology This scoring should be based on how good and stable the technology or product is within and outside of VA. If within the VA how well has it proven to work with VA systems. Requirements Definition & Stability Is the Requirement likely to change? Testability System Characterization RISK CATEGORIES AND RATING CRITERIA High Driver Value = 3 Moderate Driver Value = 2 Low Driver Value = 1 • Highly complex control/logic operations • Unique devices/complex interfaces • New/unproven algorithms, languages & support environments • Moderately complex control/logic • May be device dependent • Simple control/logic • Not device dependent • Proven on other systems with different application • Proven on other systems with same application • Rapidly changing, baselines not established • Many organizations required to define requirements • Much integration required • Potential for some changes • Some integration required • Solid requirements - little potential for change • Little to no integration required • Difficult to test • Requires much data analysis to determine acceptability of results • Large number of systems • Many components • Transmitting messages contain highly sensitive data • Large volume of messages transmitted • Requires some test data analysis to determine acceptability of results • Acceptability of test results easily determined • Medium number of systems • Medium number of components • Transmitting messages contain critical data • Medium volume of messages transmitted • Few systems • Few components • Transmitting messages contain low critical data • Small volume of messages transmitted 16
Ideal Skills for CARA Participants Ø Domain architecture knowledge Ø System Analysts with critical thinking skill Ø System Integration and Performance Engineers Ø Knowledge Ø System of Core Business components Engineers, DBA, System Architects Ø Moderator 17
CARA Rules – Enforced by Moderator 1) Evaluation Score must be slotted in a column. Scoring is not accepted if it is not a bullet within the column. 2) A maximum of 3 minutes for discussion per requirement. 3) In situations of disagreement where the variance is equal to 1, final scoring will be scored as the more conservative score. (e. g. if scores of “ 2” and “ 3” are in debate, the more conservative score of “ 3” is recorded). 4) When extreme scoring exist (e. g. “ 1” & “ 4”) both scores must have proper slotting positioning (rule #1). The analyst with the higher score must explain the justification. Re-vote immediately after the explanation. 5) Group “like requirements” together. Grouping speeds up the analysis. 6) Do not repeat scoring callouts. If your value is called, state, “Agreed. ” If not, call out “Object. ” The moderator will ask for your score and slotting. Your explanation will follow for your slotting. Others may agree and change their score. This technique is critical in remote sessions. 7) Silence is acceptance. It is better to “Agree” for consensus. 18
Resources for CARA Scoring Ø Requirements are extracted and logically grouped from the following required documents: Requirements Specification Document (RSD) Software/System Design Document (SDD) Ø Other artifacts used for reference Testing Intake Assessment (TIA) o Identify developer/PM experience with the technology Concept of Operations (Con. Ops) o o Identify architecture Overall concept of project 19
Spreadsheet Ø List Requirements Ø Captures scoring methodology TS - CARA Worksheet - Nw. HIN Criticality & Risk Areas: Performance and Operation Safety Cost of Failure/Impact to Schedule CTL + o 'Puts "1" into blank columns for 1 or more rows. ' Catastrophic Impact Value = 4 CTL + h 'Move to the 'Performance and Operation' column of the current row. ' Critical Impact Value = 3 CTL + R 'Copies a row if 'Performance. . . ' has a value. If 'Performance. . . ' is blank, it will paste the row copied into one or more rows. ' Moderate Impact Value = 2 Low Impact Value = 1 RSD 2. 8 - Support addition of New Exchange Partners BN 9. 1. 1 - Provide a new institution number for each new partner BN 9. 3 - Provide the ability to authorize a new partner BN 9. 3. 1 - Provide confidence testing prior to authorization of a new partner 20
RATSR Ø Risk Analysis and Test Scope Report Understanding of project application § Key Observations and Findings re: TIA inputs § Risk Analysis Summary (see Table 1) § Testing Requirements and Duration per Service Table 1: Requirements with Notable Risks recommendations for testing Req # Requirement Description Risk § ABC Requirements Table 2: Risk-Based Testing Requirements Feature/ Function/ Other Requirements RSD, Sec III, Para 1. 20 Service Requirements Verification Estimated Effort 10 working days Required Documentation Updated SDD, Test Cases 21
CARA Process Challenges Ø Documentation Incomplete § Agile § Identifying Increment Scope § Ø Metrics – Is the process working? ? Ø Following Up Obtaining Workload Forecast Dates § Resource Availability § 22
Conclusions Ø The CARA Analysis serves the following critical risk mitigation functions: § § § Ø Helps identify potential patient safety issues. Helps assure reliable system operation. Assists in diagnosis of problems of critical functionality. Coding to Requirements. Quality Assurance to the development process. Based on our success, the Do. D/VA Interagency Program Office (IPO) has adopted CARA for all Integrated Electronic Health Record (i. EHR) projects. 23
Questions? ? ? 24
Thank You!!!! If you have any questions, please contact me. Paul Shovlin, Checkpoint Technologies pshovlin@checkpointech. com § 813 -818 -8324 § 25