Headquarters U S Air Force Integrity Service Excellence

  • Slides: 73
Download presentation
Headquarters U. S. Air Force Integrity - Service - Excellence RI 3: Risk Identification,

Headquarters U. S. Air Force Integrity - Service - Excellence RI 3: Risk Identification, Integration, & “illities” Synergy with Cost Estimating, Earned Value Management and Auditing John Cargill Air Force Cost Analysis Agency/FMAE ASMC RPDI 4 March 2010 Version 1. 0 1

No One Likes to Recognize Risk n System Engineers -- Life is a lot

No One Likes to Recognize Risk n System Engineers -- Life is a lot easier (initially) when you are not forced to deal with risk and can seek the comfort of “group think” n Contractors -- Win source selections by minimizing statements of risk. Nothing can go wrong until it does. n Program Managers -- Behave as if nothing will go wrong with their programs (until it does). Good managers have low risk… n User Community -- Force changes to the system that invariably result in higher risks and costs (Requirements creep because of revised statements of “I want”) n Budgeteers/POM’ers -- Request budgets less than what they will realistically need to cover uncertain future events. Management reserve and contingency funding are verboten/taboo n Cost Estimators – Never quite independent, and often not skilled enough in communicating their operations research methods to decision-makers. Don’t enjoy bucking the “low risk” engineers. n EVM’ers – Almost never crank assessments of risk into their EACs. The answer is w. Insight, what was the question? Integrity - Service - Excellence 2

For Your Consideration n The trouble with forecasting is that it's right too often

For Your Consideration n The trouble with forecasting is that it's right too often for us to ignore it, and wrong too often for us to completely rely on it n Don’t rule out risk, even if it is difficult to characterize or quantify n “It is far better to foresee even without certainty than not to foresee at all. “ --Henri Poincare in The Foundations of Science, page 129. A pragmatic remark from one of the foundation builders of chaos theory. Integrity - Service - Excellence 3

Precision n “It is the mark of an educated mind to rest satisfied with

Precision n “It is the mark of an educated mind to rest satisfied with the degree of precision which the nature of the subject admits and not to seek exactness where only an approximation is possible. ”- Aristotle n What our boy, Aristotle, really meant to say was: n “Do not pretend to know more than you do. ” n “Do not carry extra decimal places past the noise or uncertainty around your inputs” n “You got risk and uncertainty – let your COV reflect it” n “Close enough for Government work” n “And above all, don’t ‘verticate’ the S-Curve (CDF)!” (attributed to Lt. Gen Hamel, USAF) Integrity - Service - Excellence 4

Sort’a Begs the Question With all this reluctance to recognize risk, what has been

Sort’a Begs the Question With all this reluctance to recognize risk, what has been the outcome? Integrity - Service - Excellence 5

The Score Keeper March, 2006 n GAO-06 -391 DEFENSE ACQUISITIONS: Assessments of Selected Major

The Score Keeper March, 2006 n GAO-06 -391 DEFENSE ACQUISITIONS: Assessments of Selected Major Weapon Programs n GAO assessed 52 systems that represent an investment of over $850 billion (comment: scope of the report) n “programs that began with immature technologies have experienced average research and development cost growth of 34. 9 percent (comment: started with low TRLs which translate to technical problems and cost growth) n “Programs consistently move forward with unrealistic cost and schedule estimates, use immature technologies in launching product development, and fail to solidify design and manufacturing processes at appropriate points in development. ” (comment: overall summary of the problem, i. e. , what Congress, OSD and SAF look for when evaluating a program) Integrity - Service - Excellence 6

The Score Keeper March, 2007 n GAO-07 -406 SP DEFENSE ACQUISITIONS Assessments of Selected

The Score Keeper March, 2007 n GAO-07 -406 SP DEFENSE ACQUISITIONS Assessments of Selected Weapon Programs n GAO assessed 62 weapon systems with a total investment of over $950 billion, some two-thirds of the $1. 5 trillion DOD plans for weapons acquisition n “Fully mature technologies were present in 16 percent of the systems at development start------the point at which best practices indicate mature levels should be present. (This means 84% of programs had immature technology) n Programs that began development with immature technologies experienced a 32. 3 percent cost increase, whereas (This means 4/5 ths of the programs grew by a 1/3 rd) n Those that began with mature technologies increased 2. 6 percent. ” Integrity - Service - Excellence 7

The Score Keeper March, 2008 n GAO-08 -467 SP DEFENSE ACQUISITIONS Assessments of Selected

The Score Keeper March, 2008 n GAO-08 -467 SP DEFENSE ACQUISITIONS Assessments of Selected Weapon Programs n “Of the 72 programs GAO assessed this year, none of them had proceeded through system development meeting the best practices standards for mature technologies, stable design, or mature production processes by critical junctures of the program…” Integrity - Service - Excellence 8

HASC Authorization Language: As Dr. Phil says, “How’s that working for you? ” “In

HASC Authorization Language: As Dr. Phil says, “How’s that working for you? ” “In addition, the committee believes that over the past decade the acquisition of space systems has been plagued by cost overruns and schedule delays. The lack of enforcement of internal Do. D procurement rules results in systemic problems leading to multiple space acquisition failures. These problems include reliance on immature technology, overdependence on contractors for program management, and a lack of government systems engineering and cost analysis expertise. ” p. 13. “The committee is alarmed by the number of space acquisition programs experiencing unexpected cost growth over the past decade. Virtually every major space acquisition program has experienced or sits dangerously close to a Nunn-Mc. Curdy breach. ” p. 13 Integrity - Service - Excellence

Historic Problem n Full disclosure of system risk does not reach decision makers that

Historic Problem n Full disclosure of system risk does not reach decision makers that they might make informed decisions n Obvious risks frequently do not make the risk cubes n Risk is downplayed at every level n “Filters” applied at every level in the acquisition chain n Decision Makers such as Service Acquisition Executives, Milestone Decision Authorities and OSD seldom can accept acquisition chain statement of risk and technical maturity n GAO reports suggest ~4/5 ths programs certified as mature go forward with immature technology Integrity - Service - Excellence 10

Fearlessly Going Where No Man Has Gone Before Risk and Uncertainty Known + Outcome

Fearlessly Going Where No Man Has Gone Before Risk and Uncertainty Known + Outcome = Historical database of somewhat analogous programs Decision quality actionable risk assessments Integrity - Service - Excellence 11

AFSO 21/D&SWS is Part of the Answer n. Develop and Sustain n. Warfighting Systems

AFSO 21/D&SWS is Part of the Answer n. Develop and Sustain n. Warfighting Systems (D&SWS) Integrity - Service - Excellence 12

Project Description and Background 3 Initiatives with the goal of institutionalizing one AF level

Project Description and Background 3 Initiatives with the goal of institutionalizing one AF level process to manage investments in technologies to ensure they are mature for AF systems n TD 1 -12 Tech Maturity “Yardstick” • Comprehensive Qualitative Criteria • Tech Performance • Manufacturability • Integrability • Other ‘ilities • Better Assessment of Risk • Improved Tech Forecasting TD 1 -13 High Confidence Tech Transitions • Early & complete lifecycle transition planning • IPT Approach – maximize coordination • “Stage/gated” transition of technology • Clearly defined entrance/exit Criteria TD 1 -14 Identify & Prioritize Tech Needs • Enterprise process to gather & prioritize tech needs • Focus S&T on highest priority needs • Integrity - Service - Excellence Game-changing “Tech Push”influencing capability planning 13

Level of Mitigation What is “Effective” Risk Management? Excessive Risk Management isk R t

Level of Mitigation What is “Effective” Risk Management? Excessive Risk Management isk R t ve en Effective Risk i t ec em Management Eff nag Ma Negligent Risk Management Significance of Risk § Essentially Pareto economics approach to risk § Asks the question – where should I focus my limited resources during risk mitigation – what risk(s) can I not tolerate? – need decision support for an informed in order to make informed decision § Seldom possible to “mitigate to zero” Integrity - Service - Excellence 14

Characteristics of Decision Support Systems n Large number of decision alternatives n Outcomes or

Characteristics of Decision Support Systems n Large number of decision alternatives n Outcomes or consequences of the decision alternatives are variable n Alternatives evaluated on the basis of multiple criteria n Criteria may be qualitative or quantitative n Typically more than one decision maker (or interest group) involved in the decision-making process n Decision makers have different preferences with respect to the relative importance of evaluation criteria and decision consequences n Decisions made are often surrounded by uncertainty Integrity - Service - Excellence 15

GAO-093 SP Cost Estimating and Assessment Guide of March 2009 Integrity - Service -

GAO-093 SP Cost Estimating and Assessment Guide of March 2009 Integrity - Service - Excellence 16

Risk vs Notional Technology Readiness Levels Range of Program Cost Outcomes Over Time Risk

Risk vs Notional Technology Readiness Levels Range of Program Cost Outcomes Over Time Risk Reduction AOA SDD Components and prototypes in relevant environment LRIP FRP Systems in operational environment Range of outcomes is reduced as we eliminate risk and some of the previously probable cost outcomes Cost risk levels Very High TRL 1 2 3 4 High Med-High 5 6 Med-Low 7 8 Low 9 Risk is retired as a program progresses and technology matures Integrity - Service - Excellence Return +1 Backup 17

COMPONENTS OF RISK ANALYSIS n Risk assessment n Identifying the risk at hand that

COMPONENTS OF RISK ANALYSIS n Risk assessment n Identifying the risk at hand that could materialize as an issue n Bound or quantify level of potential harm n Risk communication n Inform others in a way that helps them to make optimal decisions n Requires risk analysis be viewed from the beginning as a decision support tool n Risk assessment must be communicated in such a way that it is actionable n Risk management (SE an any other name) n Action to take once risk is identified n Risks can be managed on many different levels Integrity - Service - Excellence 18

Need for RI 3 Process What is Missing? 1 n Technology Readiness Assessments (TRAs)

Need for RI 3 Process What is Missing? 1 n Technology Readiness Assessments (TRAs) are necessary but not sufficient to capture risk n Natural tendency for advocates to downplay risk n “RI 3 sheds additional light on areas that seem to have traditionally been underrated in terms of risk – interestingly, literature from the field of cognitive psychology generally suggests that people often have difficulty in characterizing the relative risks of various activities appropriately (possibly due to “group- think”) thus resulting in underestimation of 1 their effects. ” 1 Risk Identification, Integration & Illities Guidebook ver 1. 2, 15 December 2008, p. 5 Integrity - Service - Excellence 19

Need for RI 3 Process What is Missing? 2 n Common approach to identifying

Need for RI 3 Process What is Missing? 2 n Common approach to identifying program risk Consistent n Validated against historical programs n Visible to all levels in the acquisition chain n Automated hand-offs (export) to systems engineering and program information reporting systems n Active Risk Manager (ARM) n Probability of Program Success (Po. PS) n n Of use to program management, cost estimating, earned value management, logistics and auditing communities Integrity - Service - Excellence 20

Surveyed Globe for Good Ideas n Efforts surveyed across Do. D, other agencies, internationally,

Surveyed Globe for Good Ideas n Efforts surveyed across Do. D, other agencies, internationally, universities, corporate world n NASA-originated AD 2 methodology viewed favorably by members in OSD AT&L and SAF AQR n British Ministry of Defence provided good input n British System Readiness Levels (SRLs) are used in conjunction with TRLs n Also in conjunction with a full-blown risk analysis assessment Integrity - Service - Excellence 21

Relationship to OSD Checklists n Various OSD checklists are available on the DAU website

Relationship to OSD Checklists n Various OSD checklists are available on the DAU website n TRA – deals primarily with setting up for a TRA, not how to conduct a TRA n PRR – in discussion with MRA team n PDR, CDR n Only Navy NAVAIR appears to use the checklists n Observations on checklists n Checklists are excellent sets of questions n Checklists are much broader in scope than RI 3 questions, but lack the depth Integrity - Service - Excellence 22

‘Bout Those Dreaded Checklists n Many checklists are too long for day-to-day usage by

‘Bout Those Dreaded Checklists n Many checklists are too long for day-to-day usage by a Program Office n NAVAIR checklist 800+ questions: if everything is important, then nothing is important n Traditional checklists are 1 -dimensional n Many checklists are like IG inspection “clipboard” checklists n Do you have a Budget? R End of subject!! n Example of multi-dimensional question (RI 3 -style approach) n Do you have a Budget? n Is it adequate? n Is the total amount sufficient? n What is the confidence level of your budget? n Is the total amount phased correctly? n Is the total amount sufficient by Appropriation? n Are there pending changes that would impact the budget? n Are there risks not captured in your budget? Integrity - Service - Excellence 23

‘ilities Threads n RI 3 Development Team down-selected to the following list* n Design

‘ilities Threads n RI 3 Development Team down-selected to the following list* n Design Maturity and Stability (stability of requirements) n n n Scalability & Complexity Software Integrability Testability Reliability Maintainability Human factors n People, organization, & skills n n List was driven by observations of past program problems and is consistent with International Counsel on Systems Engineering (INCOSE) standards Integrity - Service - Excellence 24

Development Philosophy n Standard AF Risk Cube Leverages off of standard AF risk processes

Development Philosophy n Standard AF Risk Cube Leverages off of standard AF risk processes Risk Cube n Likelihood Definitions 5 n n 4 Consequence Definitions 3 PROBABILITY OF OCCURRENCE 1 1%-20% LIKELIHOOD 1 Not Likely 2 Low Likelihood 21%-40% Likely 1 2 41%-60% Highly Likely 61%-80% 3 Proposed AF Definition 4 Do. D Guide 2 LEVEL 1 Minimal or no consequence to technical performance Minimal consequence to technical performance but no overall impact to the program success. A successful outcome is not dependent on this issue; the technical performance goals will still be met. 2 Minor reduction in technical performance or supportability, can be tolerated with little or no impact on program Minor reduction in technical performance or supportability, can be tolerated with little impact on program success. Technical performance will be below the goal but within acceptable limits. 3 Moderate reduction in technical performance or supportability with limited impact on program objectives Moderate shortfall in technical performance or supportability with limited impact on program success. Technical performance will be below the goal, but approaching unacceptable limits. 4 Significant degradation in technical performance or major shortfall in supportability; may jeopardize program success Significant degradation in technical performance or major shortfall in supportability with a moderate impact on program success. Technical performance is unacceptably below the goal. 5 Severe degradation in technical performance; Cannot meet KPP or key technical/supportability threshold; will jeopardize program success Severe degradation in technical/supportability threshold performance; will jeopardize program success; or will cause one of the triggers listed below 5 Near Certainty 3 4 5 81%-99 % Likelihood Definitions Consequence Definitions Integrity - Service - Excellence 25

Why the RI 3 New Colors and Ratings Make Sense Current Risk Cube 5

Why the RI 3 New Colors and Ratings Make Sense Current Risk Cube 5 Likelihood 4 3 2 1 1 2 3 4 Consequence 5 § I am a good program manager running a good program – I am not “red” § Bad things happen to program managers who show “red” § Bad things happen to programs that don’t quickly move out of the “red” § Tendencies • Move all “red” except for the (5, 5) outside of program control down to yellow • When you are “green” you do not have to worry about risk “There is often a tendency for persons outside a program to believe that if the CTEs of a system are all at TRL 6, then the technical risks related to those technologies have been paid down. Unfortunately, this is not the case. ” RI 3 Guidebook, p. 2 Integrity - Service - Excellence 26

Ratings versus Colors: Proposed But Not Yet Accepted n These ratings could be thought

Ratings versus Colors: Proposed But Not Yet Accepted n These ratings could be thought of as creating 2 intermediate colors n Red/Yellow (4 ratings) n n Reduces tendency to try to avoid high numbers because they’re red Green/Yellow (2 ratings) n Reduce items that get ignored because they’re green RI 3 Ratings Original Colors 5 5 2 3 4 4 5 4 4 2 3 3 4 4 3 3 2 2 3 3 4 2 2 1 2 2 3 3 1 1 2 2 3 1 2 3 4 5 Integrity - Service - Excellence 27

Some Sample Questions: n Integrability n Are there interactions / integration issues that could

Some Sample Questions: n Integrability n Are there interactions / integration issues that could be affected by proprietary or trust issues between/ among suppliers? n Have key sub-systems, at whatever level of readiness (breadboard, brassboard, prototype), been tested together in an integrated test environment and have they met test objectives? n Software n Are personnel with development-level knowledge of the existing, reused software part of the new software development team? n Maintainability n Is modeling and simulation used to simulate and validate maintenance procedures for the unit under test and higher levels of integration? n Explanatory discussion with potential best practices on each question are included in RI 3 guidebook and Excel-like worksheet/tool n Questions are technical and shy away from programmatic n Total of 101 question covering the 9 “illities” -- Answer only those that are appropriate for the Unit Under Evaluation (UUE) Integrity - Service - Excellence 28

“illity” Preamble 2. 0 Scalability & Complexity The issues of scalability and complexity are

“illity” Preamble 2. 0 Scalability & Complexity The issues of scalability and complexity are subtle issues that are often overlooked (or aspects are overlooked) with disastrous consequences. Both are complicated issues that have many dimensions. In the case of scalability, the issues are often associated with size, and weight – in both directions, e. g. going from meter-scale to 10’s of meters-scale; going from micron-scale features to nano-scale features; extrapolating to super light-weight systems or to super high-density systems. In the case of software/communications, the scale may require going from regional to global. All of these scale changes have tremendous impacts on manufacturing, integration, testing and operation. In the case of complexity itself, the primary problem is the lack of clarity in the definition of complexity itself. What makes a system complex? For instance with respect to the program itself: • Are multiple contractors involved? • Are multiple Uniformed Services involved? • Are multiple government agencies involved? • Are multiple non-US government agencies involved? • Are multiple funding sources involved? • Are there critical dependencies with other programs (e. g. cryptographic equipment)? Integrity - Service - Excellence 29

Question Amplifying Instructions 1. 0 Design Maturity and Stability 1. 01 Are hardware and

Question Amplifying Instructions 1. 0 Design Maturity and Stability 1. 01 Are hardware and software design requirements stable (have they been finalized)? Changing requirements are a significant factor in cost and schedule overruns. Continually changing requirements indicate that the design is in a state of flux. This in turn leads to a high probability that critical aspects will be overlooked, or that designs will become obsolete before they are implemented. Requirements creep is also a cause for scrap and rework, further increasing costs. It is critical to freeze requirements (or put them under strong configuration control) and to obtain the necessary buy- in from the customers before continuing to advanced stages of design. New requirements can be included as part of a future, preplanned improvement. Integrity - Service - Excellence 30

RI 3 Questions Perception by Various Communities n Engineers feel RI 3 questions are

RI 3 Questions Perception by Various Communities n Engineers feel RI 3 questions are systems engineering questions n Program Managers feel RI 3 questions are questions they want to ask their staff n Cost Estimators feel RI 3 questions are risk questions for their cost models – in widespread use at Eglin AFB n Auditors have responded that RI 3 questions are good questions to ask to ascertain whether risk had been addressed directly n Note: AFAA auditors at Eglin AFB have already incorporated some of the RI 3 questions into their audit questions list n EVM practitioners in general have not yet responded with any enthusiasm to the concept of risk, but NDIA has included risk in their ANSI standards for EVM n Reality – RI 3 questions are appropriate for anyone who deals with risk Integrity - Service - Excellence 31

Usage of RI 3 to Feed Risk Management Processes Risks Questions: • Integration •

Usage of RI 3 to Feed Risk Management Processes Risks Questions: • Integration • ilities 5 Likelihood RI 3 Guidebook Tool 4 3 2 1 1 2 3 4 Step 2. Risk Identification 5 Consequence Tool Risk Management Active Risk Manager (ARM) compatible file Additional Summary Displays Similar output for cost estimation being investigated Tool Po. PS Integrity - Service - Excellence 32

How to Begin RI 3 Methodology Project XYZ System A System B System C

How to Begin RI 3 Methodology Project XYZ System A System B System C CTE Non-CTE Subsystem a Subsystem b Component α Subsystem c Component β Component n Start with system level gross evaluation (top-down) n Break down into subsystems, note Critical Technology Elements (CTEs), and evaluate TRL at appropriate level n To assess integration and ‘ilities, must evaluate CTEs + units that interface with CTEs, even if they are not CTEs themselves n Then, proceed back up tree as appropriate Integrity - Service - Excellence 33

RI 3 “Signature” Evolves over Time RI 3 can be used to support risk

RI 3 “Signature” Evolves over Time RI 3 can be used to support risk identification both in support of milestones as well as pre-MS A activity n Input to Po. PS n Risks could actually increase as more knowledge is obtained; however one expects risk to decrease as a program matures n TRL 1 Basic Principles Observed TRL 2 Concept Formulati on TRL 3 Proof of Concept TRL 4 Breadboard in Lab TRL 5 Breadboard in Rep Environment TRL 6 Subsystem Prototype in Rep Environment TRL 7 System Prototype in Ops Environment Integrity - Service - Excellence TRL 8 System Qual TRL 9 Mission Proven 34

A Glimpse at a Potential RI 3 Tool Instantiation Integrity - Service - Excellence

A Glimpse at a Potential RI 3 Tool Instantiation Integrity - Service - Excellence 35

Description of RI 3 Historical “Test” Stayback Team Historical Program “The Mushrooms” Documentation up

Description of RI 3 Historical “Test” Stayback Team Historical Program “The Mushrooms” Documentation up to PDR Only RI 3 Guidebook, Tool Review Materials Full Historical Documentation Predict Risks l a arti ct P a r t Ex Info Review Full History Omniscient Team “The Know-It-Alls” Full Team • Interview Program Office, • Compare Predictions to Truth Integrity - Service - Excellence Results • Test Metrics • RI 3 Revisions 36

Results: RI 3 Historical “Test” Completed Nov 21, 2008 Realized Risks / Issues Number

Results: RI 3 Historical “Test” Completed Nov 21, 2008 Realized Risks / Issues Number Correctly Predicted by Team Could be Predicted by Program Office Escaped Prediction (Type 1 Error) 22 13 6 3 RI 3 v 1. 0 Tool could have predicted 86% of Issues Modified tool to be more perceptive Correctly Predicted by Team n Team members predicted a risk, which in fact became an issue n Could be Predicted by Program Office n If the program personnel had the RI 3 tool available, issue that arose would likely have been predicted as a risk by RI 3 n n n Team did not predict risk in exercise due to lack of information Escaped Prediction n Questions did not yet capture an issue that arose Integrity - Service - Excellence

SECAF Pathfinder Programs n SECAF established “High Confidence Criteria” Pathfinder programs n Initiative led

SECAF Pathfinder Programs n SECAF established “High Confidence Criteria” Pathfinder programs n Initiative led by Col Kevin Keck and Col (s) Fred Gregory n Eglin AFB Pathfinders Small diameter Bomb II n Hard Target Void Sensing Fuze n n Usage of RI 3 methodology is part of the effort n Risk identification n Recommend usage of assistance from TEAS and/or AAC/EN personnel Integrity - Service - Excellence 38

Notional Example Application n Problem – determine the engineering risks and cost impacts of

Notional Example Application n Problem – determine the engineering risks and cost impacts of two competing approaches n Using RI 3 Calculator provide risk metrics for input to cost and EVM models n Develop estimate/EAC for most risky approach Integrity - Service - Excellence 39

Notional RI 3 Summary of Two Alternatives • Notional Micro Munition • Approach 1:

Notional RI 3 Summary of Two Alternatives • Notional Micro Munition • Approach 1: likely to make changes – Driven by dispersion patterns – May drive guidance design – Possible change seeker geometry • Approach 2: considering cost changes – Switching to explosive for warhead – Simplifying heritage Ad hoc net • Primary Risk/Integration drivers – Changes to CDD – Predator weapons Thermal—actual beyond tested values • MIL SPEC exceded • Vibrational—not characterized • Prime developed models (no IV&V) – Limited testing over threat set – Integrity - Service - Excellence 40

RI 3 Results Feed and Document Cost Risk Model Ordinance System Complete Round Structure

RI 3 Results Feed and Document Cost Risk Model Ordinance System Complete Round Structure Payload Guidance & Control 3 3 2 3 9 scores 2 2 Not Rated Not Rated 3 scores Not Rated Not Rated Future Schedule Risk Future Cost Growth Potential Future Technical Risk Expert Risk Assessments w/ Comments Future Cost Risk Human Factors Maintainability Reliability Software Testability Integrability Scalability & Complexity Not Not Rated Not Rated Not Rated Not Rated 2 2 Not Rated Not 4 4 Not Rated People, Organization and Skills WBS Element RI 3 Risk Assessments and Scores Design Maturity and Stability WBS Not Rated NONE NONE Not Rated LM LM Not Rated MH MH LM MH 2 M M 1 overall risk rating Integrity - Service - Excellence 41

Risk Ratings Around Point Estimate Of Work Remaining Ordinance System Complete Round Structure Payload

Risk Ratings Around Point Estimate Of Work Remaining Ordinance System Complete Round Structure Payload Guidance & Control NONE LM MH M NONE LM LM M NONE LM MH M $ - C $ - $ 100 $ 200 $ - $ 300 NONE LM MH M Input Source Analyst Risk / Cost Growth Potential Uncertainty Captured (Default. 70) Analyst Choices Worst Case Best Case Most Likely Case or Point Estimate Triangular Distribution Input Future Cost Growth Potential Future Schedule Risk Future Technical Risk WBS Element Future Cost Risk Expert Risk Assessments w/ Comments WBS 0. 70 Point Estimate $ 600 q Boundary Interpretation for expert input q Focus on Uncertainty Captured q Experts never provide total range of outcomes q Default is 70% of the range q Reasonable to us 90% for “catalog” or “commodity” items Integrity - Service - Excellence 42

Subjective Uncertainty Bound Interpretation continued Adjusted for skewness AF CRUH p. vii 43 Integrity

Subjective Uncertainty Bound Interpretation continued Adjusted for skewness AF CRUH p. vii 43 Integrity - Service - Excellence 43

Risk Adjusted Estimate and Determination of Skewness Ordinance System Complete Round High Tail Area

Risk Adjusted Estimate and Determination of Skewness Ordinance System Complete Round High Tail Area Low Tail Area Simulation Higher Bound Interpre-tation Simulation Lower Bound Interpre-tation Uncertainty Not Captured WBS Element Skew Mean Mapped to Most Likely Mean High Low Best Case Multiplier R i s k Boundary Interpretation and Skewness Computation Worst Case Multiplier WBS 0. 00 $0 $0 0. 00 0. 00 LM Structure 0. 90 1. 30 $90 $130 $107 1. 07 0. 25 0. 30 0. 08 0. 78 0. 08 0. 23 MH Payload 0. 90 1. 75 $180 $350 $243 1. 22 0. 12 0. 30 0. 04 0. 74 0. 04 0. 26 Guidance & Control 0. 90 1. 50 $270 $450 $340 1. 13 0. 17 0. 30 0. 05 0. 75 0. 05 0. 25 M $540 $930 $690 Split between the tails of the triangular distribution Integrity - Service - Excellence 44

Risk, Boundary and Skewness Adjusted Expected Value Complete Round Structure Payload Guidance & Control

Risk, Boundary and Skewness Adjusted Expected Value Complete Round Structure Payload Guidance & Control A B 0 0. 00 $0 0 0 80 114 161 0. 80 1. 14 1. 61 $17 157 292 519 0. 79 1. 46 2. 60 $81 237 384 614 0. 79 1. 28 2. 05 $474 $790 $1, 295 $82 $181 Best Case System Simulation Forecast Total Component Simulation Forecast Sub-component Simulation Forecast Simulation Most Likely Assumption WBS Element St. Dev % of Total Std. Dev Absolute Standard Deviation Absolute Low (a) WBS Element Ordinance System Simulation Parameters Absolute High (b) Absolute Low Mapped to TBE Most Likely Absolute Mean Mapped to Most Likely Absolute High Mapped to Most Likely Absolute Distribution Absolute Mean / Expected Value WBS $ 600 $ 0. 00% 600 $ 9. 61%100 $ 44. 70%200 $ 45. 69%300 100% $600 $474 Resulting Most Likely Case $600 Expected Value $790 distribution Worst Case $1, 295 Integrity - Service - Excellence 45

RI 3 Risk Adjusted Cost Confidence Curve Integrity - Service - Excellence 46

RI 3 Risk Adjusted Cost Confidence Curve Integrity - Service - Excellence 46

So, Where Does EVM come in? n If you have “actuals, ” why estimate

So, Where Does EVM come in? n If you have “actuals, ” why estimate what you already know? n Focus on what you know and estimate what you don’t n Necessary tenets of EVM TENET 1: Acquisition Program Baseline (APB) must reflect program/product reality – Doesn’t always happen TENET 2: Categorization of costs must be accurate -- Don’t remember the last time when NR and REC and SEIT/PM were accurate and consistent across programs TENET 3: Program schedules must be executable – Not until late in program will this be known as many schedules are not stable until after CDR Integrity - Service - Excellence 47

Traditional EVMS Performance Factors: Invalid For Predicting The Future n Reality check n We

Traditional EVMS Performance Factors: Invalid For Predicting The Future n Reality check n We do know something about the past We do know something about the future n We do know something about the program n Risk is retired as work packages are completed n n Real knowledge, not assumption The future will not be like the past, except by coincidence You can bet on it! n We can focus on the future with some degree of certainty The “new reality” The best available information concerning the future lies in the collective judgments of those closest to the actual program 48 Integrity - Service - Excellence

Change of Focus and Leap of Faith Contract Performance vs Future Risk Factor Risk

Change of Focus and Leap of Faith Contract Performance vs Future Risk Factor Risk and Uncertainty How much have I gotten for my dollar? ” (CPI) “Sunk Cost” How many times the budgeted cost for my remaining work will I have to pay? ” “Future Cost” Integrity - Service - Excellence 49

Performance Factor n Traditionally we use CPI or SCI based performance factors that traditionally

Performance Factor n Traditionally we use CPI or SCI based performance factors that traditionally get no where close to the eventual cost of the program n In the EVM literature, D. S. Christensen took EACsci from Worst Case, to Most Likely, to Best Case -- where is the EACreality in that? n Real interest is in the risk associated with work remaining Performance Factor “Actuals” Remaining Work “Sunk Cost” “Future Cost” PMI Project Management Body of Knowledge (PMBOK) uses “Planned Value” and “Earned Value” for BCWS and BCWP Integrity - Service - Excellence 50

Performance Factor Continued n Not difficult to go back into CPRs or DCARC 1921’s

Performance Factor Continued n Not difficult to go back into CPRs or DCARC 1921’s of historical programs and compute “risk factors” n Air Force Cost Risk and Uncertainty Analysis Handbook lists default Most Likely estimate multipliers by risk rating Big Time Caveat 1. In example following we will assume high schedule/Technical Risk with a multiplier of 2. 0 2. 3. Always check any multiplier and risk rating system against growth of historical programs in the commodity area of interest. Un-calibrated multipliers can be dangerous. Kids, Don’t try this at home! Integrity - Service - Excellence 51

Performance Factor Continued n Example: High Risk assessed on the budgeted $200 M work

Performance Factor Continued n Example: High Risk assessed on the budgeted $200 M work remaining Basic Triangle n . 9 x $200 M = $180 M Best Case n 1. 0 x $200 M = $200 M Most Likely Case n 2. 0 x $200 M = $400 M Worst Case (High Risk) n ($180 M+$200 M+$400 M)/3 = Expected Value EAC = $260 M n Bound Interpretation with 70% Uncertainty Captured and corrected for Skewness (3% to left tail, 27% to right tail) n . 785 x $200 M = $157 M Best Case n 1. 0 x $200 M = $200 M Most Likely Case n 3. 145 x $200 M = $629 M Worst Case n ($157 M+$200 M+$629 M)/3 = Expected Value EAC = $329 M n $329 M/$200 M = 1. 645 = Performance Factor n Integrity - Service - Excellence 52

Acquisition End Game “Low Risk” For Full Rate Production n We are not at

Acquisition End Game “Low Risk” For Full Rate Production n We are not at “Low Risk” n Once we are through our pre-EMD risk reduction phase Once we enter EMD n Once we start testing prototypes n Once we start LRIP n n We are at “Low Risk” only after n we have a proven design supported by OT&E findings n We have a proven production line n ready to enter FRP n Until then, there a lot of bad things that can happen, probably will happen, and will have cost and schedule impacts Integrity - Service - Excellence 53

Now that you have been drug through the mud Time for some initial conclusions

Now that you have been drug through the mud Time for some initial conclusions Integrity - Service - Excellence 54 54

RI 3 Conclusions n Air Force Risk Identification RI 3 Process is now an

RI 3 Conclusions n Air Force Risk Identification RI 3 Process is now an Air Force Best Practice n Covered in AF Pamphlets and guides n Sec. AF Directed for all Air Force “High Confidence” programs n Being required for all Air Armament Center programs n RI 3 is being reported by OSD to Congress as the “Best of the best” risk identification practices n RI 3 is being looked at by GAO n Implies “You may see this again” Integrity - Service - Excellence 55

TD-1 -14 Stagegating September 2006 GAO report: GAO recommends that DOD strengthen its technology

TD-1 -14 Stagegating September 2006 GAO report: GAO recommends that DOD strengthen its technology transition processes by developing a gated process with criteria to support funding decisions; expanding the use of transition agreements, … GAO Report to Congressional Committees, “BEST PRACTICES Stronger Practices Needed to Improve DOD Technology Transition Processes”, September 2006. G AO-06 -883 n Initiative focuses on Technology Transition process n Ensure early and complete life-cycle transition planning n Create a common understanding of the technology transition processes to be applied at all life cycle stages n Initiative goal is improved transition success n Improved planning using exit criteria enhances probability and speed of the transition, increasing confidence of acquisition programs – REDUCE PROGRAMMING RISK! n Key aspect is ensuring the right people are involved earlier for increased collaboration between researcher, acquisition organization, and stakeholders Integrity - Service - Excellence 56

Headbone Connected to the Neckbone n Moving from the “Ruler” to the “Gate Guard”

Headbone Connected to the Neckbone n Moving from the “Ruler” to the “Gate Guard” RI 3 Stagegating n Historical situation (only slightly tongue-in-cheek”) n Lab n Matures technology to a TRL 4 n Throws it over the fence as a TRL 6 n Program Manager n Matures the TRL 4 to TRL 5 n Sells it to MDA as a TRL 6 n MDA n Concurs on Milestone passage n Program Manager runs into technology problems finding it difficult to “invent on schedule” n Declares Nunn-Mc. Curdy breech n GAO n Finds one more time we went ahead with immature technology and excessive risk Integrity - Service - Excellence 57

Scope the Initiative September 2006 GAO report: GAO recommends that DOD strengthen its technology

Scope the Initiative September 2006 GAO report: GAO recommends that DOD strengthen its technology transition processes by developing a gated process with criteria to support funding decisions; expanding the use of transition agreements, … GAO Report to Congressional Committees, “BEST PRACTICES Stronger Practices Needed to Improve DOD Technology Transition Processes”, September 2006. G AO-06 -883 n n Initiative focuses on Technology Transition process n Ensure early and complete life-cycle transition planning n Create a common understanding of the technology transition processes to be applied at all life cycle stages Initiative goal is improved transition success Improved planning using exit criteria enhances probability and speed of the transition, increasing confidence of acquisition programs – REDUCE PROGRAMMING RISK! n Key aspect is ensuring the right people are involved earlier for increased collaboration between researcher, acquisition organization, and stakeholders n Integrity - Service - Excellence

What is “New”? n Formalized process to develop the strategy to mature and transition

What is “New”? n Formalized process to develop the strategy to mature and transition a new technology n List of detailed activities needed for technology maturation n Mechanism to ensure a robust execution of the strategy: Stage-Gate process A Stage is where the activities occurs – the team completes key activities (technology and programmatic) to advance the project to the next gate and focuses on the changing roles and responsibilities n A Gate is a decision point – on whether a project is a go, no-go, re-directed or put on hold (TRL based / driven) n The decision is based on EXIT CRITERIA for each gate n Integrity - Service - Excellence 59

What is “New”? n A formalized process, the mechanism (stage-gate criteria) and detailed activities

What is “New”? n A formalized process, the mechanism (stage-gate criteria) and detailed activities and milestones necessary to transition from phase to phase Integrity - Service - Excellence 60

To Be Process n Tech Development & Transition Strategy (TDTS) Replaces the TTP n

To Be Process n Tech Development & Transition Strategy (TDTS) Replaces the TTP n TDS is subset of TDTS required at Milestone A n As program progresses – TDTS “Morphs” to LCMP n Owner: Acquisition PM Tech Development & Transition Strategy (TDTS) n Replaces TTP, but a gated approach defining depth required at each phase. n Integrated Strategy (Technology Development and Acquisition) n Example: As team approaches: n MS-A (TRL-4) – Gates/checklist ensures TDS is complete n MS-B (TRL-6) – Gates/checklist ensures LCMP is complete Subset Becomes Tech Development Strategy (TDS) (Public Law 107 -314, Sec 803) n Acquisition Approach n Supporting Rationale n R&D Strategy n Performance Goals n CSP and Spirals n Describe Tech Demo n CSP and Exit Criteria n Develop Test Plan n Goal / Exit Criteria n Ensure Maturity Level Integrity - Service - Excellence LCMP n Exec Summary n Mission/Rqmts n Program Summary n Program Mgmt n Business Strategy n Risk Mgmt n Cost and Performance Mgmt n Test Approach n Product Support Concept

Tool Description - Turbo. TPMM n Facilitates development of the “Transition Strategy” for Tech

Tool Description - Turbo. TPMM n Facilitates development of the “Transition Strategy” for Tech Maturation and Transition n USAF added graphic user interface to model n The Turbo. TPMM S/W tool features: n Automates the stage-gate process n Easy to use, walks user through the process n Turbo-tax© like graphic user interface n Questions aligned with acquisition framework n Ensures application of Systems Engineering n Follows Project Management fundamentals n DAU also Collaborating with Turbo. TPMM Designed to Ask the “Right” Question at the “Right” Time Integrity - Service - Excellence

Turbo. TPMM – Baseline Planning 63 Integrity - Service - Excellence

Turbo. TPMM – Baseline Planning 63 Integrity - Service - Excellence

Where Do You Get these New Fangled Deals? n RI 3 Calculator and Guide

Where Do You Get these New Fangled Deals? n RI 3 Calculator and Guide are available online at Air Force Institute of Technology http: //www. afit. edu/cse/page. cfm? page=164&sub=110 TDTS Guidebook: Available on DAU Acquisition Community Connection (ACC) https: //acc. dau. mil/Community. Browser. aspx? id=314696&lang=en-US Turbo. TPMM: Gunter AFB is hosting “alpha” version of Turbo. TPMM URL: https: //www. tdr. gunter. af. mil/GCSS-SBX 031 User. Name: TPMM. testuser Password: P@$$word 1234!@#$ Integrity - Service - Excellence 64

Challenge to the Various Communities n No more “group think” n Recognize enthusiasm and

Challenge to the Various Communities n No more “group think” n Recognize enthusiasm and hope can become blinders n Anticipate risk -- it is real, it is there n Communicate the risk to the Program Managers, PEO’s and MDA’s n Work with the Engineers, Cost Estimators, Auditors n They will help you communicate risk status n Better yet, they can help document your risk management plans are working n Work with the EVM’ers n Don’t gotta like ‘em, but they have a direct line to PM n Communicate the risk posture n They, too can help document your risk management plans are working Integrity - Service - Excellence 65

Challenge to the EVM Community n Fundamental challenge to the EVM community is whether

Challenge to the EVM Community n Fundamental challenge to the EVM community is whether it will embrace risk assessments as a general EVM practice n Will the EVM community join with the Systems Engineering and Cost Estimating communities in utilizing risk assessments to generate performance factors? n NDIA introduced concept of risk to EVM in 2007, but the EVM community as a whole has not embraced it PREDICTION: RISK IS COMING TO EVM Integrity - Service - Excellence 66

Challenge to Cost Estimating Community n Work with the Systems Engineers – they need

Challenge to Cost Estimating Community n Work with the Systems Engineers – they need your help n Take an engineer to lunch Have an engineer take you to lunch n Translation – work jointly – look for synergy – do risk together n Attend engineering sufficiency reviews n Have engineers attend cost sufficiency review n Participate in each other’s fact finding trips n n Use the RI 3 questions as “cost” questions n Ideally, SE’s, PM’s, Costers, EVM’ers and Auditors will embrace the RI 3 questions as their own n Did I say you might see RI 3 again? n Karen Ritchy, GAO, is having a team look at the RI 3 questions to see which would make good audit questions Integrity - Service - Excellence 67

EVM Risk Metric n I propose a new risk metric based on the concept

EVM Risk Metric n I propose a new risk metric based on the concept that n Estimates from program office are less than from Component (service)/agency headquarters n Estimates from service/agency headquarters are less than OSD CAPE n Estimates from CAPE are less than actual costs found by audit agencies upon completion of program n Essentially a comparison of traditional EVM with Independent Cost Estimate n May give too much credence to independent estimates (Program Office Estimate – Technical Baseline Estimate) (Independent Cost Estimate – Independent Cost Estimate TBE) Integrity - Service - Excellence 68

Challenge to the Acquisition Community n Change your culture n Follow a formal risk

Challenge to the Acquisition Community n Change your culture n Follow a formal risk identification process n Don’t just assume risk away n Stop sanitizing risk n Try telling the truth to the decision makers n Follow a formal technology development stagegating process n Don’t go forward to Milestone B with immature technology Integrity - Service - Excellence 69

The End 70 Integrity - Service - Excellence

The End 70 Integrity - Service - Excellence

Questions and Point of Contact John Cargill YD-1515 -03 Air Force Cost Analysis Agency

Questions and Point of Contact John Cargill YD-1515 -03 Air Force Cost Analysis Agency Operating Location Eglin AFB, FL 850 -883 -3460(w) 703 -371 -0891(c) john. cargill@eglin. af. mil Integrity - Service - Excellence 71

BACKUP SLIDES Integrity - Service - Excellence 72

BACKUP SLIDES Integrity - Service - Excellence 72

Final Note About NON-LINEAR TRLs Going from System/subsystem model or prototype demonstration in a

Final Note About NON-LINEAR TRLs Going from System/subsystem model or prototype demonstration in a relevant environment TRL 6 System prototype demonstration in an operational environment TRL 7 Is much harder than Component and/or breadboard validation in relevant environment TRL 5 AFCAA System/subsystem model or prototype demonstration in a relevant environment TRL 6 Integrity - Service - Excellence 73