2020 Additive Manufacturing Workshop Final Outbrief AM Metrics
2020 Additive Manufacturing Workshop Final Outbrief AM Metrics Co Leads: Anne Flannigan Stephen Kuhn-Hendricks William Peterson Matthew Sloane Ernesto Ureta Timothy Vorakoumane 1
AM Metrics Objectives: • Conceptually identify the inputs and process for each metric Ø How would the metrics be calculated? • Identify the required pieces of data Ø Does the required data already exist in a database? Ø Where would the data be collected? • Comment on the feasibility for each metric Ø How difficult would it be to collect the required data? • Comment on value of the metric Ø How useful is the metric to decision makers
AM Metrics Sub-Working Groups: • Measures of Effectiveness • Co-Leads: Stephen Kuhn-Hendricks; Matthew Sloane • Measures of Capability Maturity • Co-Leads: Anne Flannigan; Ernesto Ureta • Measures of Performance • Co-Leads: William Peterson; Timothy Vorakoumane
AM Metrics Steps / Actions Completed: • • Divided into subgroups and identified critical inputs for AM metrics: cost, time, yield on investment (YOI), system maturity, performance Presentation of metrics from academia, commercial, and government and demonstration of tools to capture metrics (UDRI, GE, Navy). Critical review of Day 1 metrics discussions with full working group Provided inputs to AM Guidebook Ø Developing common metrics Ø Developing methods for tracking metrics progress
AM Metrics Issues / Challenges / Gaps Identified: • • • Full potential of AM will not be realized without an appropriate policy for AM contracting Establish consistent and standardized guidance/policy for qualification/certification of AM facilities, machines, materials, equipment, and personnel Establish a uniform criticality hierarchy (risk matrix v. green, blue, red box) across all potential AM platforms Platform readiness is not an AM metric, but our metrics on cost and schedule MUST consider the readiness of the platform The threshold for a “mature AM system” needs to be defined. Alternatively steps in the progress toward a fully mature system Subjectivity of decisions to pursue AM solutions/investment in AM
AM Metrics Key Takeaways (Effectiveness): Definition: Effectiveness refers to the value of AM production to system readiness with respect to cost, time, and yield • Configuration management will be key for seeing the greatest benefits of AM • Part use is critical in determining cost and time Ø A “good enough” solution (stop-gap) can lower cost and shorten schedule (series of stop gap parts? ) • Cost and schedule assessments will dynamically change with increased learning in parallel paths Ø Same part, similar part, equipment, feedstock experience, process experience
AM Metrics Key Takeaways (Capability Maturity): Definition: Capability maturity refers to the “readiness” of your AM system • Create a tiered criteria for ensuring AM systems are mature enough to handle a predefined criticality, complexity, and output Ø Requires definitions/thresholds of a mature system which could differ by organization Ø Potential checklist for defining maturity • Maturity is not a single direction, once mature a system can also regress • The state of maturity will continue to evolve and our strict definition of maturity may be akin to a moving target Ø Based upon technology advancement
AM Metrics Key Takeaways (Performance): Definition: Performance refers to the quality and quantity of the output/status of the AM system • AM parts produced are impacted by complexity, criticality, material, and level of decision maker Ø Number of parts produced is based on the level of the organization (field/organic, intermediate/depot, or the industry equivalent) • AM parts produced are dependent on replacement/redesign versus novel designs Ø Prototyping, obsolescence, tooling, and sustainment • Qualification and certification is dependent upon academic, commercial, or government (DAWIA, apprenticeship, licensing) Ø Differing considerations for operator and support personnel (engineers, technician, administrative)
AM Metrics Open Actions / Post AM Workshop Actions: • • Continual collaboration of co-leads to refine metrics definitions and begin defining calculations (white paper) Identify existing databases that currently capture the metrics data and identify those pieces of information that need to be captured Ø • • • Where necessary build or append databases to capture additional information Develop methodology for weighting metrics based upon impact to system readiness Collaboration of metrics co-leads with data management, policy groups Make Metrics a working group at AM workshop next year
AM Metrics Questions?
- Slides: 10