Activity Metrics SE 450 Software Processes Product Metrics

  • Slides: 15
Download presentation
Activity Metrics SE 450 Software Processes & Product Metrics

Activity Metrics SE 450 Software Processes & Product Metrics

Overview n n n Metrics that indicate how well we are performing various activities:

Overview n n n Metrics that indicate how well we are performing various activities: n Requirements, Design, Coding, Testing, Maintenance, Configuration Management, Quality Engineering. Most of these are relatively crude indicators: n Outlier values indicate possible problems. n Good values not conclusive indicators of goodness. n Most do not measure actual quality of output. n Just provide detection of some kinds of problems. n Sort of like MS-Word’s green lines to indicate grammar problems. Many metrics can be generated by tools, and don’t require additional effort or process changes. n Cheap ways to get additional some useful feedback. SE 450 Software Processes & Product Metrics

Requirements n n Requirements volatility: n Average # of changes per requirement. n Requirements

Requirements n n Requirements volatility: n Average # of changes per requirement. n Requirements changes grouped by source. Requirements density: n Number of requirements per function point or KLOC. n Indicator of granularity of requirements capture. Number of variations per use case: n Indicator of coverage of exception situations. Requirements defects classification. SE 450 Software Processes & Product Metrics

Requirements Defects Classification n n Can classify requirements defects: n Requirements discovery: missed requirements,

Requirements Defects Classification n n Can classify requirements defects: n Requirements discovery: missed requirements, misunderstood requirements. n Indicators of elicitation effectiveness. n Requirements errors: consistency, completeness, specification errors. n Effectiveness of requirements analysis & specification. n Team-originated updates & enhancements: n Effectiveness of arch & solution design practices. n Effectiveness of specification (cases not considered). n Customer-originated updates: n Can’t control / opportunities for improving elicitation. Can do this for any of the activities. n Same concept as DRE SE 450 Software Processes & Product Metrics

Design Metrics n Cohesion n Coupling n Fan-in / fan-out: n Number of methods

Design Metrics n Cohesion n Coupling n Fan-in / fan-out: n Number of methods called by each method. n Keep within control limits. n Low fan-out indicates too much hierarchy. n High fan-out indicates too many dependencies. n Not absolute rules at all! SE 450 Software Processes & Product Metrics

Object-Oriented Design Metrics n n n n Average method size: less is good. Number

Object-Oriented Design Metrics n n n n Average method size: less is good. Number of methods per class: within control limits. Number of instance variables per class: within limits. Class hierarchy nesting level: < 7 (guideline). Number of subsystem/subsystem relationships. n Less is good? Control limits? Number of class/class relationships within subsystem. n High is good – indicates higher cohesion. Instance variable grouping among methods. n May indicate possibility of splits. SE 450 Software Processes & Product Metrics

Code Complexity Metrics n n n Comment density. n Does not tell you quality

Code Complexity Metrics n n n Comment density. n Does not tell you quality of comments! Cyclomatic complexity: n Number of operators / line, procedure. n Tells you more about code writing style. n Extreme values might indicate problems. n Supposedly useful to estimate complexity of software and expected error rates n Less applicable to O-O than procedural coding. Software science: n A set of equations that try to derive parametric relationships among different software parameters, and create estimates of “difficulty”, expected effort, faults etc. n Not really proven empirically, and of unclear value? SE 450 Software Processes & Product Metrics

Historical Perspective n n Much of the early work in metrics was on code

Historical Perspective n n Much of the early work in metrics was on code complexity and design complexity. n Of rather limited value, since it quickly gets prescriptive about coding practices, and its outputs are indicators at best. n Runs easily into various religious arguments. Even now, this is what people think of when you mention metrics. Metrics has now moved on to measuring: n Customer view of product. n Aspects that give you clearer insight into improving development. Most practitioners have not caught up with this yet. SE 450 Software Processes & Product Metrics

Test Metrics: Coverage n n n Black box: n Requirements coverage: test cases per

Test Metrics: Coverage n n n Black box: n Requirements coverage: test cases per requirement. n Works with use cases / user stories / numbered requirement. n Equivalence class coverage. n Extent of coverage of equiv classes of input parameters. n Combinations of equiv class coverage. n This is the real challenge. Glass box: n Function coverage n Statement coverage n Path coverage Tools that automatically generate coverage statistics. SE 450 Software Processes & Product Metrics

Test Progress n n S-curve. n Histogram of number of test cases attempted /

Test Progress n n S-curve. n Histogram of number of test cases attempted / successful. Test defects arrival rate. n Similar to reliability growth curves. Test defect backlog curve: n Cumulative defects not yet fixed. n Shows effectiveness of resolving bugs. Number of crashes over time. n Similar to reliability curve, but not so formal. SE 450 Software Processes & Product Metrics

Maintenance Metrics n Fix backlog: n Age of open and closed problems. n Backlog

Maintenance Metrics n Fix backlog: n Age of open and closed problems. n Backlog mgmt index: closed rate / arrivals rate. n Fix response time: mean time from open to closed. n Fixing effectiveness: (1 - % of bad fixes). n Fixing delinquency: % closed within acceptable response time. SE 450 Software Processes & Product Metrics

Configuration Management n n Defect classification can provide insight into sources of CM problems.

Configuration Management n n Defect classification can provide insight into sources of CM problems. Also, “Configuration Status Accounting” (CSA): n Tool-based cross-check of expected progress. n As project moves through different phases, would expect different documents to be generated / modified. n CSA reports which files being modified n Powerful, advanced technique. n If pre-configured which expected modifications, can flag discrepancies. n Can go deeper and look at extent of modifications. n Also useful to monitor which files modified during bug fixes, hence which regression tests need to run. SE 450 Software Processes & Product Metrics

Quality Engineering n n n Assessment results n Red/yellow/green on practices in each area

Quality Engineering n n n Assessment results n Red/yellow/green on practices in each area n E. g. requirements, planning, CM etc. Classifying defects: defects related to “not following process”. Shape of various curves. n E. g. wide variations in estimation accuracy or defect injection rates might show non-uniform practices. SE 450 Software Processes & Product Metrics

In-Process Metrics n n Metrics can help us determine whether projects went well and

In-Process Metrics n n Metrics can help us determine whether projects went well and where problems are. Some metrics only meaningful after the project is done e. g. productivity, cycletime. Other metrics can be used to diagnose problems while project in progress, or to ensure that activities are done right: n Most activity metrics are used that way. n Defect density, even DRE defect removal patterns can be used that way, but need to be careful. n Many metrics not fully available till end of project, but can monitor how the metric evolves as project proceeds. Most in-process metrics are like dashboard gauges: out-of-range values indicate problems, but “good” values do not guarantee health. SE 450 Software Processes & Product Metrics

Summary n n Activity metrics help us to gauge quality of activities: n Most

Summary n n Activity metrics help us to gauge quality of activities: n Most are useful as indicators, but crude and inconclusive. n Cheap to generate, so good benefit/cost. n Don’t “work to the metrics”! People constantly coming up with new ones. SE 450 Software Processes & Product Metrics