Taxonomy of Metrics:
Metrics for certain aspects of the project include:
- Progress in terms of size and complexity.
- Stability in terms of rate of change in the requirements or implementation, size, or complexity.
- Modularity in terms of the scope of change.
- Quality in terms of the number and type of errors.
- Maturity in terms of the frequency of errors.
- Resources in terms of project expenditure versus planned expenditure
Metric | Purpose | Sample measures/perspectives |
Progress | Iteration planning Completeness | Number of classes SLOC Function points Scenarios Test cases These measures may also be collected by class and by package Amount of rework per iteration (number of classes) |
Stability | Convergence | Number and type of changes (bug versus enhancement; interface versus implementation) This measure may also be collected by iteration and by package Amount of rework per iteration |
Adaptability | Convergence Software "rework" | Average person-hours/change This measure may also be collected by iteration and by package |
Modularity | Convergence Software "scrap" | Number of classes/categories modified per change This measure may also be collected by iteration |
Quality | Iteration planning Rework indicator Release criterion | Number of errors Defect discovery rate Defect density Depth of inheritance Class coupling Size of interface (number of operations) Number of methods overridden Method size These measures may also be collected by class and by package |
Maturity | Test coverage/adequacy Robustness for use | Test hours/failure and type of failure This measure may also be collected by iteration and by package |
Expenditure profile | Financial insight Planned versus actual | Person-days/class Full-time staff per month % budget expended |
Ø The Process:- the sequence of activities invoked to produce the software product (and other artifacts)
Ø The Product:- the artifacts of the process, including software, documents and models
Ø The Project:- the totality of project resources, activities and artifacts
Ø The Resources:- the people, methods and tools, time, effort and budget, available to the project
Process Metrics:Short-term metrics that measure the effectiveness of the product development process and can be used to predict program and product performance
- Staffing (hours) vs. plan
- Turnover rate
- Errors per 1,000 lines of code (KSLOC)
Metrics | Comments |
Duration | Elapsed time for the activity |
Effort | Staff effort units (staff-hours, staff-days, ...) |
Output | Artifacts and their size and quantity (note this will include defects as an output of test activities) |
Software development environment usage | CPU, storage, software tools, equipment (workstations, PCs), disposables. Note that these may be collected for a project by the Software Engineering Environment Authority (SEEA). |
Defects, discovery rate, correction rate. | Total repair time/effort and total scrap/rework (where this can be measured) also needs to be collected; will probably come from information collected against the defects (considered as artifacts). |
Change requests, imposition rate, disposal rate. | Comments as above on time/effort. |
Other incidents that may have a bearing on these metrics (freeform text) | This is a metric in that it is a record of an event that affected the process. |
Staff numbers, profile (over time) and characteristics | |
Staff turnover | A useful metric which may explain at a post-mortem review why a process went particularly well, or badly. |
Effort application | The way effort is spent during the performance of the planned activities (against which time is formally recorded for cost account management) may help explain variations in productivity: some subclasses of effort application are, for example: · Training · Familiarization · Management (by team lead, for example) · Administration · Research · Productive work—it's helpful to record this by artifact, and attempt a separation of 'think' time and capture time, particularly for documents. This will tell the project manager how much of an imposition the documentation process is on the engineer's time. · Lost time · Meetings · Inspections, walkthroughs, reviews - preparation and meeting effort (some of these will be separate activities and time and effort for them will be recorded against a specific review activity) |
Inspections, walkthroughs, reviews (during an activity - not separately scheduled reviews) | Record the number of these and their duration, and the number of issues raised. |
Process deviations (raised as non-compliances, requiring project change) | Record the numbers of these and their severity. This is an indicator that more education may be required, that the process is being misapplied, or that the way the process was configured was incorrect |
Process problems (raised as process defects, requiring process change) | Record the number of these and their severity. This will be useful information at the post-mortem reviews and is essential feedback for the Software Engineering Process Authority (SEPA). |
Product development Metrics
Artifacts:
· Size — a measure of the number of things in a model, the length of something, the extent or mass of something
· Quality
§ Defects — indications that an artifact does not perform as specified or is not compliant with its specification, or has other undesirable characteristics
§ Complexity — a measure of the intricacy of a structure or algorithm: the greater the complexity, the more difficult a structure is to understand and modify, and there is evidence that complex structures are more likely to fail
§ Coupling — a measure of how extensively elements of a system are interconnected
§ Cohesion — a measure of how well an element or component meets the requirement of having a single, well-defined, purpose
§ Primitiveness — the degree to which operations or methods of a class can be composed from others offered by the class
· Completeness — a measure of the extent to which an artifact meets all requirements (stated and implied—the Project Manager should strive to make explicit as much as possible, to limit the risk of unfulfilled expectations). We have not chosen here to distinguish between sufficient and complete.
· Traceability — an indication that the requirements at one level are being satisfied by artifacts at a lower level, and, looking the other way, that an artifact at any level has a reason to exist
· Volatility — the degree of change in an artifact because of defects or changing requirements
· Effort — a measure of the work (staff-time units) that is required to produce an artifact
Documents: Characteristic | Metrics |
Size | Page count |
Effort | Staff-time units for production, change and repair |
Volatility | Numbers of changes, defects, opened, closed; change pages |
Quality | Measured directly through defect count |
Completeness | Not measured directly: judgment made through review |
Traceability | Not measured directly: judgment made through review |
Models:
§ Use-Case Model
Characteristic | Metrics |
Size | Number of Use Cases Number of Use Case Packages Reported Level of Use Case (see white paper, "The Estimation of Effort and Size based on Use Cases" from the Resource Center) Number of scenarios, total and per use case Number of actors Length of Use Case (pages of event flow, for example) |
Effort | Staff-time units (with production, change and repair separated) |
Volatility | Number of defects and change requests (open, closed) |
Quality | Reported complexity (0-5, by analogy with COCOMO [BOE81], at class level; complexity range is narrower at higher levels of abstraction - see white paper, "The Estimation of Effort and Size based on Use Cases" from the Resource Center) Defects — number of defects, by severity, open, closed |
Completeness | Use Cases completed (reviewed and under configuration management with no defects outstanding)/use cases identified (or estimated number of use cases) |
Traceability | o Scenarios realized in analysis model/total scenarios o Scenarios realized in design model/total scenarios o Scenarios realized in implementation model/total scenarios o Scenarios realized in test model (test cases)/total scenarios |
§ Design Model
Characteristic | Metrics | |
Size | Number of classes Number of design subsystems Number of subsystems of subsystems . Number of packages Methods per class, internal, external Attributes per class, internal, external Depth of inheritance tree Number of children | |
Effort | Staff-time units (with production, change and repair separated) | |
Volatility | Number of defects and change requests (open, closed) | |
Quality | Complexity | Response For a Class (RFC): this may be difficult to calculate because a complete set of interaction diagrams is needed. |
Coupling | Number of children Coupling between objects (class fan-out) | |
Cohesion | Number of children | |
Defects | Number of defects, by severity (open, closed) | |
Completeness | Number of classes completed/number of classes estimated (identified) Design traceability (in Use-Case model) | |
Traceability | Number of classes in Implementation Model/number of classes |
Characteristic | Metrics | |
Size | Number of classes Number of components Number of implementation subsystems Number of subsystems of subsystems . Number of packages Methods per class, internal, external Attributes per class, internal, external Size of methods* Size of attributes* Depth of inheritance tree Number of children Estimated size* at completion | |
Effort | Staff-time units (with production, change and repair separated) | |
Volatility | Number of defects and change requests (open, closed) Breakage* for each corrective or perfective change, estimated (prior to fix) and actual (upon closure) | |
Quality | Complexity | Response For a Class (RFC) Cyclomatic complexity of methods** |
Coupling | Number of children Coupling between objects (class fan-out) Message passing coupling (MPC)*** | |
Cohesion | Number of children Lack of cohesion in methods (LCOM) | |
Defects | Number of defects, by severity, open, closed | |
Completeness | Number of classes unit tested/number of classes in design model Number of classes integrated/number of classes in design model Implementation traceability (in Use-Case model) Test model traceability multiplied by Test Completeness Active integration and system test time (accumulated from test process), that is, time with system operating (used for maturity calculation) |
§ Test Model
Characteristic | Metrics |
Size | Number of Test Cases, Test Procedures, Test Scripts |
Effort | Staff-time units (with production, change and repair separated) for production of test cases, and so on |
Volatility | Number of defects and change requests (open, closed)—against the test model |
Quality | Defects — number of defects by severity, open, closed (these are defects raised against the test model itself, not defects raised by the test team against other software) |
Number of test cases written/number of test cases estimated Test traceability (in Use-Case model) Code coverage | |
Number of Test Cases reported as successful in Test Evaluation Summary/Number of test cases |
§ Management
Change Model—this is a notional model for consistent presentation—the metrics will be collected from whatever system is used to manage Change Requests.
Characteristic | Metrics |
Size | Number of defects, change requests by severity and status, also categorized as number of perfective changes, number of adaptive changes and number of corrective changes. |
Effort | Defect repair effort, change implementation effort in staff-time units |
Volatility | Breakage (estimated, actual) for the implementation model subset. |
Completeness | Number of defects discovered/number of defects predicted (if a reliability model is used) |
· BCWS, Budgeted Cost for Work Scheduled
· BCWP, Budgeted Cost for Work Performed
· ACWP, Actual Cost of Work Performed
· BAC, Budget at Completion
· EAC, Estimate at Completion
· CBB, Contract Budget Base
· LRE, Latest Revised Estimate (EAC)
Resources Metrics:
· People (experience, skills, cost, performance),
· Methods and tools (in terms of effect on productivity and quality, cost),
· Time, effort, budget (resources consumed, resources remaining)
No comments:
Post a Comment