Taxonomy of Metrics:
Metrics for certain aspects of the project include: 
- Progress in      terms of size and complexity.
 - Stability in      terms of rate of change in the requirements or implementation, size, or      complexity.
 - Modularity in      terms of the scope of change.
 - Quality in      terms of the number and type of errors.
 - Maturity in      terms of the frequency of errors.
 - Resources in      terms of project expenditure versus planned expenditure
 
|     Metric  |        Purpose  |        Sample measures/perspectives  |   
|     Progress  |        Iteration planning Completeness  |        Number of classes SLOC Function points Scenarios Test cases These measures may also be collected   by class and by package  Amount of rework   per iteration (number of classes)  |   
|     Stability  |        Convergence  |        Number and type of   changes (bug versus enhancement; interface versus implementation) This measure may also be collected by   iteration and by package  Amount of rework   per iteration  |   
|     Adaptability  |        Convergence Software "rework"  |        Average   person-hours/change This measure may also be collected by   iteration and by package  |   
|     Modularity  |        Convergence Software "scrap"  |        Number of   classes/categories modified per change This measure may also be collected by   iteration  |   
|     Quality  |        Iteration planning Rework indicator Release criterion  |        Number of errors Defect discovery   rate Defect density Depth of   inheritance Class coupling Size of interface   (number of operations) Number of methods   overridden Method size These measures may also be collected   by class and by package  |   
|     Maturity  |        Test coverage/adequacy Robustness for use  |        Test hours/failure   and type of failure This measure may also be collected by   iteration and by package  |   
|     Expenditure profile  |        Financial insight Planned versus actual  |        Person-days/class Full-time staff   per month % budget expended  |   
Ø  The Process:-  the sequence of activities invoked to produce the software product (and other artifacts) 
Ø  The Product:- the artifacts of the process, including software, documents and models
Ø  The Project:- the totality of project resources, activities and artifacts
Ø  The Resources:- the people, methods and tools, time, effort and budget, available to the project
Process Metrics:Short-term metrics that measure the effectiveness of the product development process and can be used to predict program and product performance
- Staffing (hours) vs. plan
- Turnover rate
- Errors per 1,000 lines of code (KSLOC)
|     Metrics  |        Comments  |   
|     Duration  |        Elapsed   time for the activity  |   
|     Effort  |        Staff   effort units (staff-hours, staff-days, ...)  |   
|     Output  |        Artifacts   and their size and quantity (note this will include defects as an output of   test activities)  |   
|     Software   development environment usage  |        CPU,   storage, software tools, equipment (workstations, PCs), disposables. Note   that these may be collected for a project by the Software Engineering   Environment Authority (SEEA).  |   
|     Defects,   discovery rate, correction rate.  |        Total   repair time/effort and total scrap/rework (where this can be measured) also   needs to be collected; will probably come from information collected against   the defects (considered as artifacts).  |   
|     Change   requests, imposition rate, disposal rate.  |        Comments   as above on time/effort.  |   
|     Other   incidents that may have a bearing on these metrics (freeform text)  |        This   is a metric in that it is a record of an event that affected the process.  |   
|     Staff   numbers, profile (over time) and characteristics  |        |   
|     Staff   turnover  |        A   useful metric which may explain at a post-mortem review why a process went   particularly well, or badly.  |   
|     Effort   application  |        The   way effort is spent during the performance of the planned activities (against   which time is formally recorded for cost account management) may help explain   variations in productivity: some subclasses of effort application are, for   example:  ·           Training ·           Familiarization ·           Management (by team lead, for example) ·           Administration ·           Research ·           Productive work—it's helpful to record this by artifact,   and attempt a separation of 'think' time and capture time, particularly for   documents. This will tell the project manager how much of an imposition the   documentation process is on the engineer's time. ·           Lost time ·           Meetings ·           Inspections, walkthroughs, reviews - preparation and   meeting effort (some of these will be separate activities and time and effort   for them will be recorded against a specific review activity)  |   
|     Inspections,   walkthroughs, reviews (during an activity - not separately scheduled reviews)  |        Record   the number of these and their duration, and the number of issues raised.  |   
|     Process   deviations (raised as non-compliances, requiring project change)  |        Record   the numbers of these and their severity. This is an indicator that more   education may be required, that the process is being misapplied, or that the   way the process was configured was incorrect  |   
|     Process   problems (raised as process defects, requiring process change)  |        Record   the number of these and their severity. This will be useful information at   the post-mortem reviews and is essential feedback for the Software   Engineering Process Authority (SEPA).  |   
Product development Metrics
Artifacts:
·         Size — a measure of the number of things in a model, the length of something, the extent or mass of something
·         Quality 
§  Defects — indications that an artifact does not perform as specified or is not compliant with its specification, or has other undesirable characteristics
§  Complexity — a measure of the intricacy of a structure or algorithm: the greater the complexity, the more difficult a structure is to understand and modify, and there is evidence that complex structures are more likely to fail
§  Coupling — a measure of how extensively elements of a system are interconnected
§  Cohesion — a measure of how well an element or component meets the requirement of having a single, well-defined, purpose
§  Primitiveness — the degree to which operations or methods of a class can be composed from others offered by the class
·         Completeness — a measure of the extent to which an artifact meets all requirements (stated and implied—the Project Manager should strive to make explicit as much as possible, to limit the risk of unfulfilled expectations). We have not chosen here to distinguish between sufficient and complete.
·         Traceability — an indication that the requirements at one level are being satisfied by artifacts at a lower level, and, looking the other way, that an artifact at any level has a reason to exist
·         Volatility — the degree of change in an artifact because of defects or changing requirements
·         Effort — a measure of the work (staff-time units) that is required to produce an artifact
Documents:|     Characteristic  |        Metrics  |   
|     Size  |        Page   count  |   
|     Effort  |        Staff-time   units for production, change and repair  |   
|     Volatility  |        Numbers   of changes, defects, opened, closed; change pages  |   
|     Quality  |        Measured   directly through defect count  |   
|     Completeness  |        Not   measured directly: judgment made through review  |   
|     Traceability  |        Not   measured directly: judgment made through review  |   
Models: 
§  Use-Case Model
|     Characteristic  |        Metrics  |   
|     Size  |        |   
|     Effort  |        |   
|     Volatility  |        |   
|     Quality  |        |   
|     Completeness  |        |   
|     Traceability  |        o    Scenarios realized in analysis model/total scenarios o    Scenarios realized in design model/total scenarios o    Scenarios realized in implementation model/total scenarios o    Scenarios realized in test model (test cases)/total scenarios  |   
§  Design Model
|     Characteristic  |        Metrics  |   |
|     Size  |        |   |
|     Effort  |        |   |
|     Volatility  |        |   |
|     Quality  |        Complexity  |        |   
|     Coupling  |        |   |
|     Cohesion  |        |   |
|     Defects  |        |   |
|     Completeness  |        |   |
|     Traceability  |        Number of   classes in Implementation Model/number of classes  |   |
|     Characteristic  |        Metrics  |   |
|     Size  |        |   |
|     Effort  |        |   |
|     Volatility  |        |   |
|     Quality  |        Complexity  |        |   
|     Coupling  |        |   |
|     Cohesion  |        |   |
|     Defects  |        |   |
|     Completeness  |        |   |
§  Test Model
|     Characteristic  |        Metrics  |   
|     Size  |        |   
|     Effort  |        |   
|     Volatility  |        |   
|     Quality  |        |   
|     |   |
|     |   
§  Management
Change Model—this is a notional model for consistent presentation—the metrics will be collected from whatever system is used to manage Change Requests.
|     Characteristic  |        Metrics  |   
|     Size  |        |   
|     Effort  |        |   
|     Volatility  |        |   
|     Completeness  |        |   
·         BCWS, Budgeted Cost for Work Scheduled
·         BCWP, Budgeted Cost for Work Performed
·         ACWP, Actual Cost of Work Performed
·         BAC, Budget at Completion
·         EAC, Estimate at Completion
·         CBB, Contract Budget Base
·         LRE, Latest Revised Estimate (EAC)
Resources Metrics:
·         People (experience, skills, cost, performance),
·         Methods and tools (in terms of effect on productivity and quality, cost), 
·         Time, effort, budget (resources consumed, resources remaining)
No comments:
Post a Comment