Page 113 -
P. 113
84 PART TWO MANAGING SOFTWARE PROJECTS
have been developed by a number of practitioners), errors found during formal tech-
2
nical reviews, and lines of code or function points per module and function. These
data are reviewed by the team to uncover indicators that can improve team perfor-
mance.
Public metrics generally assimilate information that originally was private to indi-
viduals and teams. Project level defect rates (absolutely not attributed to an individ-
ual), effort, calendar times, and related data are collected and evaluated in an attempt
Public metrics enable
an organization to to uncover indicators that can improve organizational process performance.
make strategic Software process metrics can provide significant benefit as an organization works
changes that improve to improve its overall level of process maturity. However, like all metrics, these can
the software process be misused, creating more problems than they solve. Grady [GRA92] suggests a “soft-
and tactical changes ware metrics etiquette” that is appropriate for both managers and practitioners as
during a software
project. they institute a process metrics program:
• Use common sense and organizational sensitivity when interpreting metrics
data.
• Provide regular feedback to the individuals and teams who collect measures
and metrics.
? What • Don’t use metrics to appraise individuals.
guidelines
should be applied • Work with practitioners and teams to set clear goals and metrics that will be
when we collect used to achieve them.
software metrics?
• Never use metrics to threaten individuals or teams.
• Metrics data that indicate a problem area should not be considered “nega-
tive.” These data are merely an indicator for process improvement.
• Don’t obsess on a single metric to the exclusion of other important metrics.
As an organization becomes more comfortable with the collection and use of
process metrics, the derivation of simple indicators gives way to a more rigorous
WebRef approach called statistical software process improvement (SSPI). In essence, SSPI uses
3
SSPI and other quality software failure analysis to collect information about all errors and defects encoun-
related information is tered as an application, system, or product is developed and used. Failure analysis
available through the
American Society for works in the following manner:
Quality at
www.asq.org 1. All errors and defects are categorized by origin (e.g., flaw in specification,
flaw in logic, nonconformance to standards).
2. The cost to correct each error and defect is recorded.
2 See Sections 4.3.1 and 4.3.2 for detailed discussions of LOC and function point metrics.
3 As we discuss in Chapter 8, an error is some flaw in a software engineering work product or deliv-
erable that is uncovered by software engineers before the software is delivered to the end-user. A
defect is a flaw that is uncovered after delivery to the end-user.