CH 7
CH 7
Burn-in
Failure
rate
Time
Fig. 7.1: Bath tub curve of hardware
reliability
308
SoftwareReliability
Time
Fig. 7.2:
Software reliability curve (failure rate versus
time)
Sofware may be retired only if it becomes
given below: obsolete. Some of contributing factors are
change in environment
change in infrastructure/technology
major change in requirements
" increase in complexity
extremely difficult to maintain
deterioration in structure of the code
slow execution speed
" poor graphical user interfaces.
7.1.1 What is Software Reliability?
According to Bev Littlewood [LITT 79): "Software reliability means
Who cares how many bugs are in the operational reliability.
program? We should be concerned with their effect on its
operations".
As per IEEE standard [IEEE 90]: Software reliability is defined as the ability of a
system or component to perform its required functions under stated conditions for a
period of time". specified
Software reliability is also defined as the probability that a software system fulfills its
assigned task in a given environment for apredefined number of input cases,
the hardware and the inputs are free of error [KOPE 79]. Hence it is the assuming that
SOftware will work without failure for aspecified period of time in a given
probability that the
environment. Here
euvironment and time is fixed. Reliability is for fixed time under given environment or stated
conditions. So, reliability value is always for a well defined domain.
he most acceptable definition of software reliability is: "It is the probability of afailure
free operation of a program for a specified time in a specified environment [MUSA87".
by or example, atime-sharing system may have areliability of 0.95 for 10 hr when employed
the average user. This system, when executed for 10 hr, would operate without failure for
310
Software Engineeing
95 of these periods out of 100. As a result of the general way in which we
that the concept of software reliability incorporates the notion of performancedefined failure nte
being satisfactoe
For example, excessive response time at a given load level may be
that a routine must be recorded in more efficient form. considered unsatisfactory s
Accuracy
Clarityand açcuracy of documentation
Gonformity of óperational enironment
Usability
Completeness
Eficiency
Testability
Attribute
Attributes
domain
Modifiability
Adaptability Expandability
Portability
Quality has many characteristics and some are related to each other. In software,the
quality is commonly recognised as làck of bugs in the program. If a software has too many
functional defects, then, it is not meeting its basic requirement of functionality. Thisis usually
expressed in twO ways:
() Defect rate: number of defects per million lines of source code, per function point or
any other unit.
SofwareReliability
Modifiability
The effort required to modify a software durine
18. maintenance phase.
.
Interoperabilin
Jsability
Maintainability Flexib
Reusab
Product
operation
R e l i a b i l
Corectns Eficienc
Integity
VO valume
Integrity
VO rate
Access control
Eficiency
Access audit
Storage etficiency
Correctness Execution efficiency
Traceability
Reliability Completeness
Accuracy
Eror tolerance
Maintainability
Consistency
Simplicity
Testability
Conciseness
Instrumentation
Flexibility Expandablity
Generality
Reusability Self descriptiveness
Modularity
Machine independence
Portability
Software system independence
Communication commonality
Interoperability Data commonality
Operability
1. xx
2 Training
Communicativeness
3. x
JO volume X
4. x
J/O rate xX
5
Access control |xx
6
Access.Audit
7.
8.
Storage efficiency X
9.
Execution efficiency X
10. Traceability X
X
11, Completeness
X
12. Accuracy
X
Error tolerance
13.
X X X
14. Consistency
X X X
15. Simplicity
X
16. Conciseness
17. Instrumentation
X
18. Expandability X X
19. Genrality X X X
20. Self-descriptiveness X X X X X
21. Modularity X X
17. Instrumentation The degree to which the software provides for meas
urements of its use or identification of errors.
18. Expandability The degree to which storage requirements or software
functions can be expanded.
19, Generability The breadth of the potential application of software
components.
20. Self-descriptiveness The degree to which the documents are self explana
tory.
21. Modularity The provision of highlyindependent modules.
22. Machine independence The degree to which software is dependent on its
associated hardware.
23. Software system independence The degree to which software is independent of its
environment.
24. Communication commonality The degree to which standard protocols and interfaces
are used.
25. Data commonality The use of standard data representations.
It is not easy to measure many quality factors. We may have to apply software metrito
if possible, to measure such factors. The quality factors are not independent, but may overlap.
Some factors may impact others in a positive sense, while others may do so
negatively. "
Software Reliability 325
may have to study and understand such relationships very carefully. Asubjective assessment
af someeriteria can be obtained by giving a rating on a scale from, say, 0 (extremely bad) to 10
(extremely good). Such a subjective metric is difficult to use. Different people assessing the
same criterion are likely to give different ratings. This renders a proper quality assessment
almost impossible.
There are other methods to assessing quality like decomposing criterion into objectively
measurable properties of the system. It is also difficult to measure every property objectively
but aim is to define correctly and measure effectively.
722 Boehm Software Quality Model
R
CAPABILITY
74
capability maturity
MATURITY MODEL 347
The
he
model (CMM) is not a
egy improving software process, softwareof life cycle model.
for
CMM was
in1986.
developed by irespective
Software Engineeri n g Institute
the actual life Instead, it is a strat-
cycle model used. The
(SEI) of
CMM is used to judge the
maturity of the Car negie-M ellon University
to software processes of an
identifythe key practices that are
CMMis organized into five
required increase the maturity of organization and to
maturity levels as shown in Fig. 7.23 these processes. The
Optimizing [PAUL94, SCHA96].
5
Managed.
Defined
3
Repeatable.
Initial
1
measurements at 349
consistent level4. These
evaluating the project's software measurements
processes and establish the quantitative
foundation
for
products and processes by
narrowing the products. Projects achieve control
their acceptable quantitative boundaries. variation in their process over
within
Meaningful Variations in process
performance fall
to
particularly within performance can be distinguished from random varia-
tion
(noise),
curve of a new application
established product lines. The risks involved in
learning domain are known and carefully moving up the
number of1faults detected per 1000 lines of code. A managed. One measure
couldbe corresponding objective is to reduce
quantity (number of faults) over time.
this oofware process capability at level4
organizations summarizedcan be
as predict
able" because the process is measured and operates within measurable limits. This level of
process capability.allows an organization to predict trends of process and product quality within
quantitative bounds of these limits. When these limits are exceeded, action is taken to correct
thesituation. Software products are of predictably high quality.
Level 5)
6 Optimizing (Maturity
lavel the entire organization is focused on continuous process improvement.
organizations have the means to identify weaknesses and strengthen the process
The
Rnotively. with the goal of preventing the occurrence of defects. Data of the effectiveness of
and proposes
the software process is used to perform cost benefit analysis of new technologies en
bonges to the organization's software process. Innovations that exploit the best software
organization.
gineering practices are identified and transferred throughout the
defects to determine their
Software project teams in level 5 organizations "analyze
known types of defects from recurring,
causes". Software processes are evaluated to prevent
projects.
and lessons learned are disseminated to other con
5organizations can be characterized as
The software process capability of level
improving" because level 5 organizations are continuously striving to improve the
tinuously process performance of their projects.
capability, there by improving the
range of their process incremental advancements in the existing process and by innova
Improvement occurs both by
Lons using new technologies and methods.
maturity models are summarized in Fig.7.24 [SCHA96]
These five
Characterization
Maturity level
Adhoc process
Initial
Basic project management
Repeatable Process definition
Defined Process measurement
Managed Process control
Optimizing
five levels of CMM
Fig. 7.24: The
shown that advancing a complete
maturity model has from level 1 to level 2
maturityExperie
level nce with the capability
from 18 monthsto 3 years, but
moving
usually takes
350 Software Engineering
sometimes takes 3 or even 5 years. This is areflection of how difficult it is to instill a methodical
approach in an organization that up to now has functioned on a purely adhocand reactive
basis.