0% found this document useful (0 votes)
22 views16 pages

CH 7

The document discusses software reliability and how it differs from hardware reliability. Software reliability refers to the probability of failure-free operation of a program for a specified time period in a given environment. Unlike hardware, software does not experience wear and tear but can become obsolete over time as the environment changes. The expected reliability curve for software shows high failure rates initially during testing, which decrease over the useful life until the software becomes obsolete.

Uploaded by

pranjalgupta2031
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views16 pages

CH 7

The document discusses software reliability and how it differs from hardware reliability. Software reliability refers to the probability of failure-free operation of a program for a specified time period in a given environment. Unlike hardware, software does not experience wear and tear but can become obsolete over time as the environment changes. The expected reliability curve for software shows high failure rates initially during testing, which decrease over the useful life until the software becomes obsolete.

Uploaded by

pranjalgupta2031
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

7 Software Reliability

Software reliability. Does it exist ? Computer program instructions cannot break or


Hence predicting software reliability is a different concept as compared to wear out.
predicting
ware reliability. Software becomes more reliable over time, instead of wearing out. It hard.
obsolete as the environment for which it was developed changes. Hardware
lows us to make a system reliable as we desire, if we use large number of
becomes
redundancyal-
components
given reliability. We do not have such techniques in software and we may not get such with
niques in foreseeable future.
Initially, problems in programming were blamed to the severe constraints
imposed by
the hardware. However, hardware has now become more reliable, flexible and versatile
the problems with programming have not decreased.

7.1 BASIC CONCEPTS


The term reliability is often misunderstood in the software field since software does not break
or wear-out in the physical sense. It either works in a given environment or it does not, Hene
traditional bath tub' curve of hardware reliability is not applicable here. The "bath tub curye
is given in Fig. 7.1.
As indicated, there are three phases in the life of any hardware component i.e., burn-in,
useful life and wear-out. In burn-in phase, failure rate is quite high initially, and it starts
decreasing gradually as the time progresses. It may be due to initial testing in the premises of
the organisation. During useful life period, failure rate is approximately constant. Failure rate
increases in wear-out phase due to wearing out/aging of components. The best period is useful
life period. The shape of this curve is like a bathtub' and that is why it is known as bath tub
curve.

Burn-in

Useful life Wear-out,

Failure
rate

Time
Fig. 7.1: Bath tub curve of hardware
reliability
308
SoftwareReliability

We do not have wear out phase in software. The 309


Pig.7.2. expected curve for software is given in
Testing
phase
Useful life
Failure
rate
Obsolescence

Time
Fig. 7.2:
Software reliability curve (failure rate versus
time)
Sofware may be retired only if it becomes
given below: obsolete. Some of contributing factors are
change in environment
change in infrastructure/technology
major change in requirements
" increase in complexity
extremely difficult to maintain
deterioration in structure of the code
slow execution speed
" poor graphical user interfaces.
7.1.1 What is Software Reliability?
According to Bev Littlewood [LITT 79): "Software reliability means
Who cares how many bugs are in the operational reliability.
program? We should be concerned with their effect on its
operations".
As per IEEE standard [IEEE 90]: Software reliability is defined as the ability of a
system or component to perform its required functions under stated conditions for a
period of time". specified
Software reliability is also defined as the probability that a software system fulfills its
assigned task in a given environment for apredefined number of input cases,
the hardware and the inputs are free of error [KOPE 79]. Hence it is the assuming that
SOftware will work without failure for aspecified period of time in a given
probability that the
environment. Here
euvironment and time is fixed. Reliability is for fixed time under given environment or stated
conditions. So, reliability value is always for a well defined domain.
he most acceptable definition of software reliability is: "It is the probability of afailure
free operation of a program for a specified time in a specified environment [MUSA87".

by or example, atime-sharing system may have areliability of 0.95 for 10 hr when employed
the average user. This system, when executed for 10 hr, would operate without failure for
310
Software Engineeing
95 of these periods out of 100. As a result of the general way in which we
that the concept of software reliability incorporates the notion of performancedefined failure nte
being satisfactoe
For example, excessive response time at a given load level may be
that a routine must be recorded in more efficient form. considered unsatisfactory s

7.1.2 Software Reliability and Hardware Reliability


The field of hardware reliability has been established for somne time.
Hence, one
software reliability relates to it. In reality, the division between hardware andmight ask how
software reli.
ability is somewhat artificial. Both may be defined in the same way.
bit Therefore, one may com
7.2 SOFTWARE QUALITY
Our objective of software engineering is to produce good quality maintainable software in time
and within budget. Here quality is very important. What do we understand with the term
"quality" ? It is not easy to define quality. People understand quality, appreciate quality but
may not be able to clearly express the same. It is like beauty which isvery much in the eyes of
the beholder.
Different people understand different meanings of quality like:
" conformance to requirements
fitness for the purpose
level of satisfaction
Ifaproduct is meeting its requirements, we may say it is a good quality product. We
expect that requirements are clearly stated and cannot be misunderstood. Everything is measis
uredwith respect to requirements and if it matches, product is a quality product. If a car
designed with amnaximumn speed of 150 km/hour and if it fails to achieve in the field, then it is
eetung the requirements. If two cars are designed with different style, performance and
oomy and both are up to standards set for them, then both are quality cars.
When user uses the product and finds the product fit for its purp0se, he/she feels that
mo1S of good quality. If product is meeting user requirements,quality.a feeling of satisfaction
mayemerge and this satisfaction is nothing but the satisfaction for
318
Software Engineering
In a broad sense, the users views of quality must deal with the product's ease of
lation, operational efficiency, and convenience. If the product is easy to handle and weinstal-
fortably remember how to use it, then customer satisfaction level may increases.
Attribute Atributes
domain
Correctness
Consistency and precision
Reliability Robustness
Simplicity
Traceability

Accuracy
Clarityand açcuracy of documentation
Gonformity of óperational enironment
Usability
Completeness
Eficiency
Testability

Attribute
Attributes
domain

Accuracy and clarity of documentation


Modularity
Maintainability
Readability
Simplicity

Modifiability
Adaptability Expandability
Portability

Fig. 7.8: Software quality attributes

Quality has many characteristics and some are related to each other. In software,the
quality is commonly recognised as làck of bugs in the program. If a software has too many
functional defects, then, it is not meeting its basic requirement of functionality. Thisis usually
expressed in twO ways:
() Defect rate: number of defects per million lines of source code, per function point or
any other unit.
SofwareReliability

Reliability: generally 319


meantime to failure or measured asof number of failures per
specified environment, probability failure free in a hours operation,
of
When we deal with software quality, alist
operation specified time under
for software. There are four of attributes is
appropriatee
attribute domains required to be defined that are
Theseare
usuallythe ones most entrusted
Reliability
by the (DUNN90], which should be defined.
customer:
Usability
Maintainability
Adaptability
Fonr
attribute-domains can be divided into
understoodby the software community and are given in Fig. attributes that are more commonly
Table 7.4 7.8. The details of
sttributes are given in software quality
Table 7.4: Software quality
attributes
1.
Reliability The extent to which a software
functions without failure. performs intended
its
2. Correctness The extent to which a software meets its
tions. specifica
3.
Consistency and precision The extent to which a software is consistent and
give results with precision.
4. Robustness The extent to which a software tolerates the uneX
pected problems.
5. Simpicity The extent to which a software is simple in its op
erations.
6 Traceability The extent to whieh an error is traceable in order
to fix it.
7. Usability The extent of effort required to learn, operate and
understand the functions of the software.
8 Accuracy Meeting specifications with precision.
9 The extent to which documents are clearly and
Clarity and accuracy of documentation
accurately written.
10. Conformity of operational environment The extent to which a software is in conformity of
operational environment.
11
Completeness The extent to which a software has specified func
tions.
12. Eficiency The amount of computing resources and code
required by software to perform a function.
13. Testability The effort required to test a software to ensure that
it performs its intended functions.
(Contd.)
320 Software Engineeing
The effort required to locate and fiX an error
14. Maintainability maintenance phase.
during
It is the extent of ease to implement,
15. Modularity and maintain the software.
test, debug
The extent to which a software is readable in order
16. Readability to understand.
The extent to which a software is adaptable to
17. Adaptability platforms and technologies. new

Modifiability
The effort required to modify a software durine
18. maintenance phase.
.

The extent to which a software is expandahla


19. Expandability without undesirable side effects.

The effort required to transfer a program from one


20. Portability
platform to another platform.
named as the factor and criteria. Thev
The attribute domain and attributes are also
and some are discussed here.
are many models of software quality
7.2.1 McCall Software Quality Model
introduced in 1977 and many quality
McCall et. al [MCCA77]model of software quality was
between two levels of quality attributes
factors were incorporated. The model distinguishes These are external attributes and
Higher level quality attributes are known as quality factors.
quality attributes is named as quality criteria.
can be measured directly. The second level of
The software quality fac
Quality criteria can be measured either subjectively or objectively. 7.9.
shown in Fig.
tors are organised in three product quality factors as

Interoperabilin

Jsability
Maintainability Flexib
Reusab

Portability Product Product


transition revision Usability

Product
operation
R e l i a b i l

Corectns Eficienc
Integity

Fig. 7.9: Software quality factors


Product Operation: Here, 321
() are coombined. The factors factors
are: which are related to the
. Correctness operation of aproduct
. Efficiency
. Integrity
. Reliability
Usability.
Thesefive factors are related to operational
and its correctness. These factors play a veryperformance, convenience, ease of usage
satisfaction. significant role in building customer's
(ü) Product Revision: The factors which are required for testing and maintenance are
combined and are given below:
Maintainability
Flexibility
Testability.
These factors pertain to the testing and maintainability of software. They give us
idea about ease of maintenance, flexibility and testing effort. Hence, they are com
binedunder the umbrella of product revision.
(ii) Product Transition: We may have to transfer a product from one platform to an
other platform or from one technology to another technology. The factors related to
such a transfer are combined and are given below:
" Portability
" Reusability
Interoperability.
Most of the quality factors are explained in Table 7.4. The remaining factors are given
in Table 7.5.

Table 7.5: Remaining quality factors (others are in Table 7.4)

Sr. No. Quality factor Purpose


or data by the unauthorised
Integrity The extent to which access to software
persons can be controlled.
program.
2. Flexibility The effort required to modify an operational
reused in other applications.
3.
Reusability The extent to which a program can be
system with another.
4.
Interoperability The effort required to' couple one

Quality criteria eleven quality


he second level of quality attributes are termed as quality criteria. We have
quality attributes which are shown in
factors and each quality factor has many second level of
attributes. However, users and
managers are
Fig.7.10. Second level attributes are internal
nterested the higher level external quality
in attributes. For example, we may not directly
322 Software Engineering
reliability of a software system. We may however, directly measure the
measure the
of defects encountered so far. This direct measure can be used to obtain:insight into the reliability
of the system. This involves a theory of how the number of defects encountered relates to
number
reliability, which can be ascertained on good grounds. For most other aspects of qualitythough,
the relation between the attributes that can be measured directly and the external attributes
we are interested in is less obvious, to saythe least [VLIE 02]. The relationship between quality
factors and lquality criteria are given in Table 7.5(a). The definitions and details of these quality
criteria are discussed in Table 7.5(b).
Criteria
Factors
Operability
Usability Training
Communicativeness

VO valume
Integrity
VO rate
Access control
Eficiency
Access audit

Storage etficiency
Correctness Execution efficiency
Traceability
Reliability Completeness
Accuracy
Eror tolerance
Maintainability
Consistency
Simplicity
Testability
Conciseness
Instrumentation

Flexibility Expandablity
Generality
Reusability Self descriptiveness
Modularity
Machine independence
Portability
Software system independence
Communication commonality
Interoperability Data commonality

Fig. 7.10: McCal's quality model


SoftwareReliability

Table 7.5(a): Relation 323


between qualify factors and quality
criteria
Sr.
No. Quality criteria

Operability
1. xx
2 Training
Communicativeness
3. x
JO volume X
4. x
J/O rate xX
5
Access control |xx
6
Access.Audit
7.
8.
Storage efficiency X

9.
Execution efficiency X

10. Traceability X

X
11, Completeness
X
12. Accuracy
X
Error tolerance
13.
X X X
14. Consistency
X X X

15. Simplicity
X
16. Conciseness
17. Instrumentation
X

18. Expandability X X

19. Genrality X X X

20. Self-descriptiveness X X X X X

21. Modularity X X

22,. Machine independence X

23. S/W system independence


Communication commonality X
24.
25. Data commonality
quality criteria
Table 7.5(b): Software
Definition|Purpose
Sr. No. Quality eriteria operation of the software
The ease of system
1. which new users can use the
Operability The ease with outputs can be
2. which inputs and
Training The ease with
3. Communicativeness assimilated. (Contd.)
324
Software Engineerng
Quality oriteria
Deflnition / Purpone
Sr. No.
VO volumne
It lsrelated to the VO volume.
4.
It iathe indicationof I/O rate.
VO rate
The provisiona for control and protection of the soft
6. Accesa control
Wareand data.
The ease with which software andIdata can be checked
Accesa audit
for compliance with standards or other requirementa,
7.

The run-time storage requirementsof the Moftware


8. Storage efieiency
9 Execution effieiency The run-time efficiency of thesoftware,
The ability to link software components to require.
10. Traceability
ments.

11. Completeness The degree to which a full implementation of the


required functionality has been achieved.
12. Accuracy
The precision of computations and output.
13. Error tolerance Thedegree to which continuity of operation is ensured
under adverse conditions.

14. Consistency The use of uniform design and implementation tech


niques and notations throughout a project.
The ease with which the software can be understood.
15. Simplicity
16. Conciseness Thecompactness of the source code, in terms of lines
of code.

17. Instrumentation The degree to which the software provides for meas
urements of its use or identification of errors.
18. Expandability The degree to which storage requirements or software
functions can be expanded.
19, Generability The breadth of the potential application of software
components.
20. Self-descriptiveness The degree to which the documents are self explana
tory.
21. Modularity The provision of highlyindependent modules.
22. Machine independence The degree to which software is dependent on its
associated hardware.
23. Software system independence The degree to which software is independent of its
environment.
24. Communication commonality The degree to which standard protocols and interfaces
are used.
25. Data commonality The use of standard data representations.
It is not easy to measure many quality factors. We may have to apply software metrito
if possible, to measure such factors. The quality factors are not independent, but may overlap.
Some factors may impact others in a positive sense, while others may do so
negatively. "
Software Reliability 325

may have to study and understand such relationships very carefully. Asubjective assessment
af someeriteria can be obtained by giving a rating on a scale from, say, 0 (extremely bad) to 10
(extremely good). Such a subjective metric is difficult to use. Different people assessing the
same criterion are likely to give different ratings. This renders a proper quality assessment
almost impossible.
There are other methods to assessing quality like decomposing criterion into objectively
measurable properties of the system. It is also difficult to measure every property objectively
but aim is to define correctly and measure effectively.
722 Boehm Software Quality Model
R
CAPABILITY
74
capability maturity
MATURITY MODEL 347
The
he
model (CMM) is not a
egy improving software process, softwareof life cycle model.
for

CMM was
in1986.
developed by irespective
Software Engineeri n g Institute
the actual life Instead, it is a strat-
cycle model used. The
(SEI) of
CMM is used to judge the
maturity of the Car negie-M ellon University
to software processes of an
identifythe key practices that are
CMMis organized into five
required increase the maturity of organization and to
maturity levels as shown in Fig. 7.23 these processes. The
Optimizing [PAUL94, SCHA96].
5
Managed.

Defined
3
Repeatable.
Initial
1

Fig. 7.23: Maturity levels of CMM

7.4.1 Maturity Levels


1. Initial (Maturity Level 1)
At this, the lowest level, there are essentially no sound software engineering management
practices in place in the organization. Instead, everything is done on an adhoc basis. If one
specific project happens to be staffed by a competent manager and a good software develop
ment team, then that project may be successful. However, the usual pattern is time and cost
particular. As a
over runs caused by a lack of sound management in general, and planning in maturity-level
preplanned tasks. In
result, most activities are responses to crisis, rather than
because it depends totally on the cur
1Organizations, the software process is unpredictable, consequence, it is impossible to pre
As a
tentstaft; as the staff changes, so does the process. will take to develop a product
the important items such as the time it
a, Wath any accuracy,
fact that the vast majority of software organiza
r tne cost of that product. It is unfortunate
tions all over the world are level 1 organizations.
2 Repeatable (Maturity Level 2) procedures to implement those
At this level, policies for managing a software project and
new projects is based on experience with
policies are established. Planning and managing institutionalize effective management
level 2 is to devel-
similar projects. Ân objective in achieving
allow organizations to repeat successful practices may
processes for software projects, which processes implemented by the projects
oped on the specific
earlier projects, althoughcharacterized practiced, documented, enforced, trained.
as
iffer. An effective process can be
Measured, and amenable to be proved.
348
Software Engineering
Instead of functioning in crisis mode as in level 1, managers identify
arise and takeimmediate corrective action to prevent them from becomingproblem crisis.
as
they
point is that, without measurement it is impossible to detect problems before
they The
get key
hand. Measurements may include the careful tracking of costs, schedules and out of
Also, measurements taken during one project can be used to draw up realistic
cost schedules for future projects.
fuduration
nctionality.
and
Projects in level 2organizations have installed basic software management
Realisticproject commitments are based on the results observed on previous projectscont
a rols.
the requirements of the current projects. Software project standards are defined,
and the
organization ensures they are faithfully followed. The software project team works withe
subcontractors, if any, to establish a strong customer-supplier relationship.
The software process capability of level 2 organizations can be summarized as did
plined because planning and tracking of the software project is stable and earlier successes
can be repeated. The project's process is under the effective control ofa project management
system, following realistic plans based on the performance of previous projects.
3. Defined (Maturity Level 3)
At this level, the standard process for developing and maintaining software across the
organization is documented, including both software engineering and management processes.
This standard process is referred to throughout the CMM as the organization'sstandard software
process. Processes established at level 3 are used (and changed, as appropriate) to help the
software managers and technical staff to perform more effectively. The organization exploits
effective software engineering practices while standardizing its processes. An organization
wide training program is implemented to ensure that the staff and managers have the knowledge
and skills required to fulfill their assigned roles.
Projects tailor the organization's standard software process, to develop their own defined
software process, which accounts for the unique characteristics of the project. This tailored
process is referred to, in the CMM as the project's defined software process. Defined software
contains a coherent, integrated set of well-defined software engineering and management
processes. Because the software process is well defined, management has good control
over technical progress of all projects.
The software process capability of level 3 organizations can be summarized as "standard"
and "constituent" because both software engineering and management activities are stable
and repeatable. Within established product lines, cost, schedule, and functionality are under
control, and software quality is tracked. This process capability is based on a common,
organization-wide understanding of the activities, roles, and responsibilities in a defined
software proces.
4. Managed (Maturity Level 4)
At this level, the organization sets quantitative quality goals for both software producs
processes. Productivity and quality are measured for important software process acuv
organization-wide
across all projeçts as part of an organizational measurement program. An project's
software process database is used to collect and analyze the data available from the
defined software processes. Software processes are instrumented with well-denneu
SoftwareReliability

measurements at 349
consistent level4. These
evaluating the project's software measurements
processes and establish the quantitative
foundation
for
products and processes by
narrowing the products. Projects achieve control
their acceptable quantitative boundaries. variation in their process over
within
Meaningful Variations in process
performance fall
to
particularly within performance can be distinguished from random varia-
tion
(noise),
curve of a new application
established product lines. The risks involved in
learning domain are known and carefully moving up the
number of1faults detected per 1000 lines of code. A managed. One measure
couldbe corresponding objective is to reduce
quantity (number of faults) over time.
this oofware process capability at level4
organizations summarizedcan be
as predict
able" because the process is measured and operates within measurable limits. This level of
process capability.allows an organization to predict trends of process and product quality within
quantitative bounds of these limits. When these limits are exceeded, action is taken to correct
thesituation. Software products are of predictably high quality.
Level 5)
6 Optimizing (Maturity
lavel the entire organization is focused on continuous process improvement.
organizations have the means to identify weaknesses and strengthen the process
The
Rnotively. with the goal of preventing the occurrence of defects. Data of the effectiveness of
and proposes
the software process is used to perform cost benefit analysis of new technologies en
bonges to the organization's software process. Innovations that exploit the best software
organization.
gineering practices are identified and transferred throughout the
defects to determine their
Software project teams in level 5 organizations "analyze
known types of defects from recurring,
causes". Software processes are evaluated to prevent
projects.
and lessons learned are disseminated to other con
5organizations can be characterized as
The software process capability of level
improving" because level 5 organizations are continuously striving to improve the
tinuously process performance of their projects.
capability, there by improving the
range of their process incremental advancements in the existing process and by innova
Improvement occurs both by
Lons using new technologies and methods.
maturity models are summarized in Fig.7.24 [SCHA96]
These five
Characterization
Maturity level
Adhoc process
Initial
Basic project management
Repeatable Process definition
Defined Process measurement
Managed Process control
Optimizing
five levels of CMM
Fig. 7.24: The
shown that advancing a complete
maturity model has from level 1 to level 2
maturityExperie
level nce with the capability
from 18 monthsto 3 years, but
moving
usually takes
350 Software Engineering
sometimes takes 3 or even 5 years. This is areflection of how difficult it is to instill a methodical
approach in an organization that up to now has functioned on a purely adhocand reactive
basis.

7.4.2 Key Process Areas


Except for level 1, each maturity level is decomposed into several key process areas that indicate

You might also like