Manual On Quality Assurance For Computer Software Related To The Safety of Nuclear Power Plants
Manual On Quality Assurance For Computer Software Related To The Safety of Nuclear Power Plants
FEASIBILITY
STUDY
PROJECT TIME
I SOFTWARE P
FUNCTIONAL I
SPECIFICATION!
SOFTWARE
SYSTEM
DESIGN
DETAILED
C MODULES
ECIFICATION
MODULES
DESIGN
SOFTWARE
INTEGRATION
AND
TESTING
SYSTEM
TESTING
••COMMISSIONING I
AND
HANDOVER |
DECOMMISSION
DESIGN
DESIGN SPECIFICATION VERIFICATION OPERATION AND MAINTENANCE
© I A E A , 1988
(1) Codes of Practice for thermal neutron nuclear power plants that establish the
objectives and minimum requirements which must be fulfilled to provide
adequate safety for these plants
(2) Safety Guides that provide additional requirements and recommend procedures
that should be followed to implement the Codes of Practice
(3) User's Manuals, directed primarily to nuclear power plant operators, that
normally present possible methods and techniques for solving specific
problems.
Work on Codes and Guides was initiated in 1975 in five main fields: govern-
ment organization, siting, design, operation and quality assurance.
In the field of quality assurance the Code of Practice and ten Safety Guides had
been developed by 1986 and published in English, French, Spanish and Russian, as
well as some in Chinese. These documents are now used in a number of Member
States as quality assurance requirements for nuclear power plants. To facilitate the
use of these publications, the IAEA Technical Review Committee on Quality
Assurance has stressed on a number of occasions the need for and importance of
proceeding with the development of User's Manuals. These documents should
provide Member States implementing the Code and the Safety Guides with practical
examples of procedures, practices and documents illustrating the quality assurance
methods and techniques used in those organizations in Member States which have
broad experience in quality assurance. The same opinion was expressed in the
discussions during the International Symposium on Quality Assurance for Nuclear
Power Plants organized by the IAEA and held in Paris in May 1981. A number of
topics have been identified for which User's Manuals could provide additional
information and facilitate correct implementation of the Code and Guides in nuclear
power plant project activities.
To implement these recommendations, work has been initiated in the
Secretariat to develop those User's Manuals which are most needed in Member
States embarking on nuclear power programmes and starting quality assurance
activities. Keeping in mind the difference between User's Manuals and Codes and
Safety Guides, work on the development of these documents is undertaken outside
the NUSS Programme and the established procedures for development, review and
approval of the documents used in this Programme. For User's Manuals it was
decided to follow the standard practices used in the development of other Agency
publications such as Guidebooks and Technical Reports. This procedure will reduce
the time and cost of preparation of User's Manuals, which are the lower levels in
the hierarchy of NUSS Programme documents and do not contain requirements for
whose formulation a broad consensus of quality assurance experts would be needed.
The present Manual on Quality Assurance for Computer Software Related to
the Safety of Nuclear Power Plants provides guidance in the assurance of quality of
specification, design, implementation, maintenance and use of computer software
related to items and activities important to safety in nuclear power plants. This
guidance is consistent with, and supplements, the requirements and recommenda-
tions of Quality Assurance for Safety in Nuclear Power Plants: A Code of Practice,
50-C-QA, and related Safety Guides on quality assurance for nuclear power plants.
The Manual is intended to be of use to all those who, in any way, are involved
with software for safety related applications for nuclear power plants, including
auditors who may be called upon to audit management systems and product software.
CONTENTS
1. INTRODUCTION 1
2. SOFTWARE LIFE-CYCLE 3
3.1. General 9
3.2. Scope of the quality assurance programme 9
3.3. Responsibility 9
3.4. Quality assurance programme establishment 10
3.4.1. General 10
3.4.2. Quality assurance activities 10
3.5. Quality assurance programme documentation 10
3.5.1. Software quality plan 10
3.5.2. Work procedures and instructions 13
4. ORGANIZATION 13
4.1. Structure 13
4.2. Interfaces 14
4.2.1. Transfer of responsibility 14
4.3. Duties, authorities and responsibilities 14
4.4. Staffing and training 15
5. SOFTWARE DOCUMENT, CONFIGURATION, MEDIA AND
SERVICES CONTROL 16
6. DESIGN CONTROL 22
7. PROCUREMENT CONTROL 28
8. TESTING 28
11. RECORDS : : 31
12. AUDITING 31
ANNEXES
BIBLIOGRAPHY 87
1
(1) Systems — includes software which provides the following services, enabling
application software to function:
(a) Operating the plant (for example, software for plant monitoring and
control systems, plant protection systems, process diagnostic and opera-
tor support systems, etc.)
(b) Recording and storing data, and producing reports, data banks and
statistics
(c) Modelling and calculating for design, safety and reliability analyses
(d) Planning maintenance, fault diagnosis, operational use and verification
(e) Training plant operators with computer based simulators.
(3) Support — includes all software used in the development, testing and main-
tenance of the above applications as computer programs.
The Manual covers the development of new software, and the maintenance and
use of existing software.
The terms used in the Manual are explained in the Glossary (see Annex B).
Note; Although the principles contained in the Manual are applicable in all cases,
some of the details may hot be required.
Software quality may be defined as the degree to which the software conforms
to the specified requirements and expectations of the prospective user. A list of
attributes which may affect software quality is given in Annex C as an indication of
some of the many characteristics which need to be considered when software quality
is being evaluated during its life-cycle.
Many of these characteristics may lead to conflicting requirements, the solu-
tion of which is not always clear or easy. For example, added efficiency is often
purchased at the price of simplicity, portability, accuracy, understandability and
maintainability; added accuracy often conflicts with portability; conciseness can
conflict with readability. Users of software generally find it difficult to quantify their
preferences in such conflicting situations, and in these cases software suppliers may
produce a code to their standards. Every effort should be made to assess the relative
2
importance of all the attributes when specifying software for a particular application.
This assessment may be achieved more easily through the use of appropriate check-
lists and associated priorities.
To summarize these considerations, the measurement of quality of software
products will vary with the needs and priorities of the prospective user. It is therefore
necessary to specify the precise requirements, including quality, at the onset of a
software project, and to develop the requirements formally in a structured and
controlled manner through the ensuing project phases.
1.5. REFERENCES
2. SOFTWARE LIFE-CYCLE
The whole life, from concept to cessation of use, of the software has been
described as the software life-cycle and Figs 1-3 delineate a systematic approach
to the development, commissioning, use and formal withdrawal from use (decom-
missioning). The software life-cycle diagrams are examples showing the basic stages
through which software grows and develops from the, conceptual stage through to its
operation and withdrawal from use. At each of these stages, appropriate controls
need to be applied to the associated activities in order that the status of the software
is known and controlled and, where necessary, verified and approved. However, for
any particular project the whole of the software life-cycle as shown may not apply,
or it may be necessary to identify further stages.
Figure 1 is a very basic diagram, whereas Figs 2 and 3 are examples showing
life-cycle diagrams, together with related software documentation, and management
control activities. It is seen, therefore, that there are many different ways of describ-
ing the software life-cycle and it is a matter of project preference which is used. In
all cases it is necessary to ensure that the software life-cycle is mutually compatible
with the hardware life-cycle; together, they constitute the combined system life-
cycle. Provision of a detailed life-cycle diagram will help to ensure that software
development progresses in a planned and orderly manner.
Each of the software life-cycle activities should be formally defined and should
have a formal conclusion, usually with a document that provides tangible evidence
that the activity is completed. The following activities of the software life-cycle, as
shown in Fig. 2, have been chosen as examples for the Manual.
3
i commissioning
ano
handover
use a n d
maintenance
decommission
DESIGN
STUDY DESIGN SPECIFICATION DESIGN VERIFICATION OPERATION AND MAINTENANCE
\ T3
provided
\ FUNCTIONAL
\ SPECIFICATION
subsystems structure
\ programs, d a t a o r g a n i z a t i o n
s y s t e m verification
ARCHITECTURAL t e s t specification
system
\ DESIGN p r o g r a m and
extension,
replacement X d a t a b a s e design
\ DETAILED DESIGN
\\ source programs,
s y s t e m building,
object programs
CODING AND
project management
V IMPLEMENTATION
test results,
-programme of work/progress/resources \ f a u l t / m o d . record
-standards/methods \ INTEGRATION
- d e v e l o p m e n t facilities
- c o n f i g u r a t i o n and document control \ TESTING AND
COMMISSIONING fault reports,
\
r
- u s e r acceptance/formal h a n d o v e r
mod. r e q u e s t s ,
ABBREVIATIONS
The requirements that the computer program must satisfy are identified from
the overall system requirements and ultimately documented in the requirements
specification. The requirements specification activity is the most significant part of
the overall project in terms of its influence on the quality of the final product; it
should be as complete, unambiguous and verifiable as possible.
The requirements specification may take many forms. However, it should
contain enough information to totally identify and define the required functions so
that the top level of software design can address all these functions in a traceable
manner. To ensure that the requirements specification is read and understood, its
contents should be simply and concisely presented. It should, however, be
sufficiently detailed to show how all the relevant system requirements are being
satisfied. The requirements specification should follow a recognized standard (see
Section 3(d) of the Bibliography), should deal with what the software is intended
to do, and should also address both quality and general testing requirements.
Compatibility between software and hardware should be addressed at this and
appropriate subsequent phases of the life-cycle.
The software design activities specify the software architecture and continue
the breakdown of the functions identified in the requirements into the necessary
detail. The detailed design will include definition of the actual algorithms and equa-
tions as well as the detailed control logic and data operations that are to be
7
performed. They provide the basis for the coding and implementation activity that
will follow and take into consideration all the elements which will make up that
activity. The primary output of this activity is a design specification which is usually
designated the detailed functional specification, consisting of, for example, texts,
program specifications or decision tables. The detailed functional specification and
the software specification can be issued together as the program design document.
During this activity, the detailed software design is translated into an appropri-
ate level programming language. Compilation and assembly errors are corrected,
after which preliminary program check-out begins by executing the individual
program modules to remove the more obvious errors. Although some testing is done
at this point in the life-cycle by the developer, it does not formally include the testing
phase of the software life-cycle. There are a number of guides that have been
prepared for this activity. These include text books as well as formal standards, many
of which are listed in Sections 3(d) and 6 of the Bibliography. The product of this
activity is usually a source program, which is available in computer readable and
computer processable form. A printout of the program listing should be produced.
The final activities in the software life-cycle (see Fig. 2) are operation, main-
tenance and enhancement. During these activities the software is being operated
according to user documentation; and further activity consists of applying modifica-
tions to remove latent errors, or is in response to new or revised requirements or
8
to improve its operation (i.e. enhancement). Such enhancements may be due to hard-
ware changes. Any modifications resulting from these activities should be retested
and approved.
3.1. GENERAL
The quality assurance programme should cover the relevant activities of the
software life-cycle and the associated management controls, e.g. quality planning,
configuration control, organization and training, and should provide auditing of all
the functions. Organizational responsibilities and authorities for the conduct and
approval of activities affecting quality should also be defined. For the full scope of
the programme, reference should be made to Sections 2.1 and 2.2 of Code of
Practice 50-C-QA.
3.3. RESPONSIBILITY
9
3.4. QUALITY ASSURANCE PROGRAMME ESTABLISHMENT
3.4.1. General
10
Note: PEC'S . RCaUmEHENIS
SPEC, * SPECIFICATION
FIG. 4. Simple quality plan showing system hardware interfaces.
FIG. 5. Typical software quality plan showing all modules and documents to be produced.
3.5.2. Work procedures and instructions
4. ORGANIZATION
4.1. STRUCTURE
13
and communication with other organizational units. The typical responsibilities of
the quality assurance function are outlined in Annex F.
Project management ultimately has control of the quality of the software as it
determines the resources the project will use in its development. If the resources are
inadequate, it may be difficult to ensure product quality.
The functions of project management should be clearly defined and the quality
assurance related responsibilities of the project manager outlined (see Annex G).
4.2. INTERFACES
14
4.4. STAFFING AND TRAINING
15
of the mathematical models used to approximate physical processes, the limits of
applicability of empirical correlations, etc. The recommendations for training of the
developer's staff apply equally to the user's staff.
Annex J shows a typical training record form.
16
5.2. CONFIGURATION MANAGEMENT
5.2.1. General
A project should establish controls and procedures to ensure that all versions
of the computer software are accurately identified. Controls should also be estab-
lished to record the changing of the configuration status of the software. Mechanisms
should be provided in the software library to assign and track the identification of
computer programs and documentation, including their revisions. Proper release
authorization documentation should be provided within the library media. An autho-
rized signature list should be in place for the released documentation. The software
library function should assist with the arrangements for marking, labelling and pack-
ing for shipment of all deliverable material. The software library should maintain
logs and records relating to the distribution, inventory, configuration control and
status accounting for all deliverable items. The typical information to be recorded
is shown in Annexes L - l to L-5.
5.2.3. Documentation
17
SOFTWARE LIFE-CYCLE PHASES
(1) Establishment of configuration reference points. This can vary from project to
project, but it is essential that item and system configuration should be estab-
lished during the software life-cycle, as shown for example in Fig. 6. After
handover, the configuration should be maintained and it is necessary for a soft-
ware user to define the configuration reference points which are usually estab-
lished upon upgrading of the software or hardware system.
(2) Establishment of configuration baseline. A configuration baseline is a datum
established at a reference point in order that all subsequent changes be recorded
and incorporated into the configuration at the next reference point. Baselines
should be frozen.
(3) Introduction of new items. Before a configured item is made available for use
by someone other than its originator, it should be brought under configuration
control.
(4) Introduction of changes. All changes to configured items should be
documented and controlled, and all related configuration documentation should
be reviewed and amended as appropriate. The nature of the change and effect
on the performance of other configured items and systems should be assessed
and recorded. Any retest requirement should be assessed and recorded. All
19
changes should be approved by the appropriate authority and communicated to
all those persons involved. Patching of safety related software should be
strongly discouraged.
(5) Establishment of a master configuration index. This index should contain a list
of all the configured items, together with their current configuration status and
identification of all the associated documentation. To provide backward trace-
ability, copies of indexes for previous baselines, together with the changes
applied, should be kept.
(6) Configuration status accounting. This activity consists of recording a detailed
history of each configured item and baseline to enable the dates of configura-
tion and change to be reported, together with the current issue and change
status. Any known deficiencies, e.g. from error reports or configuration
audits, should also be recorded against a configured item.
(1) The preparation, review, approval and issue of software and configuration
management reports should be controlled.
(2) The parties responsible for item (1) should be identified'.
(3) A software release system should be established and measures provided to
ensure that users are aware of, and use, appropriate and correct software and
information.
(4) Software changes should be documented, reviewed and approved. Review and
approval should be made by the same authorities as the original item.
(5) All software and documentation revisions should be reported in a timely
manner to all those persons involved.
(6) Configuration records should provide objective evidence that all configuration
items have been identified, that configuration baselines have been established
and that software changes are reviewed, approved, controlled and
implemented. Management operating procedures for all the activities are
necessary, and software tools for configuration management are available
commercially. Further information on tools is given in Section 5.2 of the
Bibliography.
20
5.2.7. Software library
The typical storage media currently.in use are magnetic disks, magnetic tapes,
large scale integrated circuits, punched paper tape, program cards, diskettes, or
computer listings. As technology changes, the media will probably also include such
articles as video cassette tapes, laser disks, compact disks, and any other future
media which will be used in the industry.
21
central machine so that the user can access it readily if it becomes evident that the
primary copy has somehow not given a correct result. Periodic comparisons should
be made between the two copies to ensure that no degradation has taken place, and
magnetic media should be rewritten since magnetic domains on tapes and disks tend
to deteriorate.
5.3.3. Security
A means should be provided for controlling the physical media to ensure that
the stored software is accessible only to those authorized persons having need of
access. Several methods are available which will provide adequate protection from
unauthorized access to computer program media. Section 5.3 of the Bibliography
contains references on standards which have been developed for the physical security
of computer media. This area of knowledge has increased as many computer systems
have been subjected to violation by 'unauthorized' individuals. The primary method
is by effective password control or hardware access protection. Other methods
include limited access program libraries and computer rooms, encryption, external
markings and proprietary statements identifying the controlled programs. Modern
computer operating systems are being designed to include extensive security
considerations, especially when access by means of telephone lines and modems is
permitted.
6. DESIGN CONTROL
22
languages and testing. For example, the following information should be provided:
documentation standards; logic structure standards; coding standards; and com-
mentary standards.
These specifications detail the design information for each elemental program
in sufficient depth to enable programmers not having a knowledge of the overall
system to produce these programs. The detailed design document should typically
contain the following information: a detailed logic structure; internal data structures;
23
SPECIAL REQUESTS DEFINED
INTERFACES SPECIFIED
DELIVERABLE SOFTWARE LISTED
CONT.
24
FROM
FIG. 7 (cont.)
25
cross-references to the library routines to be used; the routines and conditions
specific to the program; functions; memory allocation requirements; a program iden-
tifier; priority requirements; interface requirements; and performance requirements.
6.2.1. Verification
Design reviews are one form of verification. They make use of the same
principles, irrespective of whether they are performed on software or hardware
systems. A typical design review check-list is shown in Annex S.
In reporting the review, the reviewers are to identify the configuration items
and status. Actions resulting from the review should be agreed upon and recorded
in software non-conformance reports (see Annex T - l ) , deposited in the software
library and recorded in the configuration status reports (see Annex M-4) for manage-
ment monitoring. As a design review should be a project hold point, all actions are
to be cleared before further work proceeds.
Documents subject to design review are shown in Fig. 7, which also shows
how the reviews are phased into the design process.
26
6.3. VALIDATION
(1) Real time testing of the integrated software implemented on a target computer
system under simulated conditions
(2) Testing to demonstrate that the software meets the computer software require-
ments under operational conditions.
Software tools are computer programs used by the programmer for develop-
ment of software products as aids to productivity and quality. These tools include
design aids, testing tools, program analysers, translators and editors. It is possible
for a programmer to work entirely within a tool system; therefore it is important that
a tool which could directly affect the product quality be subject to the same kind of
controls that apply to product software and that the functions of the tool are known
to have been verified and recorded. Tool documentation should detail information
on what constitutes an acceptable input to the tool, how the information is accepted,
manipulated and analysed, and also what form of output is produced. If uncontrolled
tools are used, additional verifications shall show that the resultant output code is
correct.
27
For information on tools, techniques and methods, reference should be made
to Section 6.4 of the Bibliography.
7. PROCUREMENT CONTROL
8. TESTING
28
attached to this document and the reader is referred to the references given in
Section 8 therein in order to establish an appropriate and high quality programme
of software testing.
(1) Delineation and description of the purpose and scope of each level of testing
to be conducted on each deliverable item or support item
(2) Identification of the responsibilities for each level of testing of each item
(3) Identification and description of the pre- and post-test documentation that will
be generated for each level of testing, including test specifications, procedures,
logs and records
(4) The test methods which will be used to establish compliance, i.e. test by func-
tion or test by structure
(5) Identification and use of the test support software and computer hardware to
be used in testing
(6) The test standards and quality criteria for acceptance that will be employed
(7) Identification and schedule for the user's manuals and operator handbooks
(8) Test reviews.
Performance of the actual testing should follow the test plan that has been
developed. During the course of performance, the test procedures should be followed
in detail and appropriate records kept. When modules and subsystems have been
tested by the software developer and independently verified, the elements of the
system should be passed to an independent group responsible for testing the complete
system. This may involve software from third parties; acceptance tests for such
external software should be devised. System integration testing should be carried out
using the operating hardware environment. All the test data should be compiled
independently of those data used by the software developer, which should have been
designed and verified in accordance with the same principles as those for the
designed software.
Several national and international standards are listed in Section 8 of the
Bibliography for guidance when planning tests for the software. The design of
29
appropriate individual tests can be obtained by referring to the text books identified
in the same section of the Bibliography.
The test plan should identify all the review type functions related to software
testing, including;
The review procedures should follow the guidance given in the texts cited in
Section 8 of the Bibliography.
9. NON-CONFORMANCE CONTROL
30
Copies of all reports on software non-conformances should be deposited with
the software library and feature in configuration status reports.
All amended (repaired) software should be retested against existing or new test
specifications and reconfigured. A typical software non-conformance report form
and a non-conformance report log are shown in Annexes T - l and T - 2 .
Reports for verification, validation and audit activities should contain details
of the non-conformances raised during these activities.
The project manager should be responsible for the review, approval and dispo-
sition of the reported non-conformances.
1 1 . RECORDS
31
(2) Tested software: to determine compliance with the requirements specification
(3) Consistency between software and documentation before delivery
(4) Configuration.
Ideally, configuration audits should be carried out when the configuration item
is first brought under configuration control, and whenever any changes are made.
Checks should be made to verify that the configuration control procedures have been
applied, and that the item is uniquely identified and mutually traceable to the
associated documentation.
Configuration audits should also include verification of the accuracy of the
master configuration index and the functioning of the software library.
Annexes V - l to V - 3 contain examples of the check-lists that can be used for
auditing.
32
ANNEXES
Annex A
IAEA DOCUMENTS
REFERRED TO IN THE MANUAL
33
Annex B
GLOSSARY
34
code audit: An independent review of source code done by a person, team or tool
to verify compliance with software design documentation and programming
standards. Correctness and efficiency may also be evaluated. (See also static
analysis, inspection, walk-through.)
code inspection: See inspection,
code walk-through: See walk-through.
compile: To translate a higher order language program into its relocatable or
absolute machine code equivalent,
compiler: A computer program used to compile. (See assembler, interpreter.)
component: A basic part of a system, computer program, or module,
computer: A functional programmable unit that consists of one or more associated
processing units and peripheral equipment that is controlled by internally
stored programs and that can perform computation, including arithmetic opera-
tions or logic operations, without human intervention,
computer program: A sequence of instructions suitable for processing by a
computer. Processing may include the use of an assembler, an interpreter,
or a translator (assembler or compiler) to prepare the program for execution
as well as to execute it.
computer program abstract: A brief description of a computer program,
providing sufficient information for potential users to determine the
appropriateness of the computer program to their needs and resources,
computer program configuration identification: See configuration identifi-
cation.
computer software: See software.
computer system: A functional unit, consisting of one or more interconnected
computers and associated software, that uses common storage for all or part
of a program and also for all or part of the data necessary for the execution
of the program; executes user written or user designated programs; performs
user designed data manipulation, including arithmetic operations and logic
operations; executes programs that modify themselves during their execution.
A computer system may be a stand alone unit or may consist of several inter-
connected units.
computing facility: A service which is available to several users, enabling them to
run their own software or that provided by the service,
configuration: The complete description of a product, which is a collection of
its descriptive and governing characteristics that can be expressed in functional
and physical terms. The functional terms express the performance that the item
is expected to achieve; the physical terms express the physical appearance and
composition of the item,
configuration audit: The process of verifying that all the required configuration
items have been produced, that the current version agrees with the specified
requirements, that the technical documentation completely and accurately
35
describes the configuration items, and that all the change requests have been
resolved.
configuration baseline: A specific reference point of product definition to which
changes or proposed changes may be related,
configuration control: The process of evaluating, approving or disapproving and
co-ordinating changes to configuration items after formal establishment of
their configuration identification,
configuration identification: The process of designating the configuration items in
a system and recording their characteristics,
configuration item: A collection of hardware or software elements treated as a unit
for the purpose of configuration management,
configuration management: The process of identifying and defining the configura-
tion items in a system, controlling the release and change of these items
throughout the system life-cycle, recording and reporting the status of configu-
ration items and change requests, and verifying the completeness and correct-
ness of configuration identification, configuration control, configuration
status accounting and configuration audit,
configuration status accounting: The recording and reporting of the information
that is needed to manage a configuration effectively, including a listing of the
approved configuration identification, the status of proposed changes to the
configuration and the implementation status of approved changes,
database: (1) A set of data items, together with the defined relationships between
them
(2) A collection of data fundamental to a system
(3) A collection of data fundamental to an enterprise,
defect: See error.
design language: A language with special syntax and sometimes verification
protocols used to develop, analyse and document a software design,
design level: The design decomposition of the software item (for example, system,
subsystem, computer program or module),
design methodology: A systematic approach to designing software consisting of
the ordered application of a specific collection of tools, techniques and
guidelines.
design requirement: Any requirement that impacts or constrains the design of
a software system or software system component, for example, functional
requirements, physical requirements, performance requirements, software
development standards, software quality assurance standards. (See also
requirements specification.)
design review: The formal review of an existing or proposed design for the purpose
of detection and remedy of design deficiencies that could affect fitness for use
and environmental aspects of the product, process or service and/or for iden-
36
tification of potential improvements of performance, and safety and economic
aspects.
desk checking: The manual simulation of program execution to detect faults
through step by step examination of the source code for errors in logic or
syntax. (See also static analysis),
detailed design: The process of refining and expanding the preliminary design
to contain more detailed descriptions of the processing logic, data structures
and data definitions to the extent that the design is sufficiently complete to be
implemented.
development methodology: A systematic approach to the creation of software
that defines development phases and specifies the activities, products, verifica-
tion procedures and completion criteria for each phase,
dynamic analysis: The process of evaluating a program based on execution of the
program. (See static analysis.)
enhancement: The process adopted by the user to charge software to improve
or add to its performance,
error: A discrepancy between a computed, observed or measured value or
condition and the true, specified or theoretically correct value or condition.
(See also non-conformance.)
error analysis: The process of investigating observed software errors with the
purpose of tracing each error to its source and of determining quantitative
rates and trends.
flow chart: A graphic representation of the definition, analysis or solution of a
problem in which symbols are used to represent operations, data, flow and
equipment.
formal testing: The process of conducting testing activities and reporting results
in accordance with an approved test plan,
functional requirement: A requirement that specifies a function that a system
or system component must be capable of performing,
functional specification: A specification that defines the functions that a system
or system component must perform,
implementation: The process of translating a design into a correct usable code,
independent verification and validation: Verification and validation of a soft-
ware product by individuals or groups other than those who performed the
original design, but who may be from the same organization,
inspection: An evaluation technique in which the software requirements, design
or code are examined in detail by a person or group other than the author to
detect errors. (See also walk-through and code audit.)
interface: (1) An interaction or communication with another system component
or another system across a shared boundary.
(2) See Section 4.2 of text for organization interfaces.
37
interpreter: A program that interprets and executes source language statements at
run time in a computer,
machine code: A representation of instructions and data that is directly executable
by a computer. (See assembly language.)
maintenance: Any change that is made to the software, either to correct a deficiency
in performance, as required by the original requirements specification, or to
improve its operation (i.e. enhancement),
methodology: A system of methods and rules applicable to research or work in a
given science or art.
module: A logical separable part of a program.
non-conformance: A deficiency in characteristics, documentation or procedure
which renders the quality of an item unacceptable or indeterminate. For the
purpose of the Manual, the terms 'error' and 'non-conformance' are
synonymous,
non-conformance analysis: (See error analysis.)
object program: A program in machine code that is the output after translation
from the source program and is ready to be executed in the computer. This
is also known as executable code. (See source program.)
operating system: Software that controls the execution of computer programs,
provides hardware resource allocation input/output control and related
services. The most important part of the system software is sometimes called
supervisor, monitor, executive or master control program. Although operating
systems are predominantly software, partial or complete hardware implemen-
tations are possible. (See also system software.)
operational: Pertaining to the status given a software product once it has entered the
operation and maintenance phase,
operational reliability: The reliability of a system or software subsystem in its
actual use environment. Operational reliability may differ considerably from
reliability in the specified or test environment,
patch: The modification made to an object program without recompiling the source
program.
Note: The practice of patching is strongly discouraged for safety related
software.
program instrumentation: The process of preparing and inserting probes, such as
instructions or assertions into a computer program to facilitate execution
monitoring, to prove correctness and to aid resource monitoring or other
activities.
real time: The processing of data by a computer in connection with another process
outside the computer according to the time requirements imposed by the
outside process. This term is also used to describe systems operating where
operation can be influenced by human intervention.
38
requirements specification: A specification that sets forth the requirements for
a system component, for example, a software configuration item. Typically
included are functional requirements, performance requirements, interface
requirements, design requirements and development standards,
routine: A set of instructions arranged in proper sequence to enable the computer
to perform a desired task,
simulation: The representation of simulated characteristics of the behaviour of one
physical or abstract system by another system,
software: Computer programs, procedures, rules and possibly associated
documentation and data pertaining to the operation of a computer system. (See
also application software, system software, utility software and computer
software.)
software database: A centralized file of data definitions and present values
for data common to, and located internal to, an operational software system,
software documentation: Data or information, including computer listings and
printouts, in human readable form, that describe or specify the design or
details, explain the capabilities, or provide operating instructions for using the
software to obtain the desired results from a software system. (See also system
documentation, user documentation.)
software engineering: The systematic approach to the development, operation,
maintenance and decommissioning of software,
software item: Source code, object code, job control code, control data, or a
collection of these items,
software librarian: The person responsible for establishing, controlling and
maintaining a software library,
software library: A controlled collection of software and related documentation
available to aid in software development, use or maintenance. Categories of
software held in the library may include development software, released soft-
ware and archive software,
software life-cycle: The period of time that starts when a software product
is conceived and ends when the product is decommissioned,
software maintenance: Modification of a software product after delivery to correct
faults, to improve performance or other attributes, or to adapt the product to
a changed environment,
software non-conformance: See error.
software product: A software entity designated for delivery to a user,
software tool: A computer program or device used to help develop, test, analyse
or maintain another computer program or its documentation, for example, an
automated design tool, translator, test tool, maintenance tool,
source language: A language used to write source programs,
source program: A computer program that must be translated or interpreted before
being executed by a computer. (See object program.)
39
static analysis: The process of evaluating a program without executing the program.
(See also desk checking, code audit, dynamic analysis, inspection,
walk-through.)
structured design: A disciplined approach to software design that adheres to
a specified set of rules based on principles such as top-down design, stepwise
refinement and data flow analysis,
structured program: A program constructed of a basic set of control structures,
each having one entry point and one exit. The set of control structures typically
includes: a sequence of two or more instructions, conditional selection of one
of two or more instructions, or sequences of instructions and repetition of an
instruction or a sequence of instructions,
structured programming: Any well defined software development technique that
incorporates top-down design and implementation and strict use of structured
program control constructs. (See top-down.)
support software: See utility software.
system: A collection of people, machines and methods organized to accomplish
a set of specific functions,
system architecture: The structure and relationship among the components of
a system. It may also include the system's interface with its operational
environment.
system design: The process of detailing the hardware and software architectures,
components, modules, interfaces and data for a system to satisfy specified
system requirements,
system documentation: Documentation conveying the requirements, design
philosophy, design details, capabilities, limitations and other characteristics of
a system (see user documentation), for example, quality records such as
quality assurance reports and non-conformance reports,
system software: Software designed for a specific computer system or family of
computer systems to facilitate the operation and maintenance of the computer
hardware and application programs. Typically included are operating
systems, translators and utility software. (See application software.)
system testing: The process of testing an integrated hardware and software system
to verify that the system meets its specified requirements. (See also
acceptance testing.)
test plan: A document prescribing the approach to be taken for intended testing
activities. The plan typically identifies the items to be tested, the testing to be
performed, test schedules, personnel requirements, reporting requirements,
evaluation criteria and any risks requiring contingency planning,
top-down: Pertaining to an approach that starts with the highest level component
of a hierarchy and proceeds through progressively lower levels, for example,
top-down design, structured programming, top-down testing.
40
top-down design: The process of designing a system by identifying its major
components, decomposing them into their lower level components and iterat-
ing until the desired level of detail is achieved,
top-down testing: The process of checking out hierarchically organized programs
progressively, from top to bottom, using simulation of lower level
components.
translator: A computer program used to convert programs from one programming
language (source language) to another (object language). (See compiler or
assembler.)
user documentation: Documentation conveying to the end user of a system instruc-
tions for using the system to obtain the desired results, for example, a user's
manual. (See system documentation.)
utility software: Computer programs or routines designed to perform some
general support function required by other application software, by the
operating system, or by system users. It also includes tools,
validation: Term that has a more global connotation than verification. It is an
activity that is performed at the end of development and integration to check
that the final software meets the original intention. The activity is usually test-
ing/exercising the system (hardware and software) functions within the bounds
of the system requirements specification(s). User documentation is also
applied during execution of the validation test plan by qualified personnel.
Validation also checks the resolution of all anomalies noted during all software
project verification activities,
verification: An activity which assures that the results of each successive step in the
software development cycle correctly encompass the intentions of the previous
step. This activity is performed by qualified personnel at convenient quality
verification points, e.g. between project milestones such as the completion of
a module and its release to the library, etc. Examples of verification activities
are testing, checking, analysing, design review, structured walk-throughs,
quality measurements, code inspections, etc.
walk-through: A review process in which a designer or programmer leads one or
more other members of the development team through a segment of design or
code that he or she has written, while the other members ask questions and
make comments about technique, style, possible errors, violation of develop-
ment standards and other problems. (See inspection.)
41
Annex V-3
42
Error handling capability refers to the code's ability to handle errors due to
hardware or software failures or operator input in such a way that the resulting
system performance degrades gracefully rather than catastrophically.
Human engineering (ergonomics) refers to the extent that the code fulfils its
purpose without wasting the user's time and energy, or degrading his morale.
Inputs and outputs should be self-explanatory, easy to learn and understand,
unambiguous and designed to avoid misinterpretation. This attribute implies
robustness and communicativeness.
Integrity refers to the extent to which access to software or data by unauthorized
persons can be controlled.
Interoperability refers to the effort required to couple the code system to
another independent code system.
Maintainability refers to the extent that the code facilitates updating to satisfy
new requirements, to correct deficiencies or to move to a different but similar
computer system. This implies that the code is understandable, testable and
modifiable.
Modifiability refers to the characteristics of the design and implementation of
the code which facilitates the incorporation of changes, once the nature of the
desired change has been determined.
Portability refers to the extent that the code can be operated easily and well
on computer configurations other than its current one.
Readability refers to the extent that the code's function and design can easily be
read and understood. (Example: complex expressions have meaningful
mnemonic variable names, and the code is extensively annotated.) Readability
is necessary for understandability.
Reliability refers to the extent that the code can be expected to continue to
perform its intended functions.
Reusability refers to the extent to which a program, or pieces of it, can be used
in other applications. This is related to the packaging and scope of the functions
that programs perform.
Robustness refers to the extent that the; code can continue to perform despite
some violation of the assumptions in its specification. This implies, for
example, that the program will handle inputs or intermediate calculated
variables which are out of range, or in a different format or type than specified,
without degrading its performance of functions not dependent on the inputs or
variables.
Self-containedness refers to the extent that the code performs all its explicit and
implicit functions within itself, e.g. initialization, input checking, diagnostics,
etc.
Self-descriptiveness refers to the extent that the code listing contains enough
information for a reader to determine or verify its objectives, assumptions,
constraints, inputs, outputs, components and revision status.
43
Simplicity refers to those attributes of the software that provide implementation of
functions in the most understandable manner, and usually requires avoidance
of practices which increase complexity.
Structuredness refers to the extent that the code possesses a definite pattern of
its interdependent parts. This implies that the program design has proceeded
in an orderly and systematic manner (e.g. top-down design or functional
decomposition) and that standard control structures have been defined and fol-
lowed in coding the program, e.g. DO-WHILE, IF-THEN-ELSE, resulting in
well structured paths through the program.
Testability refers to the extent that the code facilitates the establishment of test plans,
specifications, procedures and their implementation, e.g. modular construction
with well defined interfaces that can be independently tested.
Traceability refers to those attributes of the software that provide a thread
from the requirements to the implementation with respect to the specific
development and operational environment.
Understandability refers to the extent that the code's functions are clear to the
reader. This implies that variable names or symbols are used consistently, that
modules of code are self-descriptive, and that the control structure is simple
or in accordance with a prescribed standard, etc. There should be no hidden
meanings or operating characteristics that come to light only after months of
use.
Usability refers to the effort required to learn, operate, prepare input and interpret
output of a program.
44
Annex K-2
LIST OF DOCUMENTS
Programmatic documents
Development plan
Configuration plan
Documentation plan
Test plan
Verification plan
Requirements specification 1
Software system design 1
Detailed functional specification 1
Program design document 1
Program test document 1
System test document 1
Acceptance test document 1
Commissioning and handover documents 1
Operation and maintenance documents 1
1
Indicates documents which, as a minimum, are subject to configuration control.
Annex V-3
46
Annex V-3
TYPICAL PROCEDURE:
PROCEDURE FOR THE USE OF COMPUTER PROGRAMS
FOR SAFETY RELATED DESIGN PURPOSES
Contents
(1) Scope
(2) Responsibilities
(3) Requirements
(4) Records
1. SCOPE
This document defines the requirements for controlling the use of computer
programs for design purposes. The objectives are to ensure that:
(1) Computer programs which are used for design calculations produce results
which are valid for the actual physical problems under study.
(2) A comprehensive record is kept of the modelling assumptions and the data
used, so that a recalculation is possible at a later date using whatever tools are
available at the time.
2. RESPONSIBILITIES
It is the responsibility of all computer users to ensure that the programs which
they use for design calculations are included on the index of programs.
47
3. REQUIREMENTS
An index of programs used for design purposes shall be produced and shall
include:
(1) Program name and identifier. The identifier shall be sufficiently detailed so as
to indicate a specific version and/or an amendment of revision of the program;
each change must be recorded. Programs which are no longer used should not
be removed from the index.
(2) Installation used.
(3) Organization originating the program.
(4) Validation report references.
(5) Details of the program control arrangements. It is essential that the control
arrangements should be sufficient to ensure that programs are not inadvertently
modified from the validated and documented version. Any change in the pro-
gram identifier, as new releases of a program become available, shall be
recorded as a new entry on the index and an amendment issued to the computer
program report.
The report shall include details of the mathematical model used and the method
of solution, and quote the test cases which have been used. All routes through the
program which are tested should be identified. If possible, analytical results should
be included for comparison purposes. User instructions should be given. Each
change in the program (which will automatically require a change of identification),
will require an associated amendment to the computer program report.
48
3.4. Use of computer programs
Each use of a computer program for design purposes will require the retention
of the following information:
(1) The associated job control language in order to demonstrate that the program
used is contained in the appropriate index.
(2) The associated input data and all or part of the output results.
(3) Any associated user supplied routines which have been linked with the
protected program at run time, unless these have already been separately
included on the index.
4. RECORDS
49
Annex V-3
TYPICAL RESPONSIBILITIES
OF THE QUALITY ASSURANCE FUNCTION
50
Annex V-3
There are many guides and standards available for project management. An
extensive list is given in Section 4(a) of the Bibliography. Reading and implementing
the concepts that are detailed in these guides and tutorials will greatly enhance the
quality and ultimately the reliability of the software.
Project management should be responsible for:
(1) Determining the responsibility, providing plans and identifying internal and
external interfaces for the project organization, lines of directions and commu-
nications among groups and external interfaces.
(2) Controlling preparation, submittal and review of all quality assurance records,
and overseeing the effective implementation of all quality assurance functions
by the software project team.
(3) Ensuring the definition of quality assurance requirements for the software
project team and other contractors, as well as the approval of their quality
assurance manual documents such as quality assurance manual and programme
procedures.
(4) Approving the software quality plans and schedules.
(5) Ascertaining standards, codes, guides and procedures applicable to the soft-
ware project and directing the development and approval of specific proce-
dures and instructions to be used during the development of the activities.
(6) Establishing the software project file system.
(7) Ensuring the management of the configuration control system.
(8) Reporting on an established schedule, to the management of the responsible
organization, on the regular development of the software project as well as
when problems and deficiencies are identified in the process and providing the
approach and schedule for the resolution of problems and deficiencies.
(9) Originating the recommendation for the resolution of non-conformance
reports.
(10) Proposing, reviewing and approving corrective actions.
51
Annex V-3
(1) Project management has overall responsibility for the project. These respon-
sibilities include ensuring the preparation, implementation and maintenance of
the project quality assurance programme and all referenced plans and proce-
dures. Project management also has the responsibility for ensuring that the
quality assurance programme is reviewed and audited, and that any resulting
corrective actions are implemented adequately and in a timely manner.
52
Annex J
STAFF T R A I N I N G R E C O R D F O R M
SHEET OF
PROJECT
S T A F F TRAINING RECORD REVISION N*
OBTAINED
OBTAINED
OBTAINED
OBTAINED
OBTAINED
OBTAINED
OBTAINED
OBTAINED
MARK
MARK
MARK
MARK
MARK
MARK
MARK
DATE
DATE
DATE
MARK
DATE
DATE
OATE
DATE
OATE
Ul
Annex K-2
Requirements To establish functional, performance, interface, Software user Independent verifier within user
specification design, validation and acceptance requirements organization
Functional Completion of preliminary design; establishment of Software designer Design team leader
specification test plan (approved by the software user)
Software Completion of detailed design, code and develop- Software design/team Integration team leader after
configuration ment testing stages leader inspection of documents,
submission forms verification of development tests
and successful production of new
version of software
Software Recording of errors observed in the software Anyone working with Librarian maintains these reports;
non-conformance (employing) the integration team member will
reports configured software analyse the significance of the
errors and file a software change
request form
Software Request design team to correct known software Integration team Senior software designer
change request form errors leader
Design change To request project management to make available Design team leader Senior software designer
order additional resources to modify existing software
because of a design change
Integration test To document the detailed description of the tests to Integration test team Design team leader
procedures be performed, the purpose of each test, the evalua-
tion and acceptance criteria to be used, the methods
and the basis for generation of test inputs
Verification and' To document the criteria, techniques and tools to be Verification/validation Independent verifier
validation plan utilized and to describe the activities to be per- team
formed for software verification and validation
Verification test To establish tests which will demonstrate the cor- Verification/validation Design team leader
specifications rectness of the software at each stage and between team
each stage of the development life-cycle
Validation test To establish tests which will demonstrate the cor- Verification/validation Design team leader
specifications rectness of the final software with respect to the team
user needs and requirements
Verification/ To document the detailed description of the tests to Verification/validation Independent verifier
validation test be performed, the purpose of each test, the evalua- test team
procedures tion and acceptance criteria to be used, the methods
and the basis for generation of the test inputs
Integration and To record test events and test results of integration Integration team Integration team leader
test log book" tests member
Annex K - l (cont.)
System test T o record test events and test results of the System test team System test team leader and user
log book computer system
Final test report Documents the results of all the tests described in System test team Test team leaders
the software test specifications in terms of whether
or not the software has achieved the performance
requirements given in the software functional
requirements
a
See Annex K - 2 for an example of a test record log.
Note: The verification test specifications and the validation test specifications are often combined into one document, the verification and validation
plan. Also, one team, the verification and validation team, can carry out verification and validation activities independent of the design team.
Annex K-2
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
TR/
Lh Annex K-2
00
FORM FOR MEDIA CONTROL:
REGISTER OF MEDIA
SHEET OF
PROJECT
REGISTER OF MEDIA REVISION OATE
PROJECT
PROGRAM LOCATION RECORD REVISION DATE
Ui
VO
Annex K-2
FORM FOR MEDIA CONTROL:
ARCHIVES RECORD
SYSTEM
PROJECT
ARCHIVES RECORO
PROGRAM IDENTIFIER TITLE
SUBMITTED
ITEM ARCHIVED VERSION LOCATION
DATE BY
MASTER
LISTING
SPECIFICATION DOCUMENTS
TEST DOCUMENTS
USER DOCUMENTS
AUTHORIZED BY DATE
Annex K-2
ACCESS INFORMATION
LINK INFORMATION
OUTSTANDING CHANGE
VERSION DATE COMMENTS NOTES
REQUESTS (AUTHORIZED)
Annex V-3
PROJECT
MODULE R E L E A S E
SYSTEM NAME
MODULE SPECIFICATION N*
DATE CONFIGURED
COMMENTS
63
Annex K-2
PROJECT SCR N*
S O F T W A R E CHANGE R E Q U E S T
SHEET OF
CHANGES REQUIRED TO
SOFTWARE AFFECTED
DISTRIBUTE TO
SHEET OF
PROJECT
SOFTWARE CHANGE REGISTER REVISION OATE
PROJECT
CONFIGURATION S T A T U S REPORT REVISION OATE
PROJECT SHEET OF
DOCUMENT REGISTER
REVISION DATE
QUALITY ASSURANCE
V/L PROGRAMME
H- SOFTWARE QUALITY
Z .
UJ PLAN
X REQUIREMENTS
=3 CUSTOMER APPROVAL
SPECIFICATION
0 FUNCTIONAL
SPECIFICATION
UJ DETAILED SOFTWARE
DESIGN
cc. TEST PLAN
CL
\
x SYSTEM TEST
UJ SPECIFICATION
T/1 SYSTEM TEST
V
1/1 SCHEDULE
ACCEPTANCE TEST CUSTOMER APPROVAL
SPECIFICATION
ACCEPTANCE TEST CUSTOMER APPROVAL
SCHEDULE
COMPUTER SYSTEM
REQUIREMENTS
REQUIREMENTS
SPECIFICATION
INTEGRATION
HARDWARE AND TEST
REQUIREMENTS REQUIREMENTS
|
FUNCTIONAL
SPECIFICATION
PLAN
ARCHITECTURAL
DESIGN S
t—
<
a
DETAILED
DESIGN DESIGN
COOING
BUILD AND
INTEGRATION
HARDWARE
SOFTWARE
INTEGRATION
COMPUTER
SYSTEM
TESTING
I
• - J
Annex K-2
Data
Module parameters
Input data
Databases
1
See Fig. 3 in text-
Annex V-3
1. Purpose
2. Definitions
3. Verification and validation summary
3.1. Organization
3.2. Schedule
3.3. Resources summary
3.4. Responsibilities •
3.5. Tools, techniques and methods
4. Life-cycle verification and validation (V&V)
Each subsection shall address the following topics as appropriate:
Verification and validation tasks
Methods and criteria
Inputs/outputs
Schedule
Resources
Assumptions and reservations
Responsibilities
4.1. Management of V&V
4.2. Design specification V&V
4.3. Design V&V
4.4. Implementation V&V
4.5. Testing V&V
4.6. Operation and maintenance V&V
5. Software verification and validation reporting
5.1. Required reports
5.2. Optional reports
6. Verification and validation administrative procedures
6.1. Non-conformance reporting and resolution
6.2. Corrective action policy
6.3. Control procedures
6.4. Standards, practices and conventions
7. Referenced documents
1
Based on Draft Standard for Software Verification and Validation Plans, IEEE
Draft STD P1012/D6.0, Institute of Electrical and Electronics Engineers, Inc., New York
(6 Jun. 1986)
70
Annex K-2
PROJECT SHEET OF
VERIFICATION RECORD
VR N '
METHOD OF VERIFICATION
VERIFICATION COMMENTS
VERIFICATION BY TEST
TEST LOCATION
TESTED B r WITNESSED BY
General
(A) Is the detailed Junctional specification (DFS) or the program design document
(PDD) in conformance with the design documentation guideline and any other
project guidelines?
(1) Does the basic design comply with the established design guidelines?
(2) Does the detailed design comply with the established design guidelines?
(3) Have the standard naming, interface, etc. conventions been followed?
(4) Does the design as documented (in the DFS or PDD) conform to existing
documentation standards?
(5) Assuming the design standards call for structure design, does the detailed
design conform to these precepts?
(6) Do the flow charts, narratives, code diagrams, data flow diagrams, or other
form of the design description agree with the established structural and
documentation standards?
(7) Has the hierarchical structure been defined to the point where the scope of each
identified module or subsystem is narrowed down to a set of specific
operations?
(8) Does the modular decomposition reflect intermodular or intersubsystem
independence?
(9) If not stipulated earlier in the requirements specification (RS), has the
programming language that will be used been identified?
(10) Will global constants be expressed in terms of parameters? For example, will
the design lead to code of the form AREA = 3.14159*R**2 or AREA =
PI*R**2 where 'PI' is defined globally?
(11) Are parameters consistently named and referenced only by name throughout
the program?
(1) Does the design (DFS or PDD) decompose the overall system requirements
into the hardware and software components which are correct and such that
there are no ambiguities or deficiencies?
(2) Does the design give an accurate, consistent, complete translation of the RS?
(3) Do the software module structures, interfaces and layout conform to the RS?
(4) Are all the support software requirements properly referenced for proper
execution of the design?
72
(5) Does each subsystem (or module) described in the DFS (or PDD) correspond
to a previously defined element? Is it properly identified?
(6) Do module specifications contain information from which traceability back to
the software requirements can be made and in turn verified?
(7) Have all software requirements been addressed in the design?
(8) Does the design (DFS or PDD) only comply with each and every requirement
in the RS and not introduce extraneous functions?
(9) Is the design internally consistent, complete and consistent with the previous
documents (DFS with RS, DFS with PDD)?
(C) Have all the overall considerations in the design been dealt with?
(1) Have all the necessary preliminary technical design reviews been made? Have
they been recorded and the recommendations incorporated into the design?
(2) Does the DFS (or PDD) meet product assurance requirements? (Allow for
review and audits, detail facilities and procedures for testing.)
(3) Is there agreement between the project manager and user management that the
alternative selected is the best one? Are there unanswered questions concerning
trade-offs?
(4) Is each design structure chart (or equivalent) being reviewed uniquely identi-
fied by name, version and date?
(5) Has the software verification and validation plan been updated to reflect any
changes in the test schedules, test facilities and test requirements resulting from
information developed during the design activity?
(6) Is there agreement from quality assurance representatives that the DFS (or
PDD) is ready for approval?
(7) Is there agreement between the developer and user that the design will fulfil
the user needs?
(8) Has sufficient attention been paid to external (user oriented) requirements?
(9) Is the design expansible?
(10) Will it be necessary to acquire any additional software/hardware resources to
carry out futher development consistent with design? If so, have they been
specified in sufficient detail to permit their acquisition to proceed?
(11) Has each design alternative been clearly identified?
(12) Have all assumptions been identified?
(13) Were the design alternatives objectively compared and objectively selected?
(1) Database
(a) Is the refinement of all data storage and timing allocations complete?
(b) Is the detailed definition of the database content, structure and control
complete?
73
(c) Are the methods of accessing all database elements defined?
(d) Is the database design consistent with the overall design?
(e) Is each data structure referenced within a module defined within the data
dictionary?
(f) Has the definition of database organization and content been completely
generated?
(g) Are the control structures and data linkages consistently defined?
(h) Have all major parameters been adequately described?
(i) Does the database specify the structure of the tables and, to the extent
possible, the format, range and scale of the table entries?
(j) The data structure should be examined for inconsistency or
awkwardness.
(2) Interfaces
(a) Are the identification and specification of parameters, entries and normal
and abnormal exits for all common subroutines complete?
(b) Is the detailed definition and documentation of software internal and
external interfaces completed?
(c) Are all external interfaces well defined (control, support, test)?
(d) Is the design compatible with external interfaces?
(e) Does each subsystem (or module) have well defined outputs and inputs?
Is its function clearly understood?
(f) Have the methods of sequencing and controlling module execution and
interfaces between modules been completely analysed?
(g) Has each interface between the module and the submodules been
completely specified?
(h) Have all human factors been properly addressed?
(i) Has hardware/telecommunications/software compatibility been assured?
(j) Have all the interface requirements between the specified computer
program and other system elements or other computer systems as speci-
fied in the RS been addressed?
(k) Does the planned user interface to the computer program provide for use
of all the computer program functions and control of the computer
program in its operational environment?
(1) Are all functional interfaces with exterior system equipment and
communication links correct?
(m) Are all functional interfaces within the system correct, i.e. soft-
ware/software and software/hardware? (Analyse word format, transfer
rates, etc. for incompatibilities.)
(n) Will the design when implemented interface with other system.elements
or other systems in accordance with the interface specifications?
74
(o) Have all module interfaces been examined to assure that calling routines
provide the information and environment required by the called routines
in the forms, format and units assumed?
(p) Are the functional interfaces between all modules (subsystems)
complete, correct and named, and have they been referenced only by
name throughout the program?
(a) Is the logical interrelation of parts of the DFS (or PDD) clear? Are there
any major flaws?
(b) Is the control flow clear and acceptable?
(c) Is the program control system clearly outlined?
(d) Does the control logic trap infinite loops?
(e) Does the logic prevent self-destruction? (Overflow of a data table should
be diagnosed and reported to the user with appropriate recovery action.)
(a) Are the specified data rates, accuracies, response time and the like
reflected in the design?
(b) Does the design avoid potential snags caused by defective external
devices?
- (c) Have input data validity checks been.stipulated? Can corrective action be
taken by the module/subsystem that traps the error?
(d) Does the design assume the same form, format and units for input and
output that are stated in the requirements?
(5) Algorithms/models/functions
75
(f) Do the individual algorithms or special techniques really work for the
given problem and data? (This could involve hand calculations for
representative test cases.)
(g) Is the logic of individual algorithms clear and acceptable?
(a) Has a functional description and design been prepared for each module?
(b) Have the specifications of each modular/subsystem (in the DFS) been
specified to a level of detail sufficient to permit the detailed design (lead-
ing to the PDD) to begin?
(a) Does every function identified in the DFS appear either as a module or
is it embedded identifiably within some module in the PDD?
(b) Do all submodules of a given module perform actions identifiable as
subfiinctions of the stated function of the given module? (All those
subfunctions that seem to be missing or extra should be noted as a
discrepancy.)
(c) Are module functions allocated to the lowest level of individually iden-
tifiable computer program components?
(d) Is the design representation from which the software will be coded
prepared in sufficient detail to permit direct translation to a computer
program code and to derive requirements for developmental testing of
the design?
(e) Are the functions allocated to each subsystem further allocated to the
modules comprising the subsystem?
(f) Is the design detailed to the point where each design quantum (e.g. flow
chart symbol, design language statement) clearly indicates the scope of
the code that will implement it?
(g) Has each of the functions that were defined in the module specification
(DFS) been addressed?
(h) Do all decisions within each module test explicit determinable condition
flags, either defined within that module, passed to it as an argument (or
globally), or returned to it by one of its submodules?
(1) Does the design satisfy the attributes of quality: correctness, reliability, effi-
ciency, integrity, usability, maintainability, testability, flexibility, portability,
reusability and interoperability?
(2) Does the design specification reflect the quality criteria required in the RS?
76
(3) Is the design modular?
(4) Is the design free from error, as determined by simulations or walk-throughs.
(1) Does the design contain adequate provisions for error corrections, testing and
maintenance?
(2) Are corrective actions for potentially invalid conditions (e.g. dividing by zero,
or the 'noise' remaining after the subtraction of two similar values) reasonable
with regard to minimizing the effect of system or input failures?
(3) Has sufficient attention been paid to abnormal conditions or exceptions and
have they been handled adequately?
(4) Does the design approach for all the functions satisfy the performance require-
ments of the RS?
(5) Is the design compatible with the timing requirements?
(6) Are the test requirements and test tolls sufficient to determine if all the func-
tions meet the requirements of the RS?
(H) Is the design feasible, i. e. achievable within allocated resources, schedules and
acceptable risks?
920
(I) Does the design appropriately meet all the constraints?
(1) Have time critical functions been analysed to verify that the system operations
timing requirements can be met?
(2) Have adequate time and memory space been correctly allocated to individual
software items?
(3) Are data handling limitations of the hardware and resource utilization reason-
able and are they consistent/compatible with interface and accuracy control
specifications, hardware/system specifications and resource allocations?
(4) Have the overall memory required, computation speed and other resource
consumptions been estimated?
(5) Do timing and sizing margins exist?
(6) Has critical timing, if any, been analysed? If a module execution load budget
applies, are the timing estimates consistent with it?
(7) Does the module design imply previously unstipulated limitations or
constraints? Are they acceptable with regard to the user's needs?
(8) Is there a memory budget allocating estimated storage requirements for each
module? Does it have adequate reserve?
(9) Is there an execution load budget by module or subsystem functions? If the
latter, is the grain fine enough to be credible? Does it leave adequate reserve?
(10) Does the design predicate any constraint or limitation beyond those found in
the specification? Are they compatible with the end and systems specifications?
( I I ) Have all constraints specified by the requirements been met?
78
Annex K-2
PROJECT
NON - CONFORMANCE REPORT NCR N*
DETAILS OF NON-CONFORMANCE
RAISED BY OATE
CAUSE OF NON-CONFORMANCE
DISTRIBUTE TO
APPROVED BY DATE
Annex K-2
SHEET OF
PROJECT
NON-CONFORMANCE REPORT LOG REVISION OATE
OATE OATE
NCR N* IDENTIFIER BRIEF DESCRIPTION COMMENTS
RAISED CLEARED
Annex K-2
PROJECT SHEET OF
V A L I D A T I O N RECORD
VL N*
METHOD OF VALIDATION
VALIDATION COMMENTS
VALIDATION BY TEST
TEST LOCATION
TESTED BY WITNESSED BY
TYPICAL CHECK-LIST
FOR AUDIT OF COMPUTER PROGRAMS
(1) Does the program have a unique reference and status identification?
(2) Does the identifier identify the program, and its version and revision status?
(3) Do all outputs include the identifier?
(4) Is the responsibility for recording the program name and identifier defined?
(5) Is the responsibility for program security defined?
(6) Do the defined methods of program security prevent inadvertent modification
from the verified and approved version?
(7) Is the responsibility for program verification defined?
(8) Has a program verification report been produced?
(9) . Has the verification report been reviewed for scope and adequacy by someone
other than the originator?
(10) Has the program been validated by the user for his application?
(11) Has a validation report been produced?
(12) Is there a set of program documentation which includes: user guides; descrip-
tion of program, including theoretical basis; verification report; and validation
report?
(13) Has the program documentation been subject to review and approval?
(14) Do records of program use contain: listings of the data used for the run; and
associated job control language?
(15) Are changes to the program recorded and verified?
(16) Is documentation revised when program changes are made?
(17) Are changes communicated to all users of the program?
(18) Is the program maintained by an outside organization?
(19) If the program is maintained by an outside organization has the user ascer-
tained that: security procedures are applied; and that the program operates as
the user intended?
(20) Is there a computer program index containing all the programs used by the
user?
(21) Does the computer program index include all applicable versions of the
program?
82
Annex V-3
TYPICAL CHECK-LIST
FOR CONFIGURATION MANAGEMENT AUDITS
(1) Have configuration reference points been established for: requirement specifi-
cation; system description; system test; and handover?
(2) Do configuration reference points for in-service use include: major enhance-
ments: of operational or application software; upgrading of hardware; and
introduction of new versions of software?
(3) Has a configuration baseline been established?
(4) Are all changes to the configuration baseline subject to configuration control
procedures and recorded?
(5) Have all changes from previous configuration baselines been incorporated into
succeeding baselines?
(6) Have all configuration items been identified and recorded?
(7) Does each configuration item have a complete description which includes:
unique identification; type of software (i.e. master, archive, object, etc.);
nature and identification of machine readable form of software; and applica-
tion details, i.e. where used?
(8) Are all introductions or changes to system configuration controlled by proce-
dure to produce documentation to: specify nature of change; identify all config-
uration items involved or affected; uniquely identify new configuration items;
and identify the effects of the change on system performance?
(9) Do the configuration control procedures specify the level of authority for
approving a change?
(10) Do the procedures require communication of details of the change to all who
need to know?
(11) Do the procedures ensure that all relevant configuration documentation is
updated and cross-referenced?
(12) Do the procedures provide for backward traceability from an approved
changed configuration to that which existed before the change?
(13) Do the configuration procedures provide for tests or retests which verify the
change and retest any aspects which may have been affected by the changes?
(14) Do the procedures provide control of implementation of changes to define: the
responsibility for making changes, progressing approval of changes and main-
taining records of the status of changes?
(15) Do all changes have an individual identification number?
(16) Does configuration documentation conform to the prescribed project standard?
(17) Is there a correspondence between documentation and the subject configuration
items?
83
(18) Does configuration documentation contain adequate cross-references, e.g.
between hardware and software?
(19) Does configuration documentation identify dependencies and interrelationships
between configuration items?
(20) Is the identity and issue status of all configuration documentation established?
(21) Is there a master configuration index which identifies and lists all configuration
items, together with their documentation and respective issue status?
(22) Is there a security copy of the master configuration index?
(23) Are copies of all previous issues of the master configuration index provided?
(24) Do the historical records of the master configuration index contain details of
all changes between issues?
(25) Is a copy of the master configuration index available in the software library
for issue and return control of software and data?
(26) Is the master configuration index structured and organized to reflect the
systems and subsystems it comprises?
(27) Does a system of configuration status accounting exist?
(28) Does the accounting contain the following elements: dates when baselines and
changes were established; date of introduction of each configuration item;
current issue and change status (including pending changes); deficiencies
identified by quality auditing, configuration auditing, design reviews and non-
conformance reports?
84
Annex V - 3
TYPICAL CHECK-LIST
F O R AUDIT O F SOFTWARE TESTING
(1) Has a test strategy been defined and documented for the project?
(2) Has the strategy been approved and issued?
(3) Does the strategy include: methods of test; testing aids and monitoring
techniques; methods for constructing and maintaining test data; communication
links between designers, testers and users; methods of submitting the tests; test
duration; organization and responsibilities for activities; special testing
arrangements; library control; and document and record filing?
(4) Has a test plan been established and approved?
(5) Does the test plan identify the sequence of testing and the test specifications
and procedures to be used?
(6) Has the adequacy of the specifications and procedures been reviewed and the
documents approved and issued?
(7) Does the test procedure contain the following: definition of what is to be tested;
objectives of the test; test conditions; test data to be used; test date and
duration; methods of test; versions of test aids and tools to be used; test output
expected; analysis of results to be carried out; test records required; and
actions required if test fails?
(8) Have all test facilities and data been identified and provided?
(9) Does the test log provide evidence that the test was conducted according to
procedure?
(10) Do the recorded results verify that the test acceptance criteria have been met?
(11) Are all deviations from the test plan, procedure and specification recorded?
(12) Was the test date recorded?
(13) Did the following activities take place after the test: all non-conformances
recorded and analysed for cause; documentation corrected; repeat tests carried
out; test documentation completed; and test documentation and results
reviewed and approved?
(14) Was a test summary prepared which included the following points: non-
conformances which are to be corrected or are subject to concession; and any
actions to be taken and time-scales for completion?
85
BIBLIOGRAPHY
The following list of standards, books and papers are included as useful
references for the benefit of the reader who wishes to study them in more detail than
is possible to provide in the Manual.
The subjects of software engineering in general and software quality assurance
in particular are covered and indexed by main text chapter and section titles.
2. SOFTWARE LIFE-CYCLE
KEROLA, P., F R E E D M A N , P., " A comparison of life-cycle models", Proc. 5th Int. Conf.
on Software Engineering, IEEE Computer Society Catalog No. 81CH1627-9, Institute of
Electrical and Electronics Engineers, Inc., N e w York (1981) 90.
2.6. Maintenance
BARIKH, G., Techniques of Program and Systems Maintenance, Ethnotech Inc., Lincoln,
NB (1980).
ALBIN J.L., FERREOL, R . , Collecte et analyse des mesures du logiciel, Tech. Sci. Inf. 1
4 (1982).
ANSI/IEEE, Standard for Software Quality Assurance Plans, Standard 730, Institute of Elec-
trical and Electronics Engineers, Inc., N e w York (1984).
87
ASSOCIATION FRANCAISE DU CONTROLE INDUSTRIEL DE LA QUALITE,
Demonstration de la fiabilite des logiciels, Section quality du logiciel, AFCIQ, Paris.
COOPER, J.D., FISHER, M.J. (Eds), Software Quality Management, Petrocelli Books, N e w
York and Princeton (1979).
Datafrance 8 (1984) 3 7 - 4 1 .
IVES, K . A . , Computer Software Quality Assurance, Research Report, Prior Data Sciences
Ltd, Rep. INFO-0201, Atomic Energy Control Board, Ottawa, Canada.
MEEKEL, J., TROY, R., "Comparative study of standards for software quality assurance
plans", Proc. Software Engineering Standards Workshop (SESAW-III), Computer Society,
Institute of Electrical and Electronics Engineers, Inc., New York (Oct. 1984).
NERSON, J.M., Panorama des outils d'amelioration de la quality des logiciels (Atelier
Logiciel No. 40), EDF-DER-Service Informatique' et math6matiques appliqu&s, Clamart
(1983).
88
NORTH ATLANTIC TREATY ORGANIZATION, Guide for the Evaluation of Contractors'
Software Quality Control System for Compliance with AQAP-13, Allied Quality Assurance
Publication No. AQAP-14, N A T O , Brussels.
E N D R E S , A . , An analysis of errors and their causes in system programs, IEEE Trans. Soft-
ware Eng. SE-1 2 (Jun. 1975).
G A N N O N , C., "Software error studies", Proc. Nat. Conf. on Software Test and Evaluation,
National Security Industrial Association (Feb. 1983) 1-1.
GLASS, R X . , Persistent software errors, IEEE Trans! Software Eng. SE-7 2 (Mar. 1981)
162-168.
S H O O M A N , M . L . , BOLSKY, M.I., "Types, distribution, and test and correction times for
programming errors", Proc. Int. Conf. on Reliable Software, Computer Society, Institute of
Electrical and Electronics Engineers, Inc. (Apr. 1975) 347.
ALBIN, J.L., FERREOL, R., Collecte et analyse des mesures du logiciel, Tech. Sci. Inf. 1
4 (1982).
89
BOEHM, B . W . , et al., Characteristics of Software Quality, North-Holland, Amsterdam and
N e w York (1978).
McCALL, J.A., " A n introduction to software quality metrics", Software Quality Manage-
ment, Petrocelli Books, N e w York and Princeton (1979) 127.
ANSI/IEEE, Standard for Software Quality Assurance Plans, Standard 730, Institute of
Electrical and Electronics Engineers, Inc., New York (1984).
ANSI/IEEE, Standard for Software Unit Testing, Standard P1008-1987, Institute of Electrical
and Electronics Engineers, Inc., N e w York (1987).
90
BRITISH S T A N D A R D S INSTITUTION, Control of the Operation of a Computer (Code of
Practice) BS-6650, BSI, London.
91
INSTITUTE OF ELECTRICAL A N D ELECTRONICS ENGINEERS, INC., IEEE Standard
for Software Test Documentation, IEEE Standard 829-1983, IEEE, N e w York (1983).
4. ORGANIZATION
(a) General
92
D E MARCO, T . , Controlling Software Projects — Management Measurement and Estima-
tion, Yourdon Press, N e w York (1982).
(b) Training
BARNES, K., Doing it yourself — A blueprint for training, PC Mag.(6 Aug. 1985) 147.
McGILL, J.P., The software engineering shortage, IEEE Trans. Software Eng. SE-10 1
(Jan. 1984) 42.
RASKIN, R., Individual training: A matter of style, PC Mag. (6 Aug. 1985) 121.
93
5. SOFTWARE DOCUMENT, CONFIGURATION,
MEDIA AND SERVICES CONTROL
94
INSTITUTE OF ELECTRICAL A N D ELECTRONICS ENGINEERS, INC., Standard for
Software Configuration Management Plan, IEEE Standard 828-1983, IEEE, N e w York
(1983).
SHANKAR, K.S., The total computer security problem: An overview, IEEE Computer 10
6 (Jun. 1977) 50.
6. D E S I G N CONTROL
BARIKH, G., Techniques of Program and Systems Maintenance, Ethnotech Inc., Lincoln,
NB (1980).
D E MARCO, T . , Structured Analysis and System Specification, Yourdon Press, New York
(1978).
95
EUROPEAN WORKSHOP O N INDUSTRIAL COMPUTER SYSTEMS, TECHNICAL
COMMITTEE 7, Development of Safety Related Software, Position Paper No. 1 (Oct. 1981).
G A N E , C., SARSON, T., Structured Systems Analysis: Tools and Techniques, McDonald
Douglas Automation Company, St. Louis, M O (1977).
96
INSTITUTE OF ELECTRICAL A N D ELECTRONICS ENGINEERS, I N C . , Draft Standard
for Software Verification and Validation Plans, IEEE Computer Society Draft Standard
P 1 0 1 2 / D 4 . 0 , IEEE, N e w York (31 Jul. 1985).
POWELL, P.B. (Ed.), Planning for Software Validation, Verification, and Testing, National
Bureau of Standards Special Publication NBS-SP-500-98, NBS, Washington, DC
(Nov. 1982).
POWELL, P.B., (Ed.), Software Validation, Verification and Testing, Technique and Tool
Reference Guide, National Bureau of Standards Special Publication NBS-SP-500-93, NBS,
Washington, D C (Sep. 1982).
BROWN, J.R., "Programming practices for increased software quality" Software Quality
Management, Petrocelli Books, N e w York and Princeton (1979) 197.
FISHER, C . F . , "Software quality assurance tools: Recent experience and future require-
ments", Software Quality Assurance (Proc. Workshop San Diego, 1978), Association of
Computing Machinery, N e w York (1978) 116.
HOUGHTON, R . C . , Jr., OAKLEY, K . A . , NBS Software Tools Data Base, National Bureau
of Standards Publication NBS-IR2159, NBS, Washington, D C (Oct. 1980).
97
NERSON, J.M., Panorama des outils d'amelioration de la qualite des logiciels (Atelier
Logiciel No. 40), EDF-DER-Service Informatique et mathematiques appliquees, Clamart
(1983).
POWELL, P.B. (Ed.), Software Validation, Verification and Testing, Technique and Tool
Reference Guide, National Bureau of Standards Special Publication NBS-SP-500-93, N B S ;
Washington, D C (Sep. 1982).
REIFER, D.J., "Software quality assurance tools and techniques", Software Quality
Management, Petrocelli Books, N e w York and Princeton (1979) 209.
7. PROCUREMENT CONTROL
. 8. TESTING .
98
ANSI/IEEE, Standard for Software Unit Testing, Standard P1008-1987, Institute of Electrical
and Electronics Engineers, Inc., N e w York (1987).
BEIZER, B., Software System Testing and Quality Assurance, Van Nostrand-Reinhold,
N e w York (1984).
GLASS, R.L., Software Reliability Guide Book, Prentice-Hall, Englewood Cliffs, NJ (1979).
INFOTECH, State of the Art Report — Software Testing, Vol. 2, Invited Papers, Infotech
International Limited, Berkshire, UK (1979).
McCABE, T.J., Structured Testing, IEEE Computer Society Catalog No. EH0200-6,
Institute of Electrical and Electronics Engineers, Inc., N e w York (1982) 90.
MYERS, G.J., Software Reliability Principles and Practices, Wiley, N e w York (1976).
POWELL, P.B. (Ed.), Planning for Software Validation, Verification, and Testing, National
Bureau of Standards Special Publication NBS-SP-500-98, NBS, Washington, DC
(Nov. 1982).
POWELL, P.B. (Ed.), Software Validation, Verification and Testing, Technique and Tool
Reference Guide, National Bureau of Standards Special Publication NBS-SP-500-93, NBS,
Washington, DC (Sep. 1982). .
99
10. CORRECTIVE ACTION
FOSTER, K . A . , Error sensitive test case analysis (ESTCA), IEEE Trans. Software Eng.
SE-6 3 (May 1980) 258.
G A N N O N , C., Error detection using path testing and static analysis, Computer 12 8
(Aug. 1979) 26.
OSTERWEIL, L.J., FOSDICK, L . D . , ' ' D A V E — A validation error detection and documen-
tation system for Fortran programs", Tutorial: Software Testing and Validation Techniques,
IEEE Catalog No. EH0138-8, Institute of Electrical and Electronics Engineers, Inc.,
N e w York (1978) 473.
11. RECORDS
BARIKH, G., Techniques of Program and Systems Maintenance, Ethnotech Inc., Lincoln,
N B (1980).
100
N A T I O N A L B U R E A U OF S T A N D A R D S , Guidelines for Automatic Data Processing
Physical Security and Risk Management, Federal Information Processing Standards Publica-
tion 31, NBS, Washington, D C (Jun. 1974).
12. AUDITING
101
LIST OF PARTICIPANTS
103
Miani, G. Ente Nazionale per l'Energia Elettrica (ENEL),
(Ill, IV) Viale Regina Margherita 135,
1-00198 Rome, Italy
104
HOW TO ORDER IAEA PUBLICATIONS
An exclusive sales agent for IAEA publications, to whom all orders
and inquiries should be addressed, has been appointed
in the following country:
Orders from countries where sales agents have not yet been appointed and
requests for information should be addressed directly to:
Division of Publications •
International Atomic Energy Agency
Wagramerstrasse 5, P.O. Box 100, A-1400 Vienna, Austria
INTERNATIONAL SUBJECT GROUP: V
ATOMIC ENERGY AGENCY Reactors and Nuclear Power/Reactor T e c h n o l o g y
V I E N N A , 1988