Software Validation
Software Validation
Approved 2003-04
Published by Nordtest Phone: + 358 9 455 4600 Fax: + 358 9 455 4272
Tekniikantie 12 E-mail: [email protected] Internet: www.nordtest.org
FIN–02150 Espoo
Finland
NT TECHN REPORT 535
Approved 2003-04
Institution:
Danish Institute of Fundamental Metrology
Title (English):
Title (Original): Method of Software Validation
Abstract:
This Method of Software Validation is a tool intended to assist in validation of small and
medium scale software used in accredited and other laboratories where software
validation is required. The tool encompasses this technical report, which describes how to
use the method and a Microsoft£ Word 2000 report template, which guides the user
through the validation task.
The Microsoft£ Word 2000 report template can be downloaded from Nordtest Web-site at:
http://www.nordtest.org/register/techn/tlibrary/tec535/tec535_valid.dot.
The Microsoft£ Word 2000 report template has also been converted to a PDF document and
included in this report as an appendix.
Phase 1 Input
Requirements and system Output
acceptance test Functionality / limitations, defaults, security
specification Platform / system requirements
Special requirements / risk analysis
Preparation of system acceptance test
Service and maintenance / phase out
Design verification
Table of contents
Introduction ........................................................................................................................................... 2
1 Definition of terms ...................................................................................................................... 3
2 Scope............................................................................................................................................. 4
2.1 Purchased software products................................................................................................ 4
2.2 Self-developed software products ......................................................................................... 4
2.3 Development, verification, and validation ........................................................................... 4
3 Software life cycle model ............................................................................................................ 5
3.1 Requirements and system acceptance test specification..................................................... 5
3.1.1 Requirements specification ...................................................................................................... 5
3.1.2 System acceptance test specification ....................................................................................... 6
3.2 Design and implementation process ..................................................................................... 6
3.2.1 Design and development planning........................................................................................... 7
3.2.2 Design input............................................................................................................................. 7
3.2.3 Design output ........................................................................................................................... 7
3.2.3.1 Implementation (coding and compilation) ............................................................................. 7
3.2.3.2 Version identification............................................................................................................. 8
3.2.3.3 Tips on good programming practice ...................................................................................... 8
3.2.3.4 Tips on Windows programming .......................................................................................... 8
3.2.3.5 Dynamic testing ..................................................................................................................... 9
3.2.3.6 Utilities for validation and testing.......................................................................................... 9
3.2.3.7 Tips on inactive code ............................................................................................................. 9
3.2.3.8 Documentation....................................................................................................................... 9
3.2.4 Design verification................................................................................................................. 10
3.2.5 Design changes ...................................................................................................................... 10
3.3 Inspection and testing .......................................................................................................... 10
3.4 Precautions ........................................................................................................................... 11
3.5 Installation and system acceptance test ............................................................................. 11
3.6 Performance, servicing, maintenance, and phase out....................................................... 11
4 Validation report....................................................................................................................... 12
5 References .................................................................................................................................. 13
Introduction
This method is basically developed to assist accredited laboratories in validation of software for cali-
bration and testing. The main requirements to the laboratories are stated in the Standard ISO/IEC
17025 [5]. The Danish Accreditation Body has prepared a DANAK guideline RL 10 [1] which inter-
prets the requirements in ISO/IEC 17025 with respect to electronic data processing in the accredited
laboratories. That guideline and this method are closely related.
If the laboratories comply with the requirements in ISO/IEC 17025 they will also meet the require-
ments of ISO 9001. The goal of this method was also to cover the situation where an accredited labo-
ratory wants to develop and sell validated computer software on commercial basis. Therefore the
Guideline ISO 9000-3 [2], which outlines requirements to be met for such suppliers, is taken into ac-
count.
Furthermore, the most rigorous validation requirements come from the medical and pharmaceutical
industry. In order to let this method benefit from the ideas and requirements used in this area, the
guidance from U.S. Food and Drag Administration (FDA) “General principles of software validation”
[3] and the GAMP Guide [4] are intensively used as inspiration.
This method is not a guideline. It is a tool to be used for systematic and straightforward validation of
various types of software. The laboratories may simply choose which elements they want to validate
and which they do not. It is their option and their responsibility.
1 Definition of terms
In order to assure consistency, conventional terms used in this document will apply to the following
definitions:
• Computer system. A group of hardware components and associated software designed and assem-
bled to perform a specific function or group of functions [4].
• Software. A collection of programs, routines, and subroutines that controls the operation of a com-
puter or a computerized system [4].
• Software product. The set of computer programs, procedures, and associated documentation and
data [2].
• Software item. Any identifiable part of a software product [2].
• Standard or configurable software packages. Standard or configurable software packages are com-
mercial products, which typically are used to produce customized applications (e.g. spreadsheets
and executable programs). Even if the software packages themselves do not require validation, new
versions should always be treated with caution and be approved before use. The applications they
make should always be validated [4].
• Custom built or bespoke systems. Software products categorized as custom built or bespoke sys-
tems are applications that should be validated in accordance with a validation plan based on a full
life cycle model [4].
• Testing. The process of exercising or evaluating a system or system component by manual or auto-
mated means to verify that it satisfies requirements or to identify differences between expected and
actual results [4].
• Verification. Confirming that the output from a development phase meets the input requirements
for that phase [3].
• Validation. Establishing by objective evidence that all software requirements have been imple-
mented correctly and completely and are traceable to system requirements [3].
• Revalidation. Repetition of the validation process or a specific portion of it [4].
• Retrospective validation. Establishing documented evidence that a system does what it purports to
do based on analysis of historical information [4].
• Reverse engineering. Preparing retrospective validation tasks to be conducted on existing software
products (in contrast to software products under development).
• Life cycle model. A framework containing the processes, activities, and tasks involved in the
development and maintenance of a software product, spanning the life of the software from the
definition of its requirements to the termination of its use, i.e. from concept to retirement [2].
• Design process. Software life cycle process that comprises the activities of input requirements
analysis, architectural design, and detailed function design. The design process is that which trans-
forms the requirements into a software executable.
• Development process. Software life cycle process that comprises the activities of system require-
ments analysis, design, coding, integration, testing, installation, and support for acceptance. The
development process is that which transforms the requirements into a software product [2].
• System acceptance testing. Documented validation that the software performs as defined in the re-
quirements throughout anticipated operating ranges in the environment in which it will be used.
• Dynamic testing. Testing performed in the development process to ensure that all statements, func-
tions, cases, and loops have been executed at least once.
• Regression testing. Testing to determine that changes made to correct defects have not introduced
additional defects. [2]
• Replication. Copying a software product from one medium to another. [2]
2 Scope
Persons who use, develop, and validate software - especially software products used for calibration
and testing in accredited laboratories - may use this method. Most of such software products require
validation and are commonly categorized as custom built or bespoke systems. They are programs and
spreadsheets that the laboratory itself develops or purchases.
This method is based on a common life cycle model and takes in consideration most aspects of normal
(prospective) and retrospective validation. This method may be used for validation of:
• Purchased software products that are not standard or configurable software packages
• Self-developed or purchased software products where the source code is available and known
• Software being developed in control of the laboratory
• Version control. How to identify different versions of the software product and to distinguish out-
put from the individual versions.
• Dedicated platform. The operating hardware and software environment in which to use the soft-
ware product, e.g. laboratory or office computer, the actual operating system, network, third-party
executables such as Microsoft Excel and Word, etc.
• Installation. Installation requirements, e.g. how to install and uninstall the software product.
• How to upgrade. How to upgrade to new versions of platforms, support tools, etc.
• Special requirements. Requirements stated by the International Standards to which the laboratory is
committed. Security requirements, traceability, change control and back-up of records, protection
of code and data, confidentiality, precautions, risks in case of errors in the software product etc.
The requirements also specify which software items must be available for correct and unambiguous
use of the software product.
• Documentation. Description of the modes of operation and other relevant information about the
software product.
• User manual. How to use the software product.
• On-line help. On-line Help provided by Windows programs.
• Validation report. Additional documentation stating that the software product has been validated to
the extent required for its application.
• Service and maintenance. Documentation of service and support concerning maintenance, future
updates, problem solutions, requested modifications, etc.
• Special agreements. Agreements between the supplier and the end-user concerning the software
product where such agreements may influence the software product development and use, e.g. spe-
cial editions, special analysis, or extended validation, etc.
• Phase out. Documentation on how (and when) to discontinue the use of the software product and
how to avoid impact on existing systems and data.
• Errors and alarms. How to handle errors and alarms.
Anomalies found and circumvented in the Design and implementation process should be described in
phase 4, Precautions.
Windows allows executables to run in more than one instance, unless the programmer explicitly pre-
vents the start of another instance when one is already running. The programmer should be aware that
multiple instances will have access to the same files and data and that this may cause problems and
sometimes even errors.
3.2.3.8 Documentation
Human readable source code printouts are valid documentation. Programs should be properly docu-
mented so that all necessary information becomes available for the user to operate the software
product correctly. The preparation of a user manual may be specified in the requirements, but addi-
tional user manuals and/or an On-line Help facilities may be produced if required.
3.4 Precautions
When operating in a third-party software environment, such as Microsoft Windows and Office, some
undesirable, inappropriate, or anomalous operating conditions may exist. In cases where such condi-
tions impact the use of the software product in some irregular way or cause malfunction, they must be
clearly registered, documented, and avoided (if possible). All steps taken to workaround such condi-
tions should also be verified and tested.
Precautionary steps may also be taken in case of discrepancies between the description of the way an
instrument should operate, and the way it actually does. In either case it is a good idea to maintain a
logbook of registered anomalies for other operators and programmers to use.
Minor errors in a software product may sometimes be acceptable if they are documented and/or prop-
erly circumvented.
• Problem / solution. This involves detection of software problems causing operational troubles. A
first hand step could be to suggest or set up a well-documented temporary solution or workaround.
• Functional maintenance. If the software product is based on international standards, and these stan-
dards are changed, the software product, or the way it is used, should be updated accordingly.
• Functional expansion and performance improvement. User suggestions and requests should be re-
corded in order to improve the performance of the software product. Such records may provide in-
fluence on the development or evaluation of future versions of the software product.
• New versions. When a new version of the software product is taken into use, the effect on the exist-
ing system should be carefully analyzed and the degree of revalidation decided. The most common
result of these considerations will be reentrance into the design changes sub-phase where further
decisions will be made and documented. Special attention should be paid to the effect on old
spreadsheets when upgrading the spreadsheet package.
• Phase out. Considerations should be taken on how (and when) to discontinue the use of the soft-
ware product. The potential impact on existing systems and data should be examined prior to with-
drawal.
Corrective actions due to errors detected in a released software product are addressed under the disci-
pline described in the design changes clause.
4 Validation report
All validation activities should be documented and that may seem to be an overwhelming job. How-
ever, if the recommendations in this method are followed systematically, the work will become rea-
sonable and it will be quite easy to produce a proper validation report.
This method provides a Word 2000 template “Nordtest Software Validation Report.dot” which is or-
ganized in accordance with the life cycle model stated above. There are two main tasks associated with
each life cycle phase:
• Preliminary work. To specify/summarize the requirements (forward/reverse engineering for
prospective/retrospective validation), to manage the design and development process, make the
validation test plan, document precautions (if any), prepare the installation procedure, and to plan
the service and maintenance phase. All documents and actions should be dated and signed.
• Peer review and test. To review all documents and papers concerning the validation process and
conduct and approve the planned tests and installation procedures. All documents and actions
should be dated and signed.
It is recommended always to mark topics that are excluded from the validation as “not relevant” or
“not applicable” (n/a) – preferably with an argument – so it is evident that they are not forgotten but
are deliberately skipped. Additional rows may optionally be inserted into the tables if required.
It is the intention that the validation report shall be a “dynamic” document, which is used to keep track
on all changes and all additional information that currently may become relevant for the software
product and its validation. Such current updating can, however, make the document more difficult to
read, but never mind – it is the contents, not the format, which is important.
When validating software used in accredited work, the laboratories must be aware of the requirements
specified by their National Accreditation Body and especially how to handle the option to include or
exclude validation tasks. Excluded validation tasks should never be removed, but always marked as
excluded with an explanatory statement. Thus, the laboratories themselves are responsible for using
this method in a way, which can be accepted by their National Accreditation Body.
The software product should be designed to handle critical events (in terms of when, where, whom,
and why) applied during use. Such events should be traceable through all life cycle phases and meas-
ures taken to ensure the traceability should be stated in the validation report.
It may be good validation practice to sign (by date and initials) the different parts of report as the vali-
dation proceeds, e.g. Requirements specification should be approved and signed before the Design is
done, Test specifications should be approved and signed before the tests are carried out, etc. It is also
important to identify the persons who are involved in the validation and are authorized to approve and
sign the report, e.g.
− Other persons than those who built the software product should do the testing.
− Acceptance test should be done by the system user/owner rather than by the development team.
− The persons approving documents should not be the same as those who have authored them.
The lack of confirmation messages when clicking this check box indicates that the macro does not
work properly.
5 References
[1] DANAK retningslinie, Anvendelse af edb i akkrediterede laboratorier,
RL 10 af 2002.01.01
[2] DS/EN ISO 9000-3, Quality management and quality assurance standards - Part 3: Guide-
lines for the application of ISO 9001:1994 to the development, supply, installation and
maintenance of computer software, Second edition, 1997-12-15
[3] U.S. Food and Drug Administration: General Principles of Software Validation,
Draft Guidance Version 1.1, June 9, 1997 (www.fda.gov/cdrh/ode/swareval.html)
[4] GAMP Guide. Validation of Automated Systems in Pharmaceutical Manufacture. Version:
V3.0, March 1998
[5] DS/EN ISO/IEC 17025, General requirements for the competence of testing and calibration
laboratories, First edition, 2000-04-27
[6] ISO/DIS 15189.2, Medical laboratories – Particular requirements for quality and
competance, Draft 2002.
Software Product:
Preface
This software validation method, described in the document “Nordtest Method of Software Valida-
tion”, is basically developed to assist accredited laboratories in validation of software for calibration
and testing. The actual report is provided via a Word 2000 template “Nordtest Software Validation
Report.dot” which is organized in accordance with the life cycle model used in the validation method.
There are two main tasks associated with each life cycle phase:
• Preliminary work. To specify/summarize the requirements (forward/reverse engineering for
prospective/retrospective validation), to manage the design and development process, make the
validation test plan, document precautions (if any), prepare the installation procedure, and to plan
the service and maintenance phase.
• Peer review and test. To review all documents and papers concerning the validation process and
conduct and approve the planned tests and installation procedures.
The report template contains 5 sections:
1. Objectives and scope of application. Tables to describe the software product, to list the involved
persons, and to specify the type of software in order to determine the extent of the validation.
2. Software life cycle overview. Tables to specify date and signature for the tasks of preliminary
work and the peer reviews assigned to each life cycle phase as described above.
3. Software life cycle activities. Tables to specify information that is relevant for the validation. It is
the intention that having all topics outlined, it should be easier to write the report.
4. Conclusion. Table for the persons responsible to conclude and sign the validation report.
5. References and annexes. Table of references and annexes.
Even if possible, it is recommended not to delete irrelevant topics but instead mark them as excluded
from the validation by a “not relevant” or “not applicable” (n/a) note – preferably with an argument –
so it is evident that they are not forgotten but are deliberately skipped.
It is the intention that the validation report shall be a “dynamic” document, which is used to keep track
on all changes and all additional information that currently may become relevant for the software
product and its validation. Such current updating can, however, make the document more difficult to
read, but never mind – it is the contents, not the format, which is important.
Table of contents
Software Product:.................................................................................................................................. 1
Preface .................................................................................................................................................... 1
1 Objectives and scope of application........................................................................................... 2
2 Software life cycle overview........................................................................................................ 3
3 Software life cycle activities........................................................................................................ 4
3.1 Requirements and system acceptance test specification ..................................................... 4
3.2 Design and implementation process ..................................................................................... 9
3.3 Inspection and testing .......................................................................................................... 12
3.4 Precautions............................................................................................................................ 14
3.5 Installation and system acceptance test.............................................................................. 15
3.6 Performance, servicing, maintenance, and phase out ....................................................... 17
4 Conclusion.................................................................................................................................. 19
5 References and annexes ............................................................................................................ 19
Development team...
Testing team...
Activity 2.6 Performance, servicing, maintenance, and phase out Date / Initials
Task 3.6.1 Performance and maintenance
Method 3.6.1 Peer review
Task 3.6.2 New versions
1. Version:
2. Version:
3. ...
Method 3.6.2 Peer review
1. Action:
2. Action:
3. ...
Task 3.6.3 Phase out
Method 3.6.3 Peer review
validation (where the development phase is irrelevant) it can at least be specified what the software is
purported to do based on actual and historical facts. The requirements should encompass everything
concerning the use of the software.
Topics 3.1.1 Requirements specification
Objectives
Description of the software
product to the extent needed
for design, implementation,
testing, and validation.
Version of requirements
Version of, and changes
applied to, the requirements
specification.
Input
All inputs the software
product will receive.
Includes ranges, limits,
defaults, response to illegal
inputs, etc.
Output
All outputs the software
product will produce.
Includes data formats,
screen presentations, data
storage media, printouts,
automated generation of
documents, etc.
Functionality
All functions the software
product will provide.
Includes performance
requirements, such as data
throughput, reliability,
timing, user interface
features, etc.
Traceability
Measures taken to ensure
that critical user events are
recorded and traceable
(when, where, whom, why).
Hardware control
All device interfaces and
equipments to be supported.
Safety
All precautions taken to pre-
vent overflow and malfunc-
tion due to incorrect input or
use.
Default settings
All settings applied after
power-up such as default
input values, default instru-
ment or program control
settings, and options selected
by default. Includes infor-
mation on how to manage
and maintain the default
settings.
Version control
How to identify different
versions of the software
product and to distinguish
output from the individual
versions.
Dedicated platform
The hardware and software
operating environment in
which to use the software
product. E.g. laboratory or
office computer, the actual
operating system, network,
third-party executables such
as Microsoft£ Excel and
Word, the actual version of
the platform, etc.
Installation
Installation requirements,
e.g. installation kit, support,
media, uninstall options, etc.
Special requirements
Requirements the laboratory
is committed to, security,
confidentiality, change
control and back-up of
records, protection of code
and data, precautions, risks
in case of errors in the
software product, etc.
Documentation
Description of the modes of
operation and other relevant
information about the soft-
ware product.
User manual
User instructions on how to
use the software product.
On-line help
On-line Help provided by
Windows programs.
Validation report
Additional documentation
stating that the software
product has been validated
to the extent required for its
application.
Service and
maintenance
Documentation of service
and support concerning
maintenance, future updates,
problem solutions, requested
modifications, etc.
Phase out
Documentation on how (and
when) to discontinue the use
of the software product, how
to avoid impact on existing
systems and data, and how
to recover data.
The system acceptance test specification contains objective criteria on how the software product
should be tested to ensure that the requirements are fulfilled and that the software product performs as
required in the environment in which it will be used. The system acceptance test is performed after the
software product has been properly installed and thus is ready for the final acceptance test and
approval for use.
Topics 3.1.2 System acceptance test specification
Objectives
Description of the operating
environment(s) in which the
software product will be
tested and used.
Scope
Scope of the acceptance test.
E.g. installation and version,
startup and shutdown,
common, selected, and
critical requirements, and
areas not tested.
Input
Selected inputs the software
product must receive and
handle as specified.
Functionality
Selected functions the
software product must
perform as specified.
Personnel
Description of operations the
actual user(s) shall perform
in order to make evident that
the software product can be
operated correctly as
specified and documented.
Errors and alarms
How to handle errors and
alarms.
Development plan
Development tools,
manpower, and methods.
Review and acceptance
How to review, test, and
approve the design plan.
The design input phase establishes that the requirements can be implemented. Incomplete, ambiguous,
or conflicting requirements are resolved with those responsible for imposing these requirements. The
input design may be presented as a detailed specification, e.g. by means of flow charts, diagrams,
module definitions etc.
The design output must meet the design input requirements, contain or make references to acceptance
criteria, and identify those characteristics of the design that are crucial to the safe and proper func-
tioning of the product. The design output should be validated prior to releasing the software product
for final inspection and testing.
Topics 3.2.3 Design output
Implementation (coding
and compilation)
Development tools used to
implement the software,
notes on anomalies, plan for
module and integration test,
etc.
Version identification
How to identify versions on
screen, printouts, etc. Exam-
ple “Version 1.0.0”.
Inactive code
Inactive (dead) code left for
special purposes.
Documentation
Documentation provided as
output from the Design
Output section.
At appropriate stages of design, formal documented reviews and/or verifications of the design should
take place before proceeding with the next step of the development process. The main purpose of such
actions is to ensure that the design process proceeds as planned.
Topics 3.2.4 Design verification
Review
Review current development
stage according to the
design and development
plan.
The Design Change section serves as an entry for all changes applied to the software product, also
software products being subjected to retrospective validation. Minor corrections, updates, and en-
hancements that do not impact other modules of the program are regarded as changes that do not re-
quire an entire revalidation. Major changes are reviewed in order to decide the degree of necessary
revalidation or updating of the requirements and system acceptance test specification.
Evaluation 1. Description:
Evaluation of the 2. Description:
consequences of the change. 3. ...
Implementing 1. Action:
Implementing and verifying 2. Action:
the change. 3. ...
Validation 1. Action:
The degree of revalidation 2. Action:
or updating of requirements. 3. ...
The test plan is created during the development or reverse engineering phase and identify all elements
that are about to be tested. The test plan should explicitly describe what to test, what to expect, and
how to do the testing. Subsequently it should be confirmed what was done, what was the result, and if
the result was approved.
Topics 3.3.2 Test plan and performance Date / Initials
Test objectives
Description of the test in
terms of what, why, and how.
Relevancy of tests
Relative to objectives and
required operational use.
Scope of tests
In terms of coverage,
volumes, and system
complexity.
Levels of tests
Module test, integration test,
and system acceptance test.
Types of tests
E.g. input, functionality,
boundaries, performance,
and usability.
Configuration tests
Platform, network, and inte-
gration with other systems.
Calculation tests
To confirm that known
inputs lead to specified
outputs.
Regression tests
To ensure that changes do
not cause new errors.
Traceability tests
To ensure that critical events
during use are recorded and
traceable as required.
Special concerns
Testability, analysis, stress,
reproducibility, and safety.
Acceptance criteria
When the testing is
completed and accepted.
Action if errors
What to do if errors are
observed.
Follow-up of tests
How to follow-up the testing.
Result of testing Testing approved
Approval of performed tests. Comments:
3.4 Precautions
When operating in a third-party software environment, such as Microsoft£ Windows and Office, some
undesirable, inappropriate, or anomalous operating conditions may exist. A discrepancy between the
description of the way an instrument should operate, and the way it actually does, may be regarded as
an anomaly as well. Minor errors in a software product may sometimes be acceptable if they are
documented and/or properly circumvented.
Spreadsheet
Anomalous operating
conditions in e.g. Excel.
Instruments
Anomalous operating
conditions in the used
instruments.
General precautions
Anomalous operating
conditions associated with
the software product itself.
The steps taken to workaround anomalous, inappropriate, or undesired operating conditions are
verified and tested.
Topics 3.4.2 Precautionary steps taken Date / Initials
Operative system
Precautionary steps taken in
e.g. Windows settings.
Spreadsheet
Precautionary steps taken to
workaround problems using
e.g. Excel.
Instruments
Precautionary steps taken to
workaround problems with
the used instruments.
General precautions
Precautionary steps taken to
workaround problems with
the software product itself.
Installed files
List of (relevant) installed
files, e.g. EXE- and DLL-
files, spreadsheet Add-ins
and Templates, On-line
Help, etc.
Supplementary files
Readme files, License
agreements, examples, etc.
The program is tested after installation to the extent depending on the use of the product and the actual
requirements, e.g. an adequate test following the validation test plan. Sometimes it is recommendable
to carry out the installation testing in a copy of the true environment in order to protect original data
from possible fatal errors due to using a new program.
Topics 3.5.2 Installation procedure Date / Initials
Authorization Person responsible:
Approval of installation in
actual environment.
Installation test Tested and approved in a test environment
The following installations
have been performed and Tested and approved in actual environment
approved... Completely tested according to test plan
Partly tested (known extent of update)
Comments:
The system acceptance test is carried out in accordance with the system acceptance test specifications
after installation. The software product may subsequently be approved for use.
Functional expansion
and performance im-
provement
List of suggestions and
requests, which can improve
the performance of the
software product.
When a new version of the software product is taken into use, the effect on the existing system is care-
fully analyzed and the degree of revalidation decided. Special attention is paid to the effect on old
spreadsheets when upgrading the spreadsheet package.
Topics 3.6.2 New versions Date / Initials
Description 1. Version:
Description of the new 2. Version:
version to the extent needed 3. ...
to decide whether or not to
upgrade.
Action 1. Action:
Action to be taken if upgrade 2. Action:
is decided. See also the 3. ...
Design Changes section.
It is taken into consideration how (and when) to discontinue the use of the software product. The po-
tential impact on existing systems and data are examined prior to withdrawal.
Topics 3.6.3 Phase out Date / Initials
How and when
To discontinue the use of the
software product.
Consequences
Assumed impact on existing
systems and data and how to
avoid or reduce the harm.
4 Conclusion
By the subsequent signatures it becomes evident that all validation activities are documented and ap-
proved.
Date: Signature:
Conclusion
All check boxes are locked for editing (to avoid inadvertent change of settings)
Comments:
Date: Signature:
Notice: Only technical reports with a bold number on the left leaf of the page can be ordered free of charge from the
Nordtest secretariat. Others have to be ordered from the publishing organisation or institute. Information for ordering
those reports can be obtained from Nordtest secretariat and Nordtest Web-site.
403 Holmgren, M., Observing validation, uncertainty determination and traceability in developing Nordtest test
methods. Espoo 1998. Nordtest, NT Techn Report 403. 12 p. NT Project No. 1277-96.
418 Views about ISO/IEC DIS 17025 - General requirements for the competence of testing and calibration
laboratories. Espoo 1999. Nordtest, NT Techn Report 418. 87 p. NT Project No. 1378-98.
419 Virtanen, V., Principles for measuring customers satisfaction in testing laboratories. Espoo 1999. Nordtest, NT
Techn Report 419. 27 p. NT Project No. 1379-98.
420 Vahlman, T., Tormonen, K., Kinnunen, V., Jormanainen, P. & Tolvanen, M., One-site calibration of the
continuous gas emission measurement methods at the power plant. Espoo 1999. Nordtest, NT Techn Report
420. 18 p. NT Project No. 1380-98.
421 Nilsson, A. & Nilsson, G., Ordering and reporting of measurement and testing assignments. Espoo 1999.
Nordtest, NT Techn Report 421. 7 p. NT Project No. 1449-99.
430 Rasmussen, S.N., Tools for the test laboratory to implement measurement uncertainty budgets. Espoo 1999.
Nordtest, NT Techn Report 430. 73 p. NT Project No. 1411-98.
431 Arnold, M., Roound robin test of olfactometry. Espoo 1999. Nordtest, NT Techn Report 431. 13 p. NT
Project No. 1450-99.
429 Welinder, J., Jensen, R., Mattiasson, K. & Taastrup, P., Immunity testing of integrating instruments. Espoo 1999.
Nordtest, NT Techn Report 429. 29 p. NT Project No. 1372-97.
443 Guttulsrød, G.F, Nordic interlaboratory comparison measurements 1998. Espoo 1999. Nordtest, NT Techn
Report 443. 232 p. (in Dan/Nor/Swed/Engl) NT Project No. 1420-98.
452 Gelvan, S., A model for optimisation including profiency testing in the chemical laboratories. Espoo 2000.
Nordtest, NT Techn Report 452. 15 p. NT Project No. 1421-98.
501 Lau, P., Study to characterize thermal convection effects in water. Espoo 2003. Nordtest, NT Techn Report 501.
15 p. NT Project No. 1543-01.
503 Erikoinen, O., Kiiskinen, J. & Pajari, M., Interlaboratory comparison tests of reinforcing steel roducts. Espoo
2003. Nordtest, NT Techn Report 503. 31 p. NT Project No. 1446-99.
512 Merryl, J., Estimation of assigned values and their uncertainties for use in interlaboratory comparisons. Espoo
2003. Nordtest, NT Techn Report 512. 57 p. NT Project No. 1496-00.
513 Hovind, H., Severinsen, G. & Settergren-Sørensen, P., Nordic standard interface for transfer of data and graphics
between proficiency test webs and statistical software. Espoo 2003. Nordtest, NT Techn Report 513. 72 p. NT
Project No. 1542-01.
514 Petersen, L., Frølund, H. & Lorentzen, E., Qualification of personnel in laboratories, inspection and certification
bodies. Knowledge management in laboratories. Espoo 2003. Nordtest, NT Techn Report 514. 29 p. Appendix 1:
514_A1-Knowledge management (power point presentation) 10 slides. NT Project No. 1564-01.
533 Svensson, T., Holmgren, M., Johansson, K. & Johnson, E., Inter-laboratory comparison of fatigue test with
evaluation of the participating laboratories calculations of measurement uncertainty. Espoo 2003. Nordtest, NT
Techn Report 533. 19 p. NT Project No. 1591-02.
535 Torp, C.E., Method of software validation. Espoo 2003. Nordtest, NT Techn Report 535. 31 p. NT Project No.
1594-02.
NORDTEST
TECHNICAL REPORT 535
Nordtest endeavours to
• promote viable industrial development and industrial competitive-
ness, remove technical barriers to trade and promote the concept
“Approved Once Accepted Everywhere” in the conformity assess-
ment area
• work for health, safety, environment in methods and standards
• promote Nordic interests in an international context and Nordic par-
ticipation in European co-operation
• finance joint research in conformity assessment and the develop-
ment and implementation of test methods
• promote the use of the results of its work in the development of
techniques and products, for technology transfer, in setting up stand-
ards and rules and in the implementation of these
• co-ordinate and promote Nordic co-operation in conformity assess-
ment
• contribute to the Nordic knowledge market in the field of conform-
ity assessment and to further development of competence among
people working in the field
12