0% found this document useful (0 votes)
181 views130 pages

ESAISVVGuideRev1 0cnov2005

This document provides a draft guide for independent software verification and validation (ISVV) for the European Space Agency (ESA). It outlines the ISVV process, including defining roles and responsibilities, planning activities, executing verification and validation, and reporting results. The guide describes classifying software criticality levels to determine the appropriate extent of ISVV activities and provides overviews of the key ISVV processes, including criticality analysis, ISVV process management, and reporting findings. It is intended to standardize ISVV for ESA software projects and allow for feedback to improve future versions of the guide.

Uploaded by

gianfranco65
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
181 views130 pages

ESAISVVGuideRev1 0cnov2005

This document provides a draft guide for independent software verification and validation (ISVV) for the European Space Agency (ESA). It outlines the ISVV process, including defining roles and responsibilities, planning activities, executing verification and validation, and reporting results. The guide describes classifying software criticality levels to determine the appropriate extent of ISVV activities and provides overviews of the key ISVV processes, including criticality analysis, ISVV process management, and reporting findings. It is intended to standardize ISVV for ESA software projects and allow for feedback to improve future versions of the guide.

Uploaded by

gianfranco65
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 130

UIDE FOR

NDEPENDENT OFTWARE
ERIFICATION
ALIDATION

prepared by/préparé par ESA property.


Produced under ESA contract 18466/04/NL/AG by Det Norske Veritas A/S
(N) supported by the subcontractors Critical Software (P), SciSys LTD (GB)
and Terma A/S (DK)

reference/réference ESA ISVV Guide


issue/édition 1
revision/révision 0
date of issue/date d’édition November 22, 2005
status/état Draft
Document type/type de document Technical Note
Distribution/distribution

ESTEC ESA ISVV Guide Rev 1.0c


Keplerlaan 1 - 2201 AZ Noordwijk - The Netherlands Nov2005.doc

Tel. (31) 71 5656565 - Fax (31) 71 5656040


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page ii

Disclaimer:
This ISVV Guide is the first draft issue of the document. It is available for use and review to the European
Space Industry. The ISVV Guide is provided as is: the Agency gives no warranty nor guarantees
whatsoever as to its completeness, adequacy or suitability and shall not be held liable for any direct,
indirect nor consequential damages. Use of the ISVV Guide by Readers/Users is made fully at the latter's
own risk

Feedback:
Readers/users of this guide is requested to provide comments to ESA on eventual inconsistencies in the
guide or suggestions for its improvements.

The comments shall be send by e-mail to: [email protected]


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page iii

A P P R O V A L

Title ESA Guide for Software Verification and Validation issue 1 revision 0
titre issue revision

author Frode Høgberg (DNV) Editor date November 22,


auteur date 2005

approved by Kjeld Hjortnæs date


approuvé by date

C H A N G E L O G

reason for change /raison du changement issue/issue revision/revision date/date


draft release draft 1 15-Nov-2005

C H A N G E R E C O R D

Issue: 1 Revision: 0

reason for change/raison du changement page(s)/page(s) paragraph(s)/paragraph(s)


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page iv

contents:
1.0 Introduction .................................................................................................... 1
1.1 Background and Motivation .............................................................................. 1
1.2 Purpose ............................................................................................................ 1
1.3 Definitions......................................................................................................... 1
1.4 Acronyms ......................................................................................................... 4
1.5 References ....................................................................................................... 6
1.6 Outline .............................................................................................................. 7
2.0 What is Independent Software Verification and Validation? ...................... 8
2.1 Types of Independence .................................................................................... 8
2.2 Objectives of ISVV ......................................................................................... 10
2.3 ISVV is Complementary to Developer’s V&V ................................................. 10
3.0 ISVV Process Overview ............................................................................... 12
4.0 ISVV Process Management ......................................................................... 15
4.1 Activity Overview ............................................................................................ 15
4.1.1 Roles and Responsibilities ......................................................................................... 16
4.1.1.1 Responsibilities of Software suppliers........................................................................ 17
4.1.1.2 Interface with Software Validation Facility supplier .................................................... 17
4.1.2 Criticality Analysis, Definition of Scope and Budgeting .............................................. 17
4.1.3 Scheduling and Milestones ........................................................................................ 19
4.1.4 Quality Management System ..................................................................................... 19
4.1.5 Non-Disclosure and Security...................................................................................... 20
4.1.6 Competence ............................................................................................................... 20
4.2 Activity Inputs and Prerequisites .................................................................... 21
4.2.1 Software Criticality Analyses ...................................................................................... 21
4.2.2 Documents and Code from Software Development ................................................... 21
4.2.3 ISVV Findings Resolution Report............................................................................... 22
4.3 Activity Outputs .............................................................................................. 22
4.3.1 ISVV Plan ................................................................................................................... 22
4.3.2 Requests for Clarification ........................................................................................... 22
4.3.3 ISVV Report (with ISVV Findings).............................................................................. 23
4.3.4 Progress Reports ....................................................................................................... 23
4.4 Activity Management ...................................................................................... 23
4.4.1 Initiating and Terminating Events ............................................................................... 23
4.4.2 Completion Criteria..................................................................................................... 24
4.4.3 Relations to other Activities ........................................................................................ 24
4.5 Task Descriptions ........................................................................................... 24
4.5.1 ISVV Process Planning .............................................................................................. 24
4.5.2 ISVV Process Execution, Monitoring and Control ...................................................... 25
4.6 Methods.......................................................................................................... 26
5.0 Criticality Analysis ....................................................................................... 27
5.1 Activity Overview ............................................................................................ 27
5.1.1 ISVV Levels and Software Criticality Categories........................................................ 29
5.1.2 Adjusting the ISVV Level............................................................................................ 31
5.1.3 Treatment of Diverse Criticality Categories................................................................ 32
5.2 Activity Inputs and Prerequisites .................................................................... 33
5.3 Activity Outputs .............................................................................................. 33

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page v

5.4 Activity Management ...................................................................................... 34


5.4.1 Initiating and Terminating Events ............................................................................... 34
5.4.2 Completion Criteria..................................................................................................... 34
5.4.3 Relations to Other Activities ....................................................................................... 34
5.5 Task Descriptions ........................................................................................... 34
5.5.1 System Level Software Criticality Analysis................................................................. 34
5.5.2 Software Technical Specification Criticality Analysis ................................................. 35
5.5.3 Software Design Criticality Analysis ........................................................................... 36
5.5.4 Software Code Criticality Analysis.............................................................................. 37
5.6 Methods.......................................................................................................... 38
6.0 Technical Specification Analysis ................................................................ 40
6.1 Activity Overview ............................................................................................ 40
6.1.1 Requirements traceability verification......................................................................... 41
6.1.2 Software requirements verification ............................................................................. 42
6.2 Activity Inputs and Prerequisites .................................................................... 42
6.3 Activity Outputs .............................................................................................. 43
6.4 Activity Management ...................................................................................... 43
6.4.1 Initiating and Terminating Events ............................................................................... 43
6.4.2 Completion Criteria..................................................................................................... 43
6.4.3 Relations to other Activities ........................................................................................ 43
6.5 Task Descriptions ........................................................................................... 43
6.5.1 Requirements Traceability Verification....................................................................... 43
6.5.2 Software Requirements Verification ........................................................................... 45
6.6 Methods.......................................................................................................... 46
7.0 Design Analysis............................................................................................ 48
7.1 Activity Overview ............................................................................................ 48
7.1.1 Software Architectural Design Independent Verification ............................................ 49
7.1.2 Software Detailed Design Independent Verification ................................................... 50
7.1.3 Software User Manual Analysis ................................................................................. 51
7.2 Activity Inputs and Prerequisites .................................................................... 52
7.3 Activity Outputs .............................................................................................. 52
7.4 Activity Management ...................................................................................... 52
7.4.1 Initiating and Terminating Events ............................................................................... 52
7.4.2 Completion Criteria..................................................................................................... 53
7.4.3 Relations to other Activities ........................................................................................ 53
7.5 Tasks Description ........................................................................................... 54
7.5.1 Architectural Design Traceability Verification ............................................................. 54
7.5.2 Architectural Design Verification ................................................................................ 55
7.5.3 Detailed Design Traceability Verification.................................................................... 57
7.5.4 Detailed Design Verification ....................................................................................... 59
7.5.5 Software User Manual Analysis ................................................................................. 61
7.6 Methods.......................................................................................................... 62
8.0 Code Analysis............................................................................................... 64
8.1 Activity Overview ............................................................................................ 64
8.1.1 Software Code Analysis ............................................................................................. 65
8.1.2 Integration and Unit Test Procedures and Data Analysis........................................... 66
8.2 Activity Inputs and Prerequisites .................................................................... 66
8.3 Activity Outputs .............................................................................................. 67
8.4 Activity Management ...................................................................................... 67
8.4.1 Initiating and Terminating Events ............................................................................... 67

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page vi

8.4.2 Completion Criteria..................................................................................................... 68


8.4.3 Relations to other Activities ........................................................................................ 68
8.5 Tasks Description ........................................................................................... 69
8.5.1 Source Code Traceability Verification ........................................................................ 69
8.5.2 Source Code Verification............................................................................................ 71
8.5.3 Integration Test Procedures and Test Data Verification............................................. 73
8.5.4 Unit Test Procedures and Test Data Verification ....................................................... 74
8.6 Methods.......................................................................................................... 75
9.0 Independent Validation ................................................................................ 77
9.1 Activity Overview ............................................................................................ 77
9.1.1 Identification of Test Cases ........................................................................................ 78
9.1.1.1 Evaluate Task Input.................................................................................................... 79
9.1.1.2 Perform Analysis ........................................................................................................ 79
9.1.1.3 Writing the Independent Validation Test Plan ............................................................ 80
9.1.2 Construction of Test Procedures................................................................................ 81
9.1.2.1 Achieve Knowledge about the Software Validation Facility........................................ 81
9.1.2.2 Implementation of Test Case into Test Procedures ................................................... 81
9.1.2.3 Updating the Independent Validation Test Plan ......................................................... 81
9.1.3 Execution of Test Procedures .................................................................................... 81
9.1.3.1 Execute the Test Procedures ..................................................................................... 82
9.1.3.2 Investigation of Failed Tests....................................................................................... 82
9.1.3.3 Produce Test Reports ................................................................................................ 82
9.2 Activity Input and Prerequisite ........................................................................ 83
9.2.1 Activity Inputs ............................................................................................................. 83
9.2.2 Activity Prerequisites .................................................................................................. 83
9.3 Activity Outputs .............................................................................................. 84
9.4 Process Management..................................................................................... 84
9.4.1 Initiating and Terminating Events ............................................................................... 84
9.4.2 Completion Criteria..................................................................................................... 84
9.4.3 Relations to other Activities ........................................................................................ 84
9.5 Task Descriptions ........................................................................................... 85
9.5.1 Identification of Test Cases ........................................................................................ 85
9.5.2 Construction of Test Procedures................................................................................ 87
9.5.3 Execution of Test Procedures .................................................................................... 88
9.6 Methods.......................................................................................................... 89
10.0 Annex A – ISVV Plan Outline....................................................................... 90
11.0 Annex B – Review Item Discrepancy Form Example ................................ 94
12.0 Annex C – Error Potential Questionnaire ................................................... 96
13.0 Annex D – Software Criticality Categories [ECSS-Q-80-03]...................... 98
Annex E – Procedures for Performing Simplified FMECA ................................... 100
13.1 System FMECA ............................................................................................ 100
13.2 Software Requirements FMECA .................................................................. 100
14.0 Annex F – Methods..................................................................................... 101
14.1 Formal Methods ........................................................................................... 101
14.2 Hardware Software Interaction Analysis....................................................... 101
14.3 Inspection ..................................................................................................... 102
14.4 Modelling ...................................................................................................... 103
14.4.1 Data Flow Analysis................................................................................................... 104

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page vii

14.4.2 Control Flow Analysis............................................................................................... 105


14.5 Real-Time Properties Verification ................................................................. 105
14.5.1 Schedulability Analysis............................................................................................. 105
14.5.2 Worst Case Execution Time Computation ............................................................... 105
14.6 Simulation (Design execution)...................................................................... 107
14.7 Software Common Cause/Mode Failure Analysis (SCCFA/SCMFA) ........... 107
14.8 Software Failure Modes, Effects and Criticality Analysis (SFMECA)............ 107
14.9 Software Fault Tree Analysis (SFTA) ........................................................... 107
14.10 Static Code Analysis..................................................................................... 108
14.10.1 Coding Standard Conformance................................................................................ 108
14.10.2 Bug Pattern Identification ......................................................................................... 108
14.10.3 Software Metrics Analysis ........................................................................................ 108
14.11 Traceability Analysis..................................................................................... 108
14.12 Walkthrough ................................................................................................. 109
15.0 Annex G – Checklists................................................................................. 110
15.1 Generic Document Review Checklists.......................................................... 110
15.2 Requirements Review Checklists ................................................................. 111
15.3 Architectural Design Review Checklist ......................................................... 112
15.4 Detailed Design Review Checklist................................................................ 113
15.5 Code Inspection Checklist ............................................................................ 114
15.6 Test Plan Review Checklist .......................................................................... 115
15.7 HSIA Checklist ............................................................................................. 115
15.8 Validation Checklist ...................................................................................... 118
16.0 Annex H – Software Validation Facility .................................................... 119

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page viii

figures:
Figure 1: ISVV Process Activities.............................................................................................. 12
Figure 2: Software Engineering and ISVV Processes............................................................... 13
Figure 3: ISVV process management in context....................................................................... 15
Figure 4: ISVV Process Management Tasks ............................................................................ 16
Figure 5: ISVV cost model ........................................................................................................ 18
Figure 6: Criticality Analysis in context...................................................................................... 27
Figure 7: ISVV Criticality Analysis Tasks .................................................................................. 28
Figure 8: Visualisation of a software architecture with diverse category levels assigned ......... 32
Figure 9: Technical Specification Analysis in context .............................................................. 40
Figure 10: Technical Specification Analysis activity.................................................................. 41
Figure 11: Software Requirements Independent Verification.................................................... 42
Figure 12: Design Analysis in context ...................................................................................... 48
Figure 13: Software Design Analysis ........................................................................................ 49
Figure 14: Software Architectural Design Independent Verification.......................................... 50
Figure 15: Software Detailed Design Independent Verification................................................. 51
Figure 16: Software User Manual Independent Verification...................................................... 51
Figure 17: Code Analysis in context.......................................................................................... 64
Figure 18: Code Analysis .......................................................................................................... 65
Figure 19: Software Source Code Independent Verification ..................................................... 66
Figure 20: Integration/Unit Test Procedures and Test Data Verification................................... 66
Figure 22: Independent Software Validation ............................................................................. 78
Figure 23: Subtasks to "Identification of Test Cases" ............................................................... 79
Figure 24: Subtasks to "Construction of Test Procedures" ....................................................... 81
Figure 25: Subtasks to "Execution of Test Procedures" ........................................................... 82

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page ix

tables:
Table 1: Competence requirements for ISVV personnel........................................................... 20
Table 2: ISVV levels.................................................................................................................. 29
Table 3: Default mapping from Software Criticality Category to ISVV level.............................. 31
Table 4: Matrix to derive ISVV level from Software Criticality Category and Error Potential..... 31
Table 5: Dependency between ISVV level, input and analysis ................................................. 80
Table 6: RID Form..................................................................................................................... 94
Table 7: RID Problem Type Categories .................................................................................... 95
Table 8: RID Severity Classes .................................................................................................. 95
Table 9: Error Potential Questionnaire...................................................................................... 96
Table 10: Mapping from error potential score to error potential level........................................ 97
Table 11: Software criticality categories for manned mission ................................................... 98
Table 12: Software criticality categories for unmanned mission ............................................... 98
Table 13: System reliability criticality categories....................................................................... 99
Table 14: System hazard severity categories ........................................................................... 99
Table 15: UML 2 diagram types .............................................................................................. 104
Table 16: UML 2.0 to UML 1.x mapping ................................................................................. 104

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page x

Foreword
This ISVV Guide is the result of work carried out for the European Space Agency by a
consortium of European Companies under ESA contract no. 18466/04/NL/AG. The companies
involved were:
• Det Norske Veritas (N)
• Terma (DK)
• SciSys (UK)
• Critical Software (P)

The following persons have contributed to the guide:


• Benedikte Larsen (Terma)
• Frode Høgberg (DNV)
• João Esteves (Critical Software)
• Kjeld Hjortnæs (ESA)
• Leslie Baldwin (SciSys)
• Maria Hernek (ESA)
• Narve Mjøs (DNV)
• Nuno Silva (Critical Software)
• Patricia Rodriguez Dapena (SoftWcare)
• Philippe Robert (ISOSCOPE)
• Poul Hougaard (Terma)
• Ricardo Maia (Critical Software)
• Roger Ward (SciSys)
• Sabine Krüger (ESA)
• Siegfried Eisinger (DNV)

In addition to this, representatives of primes have contributed with valuable input during
dedicated industry workshop.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 1

1.0 Introduction
1.1 Background and Motivation
Independent Software Verification and Validation (ISVV) is an engineering practice intended to
improve quality and reduce costs of a software product as well as reduce development risks by
having an organisation independent of the software developer’s perform verification and
validation of the specifications and code of a software product.

The global objective of this guide is to aid in establishing an improved and coherent ISVV
process across the European space industry in a consolidation of existing practice. Special
emphasis is placed on process efficiency. It is hoped that the guide will also be found to be
useful in other industries where software is a component of safety and dependability critical
systems (e.g. automotive, rail systems, medical systems).

The guide defines an ISVV process with management, criticality analysis, verification, and
validation activities. It provides advice on ISVV roles, responsibilities, planning, and
communication as well as methods to use for the various verification and validation tasks.
1.2 Purpose
The purpose of this guide is to:
• Define a uniform, cost effective and reproducible ISVV process across Projects, and to guide
in adapting it to each specific project;
• Assist the industry in getting predictable cost and quality out of the ISVV process;
• Clarify the benefits of applying ISVV;
• Improve ISVV project execution by highlighting the many different issues that need to be
clarified and considered in the various phases of the project;
• Disseminate best practices with respect to recommended methods for the different
verification and validation activities. ;
• Present a summary of the required capabilities of the Independent SVF in preparation of the
development and utilization of a specific one for each project.

The assumed readership of the ISVV Guide is primarily the customers and suppliers of ISVV
services, but also software developers, system suppliers (primes) and system customers are
likely to find the guide useful, be they verification / validation personnel, quality assurance
managers or technical managers.

The guide should be used in the preparation of a request for quotation for an ISVV service, in
preparation for a bid, during planning, execution and re-planning of an ISVV project.
1.3 Definitions
The definitions presented herein are for the purpose of readability of this document. These
definitions prevail when any discrepancy occurs with other standards’ definition.

activity A defined body of work to be performed, including its required input and
output information [IEEE 1074:1997]
critical item Component, material, software, sub-assembly, function, process or
technology, which requires special project attention [ECSS-P-001B:2004].
NOTE: In this document critical item is used as a common term denoting
critical system function, critical software requirement, critical software
component, or critical software unit.
critical List of critical software component as determined by Design Analysis
software Criticality Analysis (ISVV task) with assigned software criticality categories

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 2

components and ISVV levels.


list
critical List of critical software requirements as determined by Technical Specification
software Analysis Criticality Analysis (ISVV task) with assigned software criticality
requirements categories and ISVV levels.
list
critical List of critical software units as determined by Code Analysis Criticality
software units Analysis (ISVV task) with assigned software criticality categories and ISVV
list levels.
critical system List of critical system functions as determined by system level safety and
functions list dependability analyses with assigned software criticality categories and ISVV
levels.
criticality A measure of the consequence of an undesirable event.
NOTE: The consequence may be in terms of safety, dependability,
maintainability, security, environmental impact, economic impact etc.
dependability Collective term used to describe the availability performance and its
influencing factors: reliability performance, maintainability performance and
maintenance support performance.
NOTE: Dependability is used only for general descriptions in non-quantitative
terms.
[ISO 9000:2000] in [ECSS-P-001B:2004].
error potential An assessment of the potentially negative impact of characteristics of the
level development organisation, the development process or the software itself on
software quality.
ISVV An organisation or person that receives an ISVV service. The ISVV customer
customer is one of the two parties of an ISVV contract, the other being the ISVV
supplier. The ISVV customer is usually either a System supplier (system
integrator, prime, software customer) or System customer (system owner).
NOTE: See also the definition of customer in [ECSS-P-001B:2004].
ISVV level A number on an ordinal scale assigned to a software component or function
to designate the required level of verification and validation to apply to the
component or function. The ISVV level is defined from 0 to 2, with 0 meaning
no ISVV, 1 meaning light ISVV, 2 meaning rigorous ISVV.
ISVV supplier An organisation or person that provides an ISVV service. The ISVV supplier
is one of the two parties of an ISVV contract, the other being the ISVV
customer. The ISVV supplier must have full technical, managerial, and
financial independence with respect to the ISVV customer (in some cases
independence is reduced; there can be various degrees of independence).
NOTE: See also the definition of supplier in [ECSS-P-001B:2004].
item Item is used as a common term denoting system function, software
requirement, software component, or software unit.
process Set of interrelated or interacting activities which transform inputs into outputs
NOTE 1: Inputs to a process are generally outputs of other processes.
NOTE 2: Processes in an organization are generally planned and
carried out under controlled conditions to add value.
NOTE 3: A process where the conformity of the resulting product
cannot be readily or economically verified is frequently
referred to as a “special process”.
[ISO 9000:2000] in [ECSS-P-001B:2004].

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 3

safety System state where an acceptable level of risk with respect to:
- fatality,
- injury or occupational illness,
- damage to launcher hardware or launch site facilities,
- damage to an element of an interfacing manned flight system,
- the main functions of a flight system itself,
- pollution of the environment, atmosphere or outer space, and
- damage to public or private property
is not exceeded
NOTE 1: The term “safety” is defined differently in ISO/IEC Guide 2 as
“freedom from unacceptable risk of harm”.
[ECSS-P-001B:2004]
software See ‘software product’.
software Part of a software system.
component NOTE 1: Software component is used as a general term
NOTE 2: Components can be assembled and decomposed to form new
components. In the production activities, components are implemented as
modules, tasks or programs, any of which can be configuration items. This
usage of the term is more general than in ANSI/IEEE parlance, which defines
a component as a “basic part of a system or program”; in this Standard,
components are not always “basic” as they can be decomposed.
[ECSS-E-40B:2003]
software A general term covering critical system functions list, critical software
critical item list requirements list, critical software components list, critical software units list.
software Software criticality analysis is an analysis resulting in the definition of a
criticality software critical item list; it is carried out with the purpose of defining the
analysis scope of ISVV. Criticality is related with safety and dependability but may
refer to, for example, security, maintainability or any other property defined by
the ISVV Customer.
NOTE: This definition deviates from the usage (there is no definition) in
[ECSS-E-40B:2003] and [ECSS-Q-80B:2003] where software criticality
analysis is a safety and dependability analysis and a requirement for all
software.
software A number or letter designating the criticality of a failure mode or an item.
criticality Software criticality categories are defined as part of a software criticality
category scheme.
software The definition of a set of software criticality categories used for a specific
criticality project or purpose. The categories are ordered from low to high criticality.
scheme NOTE 1: There are usually 4 or 5 software criticality categories in a software
criticality scheme.
NOTE 2: In space projects, software criticality categories are usually named A
to D, with A being the most critical. For ISVV, software criticality categories
are numbered 1 to 4. This numerical scale will normally correspond to the
alphabetical software criticality categories so that 4 is equivalent to A, 3 to B,
2 to C, and 1 to D. However, the software criticality categories assigned to
functions, software products, software requirements, software components,
and software units for the purposes of ISVV may be different from those
assigned for development. The two scales are intended to avoid confusion.
software Set of computer programs, procedures, documentation and their associated
product data [ECSS-E-40B:2003]
NOTE: software and software item are synonyms of software product

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 4

software unit Separately compilable piece of source code


NOTE: In this Standard no distinction is made between a software unit and a
database; both are covered by the same requirement.
[ECSS-E-40B:2003]
test case A specification of a test in terms of:
- a description of the purpose of the test
- preconditions (e.g., the state of the software under test)
- actions
- post condition (expected reaction)
test procedure The instantiation of a test case, using a specific test language
validation Confirmation, through the provision of objective evidence that the
requirements for a specific intended use or application has been fulfilled
NOTE: The validation process (for software) is the process to confirm that the
requirements baseline functions and performances are correctly and
completely implemented in the final product.
[ECSS-E-40B: 2003]
verification Confirmation through the provision of objective evidence that the specified
requirements have been fulfilled.
NOTE The verification process (for software) is the process to confirm that
adequate specifications and inputs exist for any activity, and that the outputs
of the activities are correct and consistent with the specifications and input.
[ECSS-E-40B:2003]

1.4 Acronyms
AR Acceptance Review
B The B method (Formal Methods)
CA Code Analysis
CAR Code Analysis Review
CCS Calculus of Communicating Systems (Formal Methods)
CDR Critical Design Review
CFL Critical Function List
CR Criticality Analysis
CSP Communicating Sequential Processes (Formal Methods)
DAR Design Analysis Review
DDF Design Definition File
DDR Detailed Design Review
DJF Design Justification File
ESA European Space Agency
FDIR Fault Detection, Isolation and Recovery
FMEA Failure Mode and Effects Analysis
FMECA Failure Modes, Effects, and Criticality Analysis
FSM Finite State Machines
FTA Fault Tree Analysis
HOOD Hierarchical Object Oriented Design
HRT-HOOD Hard Real-Time HOOD

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 5

HSIA Hardware Software Interaction Analysis


HW Hardware
ICD Interface Control Document
ISVV Independent Software Verification and Validation
ISVVF ISVV Facility
ISVVL ISVV Level
IV&V Independent Verification and Validation. Synonym for ISVV. IV&V is the
acronym used by NASA and in [IEEE 1012:1998]
IVA Independent Validation
IVE Independent Verification
IVR Independent Validation Review
IVTP Independent Validation Test Plan
IVTPR Independent Validation Test Plan Review
LOTOS Language Of Temporal Ordering Specifications (Formal Methods)
MISRA C Motor Industry Software Reliability Association (C language)
NASA National Aeronautics and Space Administration
NDA Non-Disclosure Agreement
P&F Process & Facility
PAF Product Assurance File
PDR Preliminary Design Review
PN Petri Nets (Formal Methods)
QR Qualification Review
RAISE Rigorous Approach for Industrial Software Engineering (Formal Methods)
RAVEN Reliable Ada Verifiable Executive Needed
RB Requirements Baseline
RID Review Item Discrepancy
SCC Software Criticality Category
SDL Specification Description Language (Formal Methods)
SFMECA Software Failure Mode Effects and Criticality Analysis
SFTA Software Fault Tree Analysis
SPR Software Problem Report
SRR System Requirements Review
SUM Software User Manual
SVF Software Validation Facility
SW Software
TA Technical Specification Analysis
TAR Technical Specification Analysis Review
TS Technical Specification
UML Unified Modelling Language
V&V Verification and Validation
VDM Vienna Development Model (Formal Methods)
WBS Work Breakdown Structure

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 6

Z The Z method or, the Z Notation (Formal Methods)

1.5 References
[AFISC:1985] Software System Safety, AFISC SSH 1-1, Headquarters Air Force
Inspection and Safety Center, 5 September 1985
[ARTHUR:1999] James D. Arthur, Markus K. Gröner, Kelly J. Hayhurst, and C.
Michael Holloway, “Evaluating the Effectiveness of Independent
Verification and Validation,” IEEE Computer, October 1999.
[AUDSLEY:1991] Hard Real-Time Scheduling: the Deadline-Monotonic Approach, N.
C. Audsley, A. Burns, M. F. Richardson, and A. J. Wellings, IEEE
Workshop on Real-Time Operating Systems, 1991
[BS 7799-2:2002] BS 7799 Part 2, Specification for information security management
systems, 2002.
[BURNS:1993] HRT-HOOD: A Structured Design Method for Hard Real-Time Ada
Systems, A. Burns, A. Wellings, University of York, Version 2.0
Reference Manual, September 1993
[DETECT:1995] Comparing Detection Methods For Software Requirements
Inspections, IEEE Transactions in Software Engineering, 06/1995
[DNV ISVV:1992] Sven-Arne Solnørdal, Torbjørn Skramstad, and Jan Tore
Henriksen, Presentation of the ISVV Concept, ESSDE Reference
Facility Project (ESA 8900/90/NL/US(SC)), Doc.Ref.
ESSDE/MISC/B31, DNV Technical Report, April 9, 1992.
[DO-178B:1992] RTCA, DO-178B: Software Considerations in Airborne Systems
and Equipment Certification, December 1992.
[ECSS-E-40B:2003] ECSS, ECSS-E-40 Part 1B, Space engineering, Software – Part 1:
Principles and requirements, 28 November, 2003.
[ECSS-M-00-03B:2004] ECSS, ECSS-M-00-03B, Space project management, Risk
Managment, 16 August, 2004.
[ECSS-P-001B:2004] ECSS, ECSS-P-001B, Glossary of terms, 14 July 2004.
[ECSS-Q-40B:2002] ECSS, ECSS-Q-40B, Space product assurance - Safety, 17 May
2002.
[ECSS-Q-80-03d:2004] ECSS, ECSS-Q-80-03 Draft 1, Space product assurance, Methods
and techniques to support the assessment of software
dependability and safety, 8 April 2004.
[ECSS-Q-80B:2003] ECSS, ECSS-Q-80B, Space product assurance, Software product
assurance, 10 October, 2003.
[EN 50128:1997] CENELEC, EN 50128: Railway Applications: Software for Railway
Control and Protection Systems, 1997.
[HRTOSK:1991] Hard Real-Time Operating System Kernel: Overview and Selection
of Hard Real-Time Scheduling Model, British Aerospace and
University of York, ESTEC Contract “HRTOSK” - Task 1 Report,
1991
[IEC 60880:1986] IEC, IEC 60880: Software for Computers in Safety Systems of
Nuclear Power Stations, 1986.
[IEC 61508-1:1998] IEC, IEC 61508: Functional safety of
electrical/electronic/programmable electronic safety-related
systems – Part 1: General requirements, First Edition,1998.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 7

[IEEE 1012:1998] IEEE, IEEE Standard 1012: IEEE Standard for Software Verification
and Validation, 1998.
[IEEE 1074:1997] IEEE, IEEE Standard 1074: IEEE Standard for Developing
Software Life Cycle Activities, 1997.
[INSPEC:1976] Design and Code Inspections to Reduce Errors in Program
Development, IBM Systems Journal, Vol. 15 No. 3, 1976.
[ISO 9000:2000] ISO, ISO 9000: Quality management systems – Fundamentals and
vocabulary, 2000.
[ISVV TN2:2005] ISVV Process and Facility, TN 2 - Methods And Tools Tradeoff
Analysis, DNV Report No.: 2005-1033, Rev. 1.2, 28 August 2005.
[LEVESON:1987] Safety Analysis Using Petri Nets, Leveson, Nancy G., Janice L.
Stolzy, IEEE Transactions on Software Engineering, Vol. SE-13,
No. 3, The Institute of Electrical and Electronics Engineers, March
1987
[NASA IV&V] NASA, Software Independent Verification and Validation
(IV&V)/Independent Assessment (IA) Criteria,
http://ivvcriteria.ivv.nasa.gov.
[NIST5589:1995] A Study on Hazard Analysis in High Integrity Software Standards
and Guidelines, U.S. Department of Commerce, Technology
Administration, National Institute of Standards and Technology,
January 1995
[PASCON RAMS related static methods, techniques and procedures
WO12-TN2.1:2000] concerning software, Issue 1.0, 2 May 2000

1.6 Outline
The document consists of the following sections:
• The introduction of which this outline is a part describes the background and motivation for
ISVV as well as the purpose of the ISVV guide.
• Section 2.0 elaborates on the topic of ISVV, describing types of independence, the
objectives of ISVV as well as its relationship to development verification and validation.
• Section 3.0 provides an overall view of the ISVV process.
• Section 4.0 describes the ISVV process management activity, detailing on ISVV roles,
responsibilities, tasks and other aspects of management.
• Section 5.0 describes the Criticality Analysis activity, and how Criticality Analysis can be
used to identify the scope of ISVV.
• Sections 6.0, 7.0, 8.0 describe the verification activities of Technical Specification Analysis,
Design Analysis, and Code Analysis respectively.
• Section 9.0 describes the Independent Validation Activity.
• Finally, there are a number of annexes providing more detailed information related to the
various ISVV activities.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 8

2.0 What is Independent Software Verification and Validation?


2.1 Types of Independence
Independence is seen as very important in many areas of society: in the governance systems of
democratic countries (the separation of power into executive, legislative, and judiciary
branches), within the judicial system itself (requirements on the competence of judges, jurors,
and specialist witnesses), in financial auditing, in all types of product, process or personnel
certification, in ship classification, in other types of safety assessments required by public
authorities, in independent advice provided by two medical experts as well as in many other
institutions.

In the context of ISVV, independence is intended to introduce three important benefits:


• Separation of concerns. Any person or organisation is likely to discover that their activity
inevitably produces conflicting demands and interests. Clearly separating roles and
responsibilities ensures that such conflicts do not arise and also gives confidence of this to
other stakeholders.
• A different view. All persons have a limited horizon of understanding within which texts (both
written and oral) are interpreted and produced. A second opinion contributes to complement
the view of the other by identifying omissions, ambiguities, factual errors, logic errors etc.
• Effectiveness and productivity in verification and validation activities. Staff specialised in
independent software verification and validation develop technical competence and
motivation that should lead to more effective and productive work. This is especially the
case with verification and validation methods that necessitate the application of sophisticated
tools.

ISVV implies independence of verification and validation from development. Such


independence is a requirement in legal regulations and software safety standards within the rail
industry [EN 50128:1997], within the nuclear industry [IEC 60880:1986], and within the aviation
industry [DO-178B:1992]. There are different types of independence and the degree of
independence may vary.

ISVV also implies verification and validation additional and complementary to that carried out by
the software developer. Research has shown that such independence in verification and
validation of software produces better software for less money [ARTHUR:1999].

The fundamental idea of independence is that some verification and validation activities are
carried out by a person other than the person responsible for (the development/design of) the
product or process being verified. Independence is strengthened by increasing the emotional
and organisational distance between the developer and the verifier. Many safety-related
standards (e.g. [IEC 61508-1:1998]) thus distinguish between:
• independent person,
• independent department, and
• independent organisation.

The higher the criticality of the system (and the software), the more independence is required.
The independent person may belong to the same department as the writer/developer, but
should not have been involved in writing the specification or the code. This is the minimum level
of independence, frequently used for document reviews or desk checking within most
companies. The independent department requires verification to be carried out by people from
a different department within the same organisation. The department could be the quality
assurance department, or a department dedicated to V&V on a specific project. For two
organisations to be independent they must be different legal entities with different management

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 9

groups and preferably different owners. This level of independence is required for auditors in
financial auditing as well as various types of third party certification.

Two of the main benefits of independence mentioned above are that they provide differences in
points of view and separation of concerns. The IEEE Standard for Software Verification and
Validation [IEEE 1012:1998] distinguishes between different types of independence addressing
these concerns:
• technical independence,
• managerial independence, and
• financial independence.

The definitions used by the IEEE standard are as follows:

• Technical independence requires the V&V effort to utilize personnel who are not involved in
the development of the software. The IV&V effort must formulate its own understanding of
the problem and how the proposed system is solving the problem. Technical independence
("fresh viewpoint") is an important method to detect subtle errors overlooked by those too
close to the solution. For software tools, technical independence means that the IV&V effort
uses or develops its own set of test and analysis tools separate from the developer's tools.
Sharing of tools is allowable for computer support environments (e.g., compilers,
assemblers, utilities) or for system simulations where an independent version would be too
costly. For shared tools, IV&V conducts qualification tests on tools to ensure that the
common tools do not contain errors which may mask errors in the software being analyzed
and tested.
• Managerial independence requires that the responsibility for the IV&V effort be vested in an
organization separate from the development and program management organizations.
Managerial independence also means that the IV&V effort independently selects the
segments of the software and system to analyze and test, chooses the IV&V techniques,
defines the schedule of IV&V activities, and selects the specific technical issues and
problems to act upon. The IV&V effort provides its findings in a timely fashion simultaneously
to both the development and program management organizations. The IV&V effort must be
allowed to submit to program management the IV&V results, anomalies, and findings without
any restrictions (e.g., without requiring prior approval from the development group) or
adverse pressures, direct or indirect, from the development group.
• Financial independence requires that control of the IV&V budget be vested in an
organization independent of the development organization. This independence prevents
situations where the IV&V effort cannot complete its analysis or test or deliver timely results
because funds have been diverted or adverse financial pressures or influences have been
exerted.

The primary purpose of technical independence is thus to ensure a different point of view, while
the purpose of managerial and financial independence is separation of concerns. An
independent person may be sufficient to achieve technical independence. However, being part
of the same technical culture may still lead to basic assumptions being unquestioned.
Managerial independence requires at least an independent department. The same is true for
financial independence. However, for all of these types of independence, an increased
organisational distance will increase the independence and thus reduce the risk of assessments
being unduly influenced by non-relevant (to the verification/validation objective) concerns.

In European space industry, full technical, managerial and financial independence is required for
ISVV of critical software. The ISVV supplier is required to be an organisation independent of the
software supplier as well as the prime (system integrator).

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 10

In European space projects, the ISVV customer has traditionally been either the prime or the
end customer. The prime should not be the ISVV customer if the prime itself (or any of its
subsidiaries) is also developing software subject to ISVV.

The recommendation of this ISVV guide is that the ISVV supplier should be a fully independent
company and that the ISVV customer should be the end customer or the prime (unless the
prime is developing software subject to ISVV).

2.2 Objectives of ISVV


It is ultimately up to the ISVV customer to decide what the objective of ISVV should be.

As with any verification and validation activity, the objective of ISVV is to find faults and to raise
confidence in the software subject to the ISVV process. The emphasis of either of these
objectives may vary, depending on the maturity of the software, budget, time, the maturity of the
software supplier, the complexity of the software product, as well as the distribution of
responsibility between the software developer’s V&V and the ISVV supplier’s V&V.

The effectiveness of the ISVV process is evident when faults are actually found. However, if a
lot of faults continue to be found, the software will be considered immature and it cannot be
trusted. If, on the other hand, faults are not found, the reason may either be that the software is
actually devoid of problems or that the ISVV process is not effective. To make the ISVV process
a confidence raising measure for the software thus requires trust in the process itself and in the
people executing it.

Raising the confidence is particularly important for critical software, whose failure may lead to
hazardous events, damage to health, environmental damage, grave economic losses, or loss of
reputation. ISVV is therefore usually targeted to find critical faults with respect to safety or
dependability. This is also the main emphasis of this guide. However, in other cases, ISVV may
target other quality attributes, including security, maintainability, reusability, and usability.
2.3 ISVV is Complementary to Developer’s V&V
ISVV should provide added value over the verification and validation carried out by the software
developer. The approach of the ISVV supplier thus has to be complementary. What does it
mean in practice?

Complementarity may be introduced by having different:


• organisational missions and values,
• objectives,
• processes,
• methods,
• tools,
• people.

Both the developer’s verification and validation team and the ISVV team share the objective of
finding faults as early as possible. This requires a “destructive” attitude contrary to the
“constructive” attitude of developers. However, especially when pressed on time and budget,
the tendency of positive thinking in the developer organisation (they have developed the product
so they know it) may work to the detriment of the quality of the product, in particular what
concerns robustness. The ISVV team, not being subject to the same pressures, can focus
solely on finding possible weaknesses and faults, trying to break the software.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 11

The developer’s verification and validation process will have to comply with the requirements of
standards forming the basis for the contract with the software customer as well as company
internal requirements. Still there is a lot of room for customisation and interpretation. Even in
the same company, the verification and validation plan of one software product may thus look
different from that of another. The developer’s verification and validation plan is one of the
factors that the ISVV vendor should take into account to ensure complementarity.

ISVV may choose to use methods and tools different from those of the development
organisation. In some cases, one method is an alternative to another; in others, methods are
complementary and not substitutable.

There are often many different tools in the market supporting the same method. Where the
functionality of tools overlaps considerably, the complementarity of using a different tool may be
questioned. However, different implementations may still yield slightly different results,
reflecting particular strengths and weaknesses of the tools themselves. Where two tools
produce the same results, at least the confidence in the findings is increased.

Even if a verification or validation task is repeated, using the same methods and tools as before,
having it done by another person may still yield interesting results. Few verification and
validation methods and techniques are deterministic - unless wholly automated by tool. Two
persons carrying out the same verification activity will not get the same results or discover the
same problems; problems overlooked by one person may be found by the other.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 12

3.0 ISVV Process Overview


The ISVV Process consists of 6 activities: two management activities, three verification
activities, and one validation activity. The activities are shown in the figure below:

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 1: ISVV Process Activities

ISVV Process Management (MAN.PM) is concerned with issues such as roles, responsibilities,
planning, budgeting, communication, competence, confidentiality etc. It involves responsibilities
of both the ISVV customer and the ISVV supplier.

Criticality Analysis (MAN.CR) is an activity supporting both ISVV Process Management and the
Verification and Validation tasks. It provides important input for ISVV planning: How can the
available budget best be used? The activity defines the scope and rigour of subsequent V&V
activities by assigning software criticality categories and ISVV levels to software requirements,
components and units.

Technical Specification Analysis (IVE.TA) is verification of the Technical Specification, i.e.


software requirements. The activity ends with a Technical Specification Analysis Review (TAR).

Design Analysis (IVE.DA) is verification of the Software Architectural Design and the Software
Detailed Design. The activity ends with a Design Analysis Review (DAR).

Code Analysis (IVE.CA) is verification of the software source code. The activity ends with a
Code Analysis Review (CAR).

Validation (IVA) is testing of the software to demonstrate that the implementation meets the
Technical Specification in a consistent, complete, efficient and robust way. The activity ends
with an Independent Validation Review (IVR).

Figure 2 relates the ISVV activities to the software development processes and the review
milestones defined by [ECSS-E-40B:2003]. In addition, 4 ISVV reviews are defined. The figure
indicates possible early and likely start times as well as end times of the activities. More specific
guidance is provided with the individual activity.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 13

Figure 2: Software Engineering and ISVV Processes

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 14

Each of the ISVV activities is described in detail in the following sections. The ISVV Process
Management activity is given special treatment, but otherwise the main structure of an activity
description is:
• Activity overview
• Activity inputs and prerequisites
• Activity outputs
• Activity management
− Initiating and terminating events
− Completion criteria
− Relations to other activities
• Task descriptions
• Methods

Every activity is broken down into tasks and sometimes sub-tasks. Each task description (with
the exception of project management tasks) is described in a table format with the following
fields:

Title: Name of the task

Task ID: Task identifier

Activity: Identifier and name of the activity

Start Event: Start constraint for the task (might be tailored depending on the
characteristics/objectives of specific ISVV projects)

End Event: End constraint for the task (might be tailored depending on the
characteristics/objectives of specific ISVV projects)

Responsible: Identification of the responsible for the task execution, the ISVV supplier or
the ISVV customer.

Objectives: Main objectives to be accomplished by the task

Inputs: Inputs to the task

Sub Tasks (per ISVV Level): Task breakdown into subtasks, organised per ISVV level.

Outputs: Outputs of the task

A specific ISVV project may include all, one, or some of the verification and validation activities
referred to above and in turn some of its tasks and subtasks. There are dependencies
between the activities; output of previous activities is often used as input for later activities. If
one or more of the activities are defined to be outside the scope of the ISVV project, some of
the tasks of the activity may nevertheless have to be performed for the prerequisites (in terms
of required input) of other activities to be fulfilled. The dependencies are described as part of
the individual activity descriptions.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 15

4.0 ISVV Process Management


4.1 Activity Overview
The objective of ISVV Process Management is to define the overall ISVV plan and to control
and monitor the ISVV process. As can be seen from Figure 3 it is a management activity of the
ISVV process.

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 3: ISVV process management in context

ISVV Process Management (PM) consists of two tasks as shown in Figure 4:


• ISVV Process planning
• ISVV Process monitoring and control

The figure also shows the inputs and outputs of each task.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 16

ISVV Process Management

Software
Development Plan

Software Product
Assurance Plan ISVV Plan

Software Verification
and Validaton Plan ISVV Process Planning

Requests for
Clarification
Documents and Code
from Software
Development

ISVV Findings ISVV Report (with


Resolution Report ISVV Process ISVV Findings)
Monitoring and
Control

Criticality Analyses
Progress Reports

ISVV Plan

Figure 4: ISVV Process Management Tasks1

The following topics are discussed in the following section, introducing important aspects to
consider for the Management of any ISVV process:
• Roles, responsibilities, and tasks;
• Criticality analysis, definition of scope, and budgeting;
• Scheduling and milestones;
• Communication and reporting;
• Competence and motivation;
• Non-disclosure and security.
4.1.1 Roles and Responsibilities
Independent Software Verification and Validation is a service provided by an ISVV supplier to
an ISVV customer. In addition, the ISVV process may have interfaces to other roles:
• Software supplier (software developer)
• Software validation facility supplier
• System supplier (system integrator, prime, software customer)
• System customer (system owner)

One of the two latter roles is also likely to be the ISVV customer.

The responsibilities of the ISVV customer and the ISVV supplier are clearly indicated in the
task descriptions in section 4.5.

1
Note that figure shows only the most important inputs and outputs.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 17

The following subsections describe the interfaces to the software supplier and the software
validation facility supplier.
4.1.1.1 Responsibilities of Software suppliers
The involvement of the software supplier in ISVV includes:
• Providing documents and code for ISVV;
• Assisting the ISVV customer in responding to requests for clarifications from the ISVV
supplier;
• Assisting the ISVV customer in assessing the findings of the ISVV supplier, their criticality
and resolution;
• Investigating and following up software problem reports resulting from ISVV findings.

All communication between ISVV supplier and any of the software suppliers (when allowed)
shall be copied to the ISVV customer.
4.1.1.2 Interface with Software Validation Facility supplier
The Software Validation Facility supplier is the party providing the Software Validation Facility
for the ISVV supplier’s independent validation activity.

The involvement of the SVF supplier could be minimal, i.e. just providing the SVF for a given
period, or it could involve tasks such as specification and execution of test procedures, and
reporting of test results.

It is the ISVV customer’s responsibility to ensure the ISVV supplier gets (or gets access to) the
SVF. The recommendation of this ISVV guide is that the SVF be provided to the ISVV
supplier. This secures the ISVV project’s access to the SVF, also in critical phases of the
project where resource contention would otherwise easily occur.
4.1.2 Criticality Analysis, Definition of Scope and Budgeting
The budget for ISVV should reflect the criticality of the software to be scrutinised. One would
also like to distinguish between different criticality categories to allow one to expend more effort
in the verification and validation of high criticality software than on software of lesser criticality.

This is the objective of the so-called Criticality Analysis (see definition in section 1.3), which
identifies the Software Criticality Category and ISVV Level of software items at various levels of
specification (software requirements, component, unit), both reducing the number of items
subject to ISVV and determining which verification and validation tasks to carry out for each
individual item.

As already indicated, Criticality Analysis is carried out throughout the ISVV project, refining the
scope as software development becomes more detailed. However, the first Criticality Analysis
task is the most important one as it is carried out by the ISVV customer and forms the basis for
allocating a budget for the ISVV contract. The ISVV supplier may also repeat the analysis to
verify the realism of the budget.

Later analyses are intended to refine the scope even further and should be done by the ISVV
supplier, with the ISVV customer reviewing the results and accepting the specified scope. The
criticality analyses may lead to updates to the ISVV plan and budget.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 18

Resource / attribute:
Competence/ Product / attribute:
experience/
productivity Process / attribute:

Cost:

Component
maturity /
complexity / Man-hours Man-hour cost
size

ISVV task
Number of cost
rounds

Tool Tool cost


ISVV task /
ISVV Level
rigour

ISVV methods

Figure 5: ISVV cost model

Figure 5 shows a model breaking ISVV costs into two major components, man-hour costs and
tool costs (there may be other costs, but these are for the moment ignored).

Man-hour costs depend on hourly rates and the estimated number of hours. Work hours are
spent on specific verification/validation tasks as well as on management. The number of hours
is a function of the competence, experience and productivity of the person carrying out the
activity, the size, complexity, and maturity of the work product (document or code) under
scrutiny, the number of rounds of verification/validation (repetitions), the ISVV task carried out,
as well as the type of verification/validation method applied. The rigour with which an ISVV
task is to be carried out (if it is to be carried out at all), depends on the ISVV Level. The ISVV
task is supported by one or more ISVV methods. The number of repetitions may have a big
impact on costs, and for fixed-price contracts it is thus crucial that the number of repetitions is
defined.

Tools support specific methods for specific verification and validation tasks. Tool costs could
be broken down into investment costs (or deprecation costs) and costs for using the tool
(based on hourly rates). The use of tools may greatly increase the efficiency of carrying out
ISVV tasks, thereby reducing man-hours.

Determining the total ISVV cost requires calculation of man-hour costs and tool costs for all of
the V&V tasks. To ensure a repeatable process, the work breakdown structure should be
defined. The ISVV Level will affect the cost through the set of verification and validation tasks
defined, and the rigour with which tasks are carried out.

A fundamental question is whether the ISVV budget should somehow be linked to the budget
of the software development project, e.g. as a percentage adjusted according to the ISVV level
of the software. The advantage is that the budget is immediately scaled to reflect the financial
realities of the project and expectations of stake holders; it makes for a good rule of thumb.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 19

The alternative is a bottom-up approach calculating the budget from the number of critical
functions and their ISVV levels.

With any estimation activity there is uncertainty. As work progresses through the phases of
the ISVV project (technical specification analysis, design analysis, code analysis) more
information becomes available (e.g. about the maturity of documents) and better estimates can
be made for remaining phases. Criticality analyses performed also provides input to the
scoping of the ISVV activities, both in terms of which software items to submit to ISVV and in
terms of which ISVV tasks to carry out (derived from ISVV Level).
4.1.3 Scheduling and Milestones
Scheduling the ISVV project is difficult, as there is a strong dependence on the progress of the
software development projects. Delays in software development activities will cause
corresponding delays in ISVV activities. A scheduled date for the start of activities may be
provided, with the understanding that this date may have to change if deliverables from the
software development projects are late.

This guide does not identify specific initiating events for ISVV activities. A general
recommendation is that input documents and code from the software suppliers should be
sufficiently mature. This is usually the case after the implementation of development reviews:
PDR for Technical Specification, DDR for Architectural and Detailed Design, and CDR for
Code. The Independent Validation activity consists of three major tasks: identification of test
cases, specification of test procedures, and execution of test procedures. Identification of test
cases may start as early as PDR, when stable documentation becomes available. To carry out
the independent software validation testing effectively requires completion of the software
development validation testing – this usually finishes at QR.

In some cases, documents may mature earlier, or it may be desirable to have ISVV provide
input to the development review – along with other review comments. In this case, ISVV
activities might start earlier. More guidance on initiation of the different verification and
validation activities is provided with the description of the activities.

A set of additional milestones have been defined for the ISVV project. These milestones have
been presented in section 3.0.
4.1.4 Quality Management System
The ISVV supplier should have a suitable quality management system (fulfilling the
requirements of [ISO 9000:2000]) as well as an information management system (see section
4.1.5).

The quality management system should cover topics such as:


• Organisation
• Operations
• Sub-contractors
• Internal audits
• Documentation
• Recruitment, Training, and Update of skills

The ISVV supplier should have a proper software documentation and configuration
management system to manage and control all inputs from the software supplier, as well as all
their ISVV activities’ outputs (verification reports and tests documentation, procedures and
reports).

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 20

As already mentioned earlier the ISVV process has many uncertainties (availability of inputs
from the software supplier, etc) and risky elements (maturity of the elements under ISVV,
maturity of the SVF, etc) that should be properly managed and controlled by the ISVV supplier
through a formalised risk management process.
4.1.5 Non-Disclosure and Security
Spacecraft software is high value intellectual property. It is therefore important that access to
documents and code (both source and executable code) is strictly controlled when handed
over to the ISVV supplier2. The ISVV supplier and other stakeholders involved in the ISVV
process, must fulfil requirements both with respect to non-disclosure and with respect to secure
handling of information.

The ISVV customer must in cooperation with the software suppliers (or the intellectual property
owner) and other stakeholders (system supplier/customer) determine the confidentiality
requirements, including:
• whether there should be different confidentiality classes for documents and what those
classes are;
• requirements for distribution and storage of confidential documents;
• requirements for personnel authorised to handle confidential documents.

The ISVV customer must also identify the documents required for the ISVV process and their
confidentiality class.

The ISVV supplier should have an information security management system in place to ensure
that distribution, storage, and handling of data fulfils the confidentiality requirements (e.g.
based on [BS 7799-2:2002]).
4.1.6 Competence
Independent software verification and validation requires special competence.

Requirements on the competence on the individual should cover formal education, experience,
as well as personal traits. There is still no consensus on what the requirements for ISVV
personnel should be, but we have included an example below (adapted from [DNV
ISVV:1992]):

Formal education ISVV personnel should have a university degree in software


engineering or computer science. It will also be beneficial to have
some formal background in hardware electronics and systems
engineering as well as the domain itself, e.g. aerospace. Personnel
should also have received proper training in quality assurance and
quality control and testing.
Experience ISVV personnel should have at least 5 years of working experience
with software development of which at least 2 should be related to
verification, validation or quality assurance. ISVV personnel should
also have at last 2 years of experience within the space domain.
Personal traits ISVV personnel should be “creative destructors”, rigorous, process
mindful, objective, and result-oriented.
Table 1: Competence requirements for ISVV personnel

2
The software developer will most likely require the ISVV supplier not to be involved in any kind of
competing software development.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 21

The “creative destructor” characteristic deserves a comment. Verification/validation personnel


(both of the software developer and the ISVV team), are charged with finding as many faults as
possible. They must thus exhibit creativity at breaking the system, at being “destructive”.

An ISVV project is usually carried out by an ISVV team, not a single individual. The ISVV team
should be composed to provide a mix of complementary competencies. The team as such
should be familiar with all methods and tools to be employed for the analyses. In addition, the
ISVV team manager should be experienced with project management, including the
management of ISVV projects. The project manager must also be able to handle the
contractual and human relations aspects of the project, and should also have sufficient
personal authority to defend the findings of the ISVV team.

4.2 Activity Inputs and Prerequisites


The following work products are input for the ISVV Management activity:
• From ISVV Customer:
− Criticality Analyses (System Level)
− Software Development Plan
− Software Product Assurance Plan
− Software Verification and Validation Plan
− Documents and Code from Software Development
− ISVV Findings Resolution Report

• From ISVV Supplier:


− Criticality Analyses

There are no particular prerequisites for starting the ISVV Management Activity.

The following subsections provide more detail about the activity inputs.

4.2.1 Software Criticality Analyses


The ISVV software criticality analysis activity (see section 5.0) produces critical items lists
defining the scope for subsequent verification and validation activities. The analyses are either
carried out by the ISVV customer or the ISVV supplier, but in any case needs to be
communicated to the other party and are to be approved by the ISVV customer. The scope
resulting from the criticality analyses must be reflected in the ISVV plan. The initial ISVV plan
is based (among other inputs) on the initial criticality analysis results and updated as later
criticality analyses refines the scope.

4.2.2 Documents and Code from Software Development


The main input to the ISVV project is the documents and code to be verified and validated.
Additional documentation may also be required, e.g. system requirements, system
architecture, safety analyses, etc. In addition, process documents such as development
standards and quality assurance procedures may also be required. Documents and code to be
verified or validated should be reasonably mature and stable before being subject to ISVV
activities. This normally means that they have been submitted for or been through
development reviews. Earlier versions of the documents and code may be provided to the
ISVV supplier for familiarisation, especially if deadlines for the actual ISVV are short.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 22

4.2.3 ISVV Findings Resolution Report


For each ISVV report, the ISVV customer should produce an ISVV findings resolution report,
describing how the ISVV findings have been dealt with, whether they are discarded or result in
a software problem report. This may be part of the review item discrepancy form in which
findings are reported.

4.3 Activity Outputs


The following work products are produced by the ISVV Management activity:

• ISVV Plan
• Requests for Clarification
• ISVV Report (with ISVV Findings)
• Progress Reports

The following subsections provide more detail about the activity outputs.

4.3.1 ISVV Plan


The ISVV plan is the primary tool of the ISVV project manager for planning and executing the
ISVV process. It describes the purpose, scope, assumptions and constraints of the project, the
activities, the schedule and milestones, the resources, meetings and reviews, as well as the
interface with the ISVV customer and other parties. A sample ISVV plan outline is provided in
section 9.0. A preliminary ISVV plan should be prepared as part of the bid for the ISVV
contract, and afterwards updated at the start of the project. The ISVV plan (or at least parts of
it) should later be updated as the project progresses (as required). One of the major inputs for
the plan is the criticality analyses, which identifies the scope for verification and validation
activities (see section 5.0). The initial ISVV plan must be approved by the ISVV customer and
possibly also other parties (e.g. software end customer, software suppliers).

4.3.2 Requests for Clarification


In some cases, the contents of documents and/or code may be considered unclear or difficult
to understand by the ISVV supplier. The way to handle this depends on how strict one wants
to keep the technical independence of the ISVV supplier. There are two main options: either to
report the obscurity as an ISVV finding or to request a clarification from the ISVV customer.

Reporting everything as findings has the advantage that:


• with time, it will probably lead to better, more precise specifications.

The disadvantage is that:


• a number of these findings are likely not to be problems, but may reflect the ISVV supplier’s
limited understanding of the system and the software application (especially early in the
ISVV project). Receiving too many findings which are seen as irrelevant, may undermine
the credibility of the ISVV supplier in the view of the ISVV customer and software
developers.

Requesting clarifications has the advantage of:


• allowing the ISVV supplier to continue with the ISVV activity with a better understanding of
system and the software.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 23

The disadvantage is that:


• the ISVV supplier may in this way adopt faulty assumptions and presuppositions held by
the ISVV customer, thus effectively masking a potential problem.

This is a real dilemma. Both approaches have been tried in real projects and there is no
definite conclusion as to what is best. The recommendation is to allow requests for
clarification, but to ensure that such requests are properly recorded so that the obscurity found
is not just forgotten about. The clarifications made and the responses given should be
included in the ISVV report.
4.3.3 ISVV Report (with ISVV Findings)
The findings of ISVV are reported in an ISVV report. There will usually be several ISVV reports
produced by an ISVV project, e.g. one per ISVV activity per software product. The ISVV report
shall highlight all the potential problems found as identified by the ISVV activity. The ISVV
supplier shall classify the findings into e.g. ‘major’, ‘minor’, and ‘’comment’, based on an
assessment of the potential consequence of the finding. The classification must later be
assessed by the ISVV customer who will possibly make a reclassification. The ISVV report
should be presented to the ISVV customer who should review it and accept it. The purpose of
the review meeting is to make sure that the ISVV report is understood by the customer. If
requests for clarification have been allowed, the request and the clarification should also be
included in the verification report. In the end, the ISVV customer must approve the ISVV
report. An example form for reporting individual review item discrepancies is included in
section 11.0.

The cost of implementing any ISVV finding to correct the supplied software is lower the earlier
it is being done. The ISVV supplier may therefore provide early feedback on major issues to
the ISVV customer. The ISVV customer is not required to respond to these findings.

It is the responsibility of the ISVV customer to filter the ISVV findings as presented in the ISVV
report and consider whether a particular finding warrants the creation of a software problem
report. The software problem report (SPR) is the usual mechanism by which a software
supplier is notified that a problem exists with the software. The status of all SPRs sent to the
software supplier could be reported to the ISVV supplier to optimize the ISVV activities and
tasks.
4.3.4 Progress Reports
For ISVV projects of more than a few week’s duration (as is likely to be the case for most
projects), the ISVV supplier should provide regular progress reports to the ISVV customer.
The progress report will report the progress of the project with respect to plan and (if not a fixed
price contract) budget, also notifying the customer of any problem areas. Progress reports will
often be issued in conjunction with progress meetings.
4.4 Activity Management
4.4.1 Initiating and Terminating Events
The ISVV Management activity starts for the ISVV customer when it becomes clear that a
software product for which the customer is responsible (either as developer or integrator) will
require ISVV. This may be at an early stage in the ISVV customer’s process of bidding for the
development of the software or the system containing the software. The ISVV customer will
start the activity when deciding to prepare a response to a request for ISVV services.

ISVV Management ends with the close of the ISVV contract, i.e. with the acceptance by the
ISVV customer of all deliverables required by the contract and described in the ISVV Plan.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 24

4.4.2 Completion Criteria


The ISVV Management activity completes when all ISVV activities, tasks and subtasks defined
by the ISVV Plan have been carried out and all deliverables have been accepted by the ISVV
customer.
4.4.3 Relations to other Activities
The ISVV Management activity shall manage all of the other activities of the ISVV project.

Criticality Analysis provides important input to the ISVV Management activity for budgeting and
planning.

4.5 Task Descriptions


Note that responsibility for the ISVV Management tasks is shared between the ISVV Customer
and the ISVV Supplier. Responsibility is defined at the subtask level.

4.5.1 ISVV Process Planning


TASK DESCRIPTION
Title: ISVV Process Planning Task ID: MAN.PM.T1
Activity: ISVV Management
Start event: Start of project
End event: End of project
Responsible: ISVV Customer and ISVV Supplier
Objectives:

- Plan the ISVV process

Inputs:

- From ISVV Customer:


- Criticality Analyses (System Level)
- Software Development Plan
- Software Product Assurance Plan
- Software Verification and Validation Plan
- From ISVV Supplier:
- Criticality Analyses
Sub Tasks (per ISVV Level):

- MAN.PM.T1.S1: Define ISVV objectives (ISVV Customer)


The main objectives of the ISVV (what quality attribute(s) are critical?) must be defined by the
ISVV Customer. See also section 2.2.
- MAN.PM.T1.S2: Perform System Level Software Criticality Analysis (ISVV Customer)
The ISVV Customer should perform a criticality analysis to identify the need for ISVV, its scope
and the initial critical items list (see section 5.5.1).
- MAN.PM.T1.S3: Define the ISVV scope and determine the ISVV budget (ISVV Customer)
The ISVV Customer should determine the overall ISVV budget frame based on the mission costs
3
and the ISVV scope and level (see sections 4.1.2) .
- MAN.PM.T1.S4: Perform Technical Specification Criticality Analysis (ISVV Customer or
ISVV Supplier)

3
There are of course also other parties influencing this process, e.g. the ISVV suppliers.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 25

Perform the Technical Specification Criticality Analysis to identify the ISVV scope, level, and
critical software requirements list. This may be carried out by the ISVV Customer or the ISVV
Supplier. See also section 5.5.2.
- MAN.PM.T1.S5: Estimate ISVV budget (ISVV Supplier)
The ISVV Supplier should do an independent estimation of the ISVV budget. See section 4.1.2.
- MAN.PM.T1.S6: Develop ISVV plan (ISVV Supplier)
The ISVV Supplier must define an ISVV plan (a draft could be part of the proposal). The plan
should be approved by the ISVV Customer. The developer’s software development plan,
software product assurance plan, and software verification and validation plan should be taken
into account if available (overall coordination planning data is to be provided by the ISVV
Customer). See section 4.3.1. An outline of a sample ISVV plan is found in section 10.0.
- MAN.PM.T1.S7: Approve ISVV Plan (ISVV Customer)
The ISVV Customer should approve the ISVV plan developed by the ISVV Supplier. An outline
of a sample ISVV plan is found in section 10.0.
- MAN.PM.T1.S8: Determine confidentiality issues and prepare NDAs (ISVV Customer)
It is the responsibility of the ISVV Customer to clarify confidentiality requirements and ensure
these are kept throughout the project through the signing of Non-Disclosure Agreement with the
ISVV Supplier and any of its sub-contractors (see section 4.1.5).
- MAN.PM.T1.S9: Approve scope definition resulting from Criticality Analysis (ISVV
Customer)
All criticality analyses must be approved by the ISVV Customer. See also section 4.2.1.
Outputs:

- ISVV plan (ISVV Supplier)

4.5.2 ISVV Process Execution, Monitoring and Control.


TASK DESCRIPTION
Title: ISVV Process monitoring and control Task ID: MAN.PM.T2
Activity: ISVV Management
Start event: Project start
End event: Project end
Responsible: ISVV Customer and ISVV Supplier
Objectives:

- Execute, monitor, and control the ISVV process

Inputs:

- From ISVV Customer:


- Criticality Analyses (System Level)
- Software Development Plan
- Software Product Assurance Plan
- Documents and Code from Software Development
- ISVV Findings Resolution Report
- From ISVV Supplier:
- Criticality Analyses
- ISVV Plan
Sub Tasks (per ISVV Level):

- MAN.PM.T2.S1: Manage ISVV project (ISVV Supplier)


The ISVV Supplier must manage the project in accordance with the ISVV plan. This includes

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 26

schedule management (see section 4.1.3), budget management, resource management, activity
management, risk management, quality management, document management, and security
management.
- MAN.PM.T2.S2: Submit documentation and code to ISVV Supplier (ISVV Customer)
It is the responsibility of the ISVV Customer to provide all documentation and code necessary for
ISVV planning and for the verification and validation activities to the ISVV Supplier. See also
section 4.2.2.
- MAN.PM.T2.S3: Check received documentation (ISVV Supplier)
Any documentation and code received from the ISVV Customer or other parties of the ISVV
should be registered and checked by the ISVV Supplier.
- MAN.PM.T2.S4: Perform verification and validation activities (ISVV Supplier)
The ISVV Supplier must carry out the verification and validation activities as described in the
ISVV plan.
- MAN.PM.T2.S5: Request clarifications (ISVV Supplier)
The ISVV Supplier may request clarification from the ISVV Customer. See section 4.3.2.
- MAN.PM.T2.S6: Respond to Requests for Clarification (ISVV Customer)
Whenever the ISVV Supplier issues a Request for Clarification, the ISVV Customer should
provide feedback in a timely manner (see section 4.3.2)
- MAN.PM.T2.S7: Report early ISVV findings (ISVV Supplier)
The ISVV Supplier may provide early feedback on findings to the ISVV Customer.
- MAN.PM.T2.S8: Review early ISVV Findings (ISVV Customer)
The ISVV Customer shall review received early ISVV findings for criticality and impact on the
software/system, and shall take action as appropriate.
- MAN.PM.T2.S9: Produce ISVV verification report (ISVV Supplier)
For each ISVV activity (as defined by the ISVV plan), the ISVV Supplier must produce an ISVV
verification report in which all of the findings reported. See section 4.3.3.
- MAN.PM.T2.S10: Conduct Review Meeting (ISVV Customer)
The findings and their resolution are discussed during a review meeting with participation of all
related parties. The meeting is the responsibility of the ISVV Customer.
- MAN.PM.T2.S11: Produce ISVV findings resolution report (ISVV Customer)
In response to each ISVV report, the ISVV Customer should produce an ISVV findings resolution
report, describing how each finding is resolved. The reports should be distributed to the ISVV
Supplier and the end customer (see section 4.2.3).
- MAN.PM.T2.S12: Implement resolutions (ISVV Customer)
The ISVV Customer is responsible for ensuring that the resolutions described in the ISVV
findings resolution report are implemented. The ISVV Supplier is not responsible for following-up
the findings.
- MAN.PM.T2.S13: Update criticality analyses (ISVV Supplier)
The criticality analysis may be updated throughout the project to further limit the scope of
subsequent verification and validation activities. This is the responsibility of the ISVV Supplier,
although the ISVV Customer may also be involved. See sections 5.5.3 and 5.5.4.
Outputs:

- Requests for Clarification (ISVV Supplier)


- ISVV Report (with ISVV Findings) (ISVV Supplier)
- Progress Reports (ISVV Supplier)

4.6 Methods
Methods used for ISVV Process Management are not different from project management
methods in general and will not be further discussed in this Guide.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 27

5.0 Criticality Analysis


5.1 Activity Overview
The objective of the Software Criticality Analysis is to limit the scope and guide subsequent
verification and validation activities. As can be seen from Figure 6, it is a management activity
of the ISVV process.

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 6: Criticality Analysis in context

Software Criticality Analysis (CR) consists of four tasks as shown in Figure 7:


• System level software criticality analysis
• Software technical specification criticality analysis
• Software design criticality analysis
• Software code criticality analysis

The figure also shows the inputs and outputs of each task.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 28

Figure 7: ISVV Criticality Analysis Tasks4

4
Note that figure shows only the most important inputs and outputs.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 29

If ISVV is to be limited to only a subset of the verification and validation activities, some of the
Software Criticality Analysis tasks may not be included. For example, for the verification
activities, if Code Analysis is not to be carried out there is no need to carry out Software Code
Criticality Analysis. If both Design Analysis and Code Analysis are to be excluded, both
corresponding Software Criticality Analysis activities may be excluded as well.

The System Level Criticality Analysis and the Software Technical Specification Criticality
Analysis will always have to be carried out - also in the case where ISVV consists only of
Independent Validation. In the case where earlier verification activities have been left out (e.g.
there is no Technical Specification Analysis or Design Analysis), but there are still verification
activities included in ISVV, Criticality Analyses of left out verification activities may still have to
be carried out to ensure that prerequisites for the remaining ISVV activity/activities are fulfilled.

Software Criticality Analysis is carried out using (Software) Failure Modes, Effects and
Criticality Analysis ((S)FMECA), supported by traceability analysis, control flow/call graphs
analysis, and complexity measurements.

It is important to emphasise that the use of these methods would not be as rigorous for the
Software Criticality Analysis as for the Safety and Dependability Analyses to be carried out as
part of the verification activities. The purpose here is not to find all potential problems
(hazards, failures, etc), but to scope the verification and validation activities. Also, the
performance of these specific analyses depends to some degree on what analyses are already
available from the software developer or system integrator.

The main outputs of the ISVV Criticality Analysis activity are:


• Critical system functions list
• Critical software requirements list
• Critical software components list
• Critical software units list

Items of the lists will usually be grouped by software product. For a given software product, the
list will include all items included in the product with a software criticality category and an ISVV
level assigned to each of them.

The following sub-sections discuss important concepts of criticality analysis.


5.1.1 ISVV Levels and Software Criticality Categories
The ISVV Level is a number on an ordinal scale assigned to a system function, software
requirement, component or unit to designate the required level of verification and validation.
System functions will not be verified as part of ISVV, but will nevertheless have to be assigned
ISVV Levels because the critical system functions list will be the basis for budgeting of the
ISVV project.

The following ISVV Levels are defined:

Level Description
ISVVL 0 No ISVV activities are required.
ISVVL 1 Basic ISVV is required.
ISVVL 2 Full ISVV is required.
Table 2: ISVV levels

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 30

For a given Verification and Validation activity, the ISVV Level provides guidance to the
selection of tasks and the rigour of performing each task within the activity. As a guiding
principle, verification and validation tasks at ISVVL 1 consists of scrutinizing analyses already
performed by the software developer whereas for ISVVL 2, the ISVV supplier will perform
independent analyses.

When a given verification task is to be applied at ISVVL 2, then the input to the task will be all
items of the critical item list which have been assigned an ISVV Level of 2. Some tasks shall
only be applied at ISVVL 1 whereas other tasks apply to both ISVVL1 and 2.

In some cases, the verification tasks cannot be applied to individual items, but the entire
specification of a software product should be taken as input. Examples of this are the
verification of readability of a design document (IVE.DA.T2.S5) or the verification of timing and
sizing budgets of software (IVE.DA.T2.S6). To determine whether such a task should be
carried out for a specific software product, one has to consider the ISVV Level of any item
contained in it.

The ISVV Level is derived from the Software Criticality Category (SCC) but may be adjusted
upward if there are other risk factors warranting increased verification and validation (see next
section).

Criticality is defined as a measure of the consequence of an undesirable event. What


constitutes an undesirable event must be defined when defining the ISVV objective (see
section 2.2). This ISVV guide focuses on consequences in terms of safety or dependability,
but other issues could also be addressed, like security, maintainability, or economic loss.

A software criticality scheme, defining software criticality categories, will usually have been
defined for the software development projects, to allow tailoring of the development process to
the criticality of the software. A common scheme may have been defined for all of the software
products embedded in the system or several different schemes may have been in use.

What was deemed critical for the development project may or may not be what is considered
critical for ISVV. Before adopting any existing criticality scheme, it should be carefully
scrutinised to ensure it is aligned with the ISVV objectives.

If the existing software criticality scheme is not appropriate for the purpose of the ISVV, a new
scheme will have to be defined. Inspiration may be taken from the examples included in
section 13.0 of this document (from [ECSS-Q-80-03d:2004, Annex B]). There are two
examples from the domain of space engineering, where the distinction between un-manned
and manned missions is fundamental. The examples relate to safety and dependability. It
has often proved difficult to include mission success criteria (e.g. related to availability) in the
definition of software criticality categories. This could be taken into account in when defining
software criticality categories for ISVV.

The software criticality scheme may be based either on a scale of letters, A to D, or on


numbers 1 to 4. In this guide we will assume a numerical scale with 1 being lowest criticality
and 4 the highest. This is partly to ensure that the ISVV software criticality categories are not
confused with development project criticality categories which are often defined in terms of the
letters (in the space domain). Where ISVV adopts the criticality scheme of the development
project having a separate scale is not strictly necessary, but it could still be useful, since the
ISVV criticality categories assigned to items may (at times at least) deviate from those
assigned by the development project. A mapping will then have to be done between the two

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 31

schemes. For the example criticality categories presented in annex D, the natural mapping is
to make 4 equivalent to A, 3 to B, 2 to C, and 1 to D.

Throughout the various stages of the criticality analysis activity, criticality categories are
assigned to failure modes (of functions/requirements/components), as well as to system
functions, software requirements, software design components and software units.

As a baseline, the ISVV Level for a software item is determined by the software criticality
category as follows:

SCC ISVVL
SCC 4 ISVVL 2
SCC 3 ISVVL 1
SCC 2 ISVVL 0
SCC 1 ISVVL 0
Table 3: Default mapping from Software Criticality Category to ISVV level
5.1.2 Adjusting the ISVV Level
The ISVV Level of a system function, software requirement, component, or unit is primarily
determined by its criticality category.

However, there may be a range of factors associated with a specific software product, which
may lead one to consider intensifying the verification and validation of the software. These
factors are not related to the criticality of the software as determined by safety or dependability
analyses, but to characteristics of the development organisation, the development process or
the software itself which may affect the quality of the software. We will call it error potential.

This type of adjustment is most appropriate for the initial software criticality analyses, i.e.
System Level Software Criticality Analysis or Software Technical Specification Criticality
Analysis, because the characteristics considered may be different for the different software
products (e.g. because sub-systems with their software are developed by different suppliers).

The factors influencing the error potential are listed as yes/no questions in a questionnaire.
The more ‘yes’ responses, the higher the potentially negative impact on software quality.
Based on a qualitative assessment one may then decide to increase the ISVV level, i.e. from
ISVVL 1 to ISVVL 2 or from ISVVL 0 to ISVVL 1 or 2. It should be noted that when the ISVV
level of an item is raised from 0 to 1, this is effectively equivalent to increasing the number of
items subject to ISVV. The table below shows the mapping from software criticality category
and error potential to ISVV Level:

SCC ISVVL 2 ISVVL 2 ISVVL 2


Software Criticality

4
SCC ISVVL 1 ISVVL 1 ISVVL 2 ISVVL 2
Category

3
SCC ISVVL 0 ISVVL 1 ISVVL 1 ISVVL 2
2
SCC ISVVL 0 ISVVL 0 ISVVL 1
1
ISVVL Low Medium High
Error Potential
Table 4: Matrix to derive ISVV level from Software Criticality Category and Error Potential

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 32

For some software criticality categories and error potential levels, the table provides two
choices of ISVV Level, the decision being left to expert judgement after having assessed the
error potential.

The questionnaire itself is contained in chapter 12.0 (Annex C). There may be a need to fill in
several instances of the questionnaire if the different software products are developed by
different organisations/project groups or there are other reasons for believing the response to
the questions would vary.

One of the questions of the questionnaire is related to complexity. At the early stages of a
software development project, this is not quantifiable and will have to be based on experience
and sound judgement. For Software Code Criticality Analysis, complexity can be measured,
and the measurements will be used as input to error potential determination. A complexity
measure must be defined, with a threshold to distinguish between non-complex and complex
software units. At code level, complexity is given more weight than the other error potential
factors.
5.1.3 Treatment of Diverse Criticality Categories
In some instances, system design splits critical software from non-critical, allowing the non-
critical software to follow a less strict development process than the critical software, thereby
potentially saving costs. However, such a split is only viable if it can be demonstrated that a
fault of the non-critical (or less critical) component cannot cause the critical component to fail.

Demonstrating this is a verification task, but for the criticality analysis, it must be ensured that
the functions/requirements/components/units constituting the boundary between the two
components are verified to the same ISVV level as the most critical items of the critical
software.

The boundary will be some sort of communication channel with built in checks to ensure that
fault propagation cannot occur. If the components reside on different processors, the
communication channel is likely to be a communication protocol stack; if they reside on the
same processor, it may be shared buffers, pipes, or files, probably managed by the operating
system and with extra hardware support to ensure that processes are strictly separated except
as managed by the operating system.

ISVV boundary

Comp X Comp Y
Criticality: 4 Criticality: 1

Operating System
Figure 8: Visualisation of a software architecture with diverse category levels assigned

Figure 8 provides a visualisation of a part of a system where the criticality is diverse:


Component X has some function with criticality category 4 assigned to it and Component Y
only a criticality category 1 function. With respect to ISVV this means that Component Y is
assigned ISVV level 0 and no verification is done. Component X is assigned ISVV level 2, the
highest possible level.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 33

Since Component Y may trigger failures of the critical functions of Component X, it is


important, that the interactions between Component X and Y (indicated by the blue double-
arrows) are not able to defunct Component X in any way. With respect to ISVV, this means
that the ISVV boundary must include the interactions between Components X and Y. This is
indicated by the dashed line around Component X. It is also a prerequisite that the Operating
System itself has been sufficiently validated.

The fact that the boundary must be taken into account should follow from system or software
level safety and dependability analyses, but it is highlighted here as a case deserving special
attention.
5.2 Activity Inputs and Prerequisites
The following work products are input for the criticality analysis activity:
• From ISVV Customer:
− Software Criticality Scheme
− Critical System Functions List
− Mission and System Requirements Specification
− System Architecture
− Requirements Baseline
− System FMECA
− Technical Specification including Interface Control Documents
− SFMECA based on Technical Specification (if existent)
− Design Definition File: Software Architectural Design and Traceability Matrices
− Design Definition File: Software Detailed Design and Traceability Matrices (optional)
− Software safety/dependability analyses based on software architectural design or
software detailed design (if existent)
− Design Definition File: Software code
− Software safety/dependability analyses based on code (if existent)

• From ISVV Supplier:


− ISVV Findings (from TS Analysis)
− Software safety/dependability analysis based on Technical Specification (from TS
Analysis if existent)
− ISVV Findings (from Design Analysis)
− Software safety/dependability analysis based on software architectural design or
software detailed design (from Design Analysis if existent)

Unlike the Independent Verification activities, the Criticality Analysis activity is split into four
tasks with different starting points in time. A prerequisite for starting any of these activities is
the availability of the required input at a satisfactory level of maturity. Please refer to the
individual tasks for a more detailed view.
5.3 Activity Outputs
The following work products are produced in the scope of the Criticality Analysis activity:
• Software Criticality Scheme
• Error Potential Questionnaires
• Critical System Functions List
• Critical Software Requirements List
• Critical Software Components List
• Critical Software Unit List

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 34

5.4 Activity Management


5.4.1 Initiating and Terminating Events
The four tasks of Software Criticality Analysis will normally be carried out prior to the
corresponding verification or validation activity. The initiating event of each
verification/validation activity thus may be seen also as the initiating event of the Software
Criticality Analysis that defines the scope of the activity.

The initial Criticality Analysis (System Level Software Criticality Analysis) will normally be
carried out by the ISVV customer (and reviewed by the ISVV supplier at the tendering process)
as it is an important input for the cost estimation of the ISVV project.
5.4.2 Completion Criteria
The outputs of each of the Software Criticality Analysis tasks shall be reviewed in a joint review
meeting between the ISVV supplier and the ISVV customer to determine whether the output
provides a sufficient basis for the execution of subsequent verification and validation activities.
5.4.3 Relations to Other Activities
The primary relation of the Software Criticality Analysis to other activities is the Verification and
Validation activities, which uses the output of the Software Criticality Analysis to limit scope
and guide the performance of the different analyses.

Input to the Software Criticality Analysis activity comes from System and Software Engineering
activities as well as from Independent Verification activities previously carried out (ISVV
findings).

The initial Software Criticality Analysis (System Level Software Criticality Analysis) is also an
important input to the cost estimation task of ISVV management.

These Software Criticality Analyses will normally not provide any feedback to the System or
Software Engineering activities, they are only to be used to scope the ISVV activities.
5.5 Task Descriptions
5.5.1 System Level Software Criticality Analysis

TASK DESCRIPTION
Title: System Level Software Criticality Analysis Task ID: MAN.CR.T1
Activity: MAN.CR – Criticality Analysis
Start event: SRR – System Requirements Review
End event: PDR – Preliminary Design Review
Responsible: The System Level Software Criticality Analysis shall be carried out by the ISVV customer. The
result of the analysis will be reviewed by the ISVV supplier during the tendering process.
Objectives:

- Establish the critical function list.

Inputs:

- From ISVV Customer:


- Software criticality scheme [from Software Engineering]
- Critical function list with criticality categories assigned [from System Engineering]
- Mission and system requirements specification [from System Engineering]

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 35

- System architecture [from System Engineering]


- Requirements Baseline [from System Engineering]
- System FMECA [from System Engineering]
Sub Tasks (Procedure):

- MAN.CR.T1.S1: Identify the software criticality scheme used for the mission.
- MAN.CR.T1.S2: Evaluate whether the defined software criticality scheme is relevant for the ISVV
objective. If it is not, then define a new software criticality scheme for ISVV.
- MAN.CR.T1.S3: If there is a Critical Function List and the criticality scheme it is based on is relevant
for the ISVV objective, then use this CFL.
- MAN.CR.T1.S4: If there is no Critical Function List or the ISVV objective does not match the criteria
used to derive it, perform a simplified system FMECA along the lines described in section 13.1.
- MAN.CR.T1.S5: Identify each software product and its supplier. Fill in the error potential questionnaire
(see section 5.1.2) for each software product.
- MAN.CR.T1.S6: Assign ISVV level to each system function based on the software criticality category
of the function and error potential.
Outputs:

- Critical system functions list


- Error potential questionnaires
- Software criticality scheme

5.5.2 Software Technical Specification Criticality Analysis


TASK DESCRIPTION
Title: Software Technical Specification Criticality Analysis Task ID: MAN.CR.T2
Activity: MAN.CR – Criticality Analysis
Start event: PDR – Preliminary Design Review
End event: TAR – Technical Specification Analysis Review
Responsible: This task will be performed by the ISVV customer or the ISVV supplier as determined by the project
contract. If carried out by the ISVV supplier, the results shall be reviewed and approved by the ISVV
customer.
Objectives:

- Establish the critical software requirements list.

Inputs:

- From ISVV Customer:


- Technical Specification including Interface Control Documents [from Software Engineering]
- SFMECA based on Technical Specification (if existent) [from Software Engineering]
- Critical system functions list [from System Level Software Criticality Analysis]
- Error potential questionnaires [from System Level Software Criticality Analysis]
- Software criticality scheme [from System Level Software Criticality Analysis]
Sub Tasks (Procedure):

- MAN.CR.T2.S1: For each software product implementing critical system functions, identify any
SFMECA based on the Technical Specification available.
- MAN.CR.T2.S2: If an SFMECA exists and the criticality scheme used as a basis is relevant for the
ISVV objective, then it may be used as a basis for deriving the critical software requirements list.
- MAN.CR.T2.S3: If no such analyses have been carried out, the quality is too poor, or the ISVV

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 36

objective differs from the presumptions of the SFMECA, perform a simplified SFMECA based on the
Technical Specification including Interface Control Documents. Another simplified way of doing this
step is described in section 13.2.
- MAN.CR.T2.S4: Verify the consistency of the SFMECA with the Critical systems function list. If
discrepancies are found, notify the ISVV customer who will have to consider consequences in terms of
re-analysis.
- MAN.CR.T2.S5: For each software requirement, derive the software criticality category by identifying
the highest criticality category of any failure mode associated with it.
- MAN.CR.T2.S6: Assign an ISVV level to each software requirement based on the software criticality
category of the requirement and error potential (there is no need to reassess error potential unless
different answers to the error potential questionnaire are expected at this level).
Outputs:

- Critical system functions list (update)


- Critical software requirements list

5.5.3 Software Design Criticality Analysis


TASK DESCRIPTION
Title: Software Design Criticality Analysis Task ID: MAN.CR.T3
Activity: MAN.CR – Criticality Analysis
Start event: PDR – Preliminary Design Review
End event: TAR – Design Analysis Review
Responsible: This task will be performed by the ISVV supplier and the result reviewed and approved by the ISVV
customer.
Objectives:

- Establish the critical software component list.

Inputs:

- From ISVV Customer:


- Technical Specification including Interface Control Documents [from Software Engineering]
- Design Definition File: Software architectural design and traceability matrices [from Software
Engineering]
- Design Definition File: Software detailed design and traceability matrices (optional) [from Software
Engineering]
- Software safety/dependability analyses based on software architectural design or software
detailed design (if existent) [from Software Engineering].
- From ISVV Supplier:
- Critical system functions list [from System Level Software Criticality Analysis]
- Error potential questionnaires [from System Level Software Criticality Analysis]
- Software criticality scheme [from System Level Software Criticality Analysis]
- Critical software requirements list [from Software Technical Specification Criticality Analysis]
- Software safety/dependability analysis based on Technical Specification (if existent) [from
Technical Specification Analysis]
- ISVV Findings [from Technical Specification Analysis]
Sub Tasks (Procedure):

- MAN.CR.T3.S1: Review the findings of and the safety/dependability analysis performed as part of the
Technical Specification Analysis. Evaluate the consistency with the critical function list and the critical
software requirements list produced by the preceding Criticality Analyses. If discrepancies are found,
notify the ISVV customer who will have to consider consequences in terms of re-analysis.
- MAN.CR.T3.S2: If design level safety and dependability analyses exist from the developer, investigate

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 37

whether these may be used to assign software criticality categories to design components. The
software criticality scheme should be relevant for ISVV, the analysis should be based on the same
versions of documents as ISVV (or else a delta analysis must be carried out), and the results of any
higher level analyses it is based on should not be in conflict with the results of the Technical
Specification Analysis.
- MAN.CR.T3.S3: If not, trace the software requirements to software architectural design components.
Assign to each software component the highest software criticality category of any requirement tracing
to it.
- MAN.CR.T3.S4: Alternatively, extend the SFMECA carried out at software requirements level by
identifying software components as causes for requirements failure modes. This creates an alternative
trace from requirements to design components. Assign to each software component the highest
software criticality category of any failure mode to which it may contribute.
- MAN.CR.T3.S5: Identify any dependency mechanisms for the design language used (e.g. use or call
relationships).
- MAN.CR.T3.S6: Analyse the dependency of critical components on other components and adjust the
software criticality category of these components to be the same as the critical component depending
on them. Some components may be used by several critical components. For these, assign the
highest criticality category of any dependent component.
- MAN.CR.T3.S7: Assign an ISVV level to each software component based on the software criticality
category of the component and error potential (there is no need to reassess error potential unless
different answers to the error potential questionnaire are expected at this level).
- MAN.CR.T3.S8: Software criticality categories and ISVV levels may also be assigned to detailed
design software components. The benefit of going to this level of detail for the criticality analysis
should be balanced by the costs induced.
Outputs:

- Critical system functions list (update)


- Critical software requirements list (update)
- Critical software component list

5.5.4 Software Code Criticality Analysis


TASK DESCRIPTION
Title: Software Code Criticality Analysis Task ID: MAN.CR.T4
Activity: MAN.CR – Criticality Analysis
Start event: CDR – Critical Design Review
End event: CAR – Code Analysis Review
Responsible: This task will be performed by the ISVV supplier and the result reviewed and approved by the ISVV
customer.
Objectives:

- Establish the critical software unit list.

Inputs:

- From ISVV Customer:


- Design Definition File: Software architectural design [from Software Engineering]
- Design Definition File: Software detailed design (optional) [from Software Engineering]
- Design Definition File: Software code [from Software Engineering]
- Software safety/dependability analyses based on code (if existent) [from Software Engineering]
- From ISVV Supplier:
- Critical system functions list [from System Level Software Criticality Analysis]
- Error potential questionnaires [from System Level Software Criticality Analysis]
- Software criticality scheme [from System Level Software Criticality Analysis]
- Critical software requirements list [from Software Technical Specification Criticality Analysis]

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 38

- Critical software components list [from Software Design Criticality Analysis]


- Software safety/dependability analysis based on Design Definition (if existent) [from Design
Analysis]
- ISVV Findings [from Design Analysis]
Sub Tasks (Procedure):

- MAN.CR.T4.S1: Review the findings of and the safety/dependability analysis performed as part of the
Design Analysis. Evaluate the consistency with the critical system function list, the critical software
requirements list and the critical software component list produced by earlier criticality analyses. If
discrepancies are found, notify the ISVV customer who will have to consider consequences in terms of
re-analysis.
- MAN.CR.T4.S2: If code level safety and dependability analyses exist from the developer, investigate
whether these may be used to assign software criticality categories to software units. The software
criticality scheme should be relevant for ISVV, the analysis should be based on the same versions of
code as ISVV (or else a delta analysis must be carried out), and the results of any higher level
analyses it is based on should not be in conflict with the results of the Design Analysis.
- MAN.CR.T4.S3: If not, identify mapping rules from software design components to software units. For
each software component (either architectural design component or detailed design component) trace
the software component to source code. Assign to each software unit the software criticality category
of the software component it implements.
- MAN.CR.T4.S4: Define complexity measure for software unit. The complexity measure could be e.g.
based on cyclomatic complexity of procedures contained in the unit as well as the number of other
units using this unit. Define a threshold to distinguish non-complex from complex units.
- MAN.CR.T4.S5: Perform complexity measurements on source code.
- MAN.CR.T4.S6: Fill in the error potential questionnaire (see section 5.1.2) for each software unit,
taking into account the complexity measures.
- MAN.CR.T4.S7: Assign ISVV level to each software unit based on the software criticality category of
the software unit and error potential .
Outputs:

- Critical system functions list (update)


- Critical software requirements list (update)
- Critical software component list (update)
- Critical software unit list

5.6 Methods
Some of the methods supporting the Criticality Analysis are not used for verification and
validation and are thus not listed in chapter 14.0 (Annex F). The comment field provides
information on where further information can be found.

Subtask Method Method ISVV Comment


Coverage Level
MAN.CR.T1 – System Level Software Criticality Analysis
MAN.CR.T1.S4 Simplified System Complete N/A See section 13.1.
FMECA
MAN.CR.T1.S5 Error potential Complete N/A See section 5.1.2.
assessment
MAN.CR.T2 – Software Requirements Criticality Analysis
MAN.CR.T2.S2 Walkthrough Complete N/A None.
MAN.CR.T2.S3 Simplified SFMECA Complete N/A See section 13.2
MAN.CR.T3 – Software Design Criticality Analysis
MAN.CR.T3.S2 Walkthrough Complete N/A None.
MAN.CR.T3.S3 Traceability Analysis Complete N/A None.
MAN.CR.T3.S4 Simplified SFMECA Complete N/A See section 13.2

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 39

Subtask Method Method ISVV Comment


Coverage Level
MAN.CR.T3.S6 Modelling Complete N/A Note that the objective here is not necessarily to build a separate
representation of the software design, but to use existing models to
understand dependencies between software components.
MAN.CR.T4 – Software Code Criticality Analysis
MAN.CR.T4.S3 Traceability Analysis Complete N/A None.
MAN.CR.T4.S5 Software metrics Complete N/A None.
analysis
MAN.CR.T4.S6 Error potential Complete N/A See section 5.1.2.
assessment

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 40

6.0 Technical Specification Analysis


6.1 Activity Overview
Technical Specification Analysis is one of the verification activities of the ISVV process (Figure
12). The Technical Specification Analysis is the first verification activity, in general being
performed after the Criticality Analysis at the software requirements level.

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 9: Technical Specification Analysis in context

The Technical Specification Analysis activity aims to verify the software requirements against
the following criteria:
• software requirements traceable to system partitioning and system requirements
• software requirements externally and internally consistent (not implying formal proof
consistency)
• software requirements unambiguous and verifiable
• software design feasible
• operations and maintenance feasible
• the software requirements related to safety and criticality correct (as shown by suitably
rigorous methods)

The Activity also aims to identify safety-critical and mission-critical design drivers and potential
test cases which may be given special attention during subsequent activities of the
independent software verification and validation processes.

A graphical view of the Activity is given in Figure 10 below.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 41

System Requirements
allocated to Software Technical Specification Analysis
(RB)

Software
Requirements
Specification Traceability between
(TS) System Requirements
and Software
Requirements
Software-Hardware
Interface Requirements
Requirements Traceability
(RB) Verification Traceability between
System Requirements
and Interface
Interface Control Requirements
Document
(TS)

Requirements
Critical Software Software Verification Report
Requirements List Requirements
(ISVV) Verification

Contribution to
Software Logical Independent
Model Validation
(TS)

Software Criticality
Analysis Report
(PAF)

Figure 10: Technical Specification Analysis activity5


6.1.1 Requirements traceability verification
The Technical Specification contains the software requirements and the ICDs which have been
derived by the software supplier from the system requirements allocated to software and the
software-hardware interface requirements contained in the Requirements Baseline. To verify
that this derivation has been conducted completely, correctly, consistently, and accurately, it is
necessary to identify the two-way relationships between the derived items in the Technical
Specification and the originating items in the Requirements Baseline. The identified
relationships may then be analysed for completeness, correctness, consistency, and accuracy.

The traceability verification is indicated in Figure 11 below by the relationships between higher-
level and lower level documents.

5
Note that figure shows only the most important inputs and outputs.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 42

Software Requirements Independent Verification

System SW-HW Interface


Requirements Requirements
allocated to (SRR)
Software (SRR)
•Verify the traceability
•Verify the traceability with with the Interface
the System Requirements Requirements

SW Requirements Interfaces Control


Specification (PDR) Document (PDR)

•Verify SW requirements correctness with respect to system requirements


•Verify consistent documentation of SW requirements
•Verify dependability and safety of requirements
•Verify readability of SW requirements
•Verify timing and sizing budgets of SW requirements
•Verify that SW requirements are testable
•Verify feasibility of producing an architectural design
•Verify SW requirements conformance with applicable standards

Figure 11: Software Requirements Independent Verification


6.1.2 Software requirements verification
The Technical Specification contains the software requirements which have been defined by
the software supplier to represent the system requirements allocated to software in the
Requirements Baseline. It is necessary to verify that this representation in terms of function,
capability, performance, safety, dependability, qualification, human factors, data definitions,
documentation, installation and acceptance, and operation and maintenance is complete,
correct, consistent, accurate, readable, and testable.

This verification is shown in Figure 11 as attached specifically to the SW Requirements


Specification.

6.2 Activity Inputs and Prerequisites


The following work products are input to the Technical Specification Analysis activity:

• From ISVV Customer:


− System Requirements allocated to Software [RB; SRR]
− Hardware-Software Interface Requirements [RB; SRR]
− Software Requirements Specification [TS; PDR]
− Interface Control Document [ICD(TS); PDR]
− Software Logical Model [TS; PDR]
− Software Criticality Analysis Report [PAF; SRR]

• From ISVV Supplier:


− Critical Software Requirements List (refer to [MAN.CR.T2])

The inputs to the Technical Specification Analysis activity should comprise a mature, stable,
and self-consistent set to ensure that the analysis conducted on them is useful. A set of inputs
which meet these criteria is available for the customer’s Preliminary Design Review.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 43

6.3 Activity Outputs


The following work products are produced in the scope of the Technical Specification Analysis
activity:

• Traceability Between System Requirements and Software Requirements


• Traceability Between System Requirements and Interface Requirements
• Requirements Verification Report
• Contribution to Independent Validation

Verification reports include at least overall analysis of the work products analysed, findings, list
of open issues to probe further on subsequent analysis, suggested modifications (if any), and
inputs for independent validation test cases specification. Traceability matrices might be
provided as annexes of verification reports or as separate documents.
6.4 Activity Management
6.4.1 Initiating and Terminating Events
The activity will be initiated on receipt of the required inputs. A suitable set of input documents
will be contained in the Datapack submitted by the software supplier for the customer’s
Preliminary Design Review. The Datapack is normally submitted some weeks prior to the
Review, but an earlier initiation of the activity could be achieved if a set of mature, stable, and
self-consistent documents can be made available by the software supplier at an earlier date.

The activity will be terminated on completion of the verification tasks which have been selected
by the customer during verification process implementation as identified in the ISVV Plan. The
required outputs will be submitted to the customer’s ISVV Technical Specification Analysis
Review Meeting.
6.4.2 Completion Criteria
The completion of the Requirements Traceability Matrices and Requirements Verification
Report and their submission to the ISVV customer contribute to the completion of the activity.
The customer’s ISVV Technical Specification Analysis Review Meeting, with the participation of
all involved parties, will allocate final dispositions to the findings of the activity.
6.4.3 Relations to other Activities
Safety-critical and mission-critical design drivers may be identified for further analysis in the
Design Analysis Activity. Potential test cases may be identified for the Validation Activity.
6.5 Task Descriptions
6.5.1 Requirements Traceability Verification
TASK DESCRIPTION
Title: Requirements Traceability Verification Task ID: IVE.TA.T1
Activity: IVE.TA - Technical Specification Analysis
Start event: PDR - Preliminary Design Review
End event: TAR - Technical Specification Analysis Review
Responsible: ISVV Supplier
Objectives:
- Identify the two-way relationships between the software requirements and interface specifications and the
system requirements allocated to software and interface requirements and analyse the identified relationships
for completeness, correctness, consistency, and accuracy.
Inputs:

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 44

- From the ISVV Customer:


- System Requirements allocated to Software [RB; SRR]
- Hardware-Software Interface Requirements [RB; SRR]
- Software Requirements Specification [TS; PDR]
- Interface Control Document [ICD(TS); PDR]
- From the ISVV Supplier:
- Critical Software Requirements List (refer to [MAN.CR.T2])
Sub Tasks (per ISVV Level):
- ISVV Level 1:
- IVE.TA.T1.S1: Verify the traceability matrix for the Software Requirements
By reviewing the traceability matrices produced by the software supplier:
Ensure that all system requirements allocated to software are traceable to software requirements (forward
traceability).
Ensure that every software requirement is traceable to a system requirement (backward traceability).
Ensure that the relationship between each software requirement and its originating system requirement is
correct.
Ensure that the relationships between the software requirements and their originating system
requirements are specified in a uniform manner (in terms of level of detail and format).
Ensure that the characteristics specified in the system requirements allocated to software are accurately
specified by the traced software requirements.
- IVE.TA.T1.S2: Verify the traceability matrix for the Interface Requirements
By reviewing the traceability matrices produced by the software supplier:
Ensure that all system requirements referring to interfaces and all interface requirements are traceable to
interface specifications (forward traceability).
Ensure that every interface specification is traceable to a system or interface requirement (backward
traceability).
Ensure that the interface specifications correctly represent the system interface requirements allocated to
software and the interface requirements. Ensure that data and control flows, data usage and format, and
performance are considered.
Ensure that the relationships between the interface specifications and their originating system or interface
requirements are specified to a consistent level of detail.
Ensure that the characteristics specified in the system requirements referring to interfaces and the
interface requirements are accurately specified by the traced interface specifications.
- ISVV Level 2:
- IVE.TA.T1.S3: Independently construct and verify the traceability matrix for the Software
Requirements
By independently constructing the traceability matrices address the same topics as described in
IVE.TA.T1.S1.
- IVE.TA.T1.S4: Independently construct and verify the traceability matrix for the Interface
Requirements
By independently constructing the traceability matrices address the same topics as described in
IVE.TA.T1.S2.
Outputs:
- Traceability Between System Requirements and Software Requirements
- Traceability Between System Requirements and Interface Requirements

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 45

6.5.2 Software Requirements Verification


TASK DESCRIPTION
Title: Software Requirements Verification Task ID: IVE.TA.T2
Activity: IVE.TA - Technical Specification Analysis
Start event: PDR - Preliminary Design Review
End event: TAR - Technical Specification Analysis Review
Responsible: ISVV Supplier
Objectives:
- Verify that the representation by the software requirements of the system requirements allocated to software is
complete, correct, consistent, accurate, readable, and testable.
Inputs:
- From the ISVV Customer:
- System Requirements allocated to Software [RB; SRR]
- Software Requirements Specification [TS; PDR]
- Interface Control Document [ICD(TS); PDR]
- Hardware-Software Interface Requirements [RB; SRR]
- Software Criticality Analysis Report [PAF; SRR]
- From the ISVV Supplier:
- Critical Software Requirements List (refer to [MAN.CR.T2])
Sub Tasks (per ISVV Level):
- ISVV Level 1 and Level 2:
- IVE.TA.T2.S1: Verify software requirements correctness with respect to system requirements
Ensure that the software requirements represent the system requirements allocated to software within the
assumptions and constraints identified for the system. Ensure that state transitions, data and control
flows, and data usage and format are considered.
Ensure that the software requirements comply with applicable documents and physical laws.
Ensure that the precision specified for interfaces and calculations represents the requirements of the
system.
Ensure that the modelled physical phenomena agree with system accuracy requirements and physical
laws.
- IVE.TA.T2.S2: Verify the consistent documentation of the software requirements
Ensure that the software requirements are documented to a consistent level of detail.
Ensure that the interface specifications are documented to a consistent level of detail.
Ensure that interactions between software requirements and assumptions embedded in them are
consistent and represent system requirements.
- IVE.TA.T2.S3: Verify software requirements completeness
Ensure that the software requirements, within the assumptions and constraints of the system, represent
all the characteristics of the system designated to the software including functional and performance
specifications, software product quality requirements, security specifications, human factors engineering
(ergonomics) specifications and data definition and database requirements.
Ensure that the software requirements include also the specification of the interfaces external to the
software item.
When in flight modification is specified for flight software, ensure also that the software requirements
include specifications for in flight.
- IVE.TA.T2.S4: Verify the dependability and safety of the requirements
Ensure that the software requirements and interface specifications correctly represent the system
requirements relating to safety and dependability allocated to software.
Ensure that the software requirements and interface specifications address all the safety and
dependability aspects introduced by the system requirements allocated to software.
Ensure that the software is not contributing to system hazardous events by analysing software failure
modes and their propagation to system level.
Ensure that the software requirements describe proper features for Fault Detection Isolation And
Recovery (FDIR) in accordance with the system requirements allocated to software.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 46

Ensure that the implemented FDIR mechanisms are independent of the faults that they are supposed to
deal with.
- IVE.TA.T2.S5: Verify the readability of the software requirements
Ensure that the software requirements documentation has a clear and consistent structure.
Ensure that the documentation is intelligible for its target readers and that all the required elements for its
understanding are provided (e.g. definition of acronyms, terms, and conventions).
- IVE.TA.T2.S6: Verify the timing and sizing budgets of the software requirements
Ensure that the software requirements for timing and sizing budgets (e.g. memory usage, CPU utilization,
etc.) correctly represent the system performance requirements allocated to software.
Ensure that the software requirements for timing and sizing budgets (e.g. memory usage, CPU utilization,
etc.) are specified with the accuracy required by the system performance requirements allocated to
software.
- Ensure that the acceptance criteria for validating the software timing and sizing budgets (e.g. memory
usage, CPU utilization, etc.) requirements are objective and quantified.
- IVE.TA.T2.S7: Identify test areas and test cases for Independent Validation
Identify software requirements which cannot be analysed adequately for independent verification and
which, therefore, require execution of independent validation tests. Annotate this information (e.g.
requirements, test case.) as a contribution to the Independent Validation activities.
- ISVV Level 2 only:
- IVE.TA.T2.S8: Verify that the software requirements are testable
Ensure that the acceptance criteria for validating the software requirements are objective and quantified.
Ensure that each software requirement is testable to objective acceptance criteria.
Ensure that software requirements are unambiguous.
- IVE.TA.T2.S9: Verify the feasibility of producing an Architectural Design
Ensure that from the defined software requirements it is possible to implement architectural design.
- IVE.TA.T2.S10: Verify software requirements conformance with applicable standards
Ensure that the software requirements are compliant to applicable standards, references, regulations,
policies, physical laws, and business rules.
Outputs:
- Requirements Verification Report
- Contribution to Independent Validation

6.6 Methods
The table below identifies which methods can be used for the work to be performed within the
scope of each subtask. For each method it is stated whether it covers the purposes of the
subtask completely or partially.

Subtask Method Method ISVV Comment


Coverage Level
IVE.TA.T1 - Requirements Traceability Verification
IVE.TA.T1.S1 Inspection Complete 1 None.
IVE.TA.T1.S2 Inspection Complete 1 None.
IVE.TA.T1.S3 Traceability Complete 2 None.
Analysis
IVE.TA.T1.S4 Traceability Complete 2 None.
Analysis
IVE.TA.T2 - Software Requirements Verification
IVE.TA.T2.S1 Inspection Complete 1&2 None.
Modelling Partial 2 Applicable only to a limited range of
software requirements. May be
performed at ISVV Level 1 if software
requirements are also presented in
graphical form.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 47

Subtask Method Method ISVV Comment


Coverage Level
Formal Methods Partial 2 Applicable only to a limited range of
software requirements. Automatic
derivation of a formal specification is
possible from a UML model.
IVE.TA.T2.S2 Inspection Complete 1&2 None.
Modelling Partial 2 Applicable only to a limited range of
software requirements. May be
performed at ISVV Level 1 if software
requirements are also presented in
graphical form.
Formal Methods Partial 2 Applicable only to a limited range of
software requirements. Automatic
derivation of a formal specification is
possible from a UML model.
IVE.TA.T2.S3 Inspection Complete 1&2 Although scenarios can be developed
to cover all aspects of the inspection
method, the support is partial in that
the document must still be manually
inspected.
Modelling Partial 2 Applicable only to a limited range of
software requirements. May be
performed at ISVV Level 1 if software
requirements are also presented in
graphical form.
IVE.TA.T2.S4 HSIA Complete 1&2 The preferred method at ISVV Level
1 and should always be used at ISVV
Level 2 but is applicable only where
the software interacts with hardware
SFMECA Complete 1&2 The alternative method at ISVV Level
1, when the HSIA is not applicable,
but should always be used at ISVV
Level 2.
SFTA Partial 1&2 May be the one method used at ISVV
Level 1 but should always be used at
ISVV Level 2.
IVE.TA.T2.S5 Inspection Complete 1&2 None.
IVE.TA.T2.S6 Inspection Complete 1&2 None.
Modelling Partial 2 Applicable only to a limited range of
software requirements. May be
performed at ISVV Level 1 if software
requirements are also presented in
graphical form.
IVE.TA.T2.S7 N/A N/A 1&2 No particular method applies to this
subtask. It comprises a collation of
the inputs to Independent Validation
identified in the other subtasks.
IVE.TA.T2.S8 Inspection Complete 2 None.
IVE.DA.T2.S9 Walkthrough Complete 2 None.
IVE.DA.T2.S10 Inspection Complete 2 None.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 48

7.0 Design Analysis


7.1 Activity Overview
Design Analysis is one of the verification activities of the ISVV process (Figure 12). The Design
Analysis is performed in general after the Technical Specification Analysis and having
performed the Criticality Analysis at the component/software unit level.

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 12: Design Analysis in context

The Design Analysis consists on the evaluation of the design of each software product i.e.
analysis of the Design Definition File (DDF) and Design Justification File (DJF), focusing on
aspects such as:
• reliability, availability and safety, ensuring that the sufficient and effective fault detection
and isolation and recovery mechanisms are included,
• error handling mechanisms,
• initialisation / termination of software components
• interfaces between software components and between software and hardware components
• threads / processes synchronisation and resource sharing, and
• budget analysis, including schedulability analysis

Design Analysis focus on two main products, Software Architectural Design and Detailed
Design, corresponding to two main phases of the analysis. In addition Design Analysis should
analyse the software user manual (Figure 13).

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 49

Traceability between
Design Justification
ICD and SW
File Architectural Design Architectural Design
Traceability
Verification

Traceability between TS
Technical
and SW Architectural
Specification
Design

SW Architectural Design
Interface Control
Architectural Design Independent
Documents
Verification Verification Report

Contribution to IVA Contribution to IVA

Detailed Design Traceability between


Criticality Analysis
Traceability ICD and SW Detailed
Report
Verification Design

SW Architectural Traceability between TS


Design and SW Detailed Design

Detailed Design
Traceability between
Verification
SW Detailed Design SW Architectural and
Detailed Design

Software Detailed
SW Item (application) Design Independent
Verification Report

Software User Manual


Verification
Software User Manual
SW User Manual Independent
Verification Report

Figure 13: Software Design Analysis6


7.1.1 Software Architectural Design Independent Verification
The architectural design expresses the high-level organisation of the software (i.e. how the
software will be arranged in terms of components). The architectural design is derived from the
software requirements expressed in the Technical Specification and the interfaces described in
the applicable Interface Control Documents. To verify that the architectural design fulfils the
requirements and presents an adequate quality level, two tasks shall be carried out.
• First, one shall verify the architectural design traceability against the Technical
Specification and the ICDs. This task aims at ensuring that the architectural design has
been completely, correctly, consistently and accurately driven from the Technical
Specification and the applicable ICDs.

6
Note that figure shows only the most important inputs and outputs.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 50

• Second, one shall verify the architectural design itself in order to check whether it is
consistent, correct, complete and readable such that it can be effectively tested, is
sufficient to produce a detailed design and is in conformance with the applicable standards.
Figure 14 illustrates the verification subtasks to be performed as part of the software
architectural design independent verification.

Architectural Design Independent Verification

Technical
•Verify the traceability with Specification (PDR)
the Technical Specification

•Verify the traceability


with the Interfaces
SW Architectural Interfaces Control
Design (PDR) Doc (PDR)

•Verify interfaces consistency between different SW components


•Verify architectural design correctness with respect to Technical Specification
•Verify architectural design completeness
•Verify the dependability & safety of the design
•Verify the readability of the architectural design
•Verify that the software architectural design components are testable
•Verify the feasibility of producing a Detailed Design
•Verify architectural design conformance with applicable standards

Figure 14: Software Architectural Design Independent Verification

7.1.2 Software Detailed Design Independent Verification


The detailed design expresses the low-level organisation of the software (i.e. how the software
will be organised in terms of units). The detailed design is primarily derived from the
architectural design however it also derives from the Technical Specification and the applicable
ICDs. To verify the detailed design two tasks shall be carried out.
• First, one shall verify the detailed design traceability against the Architectural Design, the
Technical Specification and the Interface Control Documents in order to assert whether the
detailed design has been correctly derived.
• Second, one shall verify the detailed design itself in order to check whether it is consistent,
correct, complete and readable such that it can be effectively tested, is sufficient to produce
the code and is in conformance with the applicable standards.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 51

Figure 15 illustrates the verification tasks to be performed as part of the software detailed
design independent verification.

Detailed Design Independent Verification


Technical Specification SW Architectural Interfaces Control
(DDR) Design (DDR) Doc (PDR)

•Verify the traceability with •Verify the traceability with


the Technical Specification the Architectural Design

•Verify the traceability


with the Interfaces
Detailed Design Interfaces Control
(DDR) Doc (DDR)

•Verify interfaces consistency between different SW units


•Verify detail design correctness with respect to Technical Specification
•Verify detailed design completeness
•Verify the dependability & safety of the design
•Verify the readability of the detailed design
•Verify the timing and sizing budgets of the software
•Verify that the software units are testable
•Verify the feasibility of coding
•Verify detail design conformance with applicable standards

Figure 15: Software Detailed Design Independent Verification

7.1.3 Software User Manual Analysis


The software user manual describes the aspects of the software that are relevant for the end
user. It constitutes an essential aspect for the operation of the software. The software user
manual mainly derives from the user requirements but it is also affected by all the other
software project lifecycle phases. It is therefore necessary to verify the software user manual in
terms of completeness, correctness and readability. Figure 16 illustrates the verification tasks
to be performed as part of the software user manual verification.

Software User Manual Independent Verification


Technical SW Architectural Detailed Design
Specification (DDR) Design (DDR) (DDR)

SW Item
Software User Manual
(Application)
(DDR)
(CDR)

•Verify the readability of the User Manual


•Verify the completeness of the User Manual
•Verify the correctness of the User Manual

Figure 16: Software User Manual Independent Verification

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 52

7.2 Activity Inputs and Prerequisites


The following work products are input for the design analysis activity:
• From ISVV Customer:
− Technical Specification
− Interface Control Documents
− Software Architectural Design
− Software Detailed Design
− Software User Manual
− Software Dependability and Safety Analysis Reports
− Software Architectural Design to Requirements Traceability Matrices
− Detailed Design Traceability Matrices
− Schedulability Analysis
− Technical Budgets
− Criticality Analysis
− Software Item

• From ISVV Supplier:


− Project Management Plan
− Technical Specification Independent Verification Report
− Requirements Traceability Matrices from Technical Specification Analysis
− Safety and Dependability Analysis from Technical Specification Analysis
− Contribution to Independent Validation (from Technical Specification Analysis)

The prerequisite for starting the design analysis activity is the availability of the listed inputs.
Moreover the design artefacts shall present a satisfactory maturity level.

7.3 Activity Outputs


The following work products are produced in the scope of Design Analysis activity:

• Software Architectural Design Independent Verification Report


• Software Detailed Design Independent Verification Report
• Software User Manual Independent Verification Report
• Traceability Between TS and SW Architectural Design
• Traceability Between ICD and SW Architectural Design
• Traceability Between TS and SW Detailed Design
• Traceability Between SW Architectural Design and SW Detailed Design
• Traceability Between ICD and SW Detailed Design
• Contribution to Independent Validation (updated with Design Analysis findings)

Verification reports include at least overall analysis of the work products analysed, findings, list
of open issues to probe further on subsequent analysis, suggested modifications (if any), and
inputs for independent validation test cases specification. Traceability matrices might be
provided as annexes of verification reports or as separate documents.

7.4 Activity Management


7.4.1 Initiating and Terminating Events
The Design Analysis activities may be initiated as soon as mature Design artefacts are
available. In general this coincides with the approval of the Software Architectural Analysis at
the PDR.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 53

Although several iterations of the design activity may be performed, thus extending it
potentially until the end of the development project, the Design Analysis activity ends with the
Design Analysis Review (DAR) (as defined in above section 3.0), which in general takes place
before the CDR.
7.4.2 Completion Criteria
Design Analysis becomes complete after Architectural Design, Detailed Design, and Software
User Manual verified in accordance with tasks IVE.DA.T1 to IVE.DA.T5 (refer to section 7.5).
7.4.3 Relations to other Activities
This section identifies the relations between this activity and the remaining ISVV activities.

The tailoring of the Design Analysis activity is performed as part of the Criticality Analysis
activity. Criticality Analysis may also provide useful inputs to the Design Analysis activity,
namely the subtask “Verify the safety and dependability of the design” (refer to section 7.5).

Strong relations exist between the Technical Specification Analysis and the Software Design
Analysis. The outputs of the Technical Specification Analysis are applicable inputs to the
Design Analysis. In addition Technical Specification Analysis may raise issues to be closed
during Design Analysis.

Design Analysis is also likely to provide inputs to independent validation test cases
specification.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 54

7.5 Tasks Description


7.5.1 Architectural Design Traceability Verification
TASK DESCRIPTION
Title: Architectural Design Traceability Verification Task ID: IVE.DA.T1
Activity: IVE.DA - Design Analysis
Start event: PDR – Preliminary Design Review
End event: DAR – Design Analysis Review
Responsible: ISVV Supplier
Objectives:
- Verify Architectural Design external consistency with Technical Specification and Interface Control Documents by
analysing the relationships between architectural design elements (e.g. software components) and technical
specification and interface control documents.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; PDR]
- Interface Control Documents [ICD(TS); PDR]
- Software Architectural Design [DDF; PDR]
- Software Architectural Design to Requirements Traceability Matrices [DJF; PDR]
Sub Tasks (per ISVV Level):
- ISVV Level 1 only:
- IVE.DA.T1.S1: Verify the traceability matrix with the Technical Specification
By reviewing the traceability matrices produced by the software developer:
Ensure that all software item requirements are traceable to a software component and that the functionality
described in the requirement is implemented by the software component (forward traceability).
Ensure that all software components have allocated requirements and that each software component is not
implementing more functionalities than the ones described in the requirements allocated to it (backward
traceability).
For each requirement traced to more than one component ensure that implementation of functionalities is not
repeated.
Ensure that all the relationships between the architectural design elements and the technical specification are
specified in a uniform manner (in terms of level of detail and format).
- IVE.DA.T1.S2: Verify the traceability matrix with the Interface Control Documents
By reviewing the traceability matrices produced by the software developer:
Ensure that the interface design (with other software units, hardware, the user, etc.) is consistent with the
applicable Interface Control Documents.
Ensure that interfaces are designed in a uniform way.
Ensure that each interface provides all the required information from the underlying component.
- ISVV Level 2 only:
- IVE.DA.T1.S3: Independently construct the traceability matrix with the Technical Specification
By independently constructing traceability matrices address the same topics as described in IVE.DA.T1.S1.
- IVE.DA.T1.S4: Independently construct the traceability matrix with the Interface Control Documents
By independently constructing traceability matrices address the same topics as described in IVE.DA.T1.S2.
Outputs:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 55

7.5.2 Architectural Design Verification


TASK DESCRIPTION
Title: Architectural Design Verification Task ID: IVE.DA.T2
Activity: IVE.DA - Design Analysis
Start event: PDR – Preliminary Design Review
End event: DAR – Design Analysis Review
Responsible: ISVV Supplier
Objectives:
- Evaluate the software architectural design for internal consistency, correctness, completeness, testability, feasibility of
detailed design, readability, timing & sizing budgets and dependability & safety.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; PDR]
- Interface Control Documents [ICD(TS); PDR]
- Software Architectural Design [DDF; PDR]
- Software Dependability and Safety Analysis Reports [PAF; PDR]
- Schedulability Analysis [DJF; PDR]
- Technical Budgets [DJF; PDR]
- Criticality Analysis (refer to [DNV ISVV-SN4.1:2005])
- From ISVV Supplier:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design
- Technical Specification Independent Verification Report
- Safety and dependability analysis from Technical Specification Analysis
- Contribution to Independent Validation (from TS Analysis)
Sub Tasks (per ISVV Level):
- ISVV Level 1 and Level 2:
- IVE.DA.T2.S1: Verify interfaces consistency between different SW components
Ensure that software item internal interfaces (e.g. interfaces between software components) are consistent.
Consider both data and control flows. Include verification of data format, accuracy, and timing/performance.
Ensure that all inputs of one software component are produced by some other component and that all outputs of a
component are consumed by some other component.
- IVE.DA.T2.S2: Verify architectural design correctness with respect to Technical Specification
Ensure that the static architecture (e.g. software decomposition into software elements such as packages, and
classes or modules) and the dynamic architecture (e.g. specification of the software active objects such as thread /
tasks and processes) described in the software architectural design adequately implement the software
requirements.
Ensure that architectural design complexity and modularity is in accordance with quality requirements.
Ensure that the software architectural design implements proper sequence of events, inputs, outputs and interfaces
logic flow.
For real-time software ensure the correctness and the consistency of the computational model (in the case it is
provided).
- IVE.DA.T2.S3: Verify architectural design completeness
Ensure that the software architectural description includes (accordingly to [ECSS-E40B:2003): Hierarchy,
dependency and interfaces of software components, process, data and control aspects of the software
components, static and dynamic architecture of the software and mapping between them
For real-time software ensure also that a computational model is provided as part of the software architectural
design.
- IVE.DA.T2.S4: Verify the dependability & safety of the design
Ensure that the software architectural design minimises the number of critical software components without

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 56

introducing undesirable software complexity.


Ensure that the software is not contributing to system hazardous events by analysing software failure modes and
their propagation to system level.
Ensure that the software architectural design implements proper features for Fault Detection Isolation And
Recovery (FDIR) in accordance with the technical specification.
Ensure that the implemented FDIR mechanisms are independent of the faults that they are supposed to deal with.
- IVE.DA.T2.S5: Verify the readability of the architectural design
Ensure that the architectural design documentation has a clear and consistent structure.
Ensure that the documentation is intelligible for the target readers and that all the required elements for its
understanding are provided (i.e. acronyms, terms, conventions used, etc.).
- IVE.DA.T2.S6: Verify the timing and sizing budgets of the software
Ensure that software architectural design implements proper allocation of timing and sizing budgets (e.g. memory
usage, CPU utilization, etc.) by reviewing the analysis performed by the software developer.
For real-time software verify developer’s schedulability analysis.
- IVE.DA.T2.S7: Identify test areas and test cases for independent Validation
Identify areas and items that can not be sufficiently analysed by means of Independent Verification only and
therefore that require execution of validation tests. Annotate this information (test areas/items, test case, etc.) as a
contribution to the Independent Validation activities.
This subtask shall receive, refine and update the contribution to Independent Validation from the TS Analysis.
- ISVV Level 2 only:
- IVE.DA.T2.S8: Verify that the software architectural design components are testable
Ensure that there are objective acceptance criteria for validating each software architectural design component
Ensure that each software design component/unit is testable to objective acceptance criteria.
- IVE.DA.T2.S9: Verify the feasibility of producing a Detailed Design
Ensure that from the defined software architectural design is possible to implement detail design and coding.
- IVE.DA.T2.S10: Verify architectural design conformance with applicable standards
Ensure that the design is compliant to applicable standards, references, regulations, policies, physical laws, and
business rules.
Outputs:
- Software Architectural Design Independent Verification Report
- Contribution to Independent Validation (updated)

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 57

7.5.3 Detailed Design Traceability Verification


TASK DESCRIPTION
Title: Detailed Design Traceability Verification Task ID: IVE.DA.T3
Activity: IVE.DA - Design Analysis
Start event: DDR – Detailed Design Review
End event: DAR – Design Analysis Review
Responsible: ISVV Supplier
Objectives:
- Analyse relationships between detailed design elements (e.g. software units) and technical specification including
interface control documents.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; DDR]
- Interface Control Documents [ICD(TS); DDR]
- Software Detailed Design [DDF; DDR]
7
- Detailed Design Traceability Matrices [DJF; DDR ]
Sub Tasks (per ISVV Level):
- ISVV Level 1 only:
- IVE.DA.T3.S1: Verify the traceability matrix with the Technical Specification
By reviewing the traceability matrices produced by the software developer:
Ensure that all software requirements allocated to a software component are traceable to its software units and that
the functionality described in the requirements is correctly implemented by the corresponding software unit
(forward traceability)
Ensure that all software units have allocated requirements and that each software units is not implementing more
functionalities than the ones described in the requirements allocated to it (backward traceability).
For each requirement traced to more than one software unit ensure that implementation of functionalities is not
repeated.
Ensure that the relationship between the software units and the software requirements are specified to a uniform
manner (in terms of level of detail and format).
- IVE.DA.T3.S2: Verify the traceability matrix with the Interface Control Documents
By reviewing the traceability matrices produced by the software developer:
Ensure that the interface detailed design (with other software units, hardware, the user, etc.) is consistent with the
applicable Interface Control Documents.
Ensure that interfaces are designed in a uniform way.
Ensure that each interface provides all the required information from the underlying component.
8
- IVE.DA.T3.S3: Verify the traceability matrix with the Architectural Design
By reviewing the traceability matrices produced by the software developer:
Ensure that the static and dynamic design is consistent with the static and dynamic architecture defined in the
software architectural design
Ensure that the software units implement correctly the internal interfaces described in the software architectural
design
Ensure that the software design method used for the detailed design is consistent with the one used for software
architectural design
For real-time systems ensure the consistency of the detailed design with the computational model defined in the
software architectural design (e.g. the SW units implementing a given component are consistent with the
computational model of that component).
- ISVV Level 2 only:

7
According to ECSS-E-40 Detailed Design Traceability matrix is only due on CDR. However, if it is
available at the beginning of Detailed Design Traceability Verification it can be considered as an input.
8
The verification of the Detailed Design traceability to Architectural Design is only considered under the
condition that traceability matrices be available at the beginning of Detailed Design Traceability
Verification.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 58

- IVE.DA.T3.S4: Independently construct the traceability matrix with the Technical Specification
By independently constructing traceability matrices address the same topics as described in IVE.DA.T3.S1.
- IVE.DA.T3.S5: Independently construct the traceability matrix with the ICDs
By independently constructing traceability matrices address the same topics as described in IVE.DA.T3.S2.
- IVE.DA.T3.S6: Independently construct the traceability matrix with the Architectural Design
By independently constructing traceability matrices address the same topics as described in IVE.DA.T3.S3.
Outputs:
- Traceability Between TS and SW Detailed Design
- Traceability Between ICD and SW Detailed Design
- Traceability Between SW Architectural Design and SW Detailed Design

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 59

7.5.4 Detailed Design Verification


TASK DESCRIPTION
Title: Detailed Design Verification Task ID: IVE.DA.T4
Activity: IVE.DA - Design Analysis
Start event: DDR – Detailed Design Review
End event: DAR – Design Analysis Review
Responsible: ISVV Supplier
Objectives:
- Evaluate the software detailed design for internal consistency, correctness, completeness, accuracy, testability, feasibility
of coding, readability and consistency with software detailed design of other software items.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; DDR]
- Interface Control Documents [ICD(TS); DDR]
- Software Architectural Design [DDF; DDR]
- Software Detailed Design [DDF; DDR]
- Software Dependability and Safety Analysis Reports [PAF; DDR]
- Criticality Analysis (refer to [DNV ISVV-SN4.1:2005])
- From ISVV Supplier:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design
- Traceability Between TS and SW Detailed Design
- Traceability Between ICD and SW Detailed Design
- Traceability Between SW Architectural Design and SW Detailed Design
- Technical Specification Independent Verification Report
- Software Architectural Design Independent Verification Report
- Safety and dependability analysis from Technical Specification Analysis
- Safety and dependability analysis from Architectural Design Analysis
- Contribution to Independent Validation (from Architectural Design Verification)
Sub Tasks (per ISVV Level):
- ISVV Level 1 and Level 2:
- IVE.DA.T4.S1: Verify interfaces consistency between different SW components
Ensure that software item internal interfaces (e.g. interfaces between software components) are consistent.
Consider both data and control flows. Include verification of data format, accuracy, and timing/performance.
Ensure that all inputs of one software unit are produced by some other unit and that all outputs of a unit are
consumed by some other unit
- IVE.DA.T4.S2: Verify detailed design correctness with respect to Technical Specification
Ensure that the static architecture (e.g. software decomposition into software elements such as packages, and
classes or modules) and dynamic architecture (e.g. specification of the software active objects such as threads /
tasks and processes) described in the software detailed design implements adequately the software requirements.
Ensure that detailed design complexity and modularity is in accordance with quality requirements.
Ensure that the software detailed design implements proper sequence of events, inputs, outputs, interfaces logic
flow, allocation of timing and sizing budgets, and error handling.
Endure that the detailed design is compatible with the target platform (i.e. ensure that platform dependent issues
are compatible with the target hardware).
- IVE.DA.T4.S3: Verify detailed design completeness
Ensure that the software architectural description includes (accordingly to [ECSS-E40B:2003): decomposition of
the software into software units, update of the software item internal interfaces design, and the physical model of
the software items described during the software architectural design.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 60

For real-time software ensure also that a computational model is provided as part of the software architectural
design.
- IVE.DA.T4.S4: Verify the dependability & safety of the design
Ensure that the software detailed design minimises the number of critical software units without introducing
undesirable software complexity.
Ensure that the software is not contributing to system hazardous events by analysing software failure modes and
their propagation to system level.
Ensure that the software detail design implements proper features for Fault Detection Isolation And Recovery
(FDIR) in accordance with the technical specification.
Ensure that the implemented FDIR mechanisms are independent of the faults that they are supposed to deal with.
Ensure that the software correctly handles hardware faults and that the implemented software logic is not harming
the hardware in any way.
Ensure that the detailed design includes proper verification of inputs and consistency checking.
Ensure that software detailed design implements proper error handling mechanisms.
- IVE.DA.T4.S5: Verify the readability of the detailed design
Ensure that the detailed design documentation has a clear and consistent structure.
Ensure that the documentation is intelligible for the target readers and that all the required elements for its
understanding are provided (i.e. acronyms, terms, conventions used, etc.).
- IVE.DA.T4.S6: Verify the timing and sizing budgets of the software
Ensure that software architectural design implements proper allocation of timing and sizing budgets (e.g. memory
usage, CPU utilization, etc.) by reviewing the analysis performed by the software developer.
For real-time software verify developer’s schedulability analysis.
- IVE.DA.T4.S7: Identify test areas and test cases for independent Validation
Identify areas and items that can not be sufficiently analysed by means of Independent Verification only and
therefore that require execution of validation tests. Annotate this information (test areas/items, test case, etc.) as a
contribution to the Independent Validation activities.
This subtask shall receive, refine and update the contribution to Independent Validation from the Architectural
Design Verification (IVE.DA.T2.S7).
- ISVV Level 2 only:
- IVE.DA.T4.S8: Verify that the software units are testable
Ensure that every single software unit is testable and that a clear and objective criterion for validating it exists.
- IVE.DA.T4.S9: Verify the feasibility of coding
Ensure that the defined software detailed design is possible to implement, i.e. to translate into source code. The
detailed design shall be such that it is possible to fully implement the source code without the need for the
Technical Specification (it shall contain all the necessary information).
- IVE.DA.T4.S10: Verify architectural design conformance with applicable standards
Ensure that the detailed design is compliant to applicable standards, references, regulations, policies, physical
laws, and business rules.
Outputs:
- Software Detailed Design Independent Verification Report
- Contribution to Independent Validation (updated)

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 61

7.5.5 Software User Manual Analysis


Although Software User Manual Analysis is described in a single verification task, it can be
executed in two iterations. The first iteration will start at DDR9 and end at DAR. The second
iteration will start at CDR and end at CAR.

TASK DESCRIPTION
Title: Software User Manual Verification Task ID: IVE.DA.T5
Activity: IVE.DA - Design Analysis
Start event: DDR – Detailed Design Review
End event: DAR – Design Analysis Review
Responsible: ISVV Supplier
Objectives:
- Ensure the User Manual readability, completeness and correctness.
Inputs:
- From ISVV Customer:
- Software User Manual [DDF; DDR]
- Software Technical Specification [TS; DDR]
- Software Architectural Design [DDF; DDR]
- Software Detailed Design [DDF; DDR]
- Software Item (application) [DDF; CDR]
Sub Tasks (per ISVV Level):
- ISVV Level 2 only:
- IVE.DA.T5.S1; Verify the readability of the User Manual
Ensure that the user manual has a clear and consistent structure.
Ensure that the user manual is intelligible for the target software users and that all the required elements for its
understanding are provided (i.e. acronyms, terms, conventions used, etc.).
- IVE.DA.T5.S2; Verify the completeness of the User Manual
Ensure that the User Manual describes all the functionalities implemented by the software. Check if all the
necessary information for performing the required operations is provided.
- IVE.DA.T5.S3: Verify the correctness of the User Manual
Ensure that the information provided in the User Manual is consistent with the software implementation i.e. the
software behaves as described.
Outputs:
- Software User Manual Independent Verification Report

9
Please note that one of the pre-requisites for the verification tasks is that the documentation should be
mature. Usually this does not happens with the Software User Manual at DDR.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 62

7.6 Methods
The table below identifies which methods can be used for the work to be performed within the
scope of each subtask. For each method it is stated whether it covers the purposes of the
subtask completely or partially.

Subtask Method Method ISVV Comment


Coverage Level
IVE.DA.T1 - Architectural Design Traceability Verification
IVE.DA.T1.S1 Inspection Complete 1 None.
IVE.DA.T1.S2 Inspection Complete 1 None.
IVE.DA.T1.S3 Traceability Analysis Complete 2 None.
IVE.DA.T1.S4 Traceability Analysis Complete 2 None.
IVE.DA.T2 - Architectural Design Verification
IVE.DA.T2.S1 Walkthrough Partly 1 None.
Inspection Partly 2 None.
Modelling Partly 1&2 None.
(UML: component,
activity,
communication,
interaction, sequence,
timing)
IVE.DA.T2.S2 Walkthrough Partly 1 None.
Inspection Partly 2 None.
Modelling Partly 1&2 None.
(UML: component,
composite,
deployment, package,
activity, sequence,
state machine)
Simulation Partly 2 Simulation may be used to validate high level algorithms.
IVE.DA.T2.S3 Walkthrough Partly 1 Results depend on the completeness of the checklist.
Inspection Complete 2 None.
IVE.DA.T2.S4 Inspection Partly 1&2 None.
HSIA Partly 2 HSIA is used to evaluate HW/SW interfaces.
SFMECA Partly 2 None.
2 None.
SFTA Partly 2 None.
2 None.
SCCFA/SCMFA Partly 2 This method aims at testing the independence of the FDIR
mechanisms from the faults they are supposed to handle
Formal Methods Partly 2 None.
IVE.DA.T2.S5 Walkthrough Partly 1 None.
Inspection Complete 2 None.
IVE.DA.T2.S6 Inspection Partly 1&2 Sizing budgets need to be manually inspected.
Schedulability analysis Partly 1&2 This applies for timing budgets only.
IVE.DA.T2.S7 N/A N/A 1&2 No particular method applies to this subtask. Actually this subtask
consists on a sort of gathering of the inputs to Independent
Validation identified in all the other subtasks.
IVE.DA.T2.S8 Walkthrough Complete 2 None.
IVE.DA.T2.S9 Walkthrough Complete 2 None.
IVE.DA.T2.S10 Inspection Complete 2 None.
IVE.DA.T3 - Detailed Design Traceability Verification
IVE.DA.T3.S1 Inspection Complete 1 None.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 63

Subtask Method Method ISVV Comment


Coverage Level
IVE.DA.T3.S2 Inspection Complete 1 None.
IVE.DA.T3.S3 Inspection Complete 1 None.
IVE.DA.T3.S4 Traceability Analysis Complete 2 None.
IVE.DA.T3.S5 Traceability Analysis Complete 2 None.
IVE.DA.T3.S6 Traceability Analysis Complete 2 None.
IVE.DA.T4 - Detailed Design Verification
IVE.DA.T4.S1 Walkthrough Partly 1 None.
Inspection Partly 2 None.
Modelling Partly 1&2 None.
(UML: class, object,,
activity,
communication,
interaction, sequence,
timing)
IVE.DA.T4.S2 Walkthrough Partly 1 None.
Inspection Partly 2 None.
Modelling Partly 1&2 None.
(UML: class,
component, package,
activity, interaction,
sequence, state
machine)
Simulation Partly 2 Simulation can be used to validate specific algorithms (e.g.
communication protocols).
IVE.DA.T4.S3 Walkthrough Partly 1 None.
Inspection Complete 2 None.
IVE.DA.T4.S4 Inspection Partly 1&2 None.
HSIA Partly 2 HSIA is used to evaluate HW/SW interfaces.
SFMEA Partly 2 None.
2 None.
SFTA Partly 2 None.
2 None.
SCCFA/SCMFA Partly 2 This method aims at testing the independence of the FDIR
mechanisms from the faults they are supposed to handle
Formal Methods Partly 2 None.
IVE.DA.T4.S5 Walkthrough Partly 1 None.
Inspection Complete 2 None.
IVE.DA.T4.S6 Inspection Partly 1&2 Sizing budgets need to be manually inspected.
Schedulability analysis Partly 1&2 This applies for timing budgets only.
IVE.DA.T2.S7 N/A N/A 1&2 No particular method applies to this subtask. Actually this subtask
consists on a sort of gathering of the inputs to Independent
Validation identified in all the other subtasks.
IVE.DA.T4.S8 Walkthrough Complete 2 None.
IVE.DA.T4.S9 Walkthrough Complete 2 None.
IVE.DA.T4.S10 Inspection Complete 2 None.
IVE.DA.T5 - Software User Manual Verification
IVE.DA.T5.S1 Inspection Complete 2 None.
Walkthrough Partly 2 None.
IVE.DA.T5.S2 Inspection Complete 2 None.
Walkthrough Partly 2 None.
IVE.DA.T5.S3 Inspection Complete 2 None.
Walkthrough Partly 2 None.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 64

8.0 Code Analysis


8.1 Activity Overview
Code Analysis is one of the verification activities of the ISVV process (Figure 17). The Code
Analysis is performed in general after the Design Analysis and having performed the Criticality
Analysis at the software unit level.

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 17: Code Analysis in context

The Code Analysis consists on the evaluation of the source code of each selected software
product focusing on aspects such as:
• reliability, availability and safety, ensuring that the sufficient and effective fault detection
and isolation and recovery mechanisms are included,
• error handling mechanisms,
• initialisation / termination of software components
• interfaces between software components and between software and hardware components
• threads / processes synchronisation and resource sharing, and
• budget analysis, including schedulability analysis

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 65

The Code Analysis activity comprises the analysis of the application source code and the tests
procedures and test data (Figure 18).

Technical
Specification
Traceability between TS
and Source Code
Source Code
SW Architectural Traceability
Design Verification
Traceability between
ICD and Source Code

SW Detailed Design
Traceability Between
SW Architectural Design
and Source Code
Interface Control Source Code
Documents Verification
Traceability between
SW Detailed Design and
Source Code
Source Code
Source Code
Independent
Integration Test Verification Report
Criticality Analysis
Procedures and Test
Report
Data Verification
Contribution to IVA

Contribution to IVA
Integration Test
Procedures and Test
SW Integration Test Unit Test Procedures Data Verification Report
Plan and Test Data
Verification
Unit Test Procedures
and Test Data
Verification Report
SW Unit Test Plan

Figure 18: Code Analysis10

8.1.1 Software Code Analysis


The software source code is the ultimate product of a software project. Within the software
development process it is the expression of the software design. Source code independent
verification is one of the most demanding undertakings of an ISVV project. The software
source code independent verification encompasses two major tasks:
• First, one shall verify the source code traceability against the different elements it derives
from, i.e. the Detailed Design, the Architectural Design, the Technical Specification and the
applicable Interface Control Documents. This task aims at asserting the source code
external consistency. Note that traceability to source code level should only be done if
detailed design does not specify to the lowest level.
• Second, one shall verify the source code itself in order to check whether it presents
consistency, correctness and accuracy. It shall also be verified whether dependability and
safety issues have been correctly addressed, whether the source code is readable and
maintainable and whether it can effectively be tested. In the case of real-time software one
shall also verity the timing and sizing budgets.

Figure 19 illustrates the verification tasks to be performed as part of the code analysis.
10
Note that figure shows only the most important inputs and outputs.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 66

Code Analysis
Technical SW Architectural
Specification (DDR) Design (CDR)

•Verify the traceability matrix •Verify the traceability matrix


with the Technical Specification with the Architectural Design
SW Units Source
Code (CDR)
•Verify the traceability matrix •Verify the traceability matrix
with the Detailed Design with the Interface Control Docs

Detailed Design Interfaces Control


Document (CDR) Doc (CDR)

•Verify interfaces consistency between different SW units


•Verify source code correctness with respect to technical specification, architectural design and
detailed design
•Verify the source code readability, maintainability and conformance with the applicable standards
•Verify the dependability & safety of the source code
•Verify the accuracy of the source code
•Verify that the source code is testable
•Verify the timing and sizing budgets of the software

Figure 19: Software Source Code Independent Verification

8.1.2 Integration and Unit Test Procedures and Data Analysis


The integration/unit test plan includes the integration/unit test procedures and test data. It is
necessary to verify the consistency of the test procedures and test data against the design
(architectural and detailed respectively), the technical specification and the ICDs. Furthermore
it is necessary to verify their correctness, completeness and feasibility.

Integration/Unit Test Procedures and Test Data Verification


Technical Specification SW Architectural/Detailed Interfaces Control
(DDR) Design (DDR) Doc (PDR)

•Verify consistency with •Verify consistency with


Technical Specification Software Architectural Design

Integration/Unit Interfaces Control


Test Plan (DDR) Doc (DDR)

•Verify integration test procedures correctness and completeness


•Verify integration test procedures feasibility

Figure 20: Integration/Unit Test Procedures and Test Data Verification

8.2 Activity Inputs and Prerequisites


The following work products are input for the Code Analysis activity:
• From ISVV Customer:

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 67

− Requirements Baseline
− Technical Specification
− Interface Control Documents
− Software Architectural Design
− Software Detailed Design
− Software Units Source Code
− Software Integration Test Plan
− Software Unit Test Plan
− Software User Manual
− Software Dependability and Safety Analysis Reports
− Software Code Traceability Matrices
− Schedulability Analysis
− Technical Budgets
− Criticality Analysis

• From ISVV Supplier:


− Project Management Plan
− Technical Specification Independent Verification Report
− Software Architectural Design Independent Verification Report
− Software Detailed Design Independent Verification Report
− Traceability Between TS and SW Architectural Design
− Traceability Between ICD and SW Architectural Design
− Traceability Between TS and SW Detailed Design
− Traceability Between SW Architectural Design and SW Detailed Design
− Traceability Between ICD and SW Detailed Design
− Safety and Dependability Analysis from Design Analysis
− Contribution to Independent Validation (from Design Analysis)

The prerequisite for starting the code analysis activity is the availability of the listed inputs.
Moreover the listed inputs shall present a satisfactory maturity level.
8.3 Activity Outputs
The following work products are produced in the scope of Source Analysis activity:
• Software Source Code Independent Verification Report
• Integration Test Procedures and Data Independent Verification Report
• Unit Test Procedures and Data Independent Verification Report
• Traceability Between TS and Source Code
• Traceability Between ICD and Source Code
• Traceability Between Software Architectural Design and Source Code
• Traceability Between Software Detailed Design and Source Code
• Contribution to Independent Validation (updated with Code Analysis findings)

Verification reports include at least overall analysis of the work products analysed, findings, list
of open issues to probe further on subsequent analysis, suggested modifications (if any), and
inputs for independent validation test cases specification. Traceability matrices might be
provided as annexes of verification reports or as separate documents.

8.4 Activity Management


8.4.1 Initiating and Terminating Events
The Code Analysis activities may be initiated as soon as mature Source Code and/or Test
Procedure/Data are available. In general this coincides with the CDR.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 68

Although several iterations of the Code Analysis activity may be performed, thus extending it
potentially until the end of the development project, the Code Analysis activity ends with the
Code Analysis Review (CAR) (as defined in above section 3.0), which in general takes place
before the QR.

8.4.2 Completion Criteria


Code Analysis becomes complete after Source Code, Unit and Integration Test Procedures
and Test Data verified in accordance with tasks IVE.CA.T1 to IVE.CA.T4 (refer to section 7.5).

8.4.3 Relations to other Activities


This section identifies the relations between this activity and the remaining ISVV activities.

The tailoring of the Code Analysis activity is performed as part of the Criticality Analysis
activity. Criticality Analysis may also provide useful inputs to the Code Analysis activity, namely
the subtask “Verify the safety and dependability of the source code” (refer to section 7.5).

Strong relations exist between the Technical Specification Analysis, and the Software Design
Analysis and the Code Analysis. The outputs of the Technical Specification Analysis and
Design Analysis are applicable inputs to the Code Analysis. In addition Technical Specification
Analysis and Design Analysis may raise issues to be closed during Code Analysis.

Code Analysis is also likely to provide inputs to independent validation test cases specification.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 69

8.5 Tasks Description


8.5.1 Source Code Traceability Verification
Note that traceability to source code level should only be done if detailed design does not
specify to the lowest level.

TASK DESCRIPTION
Title: Source Code Traceability Verification Task ID: IVE.CA.T1
Activity: IVE.CA Code Analysis
Start event: CDR – Critical Design Review
End event: CAR – Code Analysis Review
Responsible: ISVV Supplier
Objectives:
- Verify source code external consistency with Technical Specification, Interface Control Documents, Architectural Design
and Detailed Design.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; DDR]
- Interface Control Documents [ICD; CDR]
- Software Architectural Design [DDF; CDR]
- Software Detailed Design [DDF; CDR]
- Source Code [DDF; CDR]
- Source Code Traceability Matrices [DJF; CDR]
- From ISVV Supplier:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design
- Traceability Between SW Architectural Design and SW Detailed Design
- Traceability Between TS and SW Detailed Design
- Traceability Between ICD and SW Detailed Design
Implementation:
- ISVV Level 1 only:
- IVE.CA.T1.S1: Verify the traceability matrix with the Technical Specification
By reviewing the traceability matrices produced by the software developer:
Ensure that all software item requirements are traceable to a software unit (source code) and that the functionality
described in the requirement is implemented by the source code unit (forward traceability);
Ensure that all software units (source code) have allocated requirements and that each software unit (source
code) is not implementing more functionalities than the ones described in the requirements allocated to it
(backward traceability)
For each requirement traced to more than one software unit (source code) ensure that implementation of
functionalities is not repeated.
Ensure that the relationship between the software units (source code) elements and the software requirements are
specified in a uniform manner (in terms of level of detail and format).
- IVE.CA.T1.S2: Verify the traceability matrix with the Interface Control Documents
By reviewing the traceability matrices produced by the software developer:
Ensure that the interfaces implementation (with other software units, hardware, the user, etc.) is consistent with the
applicable Interface Control Documents.
Ensure that interfaces are designed in a uniform way.
Ensure that each interface provides all the required information from the underlying component.
- IVE.CA.T1.S3: Verify the traceability matrix with the Architectural Design and Detailed Design
By reviewing the traceability matrices produced by the software developer:

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 70

Ensure that the static architecture (e.g. software decomposition into software elements such as packages, and
classes or modules) and dynamic architecture (e.g. specification of the software active objects such as thread /
tasks and processes) are implemented according the design.
Ensure that the software units (source code) implement correctly the internal interfaces described in the software
architectural design.
- ISVV Level 2 only:
- IVE.CA.T1.S4: Independently construct the traceability matrix with the Technical Specification
By independently constructing traceability matrices address the same topics as described in IVE.CA.T1.S1.
- IVE.CA.T1.S5: Independently construct the traceability matrix with the ICDs
By independently constructing traceability matrices address the same topics as described in IVE.CA.T1.S2.
- IVE.CA.T1.S6: Independently construct the traceability matrix with the Architectural and Detailed design
By independently constructing traceability matrices address the same topics as described in IVE.CA.T1.S3.
Outputs:
- Traceability Between TS and Source Code
- Traceability Between ICD and Source Code
- Traceability Between Software Architectural Design and Source Code
- Traceability Between Software Detailed Design and Source Code

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 71

8.5.2 Source Code Verification


TASK DESCRIPTION
Title: Source Code Verification Task ID: IVE:CA.T2
Activity: IVE.CA Code Analysis
Start event: CDR – Critical Design Review
End event: CAR – Code Analysis Review
Responsible: ISVV Supplier
Objectives:
- Evaluate the software source code for internal consistency, correctness, completeness, accuracy, testability, feasibility of
operations and maintenance, and readability.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; DDR]
- Interface Control Documents [DDF; CDR]
- Software Architectural Design [DDF; CDR]
- Software Detailed Design [DDF; CDR]
- Software dependability and safety analysis reports [DJF; CDR]
- Schedulability Analysis [DJF; CDR]
- Technical Budgets [DJF; CDR]
- Criticality Analysis (refer to [DNV ISVV-SN4.1:2005])
- From ISVV Supplier:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design
- Traceability Between TS and SW Detailed Design
- Traceability Between ICD and SW Detailed Design
- Traceability Between SW Architectural Design and SW Detailed Design
- Traceability Between TS and Source Code
- Traceability Between ICD and Source Code
- Traceability Between Software Architectural Design and Source Code
- Traceability Between Software Detailed Design and Source Code
- Technical Specification Independent Verification Report
- Software Architectural Design Independent Verification Report
- Software Detailed Design Independent Verification Report
- Safety and dependability analysis from Design Analysis
- Contribution to Independent Validation (from Design Analysis)
Implementation:
- ISVV Level 1 and 2:
- IVE.CA.T2.S1: Verify interfaces consistency between different SW units
Ensure that software component internal interfaces (e.g. interfaces between software units) are consistent.
Consider both data and control flows. Include verification of data format, accuracy, and timing/performance.
Ensure that all inputs of one software unit are produced by some other unit and that all outputs of a unit are
consumed by some other unit.
- IVE.CA.T2.S2: Verify source code correctness with respect to technical specification, architectural design and
detailed design
Ensure that the static architecture (e.g. software decomposition into software elements such as packages, and
classes or modules) and dynamic architecture (e.g. specification of the software active objects such as thread /
tasks and processes) implements adequately the software design.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 72

Ensure that source code complexity and modularity is in accordance with quality requirements.
Ensure that the software source code implements proper sequence of events, inputs, outputs and interfaces logic
flow.
Ensure that correct use of programming language, libraries, system calls, etc. is being made.
- IVE.CA.T2.S3: Verify the source code readability, maintainability and conformance with the applicable standards.
Ensure that the source code is written in a clear way and that it is properly documented.
Ensure that all source code files adhere to the same coding style and that the applicable coding conventions, if any,
are followed.
Ensure that applicable coding standards, if any, are followed (e.g. Ada RAVEN, MISRA C, etc.).
Ensure that every single source file has a descriptive header and that the file history was recorded there.
Ensure that a description is provided for every single subprogram.
- IVE.CA.T2.S4: Verify the dependability & safety of the source code
Ensure that the software source code minimises the number of critical software units without introducing
undesirable software complexity (e.g. critical software unit are not sharing resources with non-critical software units
thus increasing their criticality).
Ensure that the software is not contributing to system hazardous events by analysing software failure modes and
their propagation to system level.
Ensure that the software source code implements proper features for Fault Detection Isolation And Recovery
(FDIR) in accordance with the technical specification.
Ensure that the implemented FDIR mechanisms are independent of the faults that they are supposed to deal with.
Ensure that the software correctly handles hardware faults and that the implemented software logic is not harming
the hardware in any way.
Ensure that defensive programming techniques are used
Ensure that the source code includes proper verification of inputs and consistency checking.
Ensure that all relevant events are reported by the software using the appropriated channels.
Ensure that the source code does not include any hazardous programming language construct or library function.
Ensure that no dead or deactivated code exists. If deactivated code exists ensure that its activation will not lead to a
hazardous condition.
For concurrent systems ensure that no dead-lock or race conditions exist.
- IVE.CA.T2.S5: Verify the accuracy of the source code
Ensure that the source code implements the required computational precision (e.g. rounding, truncation, etc.).
Ensure that the granularity of the reported error information is sufficient to trigger the necessary corrective actions.
Ensure that the parameter values and the computation made are conformant with the required units (e.g. meters,
inches, volts, etc.).
- IVE.CA.T2.S6: Identify test areas and test cases for independent Validation
Identify areas and items that can not be sufficiently analysed by means of Independent Verification only and
therefore that require execution of validation tests. Annotate this information (test areas/items, test case, etc.) as a
contribution to the Independent Validation activities.
This subtask shall receive, refine and update the contribution to Independent Validation from the Design Analysis.
- Level 2 only:
- IVE.CA.T2.S7: Verify that the source code is testable
Ensure that the source code can be easily tested (e.g. check if every single subprogram implements a single
function).
- IVE.CA.T2.S8: Verify the timing and sizing budgets of the software
For real-time software, verify the developer’s computation of the Worst Case Execution Time (WCET) of each task
and compare the obtained values with those provided in the design and/or technical specification.
Verify the developer’s schedulability analysis of the implemented application (should be based on computed
WCETs).
Verify the sizing budgets of the software (e.g. executable image size, stack size, buffers, etc.) and face them up to
the design and requirements.
Outputs:
- Software Source Code Independent Verification Report
- Contribution to Independent Validation (updated)

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 73

8.5.3 Integration Test Procedures and Test Data Verification


TASK DESCRIPTION
Title: Integration Test Procedures and Test Data Verification Task ID: IVE.CA.T3
Activity: IVE.CA Code Analysis
Start event: CDR – Critical Design Review
End event: QR
Responsible: ISVV Supplier
Objectives:
- Evaluate Integration Test Procedures and Data for consistency with technical specification and architectural design and
correctness, completeness and feasibility.
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; DDR]
- Interface Control Documents [ICD;CDR]
- Software Architectural Design [DDF; CDR]
- Integration Test Plan [DJF; CDR]
- Traceability of Architectural Design to Integration Tests [DJF; CDR]
- From ISVV Supplier:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design
- Integration Test Plan Independent Verification Report
Implementation:
- ISVV Level 2 only:
- IVE.CA.T3.S1: Verify consistency with Technical Specification
Ensure that test procedures and data are traceable to software requirements.
- IVE.CA.T3.S2: Verify consistency with Software Architectural Design
Ensure that test procedures and data are traceable to software architectural design components.
- IVE.CA.T3.S3: Verify integration test procedures correctness and completeness
Ensure that the integration test plan is in accordance with the defined test strategy, namely with respect to: the
types of tests to be performed (e.g. functional, boundary, performance, usability, etc.) and test coverage goals
(such as call graph and parameter passing).
Verify if the integration test procedures and data are in accordance with the integration test plan, namely in which
respect to: the types of tests to be performed, e.g. functional, robustness, performance, usability, etc.; test coverage
goals (such as call graph and parameter passing).
Verify if there is a clear acceptance criterion for every single test case.
Ensure that every single test contains all the necessary information to test the addressed component.
- IVE.CA.T3.S4: Verify integration test procedures feasibility
Verify whether the defined integration tests are possible to implement (consider both the environment needs and
selected tools).
Verify whether the particular approach used in every single test is feasible and conformant with the design of the
component.
Outputs:
- Integration Test Procedure and Test Data Independent Verification Report

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 74

8.5.4 Unit Test Procedures and Test Data Verification


TASK DESCRIPTION
Title: Unit Test Procedures and Test Data Verification Task ID: IVE.CA.T4
Activity: IVE.CA Code Analysis
Start event: CDR – Critical Design Review
End event: QR
Responsible: ISVV Supplier
Objectives:
- Evaluate Unit Test Procedures and data and unit for consistency with software architectural design, and software
detailed design and correctness, completeness, and feasibility
Inputs:
- From ISVV Customer:
- Software Requirements Specification [TS; DDR]
- Interface Control Documents [ICD;CDR]
- Software Architectural Design [DDF; CDR]
- Software Detailed Design [DDF;CDR]
- Software Unit Test Plan [DJF; CDR]
- Traceability of Detailed Design to Unit Tests [DJF; CDR]
- From ISVV Supplier:
- Traceability Between TS and SW Architectural Design
- Traceability Between ICD and SW Architectural Design
- Traceability Between TS and SW Detailed Design
- Traceability Between ICD and SW Detailed Design
- Traceability Between SW Architectural Design and SW Detailed Design
- Unit Test Plan Independent Verification Report
Implementation:
- ISVV Level 2 Only:
- IVE.CA.T4.S1: Verify consistency with Software Architectural Design
Ensure that unit test procedures and data are traceable to software architectural design components.
- IVE.CA.T4.S2: Verify consistency with Software Detailed Design
Ensure that unit test procedures data are traceable to detailed design elements.
- IVE.CA.T4.S3: Verify unit test procedures correctness and completeness
Verify if the unit test procedures and data are in accordance with the unit test plan, namely in which respect to: the
types of tests to be performed, e.g. functional, robustness, performance, usability, etc.; test coverage goals (such
as statement, decision and branch condition coverage).
Verify if there is a clear acceptance criterion for every single test case.
Ensure that every single test contains all the necessary information to test the addressed unit.
- IVE.CA.T4.S4: Verify unit test procedures feasibility
Verify whether the defined unit tests are possible to implement (consider both the environment needs and selected
tools).
Verify whether the particular approach used in every single test is feasible and conformant with the design of the
unit.
Outputs:
- Unit Test Procedures and Test Data Independent Verification Report

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 75

8.6 Methods
The table below identifies which methods can be used for the work to be performed within the
scope of each subtask. For each method it is stated whether it covers the purposes of the
subtask completely or partially.

Subtask Method Method ISVV Comment


Coverage Level
IVE.CA.T1 – Source Code Traceability Verification
IVE.CA.T1.S1 Inspection Complete 1 None.
IVE.CA.T1.S2 Inspection Complete 1 None.
IVE.CA.T1.S3 Inspection Complete 1 None.
IVE.CA.T1.S4 Traceability Complete 2 None.
analysis
IVE.CA.T1.S5 Traceability Complete 2 None.
analysis
IVE.CA.T1.S6 Traceability Complete 2 None.
analysis
IVE.CA.T2 – Source Code Verification
IVE.CA.T2.S1 Walkthrough Partly 1 None.
Inspection Partly 2 None.
Modelling Partly 1&2 The model may be constructed or extracted from the source code by
(UML: reverse engineering.
component,
activity,
communication,
interaction,
sequence,
timing
HOOD
Control Flow
Data Flow)
IVE.CA.T2.S2 Walkthrough Partly 1 None.
Inspection Partly 2 None.
Modelling Partly 1&2 By modelling in the code analysis case is meant extraction of the
(UML: class, software design from the source code (i.e. reverse engineering).
component,
package,
activity,
interaction,
sequence, state
machine
HOOD)
Bug pattern Partly 1&2 None.
identification
IVE.CA.T2.S3 Walkthrough Partly 1 None.
Inspection Complete 2 None.
Software Partly 1&2 None.
Metrics
Analysis
Coding Partly 1&2 None.
standard
conformance
IVE.CA.T2.S4 Inspection Partly 1&2 None.
HSIA Partly 2 None.
2 HSIA is used to evaluate HW/SW interfaces.
SFMECA Partly 2 None.
2 None.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 76

Subtask Method Method ISVV Comment


Coverage Level
SFTA Partly 2 None.
2 None.
SCCFA/SCMF Partly 2 None.
A
Software Partly 2 None.
Metrics
Analysis
Bug pattern Partly 1&2 None.
identification
IVE.CA.T2.S5 Walkthrough Partly 1 None.
Numeric Complete 2 None.
Analysis
IVE.CA.T2.S6 N/A N/A 1&2 No particular method applies to this subtask. Actually this subtask
consists on a sort of gathering of the inputs to Independent Validation
identified in all the other subtasks.
IVE.CA.T2.S7 Walkthrough Partly 2 None.
Inspection Complete 2 None.
IVE.CA.T2.S8 Inspection Partly 2 None.
Worst Case Partly 2 Only provides the worst case time of isolated tasks.
Execution Time
computation
Schedulability Partly 2 Shall be applied in conjunction with WCET computation.
analysis
IVE.CA.T3 – Integration Test Procedures and Test Data Verification
IVE.CA.T3.S1 Traceability Complete 2 None.
analysis
IVE.CA.T3.S2 Traceability Complete 2 None.
analysis
IVE.CA.T3.S3 Walkthrough Partly 2 None.
Inspection Complete 2 None.
IVE.CA.T3.S4 Walkthrough Partly 2 None.
Inspection Complete 2 None.
IVE.CA.T4 – Unit Test Procedures and Test Data Verification
IVE.CA.T4.S1 Traceability Complete 2 None.
analysis
IVE.CA.T4.S2 Traceability Complete 2 None.
analysis
IVE.CA.T4.S3 Walkthrough Partly 2 None.
Inspection Complete 2 None.
IVE.CA.T4.S4 Walkthrough Partly 2 None.
Inspection Complete 2 None.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 77

9.0 Independent Validation


9.1 Activity Overview
Independent validation is the validation activity of the IVV. It can in general be performed after
the independent verification activities.

MAN. Management

MAN.PM.ISVV Process Management

MAN.CR.Criticality Analysis

IVE. Independent Verification

IVE.TA.Technical Specification Analysis

IVE.DA.Design Analysis

IVE.CA.Code Analysis

IVA. Independent Validation

IVA.Validation

Figure 21: Independent Validation in context


However, subactivities can be executed in parallel with the independent verification (if this is
carried out), and can start earliest at CDR as shown in Figure 2. This is described in the
following subsections.

Independent validation is made out of three tasks as shown in Figure 22.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 78

Figure 22: Independent Software Validation11

Notice that the ‘Construction of Test Procedures’ subtask can start when the SVF is delivered.

The ‘Execution of Test Procedures’ requires the object code of the software under test., and
can be started at QR. At QR the first version of the software is delivered and it is expected that
the software has been through development validation.

9.1.1 Identification of Test Cases


The purpose of this task is to identify areas to be subject to independent validation. The task
relies on input from the ISVV Customer and the ISVV Supplier.

The identification of test cases is an iterative process, where new test cases might be identified
during the establishment and execution of previously identified test cases.

The Identification of Test Cases task is divided into the subtasks shown in Figure 23; input to
the task is also illustrated. The needed documents are highlighted in the figure. This first task in
the IVA activity will take 20%-40% of the total effort.

11
Note that figure shows only the most important inputs and outputs.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 79

Figure 23: Subtasks to "Identification of Test Cases"

The input to the IVA activity to this task originates from the ISVV customer and from the ISVV
supplier.

9.1.1.1 Evaluate Task Input


It is a prerequisite that the ISVV supplier has a basic knowledge about the software. This might
be achieved by performing the preceding IVE activities. An evaluation of the Test Cases and
Test Reports delivered by the software supplier might also be useful at this stage of the IVA
activity12. A software user manual (if existing) might provide valuable overview of the system.
9.1.1.2 Perform Analysis
The identification of test cases takes into account that the validation performed by the software
supplier has demonstrated that the user and software requirements have been satisfied.

The analysis will reuse as much as possible from the preceding IVE activity. If none or only
part of the IVE activities have been performed it can be necessary to include elements of the
verification analysis in preparation of the IVA activity.

Table 5 shows when analyses are performed by the IVA activity dependent on the existence of
independent verification results and ISVV level.

12
If the ISVV supplier uses the test cases and test report from the development validation to identify
missing test cases, the ISVV supplier must be careful not to adapt the developer’s way of thinking.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 80

ISVV IVE IVA


Level
1 None Analysis based on checklists
1 Performed Reuse IVE results
2 None Dedicated analysis performed
2 Performed Additional analysis pointed at in the IVE analysis is
performed.
Table 5: Dependency between ISVV level, input and analysis

A full independent validation (ISVV Level 2) must always rely on independent analysis and
shall preferably be performed within the IVA activity to ensure that the analysis focus is on the
validation.

The most important analyses for identification of test cases are:

• Worst Case Analysis


Investigate combinations of worst case situations, e.g. several inputs at the boundary at the
same time.
• Worst Case Load Analysis
If scheduling is a critical item a worst case load analysis must be performed. The analysis
looks into e.g., blocking time, execution time, response time.
• Requirement stretching: what happens if data outside the boundaries are given?
• FDIR analysis: inspect “Fault Detection, Isolation and Recovery” requirements.
.
9.1.1.3 Writing the Independent Validation Test Plan
The Independent Validation Test Plan (IVTP) describes in a programming language neutral
way each identified test. The test plan contains:

• The basic understanding of the software


• Detailed understanding of the software behavior in areas where independent tests are
foreseen.
• Description of identified Test Cases
Each Test Case consist of the following sections:
• Test Overview
• Purpose of the Test Case
• How was the Test Case identified
• Test Environment
• Input Data
• Output Data
• Starting Conditions
• Detailed Test Steps

Each step will be described in a table with the following contents:

Step No. Step Description Expected Result Comment Pass/Fail

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 81

At the end of the task, it is important that the ISVV customer reviews and accepts the test plan.
This will make it possible for the ISVV customer and ISVV supplier to discuss the intended
behavior of the software and focus on essential areas.
9.1.2 Construction of Test Procedures
Test procedures are the implementation of the test cases, i.e., test cases expressed in the test
language as provided by the software validation facility.

The Construction of Test Procedures task is divided into three subtasks. Figure 24 shows the
subtasks and input to each of the subtasks.

Figure 24: Subtasks to "Construction of Test Procedures"


9.1.2.1 Achieve Knowledge about the Software Validation Facility
A prerequisite for starting the SVF familiarisation is the SVF user guide. Requirements to an
SVF are described in section 16.0.
9.1.2.2 Implementation of Test Case into Test Procedures
The test procedure shall be automated to the extent possible, i.e. minimize the need for user
interaction during execution. Also the test result analysis should be done without user
intervention. The intention is to allow for automatic regression testing using the IVA tests
procedures.

The test procedures are part of the Independent Validation Test Report along with the test
results.
9.1.2.3 Updating the Independent Validation Test Plan
When implementing the test cases into test procedures new test cases might appear, these
test cases must be added to the independent validation test plan to be used as documentation
for the test execution.
9.1.3 Execution of Test Procedures
The Execution of Test Procedures task is divided three subtasks as shown in Figure 25.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 82

Figure 25: Subtasks to "Execution of Test Procedures"


9.1.3.1 Execute the Test Procedures
Information about how to perform the tests goes into the Test Report. It is important to notice
that this also might identify additional test cases resulting in a repetition of ‘Construction of Test
Procedures’.
9.1.3.2 Investigation of Failed Tests
Failed tests must be analyzed to ensure that the failure is due to problems in the software
under test, and not to inconveniences in the software validation facility or errors in the test
procedures.

When investigating the failed tests it might leads to additional or modified test cases. If new
test cases are identified they must be added to the IVTP, implemented into test procedures
and then executed.
9.1.3.3 Produce Test Reports
This task will result in two test reports, the recommended contents is:
• Independent Validation Test Report
• Description of how to execute the tests
• Description of observations and problems
• Test procedures (scripts)
• Test results (log files) including pass/fail status
• Independent Validation Findings
• Summarized findings during IVA
• Lists of failed Tests

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 83

9.2 Activity Input and Prerequisite


9.2.1 Activity Inputs
The following work products are input for the independent validation activity:
• From the ISVV Customer
• Documents
• Requirements Baseline
• Technical Specification
• Software Requirements
• Interface control documents
• Architectural Design
• Detailed Design
• Software User Manual
• ISVV Criticality Analysis
• Safety and Dependability Analysis Reports (SFMECA and SFTA reports)
• Test Procedures and Test Report
• Software
• Object code for the software product
• Source code for the software product
• From the ISVV Supplier
• Documents (if any)
• Independent Technical Specification Analysis Result
• Independent Design Analysis Result
• Independent Code Analysis Result
• Criticality System Functions List
• Critical Software Requirements List
• Critical Software Components List
• Critical Software Units List
• From the SVF Supplier
• Documents
• Software Validation Facility User Guide
• Software Validation Facility
• Hardware (if any)
• Software

The input to the individual tasks are discussed in section 9.1 and listed in section 9.5.
9.2.2 Activity Prerequisites
To ensure efficient independent validation activity, it is important that the software under test is
in a mature and healthy state:
• The software under test has already been validated by the software supplier. Because the
ISVV supplier is not expected to redo or replace the software supplier’s validation activities,
these must have been performed prior to the independent validation..
• A suitable software validation facility is available. The SVF can either be constructed by the
ISVV supplier, but can also be delivered by another supplier.
• If possible, the independent verification analysis should have been performed in order to
support the identification of the test cases. If this is not the case, corresponding activities
must be performed as part of the test case identification.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 84

9.3 Activity Outputs


As shown in Figure 22, the specific outputs of the independent validation activity are:
• Independent Validation Test Plan
• Independent Validation Test Procedures
• Independent Validation Test Report.

The Independent Validation Test Plan (IVTP) contains the ISVV supplier’s basic knowledge of
the software and the identified test cases. This test plan is an output of the activity, but also a
document used during the IVA activity. The IVTP is reviewed by the ISVV customer before
continuing with the “Construction of Test Procedures” task.

The test procedures delivered can be used for regression testing of future versions of the
software product, i.e., they might be added to the set of acceptance tests that the software
customer will request executed as part of acceptance of a delivery.

The IVA activity will produce a test report holding all results of the test execution and the
findings from test execution. Before the test report is produced, the ISVV supplier must
investigate failed tests to ensure that the problem revealed is located in the software under
test, and not in the software validation facility or in the test procedures. It is the responsibility
of the ISVV customer to investigate the Independent Validation Test Report, and decide if
failed tests should result in problem reports.
9.4 Process Management
9.4.1 Initiating and Terminating Events
The IVA activity can be initiated as soon as sufficient documentation is available. This means
that the independent validation activity can start at CDR or as soon as corresponding
information is available and mature. If independent verification is being performed, the IVA
activity is recommended to start during the independent code analysis.

The completion of the independent software validation activity does not have to be linked with
the development process, but should take place while the software supplier is still available for
maintenance of the software and preferably close to the AR.
9.4.2 Completion Criteria
The independent validation activity closes with the delivery and review of the test report.

It can be an advantage to deliver the applied software validation facility to the operation phase
along with the test procedures, SVF User Guide and the independent validation test plan. This
will enable execution of the independent validation test suite when updates to the software are
to be investigated.
9.4.3 Relations to other Activities
The IVA activity is dependent of the result of the IVE activities. If important analyses do not
exist and sufficient knowledge about the software under tests is not achieved, sufficient
activities enforcing these must be executed prior to the formal IVA.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 85

9.5 Task Descriptions


9.5.1 Identification of Test Cases
TASK DESCRIPTION
Title: Identification of test cases Task ID: IVA.T1
Activity: IVA - Independent Validation
Start event: DAR - Design Analysis Review (during the independent code analysis)
End event: IVTPR – Independent Validation Test Plan Review
Responsible: ISVV Supplier
Objectives:

- The purpose of this task is to identify areas to be subjected for independent validation. The task relies
on the development documentation for the system, including requirements and design specifications
and output from the IVE analysis. The identified test cases must be described in the test plan to be
used when implementing the tests.
Inputs:

- From ISVV Customer:


- Criticality Analysis
- Requirement Baseline[
- Software Requirements
- Interface Control Documents[
- Software User Manual
- Test Cases & Test Reports
- Software Architectural Design
- From the ISVV Supplier (if available)
- Result of technical specification analysis
- Result of design analysis
- Result of code analysis
- Result of criticality analysis
Sub Tasks (per ISVV Level):

- ISVV Level 1 and 2:


- IVA.T1.S1: Evaluate Task Input
Evaluate the results from the IVE analysis.
Evaluate the validation test performed by the software supplier.
Achieve basic knowledge about the software, either during the IVE activities or during this
evaluating subtask.
Achieve detailed knowledge about the subjects from the critical function list.
- ISVV Level 1 only:
- IVA.T1.S2: Perform Analysis
Use checklist in section 15.8.
- ISVV Level 2 only:
- IVA.T1.S2: Perform Analysis
Analyze the interaction with external software with focus on degraded functionality of the external
software products.
Investigate worst case load scenarios, covering robustness of the software with respect to
deviations from timing requirements, e.g., events happening too early/late.
Investigate worst case scenarios, including robustness of the software with respect to injections
outside or at the boundary.
Investigate potential runtime errors, overflow/underflow, and dataflow conflicts.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 86

- ISVV Level 1 and 2:


- IVA.T1.S3: Writing Independent Validation Test Plan
Describe each identified test case in terms of: Test Rationale, Test Overview, Test Environment,
Input Data, Output Data, Starting Conditions and Detailed test Step descriptions including step
description, expected results and pass/fail comments.
Outputs:

- Independent Validation Test Plan

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 87

9.5.2 Construction of Test Procedures


TASK DESCRIPTION
Title: Construction of Test Procedures Task ID: IVA.T2
Activity: IVA - Independent Validation
Start event: At delivery of the SVF User Guide
End event: When all Test Cases are implemented and described in the Test Plan.
Responsible: ISVV Supplier
Objectives:

- The purpose of this task is to express the test cases in the test language provided by the software
validation facility
Inputs:

- From the ISVV Customer


- Object code for the software product[QR]
- Source code for the software product[QR]]
- From the SVF Supplier
- SVF hardware and software
- SVF User Guide
- From the ISVV Supplier
- Output from IVA.T1: Independent Validation Test Plan
Sub Tasks (per ISVV Level):

- ISVV Level 1 and 2:


- IVA.T2.S1: Achieve knowledge about the SVF
Achieve knowledge about the SVF.
Achieve knowledge about the test language to implement the test cases.
- IVA.T2.S2: Implement Test Cases into Test Procedures
Express the test cases in the test language provided by the software validation facility:
Relate test case parameters with software parameters
Relate test case actions with software functions
Setup conditions for test procedure failure/success
Include test report generation commands
Describe test data TM/TC in the Test Plan
- IVA.T2.S3: Updating the Independent Validation Test Plan
Update the test plan with possibly additional test cases.
Outputs:

- Independent Validation Test Procedures


- Updated Independent Validation Test Plan

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 88

9.5.3 Execution of Test Procedures


TASK DESCRIPTION
Title: Execution of Test Procedures Task ID: IVA.T3
Activity: IVA - Independent Validation
Start event: When IVA.T2 is performed.
End event: IVR – Independent Validation Review
Responsible: ISVV Supplier
Objectives:

- The purpose of this task is to execute the test procedures and generate a test report

Inputs:

- From the ISVV Customer


- Object code for the software product[QR]
- Source code for the software product[QR]]
- From the SVF Supplier
- Object code for the SVF
- SVF User Guide
- From the ISVV Supplier
- Output from IVA.T2:Test Procedures and Independent Validation Test Plan
Sub Tasks (per ISVV Level):

- ISVV Level 1 and 2:


- IVA.T3.S1: Execute the Test Procedures
Execute all implemented test procedures and generate report.
- IVA.T3.S2: Investigation of failed tests
Check that the failure is due to the software under test.
- IVA.T3.S3: Produce Test Report
Describe all tests and observations.
Attach all test procedures (scripts) and test logs to the report.
Outputs:

- Independent Validation Test Report

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 89

9.6 Methods
The table below identifies which methods can be used for the work to be performed within the
scope of each subtask. For each method it is stated whether it covers the purposes of the
subtask completely or partially.

Subtask Method Method ISVV Comment


Coverage Level
IVA.T1 – Identification of Test Cases
IVA.T1.S1 Inspection Partial 1&2 None.
Walkthrough Partial 1&2 None.
IVA.T1.S2 SFTA Partial 2 None.
SFMECA Partial 2 Establish worst case scenarios.
Schedulability Partial 2 Establish worst case load scenarios.
analysis
IVA.T1.S3 N/A N/A 1&2 None.
IVA.T2 – Construction of Test Procedures
IVA.T2.S1 Inspection Partial 1&2 None.
Walkthrough Partial 1&2 None.
IVA.T2.S2 N/A N/A 1&2 None.
N/A N/A 1&2 None.
IVA.T2.S3 N/A N/A 1&2 None.
IVA.T3 - Execution of Test Procedures
IVA.T3.S1 N/A N/A 1&2 None.
IVA.T3.S2 Inspection Partial 1&2 None.
IVA.T3.S3 N/A N/A 1&2 None.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 90

10.0 Annex A – ISVV Plan Outline


This annex contains an outline of the ISVV supplier’s plan for the execution of an ISVV project.
Included in the outline is also some text describing the contents of each section as well as
pointers to previous sections of this document with relevance to the plan.
<1> Introduction
<1.1> Background
The background should set the stage for the ISVV project, describing the context in which the
software subject to ISVV is to be used, the end customer, other relevant organisations involved
in the construction of the final system, the overall schedule of the project etc. The section
should also describe the background for the current ISVV project: why is ISVV being done for
this project?

<1.2> Purpose
This section should describe the purpose of the ISVV plan. Who are the readers of the plan
and how will it be used?

<1.3> Scope and Objectives


The scope shall define the boundaries of the work. It should provide a short description of the
software products to be subject to ISVV (as identified by the initial criticality analyses – see
sections 4.1.2 and 5.0) as well as a summary of the verification and validation activities to be
carried out. The verification and validation objectives should also be described (see section
2.2).

<1.4> Assumptions and Constraints


This section should describe any assumptions and constraints of the ISVV plan. This could be
related to the size or maturity of documents and code, timing of execution of ISVV activities or
any other aspect whose change may impact the execution of the ISVV project. These
assumptions and constraints should of course also be reflected by the ISVV contract.

<1.5> Outline
This section should provide an outline of the rest of the plan.

<2> Applicable and reference documents


This section should list all references made in the plan. References may be divided into
applicable and referenced documents. Typical documents to refer to include:
• ISVV customer’s requirements for project management and reporting,
• Applicable international standards on software engineering, software product assurance or
other,
• Applicable project specific standards, e.g. the ISVV customer’s software product assurance
plan, software development standards etc.
• ISVV project Statement of Work,
• ISVV project Contract,
• ISVV supplier’s plans for Configuration Management, Document Control, Product
Assurance etc.
• ISVV supplier’s quality management system and ISVV process definition.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 91

<3> Terms, definitions and abbreviated terms


<3.1> Definitions
This section should list all definitions used in the plan.

<3.2> Acronyms and Abbreviations


This section should list all acronyms and abbreviations used in the plan.

<4> Project Plan Overview


<4.1> General
The purpose of the Project Plan Overview is to provide a short summary of the ISVV project,
describing roles, work packages, schedule, and deliverables. This is an executive summary of
the plan, also serving as a map to later sections.

<4.2> Project Organisation and Management


<4.2.1> Organizational structure
This section should identify the organizational structure including the relationships with other
organizational parties involved in the ISVV contract.

<4.2.2> Roles and Responsibilities


This section should identify the parties involved in the ISVV contract (ISVV supplier and
customer), parties with which communication is or may be necessary (e.g. software developers,
end customer of system, sub-contractors) as well as other parties relevant to the project (see
section 4.1.1). The section should also describe the ISVV supplier’s internal organisation,
clearly identifying the roles of contractual manager, project manager, technical manager,
product assurance manager, configuration manager as well as technical personnel. The
responsibilities and authorities associated with each role should be described. The names and
contact information (phone/fax number and email/mail address) of each person taking up the
different roles should also be included.

<4.3> Scheduling and Milestones


The schedule and milestones section should identify the major milestones of the project and
map out the activities described in the preceding section in time (see section 3.0). The schedule
should be described in terms of a Gantt chart or some other graphical representation (which
may be a separate document, but which should be referred to). Assumptions and constraints of
the scheduling should be clearly defined. The schedule may also show budget and resource
allocation of individual activities. See the description of the various ISVV activities for guidance
on when activities should be initiated.

<4.4> Resources and Infrastructure


This section should describe the non-personnel resources required for the execution of the
project, including office tools, configuration management tool, verification and validation tools,
special facilities etc. Purchase of new software or equipment should be planned. The availability
of resources should be planned for; if possible and necessary, resources may have to be
reserved. In some cases, the ISVV supplier may not be in control of all necessary infrastructure
(e.g. if validation is to be performed on the validation facility of the ISVV customer). Such use
should be planned (and regulated by contract).

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 92

<4.5> Meetings and Reviews


The meetings and reviews section should list the progress and review meetings to be held
throughout the project with their scheduled date and location. It should also provide instructions
for preparation of the meetings as well as writing of minutes etc. In addition, the possibility of
technical meetings when needed should be mentioned. See section 3.0 for an overview of the
review milestones defined for the ISVV process and section 4.1.3 for setting the review
meetings in context.

<5> Control procedures for verification and validation activities


The plan shall contain information (or reference to) about applicable management procedures
concerning the following aspects:
a. problem reporting and resolution;
b. deviation and waiver policy;
c. control procedures.

<6> ISVV activities and tasks identification


<6.1> Management activities
<6.1.1> Quality Management
The plan should describe the quality assurance scheme of the ISVV project. This may be a
reference to the quality management system of the ISVV supplier.

<6.1.2> Document and Code Management


This section should describe (or provide reference to a document describing) handling of
documents and code received from outside the project and produced by the project. This
includes the identification, registration, configuration management, review, approval, and filing
of documents and code.

<6.1.3> Risk Management


This section should describe how the ISVV supplier intends to identify, register, analyse,
prioritise, and follow-up risks related to the execution of the ISVV project. The plan may contain
an initial risk analysis. However, it is advised to keep a risk management database outside the
plan; it should be continuously updated throughout the project.

<6.1.4> Security Management


The confidentiality requirements of the project should be referred in the plan. Risks related to
breach of confidentiality should be identified and suitable security measures described and
implemented, both for sending information across the internet or via mail and for physically
securing documents and PCs. See also section 4.1.5.

<6.1.5> Metrics
This section should list the metrics that should be collected during the ISVV project, also
describing the purpose and use of the metrics.

<6.2> Work Breakdown Structure


The work breakdown structure describes how the ISVV work is split into individual work
packages, possibly organised in a hierarchical fashion. The section should be introduced by a
graphical overview of all of the work packages followed by a subsection (or sub-subsection) per
work package.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 93

The work breakdown structure should reflect the ISVV supplier’s defined process (based on the
ISVV process of this guide) and the defined scope of the process. The process description
could be part of the ISVV plan or is a separate document which should then be referred to.

Each work package should be described in terms of:


• Identification
• Name
• Input documents
• Output documents
• Controlling documents
• Activities to be carried out with supporting methods and tools
• Budget

Responsibility, start and end event may also be described here or it should be described in the
schedule if this is more convenient.

<7> Communication and Reporting


This section should describe the communication which will take place between the ISVV
supplier and other parties within the scope of the ISVV project, both formal and informal and
both technical and managerial. It should clarify responsibilities for producing and distributing
documents and indicate when communication will take place. It should also specify what
communication must be formal and how such communication shall take place.
This section should include:
• Formal communications, such as software supplier’s documentation and code, the ISVV
reports, the ISVV project progress reports
• Internal communications, such as early findings, etc.

<Appendix A>: Project Directory


The Project Directory should include contact information to all persons involved in the project.

<Appendix B>: Deliverables


The deliverables section should list all the formal deliverables of the project with reference to
the work package producing the deliverable, title, and document identification.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 94

11.0 Annex B – Review Item Discrepancy Form Example


The form below is an example of a RID form.

Review Item Nº Unique identifier of the RID.


Item
Date Origination data of the RID (e.g. in DD.MM.YYYY format).
Discrepancy
Author Author/originator of the RID.
Title Title of the RID.
Originating subtask Identification of the verification subtask in which the RID was
identified (e.g. IVE.CA.T2.S1).
Document reference Identification of the document against who the RID was raised.
Problem location Identification of the inconsistency location (document page,
software component, source file and line).
Problem description Claim:
The description of the problem found.
Recommendation:
Recommendation aiming at the resolution of the RID. This is an
optional field.
Problem type RID classification according to the problem types from Table 7.
(see Table 7)
Severity classification RID severity classification according to Table 8. The severity
(see Table 8) classification is originally assigned by the ISVV Supplier and then
reviewed and possibly updated together with the ISVV Customer.
Table 6: RID Form

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 95

The Problem Type may be described in terms of the categories below:

Problem type Description


External consistency The item presents an inconsistency against an item of an
applicable or referenced document (e.g. the component design is
not according to an applicable software requirement). What is
implemented is not what is required.
Internal consistency The item presents an inconsistency against another item in the
same document (e.g. the description of an interface is not
consistent between the interface user and the interface provider).
Correctness The item is incorrectly implemented, technical issues are being
violated. Consider for instance an activity diagram that contains a
deadlock condition. That would be a case to which correctness
will apply.
Technical feasibility The item is not technically feasible taking into account the
applicable constraints (e.g. the architecture makes extensive use
of advanced object oriented techniques but the application is to
be written in C language).
Readability & The item is hard to read and/or to maintain. An individual other
Maintainability than the author will have serious difficulties in implementing
and/or maintaining the item. The information provided is confuse
and therefore may lead to wrong interpretations.
Completeness The item is not completely defined or the provided information is
not sufficient (e.g. the description of the service 5 telemetry
provided in the user manual do not allow for a clear identification
of the error cause). If an item do not completely implements a
requirement or interface defined in an applicable or reference
document, one shall use “external consistency” instead.
Table 7: RID Problem Type Categories

Each RID may be described in terms of the following severity classes:

Severity classification Description


Major The discrepancy found presents a major thread to the system. Its
correction is pertinent.
Minor The discrepancy found is a minor issue. Although it does not
present a major threat to the system, its correction should be
done.
Comment The discrepancy found does not present any threat to the
system. The RID was raised as a recommendation that aims at
improving the quality of the affected item. The implementation of
the recommended correction is optional.
Table 8: RID Severity Classes

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 96

12.0 Annex C – Error Potential Questionnaire13


The table below shows the error potential questionnaire with the weight of each question. For
software code criticality analysis, the weight of the question related to complexity is 5. In all
other cases, the weight of each question is 1. The score is computed from the weighted sum
of all ‘yes’ answers which is then normalised.

ID Question Weight Yes No Score


1 Is the number of people in the software development team 1 [0..1]
(including development verification and validation) more
than 20?
2 Is the development team split across several geographical 1 [0..1]
working locations (more than 5 minutes walking distance)?
3 Is the maturity of the software development team’s 1 [0..1]
process low as measured by a suitable internationally
recognised software process assessment approach?
4 Is the software development team lacking in experience 1 [0..1]
with the software technology, the domain, or the
application?
5 Is the software supplier lacking in experience with 1 [0..1]
development of software of the required criticality level?
6 Does development of the software require innovative 1 [0..1]
designs?
7 Are software requirements still unstable? 1 [0..1]
8 Is the complexity of the software high? 0/1/5 [0..5]
NOTE 1: Software complexity is inherently difficult to
define. Some of the factors which may influence
complexity include the size of the software, the number of
components and number of relationships between
components, the complexity of algorithms, the number of
internal and external interfaces. Complexity is not only an
intrinsic property of the software itself, but also dependent
on tools available to visualise software as well as the
cognitive capacity of the observer.
NOTE 2: The weight of this question is 5 when evaluating
the error potential for the software code criticality analysis,
otherwise it is either 0 (when no information is available)
or 1.
Sum 7/8/12 [0..12]
Normalised score (sum score / sum weight) [0..1]
Table 9: Error Potential Questionnaire

13
The questions have been inspired by [NASA IV&V]

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 97

The mapping between the normalised score and the error potential levels is shown in the table
below:

Error Error potential level


potential
score
[0..1/3> Low
[1/3..2/3> Medium
[2/3..1] High
Table 10: Mapping from error potential score to error potential level

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 98

13.0 Annex D – Software Criticality Categories [ECSS-Q-80-03]


The following criticality category definitions are provided as examples only.

Category Definition
SCC A Software involved in a hazard severity CAT I control function or software failure
causing a hazard leading to consequences of hazard severity category I where
in the event of software failure no immediate hardware or independent software
backup can be activated and no time is available to effectively intervene to
prevent the occurrence of the consequences.
SCC B Software involved in a hazard severity CAT I control function or software failure
causing a hazard leading to consequences of hazard severity category I where
in the event of software failure immediate hardware or independent software
backup can be activated without external intervention or time and means are
available to effectively intervene to prevent the occurrence of the consequences.
or
Software involved in a hazard severity CAT II control function or software failure
causing a hazard to consequences of hazard severity category II and software
controlling reliability criticality functions category 1 where in the event of
software failure no immediate hardware backup or independent software can be
activated and no time is available to effectively intervene to prevent the
occurrence of the consequences or failure modes effects.
SCC C All other software implementing a function CAT III or CAT 2, 3
or
All other software which is used to control, generate the above and to control
categories A and B software
or
Software used to check-out or qualify system critical equipment/subsystems.
SCC D Any other software.
Table 11: Software criticality categories for manned mission

Category Definition
SCC A Software whose anomalous behaviour would cause or contribute to a failure of
the satellite system resulting in loss of life, personnel injuries, or damage to
other equipment.
SCC B Software whose anomalous behaviour would cause or contribute to a failure of
the satellite system resulting in permanent or non-recoverable loss of the
satellite’s capability to perform its planned mission.
SCC C Software whose anomalous behaviour would cause or contribute to a failure of
the satellite system with negligible or minor effect on the satellite’s mission and
operability.
SCC D Any other software.
Table 12: Software criticality categories for unmanned mission

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 99

The following tables are the sample definition of both the reliability and the safety categories.

Reliability Definition
category
1 Functions whose failure result in loss of the flight configuration
2 Functions whose failure result in loss of all operational capability
3 All others
Table 13: System reliability criticality categories

Hazard severity Definition


category
Catastrophic Hazard (I) Events or conditions which result in loss of life, life threatening or
permanently
disabling injury or loss of
• The launch facility or servicing vehicle
• The Space Station program elements
• Public or private property on the ground
Critical Hazard (II) Events or conditions which result in temporarily disabling injury,
severe
occupational illness of personnel, loss or major damage to the
manned system
itself or major damage to
• The launch facility or servicing vehicle
• The Space Station program elements
• Public or private property on the ground
Marginal Hazard (III) Events or conditions which result in minor injury, illness of personnel
or minor
damage to
• The launch facility or servicing vehicle
• The Space Station program elements
• The manned system itself
• Public or private property on the ground
Table 14: System hazard severity categories

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 100

Annex E – Procedures for Performing Simplified FMECA


13.1 System FMECA
In the event that we would like to redefine the criticality categories for the purpose of ISVV, the
existing safety/dependability analyses cannot be directly reused.

The existing system FMECA may be used as a starting point, by adding an additional column
where the consequence severity (criticality) in terms of the newly defined scheme may be
annotated for every currently defined failure mode. However, the existing failure modes were
identified with the project defined consequence severity categories in mind. The new scheme
(defined for the purpose of scoping the ISVV activity as defined in section 5.0) may raise the
criticality of previously non-critical failure modes which for this reason were not included in the
initial FMECA but should be in the amended one. Identifying these failure modes is a creative
process requiring system understanding. It is thus best done by the ISVV customer during the
original analysis, possibly with participation by the ISVV supplier.

If no system FMECA has been carried out at all, and there is no Critical Function List, the
system FMECA will have to be prepared from scratch. However, an FMECA carried out for the
purposes of determining (i.e. limiting) the scope of ISVV could be somewhat simplified. The
columns that need to be filled in are:
• FMECA #
• Item
• Function
• Failure mode
• Consequence
• Operational phase/mode
• Severity (criticality)
13.2 Software Requirements FMECA
If no software requirements FMECA is available, a simplified analysis (e.g a simplified
SFMECA) must be produced to identify the most critical software requirements. The following
procedure represents the most basic way for identifying the criticality category of the software
requirements. The procedure is based on expert sessions. The traceability matrix from
software requirements to system requirements will aid the process considerably.

1. Obtain the software requirements specification and the system requirements


specification.
2. Obtain the system critical functions list.
3. Go through all software requirements together with at least one expert who knows the
software design and the system design well (two experts may be necessary for that).
Ask the question: To which system critical function failure mode(s) is the software
requirement linked?
a. If no link is found, the software requirement might be non-critical and nothing
needs to be done with respect to ISVV.
b. If one or more links are found, document the link and the reasoning behind it.
4. When all information is collected, the links can be collected in a table, with one critical
function failure mode per column and the software requirements as rows. There should
always be at least one entry per column (at least one software requirement per critical
system function).
In an additional column of the table, the maximum criticality category is written down for all
software requirements.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 101

14.0 Annex F – Methods


This annex provides a brief description for each of the methods considered for design analysis.
Portions of the definitions presented herein were extracted from [PASCON WO12-TN2.1:2000]
and [ECSS-Q80-03:2004]. The methods are presented in alphabetical sequence:
• Formal Methods
• Hardware Software Interaction Analysis
• Inspection
• Modelling
− Data Flow Analysis
− Control Flow Analysis
• Real-Time Properties Verification
− Schedulability Analysis
− Worst Case Execution Time Computation
• Simulation (Design execution)
• Software Common Cause/Mode Failure Analysis (SCCFA/SCMFA)
• Software Failure Modes, Effects and Criticality Analysis (SFMECA)
• Software Fault Tree Analysis (SFTA)
• Static Code Analysis
− Coding Standard Conformance
− Bug Pattern Identification
− Software Metrics Analysis
• Traceability Analysis
• Walkthrough
14.1 Formal Methods
Formal Methods provide a means of developing an analysable description of a software
system at some stage in its development life-cycle; specification, design, or code. Formal
Methods generally offer a notation (usually some form of discrete mathematics being used), a
technique for deriving a description in that notation, and various forms of mathematical analysis
for checking a description to detect various classes of inconsistency or incorrectness. Some
examples of Formal Methods and formal specification languages are B, RAISE, VDM, Z, Petri
Nets, SDL and Finite State Machines. Further information on these methods can be found in
[PASCON WO12-TN2.1:2000].

Formal Methods do suffer from certain limitations. In particular, Formal Methods can prove that
an implementation satisfies a formal specification, but they cannot prove that a formal
specification captures a user's intuitive informal understanding of a system. In other words,
Formal Methods can be used to verify a system, but not to validate a system. The extent of this
limitation should not be underestimated - the reduction of informal application knowledge to a
rigorous specification is a key problem area in the development of large systems.
14.2 Hardware Software Interaction Analysis
The area that covers the interfaces between the software (specially the critical software) and
the hardware on which it runs is particularly difficult to analyse and can become problematic or
somehow forgotten. In fact it tends to stay in no-man’s land. The architecture of the embedded
systems has also suffered changes due to the evolution of computer systems, i.e., from the
simple software controlling and interfacing directly with hardware, we have now systems with
complex Operating Systems, Java virtual machines and other SW layers. These layers
increase the distance between the application software and the hardware, and add an extra
complexity that can lead to covert (or not well understood) communication channels between
the SW and the HW. This is the reason why the use of techniques such as HSIA is becoming
necessary.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 102

The HSIA method is defined in ESA standard [ECSS-Q-80-03]. It consists of the systematic
analysis of the HW/SW interfaces, with particular focus on the hardware faulty conditions and
the handling of those faulty conditions by the SW.

Several assessment techniques exist to verify safety and dependability properties of critical
systems. For example, FMECA and SFMECA are used to identify failure modes, causes,
effects and severity in HW and SW respectively. These techniques are not strongly coupled
and are usually performed at early stages of the project lifecycle. HSIA is meant to complement
and be used with FMECA/SFMECA, to analyse how the hardware might be affected by
software failures or stressed by software actions (in both nominal and error conditions).

The core of the HSIA method is a checklist consisting of a set of questions. These questions
are aimed at providing answers to the following fundamental issues:

• Is the software able to detect, recover, compensate and report a specific hardware failure
mode?
• Is the software using the hardware properly, not harming it in any way?
• Are the recovery actions independent from the hardware components that failed?

Question one helps determine whether the HW/SW system has the ability to monitor the
failure modes provoked by hardware faults and provides mechanisms to recover and limit the
effect of such failures. Question two intends to determine whether the SW, both in nominal and
error recovery cases, does not stress the HW. The last question aims at determining whether
the recovery mechanisms do not make use of (or depend on) the component that has failed
(e.g., sending an error report when the failure affects the communication link). These
fundamental questions are a general overview of the original HSIA checklist. From the
application of HSIA it is expected to obtain a list of software actions that may have adverse
effects on the hardware, recommendations to add or improve the HW and SW FDIR
mechanisms, and information to complement the SFMECA and FMECA worksheets.
14.3 Inspection
The basic method of software inspection for design and code was defined by Fagan in 1976
[INSPEC:1976]. The method has subsequently been applied to the verification of software
requirements where it is said to be most effective when individual reviewers are assigned
specific responsibilities and where they use systematic techniques for meeting those
responsibilities [DETECT:1995, §I.A]. Alternatively [ISO 9000:2000] defines Inspection as an
activity such as measuring, examining, testing or gauging for conformity evaluation.

From the ISVV point of view the inspection can be defined as an evaluation technique in which
software requirements; design, code, or other work products are formally examined by a
person or group (the inspection team) to detect faults, violations of development standards,
and other problems. The author of the work product may or may not be part of the inspection
team. The inspection team typically ranges from two to seven elements being four the
commonly recommended number. All of the elements are inspectors but some have special
roles in the team. The typical layout includes a moderator (manages the inspection), a reader
(performs full reading of the review item in the inspection meeting) and a recorder (annotates
all the inconsistencies found in the inspection meeting). An inspection begins with the
distribution of the item to be inspected (e.g., a specification, some code and test data). Each
participant is required to analyse the item on his own. During the inspection, which is the
meeting of all the participants, the item is jointly analysed to find as many errors as possible.
All errors found are recorded, but no attempt is made to correct the errors at that time.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 103

However, at some point in the future, it must be verified that the errors found have actually
been corrected. Inspections may also be performed in the design and implementation phases.
14.4 Modelling
Modelling consists on the elaboration of a model of the system using a modelling tool and/or
language (e.g. UML, SDL, etc.). The primary target of this method is to help on the
understanding of the system. Modelling can be used to cover a broad range of analysis or sub-
tasks such as, data flow analysis, control flow analysis, state machine diagrams, etc. The
method may be applied to all or specific parts of the system under verification.

Nowadays UML is a synonym of modelling. Some modelling languages were deprecated in


favour of UML and UML is being complemented with some features from those languages.
Consequently special attention is given to UML whenever modelling plays a role. Current UML
specification is version 1.5 but UML 2.0 is arriving and several tools already include support for
it. In order to provide further precision when referring to UML, the diagram type shall be
mentioned.

UML 2 defines 13 basic diagram types, divided into two general classes:
• Structural diagrams. This class comprises diagrams that define the static architecture of a
model (elements of a specification that are irrespective of time). These diagrams are used
to model the building blocks that make up the full model – classes, objects, interfaces and
physical components. Structure diagrams are also used to model the relationships and
dependencies between elements. It includes class, component, composite structure,
deployment, object and package.
• Behavioural diagrams. This class comprises diagrams that depict behavioural features of
a system or business process. It includes activity, state machine, and use case diagrams
as well as the interaction diagrams.

The Behavioural class is further divided in a subclass named Interaction Diagrams. This
subclass is defined as a subset of behaviour diagrams which emphasize object interactions.
This includes communication, interaction overview, sequence, and timing diagrams.

The next table presents the UML 2 diagram types grouped by class.

Diagram Type Description


UML 2
Structural Diagrams
Class Diagram Shows a collection of static model elements such as classes and types, their contents, and their relationships. This
type of diagram defines the basic building blocks that are used to build the full model.
Component Depicts the components that compose an application, system, or enterprise. The components, their
Diagram interrelationships, interactions, and their public interfaces are depicted. This type of diagram is used to model higher
level or more complex structures, usually built up from one or more classes, and providing a well defined interface.
Composite Depicts the internal structure of a classifier (such as a class, component, or use case), including the interaction
Structure Diagram points of the classifier to other parts of the system. It provides means of layering an element's structure and
focusing on inner detail, construction and relationships.
Deployment Shows the execution architecture of systems. This includes nodes, either hardware or software execution
Diagram environments, as well as the middleware connecting them. By other words, this type of diagram shows the physical
disposition of significant artefacts within a real-world setting.
Object Diagram Depicts objects and their relationships at a point in time, typically a special case of either a class diagram or a
communication diagram. Shows how instances of structural elements are related and used at run-time.
Package Diagram Shows how model elements are organized into packages as well as the dependencies between packages.
Diagrams of this type are used to define the high level architecture of the system.
Behavioural Diagrams
Activity Diagram Depicts high-level business processes, including data flow, or to model the logic of complex logic within a system.
This type of diagram has a wide number of uses, from defining basic program flow, to capturing the decision points
and actions within any generalized process.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 104

Diagram Type Description


UML 2
Communication Shows instances of classes, their interrelationships, and the message flows between them. Communication
Diagram diagrams typically focus on the structural organization of objects that send and receive messages. Formerly called
a Collaboration Diagram in UML 1.x.
Interaction Overview A variant of an activity diagram which overviews the control flow within a system or business process. Each
Diagram node/activity within the diagram can represent another interaction diagram.
Sequence Diagram Models the sequential logic, in effect the time ordering of messages between classifiers. It is closely related to
Communication diagrams and show the sequence of messages passed between objects using a vertical timeline.
State Machine Describes the states an object or interaction may be in, as well as the transitions between states. Formerly referred
Diagram to as a state diagram, state chart diagram, or a state-transition diagram.
Timing Diagram Depicts the change in state or condition of a classifier instance or role over time. Typically used to show the
change in state of an object over time in response to external events.
Use Case Diagram Shows use cases, actors, and their interrelationships. This type of diagram is used to model user/system
interactions. They define behaviour, requirements and constraints in the form of scripts or scenarios.

Table 15: UML 2 diagram types

UML 2 adds four new diagram types to the UML 1.x set and renames one. The next table
presents the UML 2 to UML1.x mapping.

UML 2.0 UML 1.x Subtype


Structural Diagrams
Class Diagram Class Diagram -
Component Diagram Component Diagram -
Composite Structure Diagram - -
Deployment Diagram Deployment Diagram -
Object Diagram Object Diagram -
Package Diagram - -
Behavioural Diagrams
Activity Diagram Activity Diagram -
Communication Diagram Collaboration Diagram Interaction
Interaction Overview Diagram - Interaction
Sequence Diagram Sequence Diagram Interaction
State Machine Diagram Statechart Diagram -
Timing Diagram - Interaction
Use Case Diagram Use Case Diagram -

Table 16: UML 2.0 to UML 1.x mapping

14.4.1 Data Flow Analysis


Data flow analysis checks the behaviour of program variables as they are initialised, modified
or referenced as if the program would execute. Data flow diagrams are used to facilitate this
analysis.

The purpose is to detect poor and potentially incorrect program structures. Data flow analysis
combines the information obtained from the control flow analysis with information about which
variables are read or written in different portions of code. It may also be used in the design and
implementation phases.

Data flow analysis can be used to support the dependability and safety assessment in what
concerns the analysis of failures and faults in the product. It complements the engineering
activities, in case these diagrams are already provided therein from which they could be reused
for the dependability and safety analyses purposes. Many tools and methods used for the

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 105

design engineering of the product already have the possibility to define the data flows, and
these same diagrams should be re-used to analyse dependability and safety aspects of the
software product, this is, potential faults existing in the product.
14.4.2 Control Flow Analysis
Control flow is most applicable to real time and data driven systems. Logic and data
requirements are transformed from text into graphic flows, which are easier to analyze.
Examples of control flow diagrams include, among others, PERT, state transition, and
transaction diagrams. These analyses are intended to identify unreachable code, dead code,
inconsistent/incomplete interface mechanisms between modules, and logic errors inside a
module.

Control flow analysis can be used to support the dependability and safety assessment in what
concerns the analysis of failures and faults in the product and an analysis of their propagation.
It complements the engineering activities, in case these diagrams are already provided therein
from which they could be reused for the dependability and safety analyses purposes. Many
tools and methods used for the design engineering of the product already have the possibility
to define the control flow (IDEF0, etc), and these same diagrams should be used to analyse
dependability and safety aspects of the software product, this is, potential faults existing in the
product.
14.5 Real-Time Properties Verification
14.5.1 Schedulability Analysis
Schedulability Analysis aims at determining whether a specific software system is schedulable
or not. In other words, to check whether the software system accomplishes the deadlines it
was designed for (does each function executes within the required time limit).

Cyclic models are by their nature deterministic and the duration and completion of each
function can be determined. On the contrary pre-emptive models are non-deterministic, since
the functions may be triggered by the asynchronous occurrence of some events. In the case of
hard real-time systems, deadlines are defined and must imperatively be respected when a
service is provided. In this case, if a pre-emptive model is adopted, application shall be
analysed in order to verify if it meets the deadline requirements.

To this end Scheduling Models have been defined, based on the Rate Monotonic or Deadline
Monotonic Scheduling Algorithms and the Ceiling Priority Inheritance Protocol
[AUDSLEY:1991, HRTOSK:1991]. Such Scheduling Models allow to verify that all critical tasks
are schedulable (that is, can be executed within their deadlines) under their worst case
execution time conditions.

The Schedulability Analysis is supported by tools which allow the off-line static verification of
hard real-time systems, the simulation of their run-time behaviour (related to timing aspects)
and the evaluation of the worst case execution time. In addition, based on the Schedulability
Analysis theory, an extension of the HOOD method, named HRT-HOOD has been defined
[BURNS:1993].
14.5.2 Worst Case Execution Time Computation
Schedulability Analysis serves to exercise the scheduling algorithm in order to check whether
the software system is schedulable or not. However the method requires vital input information
that is the duration of each task. That information can be obtained by applying another method,
the Worst Case Execution Time calculation (or WCET.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 106

The execution time of a program depends generally on its input data, which determine a
certain execution path within the code according to its control flow instructions. The Worst
Case Execution Time (or WCET) is thus defined as the maximum value of such execution time
for the set of all possible input data. In the following paragraphs, we first review the existing
techniques for calculating the WCET. We then focus on static code analysis-based techniques,
which are today well studied and largely applied in both industry and academy.

There are a number of methods developed for the prediction of the WCET of a function or
program. These methods can be grouped into two main categories, namely, dynamic methods
and static methods. Dynamic methods consist in measuring the execution time of a program
directly on the target system or on a simulator of the target system. The first technique is
referred to as testing, while the second is called simulation. Both techniques require a set of
input data to be found that can lead to the maximum execution time. Conversely, static
methods are based on a static analysis of the code of the program, so no input data is
required. These three different techniques are discussed in more detail hereafter.

Testing: This is a method of evaluating the timing characteristics of code fragments by actually
running them on representative input data. However, since test data may not fully cover the
domain of interest, and measurement may not be possible without setting up an actual
environment, this approach may not be acceptable in the hard real-time domain. Indeed, the
duration of the testing process (i.e., time needed for generating all possible combinations of the
input data and then applying them) is generally too high (however, it can be feasible for very
simple programs defining few input data with a limited value domain, and whose code can be
easily understood and handled). Consider for example a program with a single input consisting
of a 32-bit integer variable. In a general case (i.e., considering no knowledge of the internal
structure of the program), to be sure to find the WCET, it would be necessary to systematically
measure the execution time of the program for all possible input values, i.e., a total of 232
measurements assuming that the program does not rely on internal hardware state. Unless the
execution times of the program are extremely short, this process would take years to perform.

Simulation: As already mentioned in section 14.6, this method consists in analysing some
behavioural characteristics of any software system, for example the timing properties of a
program, by simulating in this specific case, the behaviour of the target system. This method
can be used during the design phase of the system, when the hardware platform is not still
available. The simulator heavily relies on the model of the underlying hardware, which is an
approximation of the actual system, so it may not accurately represent the worst case situation.
Note that the problem related to finding a set of input data that lead to the longest execution
time also applies here.

Static code analysis: This is an analytical method that relies on a static analysis of the
program code so as to find worst case execution paths within the code. The program code is
usually analysed at both high language level and low language level. The former level is based
on the study of the control flow of the program (e.g., loops, conditional branches, etc.) and
aims at isolating basic sequential high-level instruction blocks. The objective of the latter level
is to calculate the execution times of the corresponding basic assembly blocks by considering
the specific architecture of the target system (e.g., instruction set, caches, pipelines, branch
prediction units, etc.).

Unlike static code analysis, techniques based on testing and simulation, are unlikely to
characterise accurately the worst case timing properties of a program. Indeed, static code
analysis has become very popular today and has been the object of study of many works in
both industry and academy. Next section describes in more details the fundamentals of static
code analysis based techniques.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 107

14.6 Simulation (Design execution)


Simulation consists in exercising parts or the model of the overall software/system in order
check is behavioural characteristics and to inquire its feasibility, accuracy, etc. This method
implies the elaboration of a model of the system and the environment it interacts with.
14.7 Software Common Cause/Mode Failure Analysis (SCCFA/SCMFA)
Software common cause failure analysis is intended to determine and define corrective
measures of potential software failures in multiple systems or multiple sub-systems which
undermine the benefits of redundancy, design diversity or components coupling, because of
the appearance of the same software failures in the multiple components.

An important objective of the Software Common Cause Failure Analysis is to ensure real
independence of failures of multiple systems. The effects of failures in the components which
defeat their independence should be analyzed.

Techniques for Software Common Cause/Mode Failure Analysis are general SFMECA, quality
control, design review, verification and testing by independent team, etc.
14.8 Software Failure Modes, Effects and Criticality Analysis (SFMECA)
The software failure modes effects analysis SFMEA is an iterative method, intended to analyse
the effects and criticality of failure modes of the software within a system. SFMECA extends
SFMEA by assigning a criticality category to each software failure. SFMEA and SFMECA are
based on FMEA and FMECA respectively being the last two targeted to hardware/equipment
analysis.

The SFMEA/SFMECA main purposes are to reveal weak or missing requirements and to
identify latent software failures, assign software criticality categories. SFMEA/SFMECA use
intuitive reasoning to determine the effect on the system of a component failing in a particular
failure mode. For example, if a function of a train crossing system is to turn on warning lights
as a train approaches the crossing and to leave the lights on until the train has past, some of
its failure modes could be:

• the lights do not turn on when a train approaches


• the lights turn on though no train is coming
• the lights turn off too soon (before the train has fully crossed)
• the lights turn on too late (after the train has begun crossing).

The effect on the system of each component’s failure in each failure mode would then be
assessed by developing a matrix for each component. The criticality factor, that is, the
seriousness of the effect of the failure, can be used in determining where to apply other
analyses and testing resources.
14.9 Software Fault Tree Analysis (SFTA)
Software Fault Tree Analysis (SFTA) is derived from Fault Tree Analysis (FTA). Fault Tree
Analysis was originally developed in the 1960’s for the safety analysis of a missile system and
it has become one of the most widely used hazard analysis techniques.

SFTA can be used in conjunction with FTA whereby hardware (system) and software fault
trees are combined in order to analyze the entire system. This is significant since many
hazards can be caused by a combination of a software error with a hardware or human failure.

The goal of SFTA is to show that the logic in the software design or in an implementation will
not produce hazard or a failure. The basic procedure in an SFTA is to assume that the hazard

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 108

or the failure has occurred and then to determine its set of possible causes. The produced fault
tree depicts the logical relationship of basic events that lead to the undesired event, which is
the top event of the fault tree. The design or code is modified to compensate for those failure
conditions deemed to be hazardous threats to the system.

System fault trees can be used to calculate the probability of a hazard (the top event)
occurring, if the probabilities of lower events are known. These aides in determining which
parts of the system are the most critical and therefore, require more intensive safety analysis.
These probability calculation are not really applicable to the software or software parts of the
system.
14.10 Static Code Analysis
14.10.1 Coding Standard Conformance
Coding Standard Conformance is method that aims at checking whether the implemented
source code follows a specific coding convention or set of coding rules (this includes checking
coding style, naming conventions, etc.). Coding Standard Conformance verification usually
results on an exhaustive task and therefore a tool for automating it is required.

The Coding Conformance Verification may be used to verify user defined conventions or some
standards such as the Ada RAVEN, MISRA C, etc.
14.10.2 Bug Pattern Identification
Bug Pattern Identification consists on the identification of known programming language and
library bug patterns. As Coding Standard Conformance Bug pattern Identification is not a
complete method. It is always possible to identify and further patterns.

This method can be applied manually but that is only feasible for very small systems.
Therefore, a tool for automating the method will be required in the majority of the cases.
Fortunately a significant amount of high quality tools is available.
14.10.3 Software Metrics Analysis
Software Metrics Analysis consists on evaluating the quality of the software based upon a set
of extracted metrics such as the McCabe’s cyclomatic rate, percentage of comments per
statement and, number of subprogram exit points, etc.

However there are some metrics that are widely accepted – McCabe’s cyclomatic rate is
probably the best example – many others vary or are particular to a specific tool. It is therefore
impossible to say that a tool is complete with respect to the set of metrics it provides. This fact
is not critical because Software Metrics Analysis is to be used as a companion method; it is not
intended to be complete. Software Metrics Analysis may support the safety and dependability
subtask when using complexity measures to point to complex code areas more likely to have
software faults.
14.11 Traceability Analysis
The traceability analysis method consists of analysing the tracing of (finding the
correspondence of) specific items of one lifecycle phase to items of another lifecycle phase.
Typically items are traced across adjacent lifecycle phases and the traceability can be done
from inputs to the outputs (forward traceability, e.g. trace software requirements to architectural
elements) or from outputs to the inputs (backwards traceability e.g. trace architectural elements
to technical requirements). The main purpose of traceability analysis is to check the
consistency and completeness of the work products being reviewed.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 109

Traceability analysis is performed analysing a table with at least two columns so called the
traceability matrix. In case of backwards traceability, the first column will be filled with the
outputs of the phase (e.g. for architectural design analysis all the design elements) and then,
for every output the analyst will check for the matching inputs (e.g. for architectural design
analysis all the software requirements).
14.12 Walkthrough
The walkthrough technique can be defined as a sort of minimalist review. The main difference
between the walkthrough and review techniques is that in walkthrough full reading of the
review item is not performed. Instead of a full reading, one walks through the items under
review only stopping whenever some of the reviewers have a question or a comment. The
objective of a walkthrough is to evaluate a specific software element (e.g. document, source
module) attempting to identify defects and consider possible solutions. In contrast with other
forms of review, secondary objectives are to educate, and to resolve stylistic problems.
Besides that, the preparatory effort and the number of participants is typically less than in a
review. The walkthrough relies more in the discussion between the team members to find
inconsistencies in the work product than in the preparatory work done by each member.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 110

15.0 Annex G – Checklists


15.1 Generic Document Review Checklists
1. Cover page Verified
1.1. Does the document have a title?
1.2. Is the project name referred in the cover page?
1.3. Does the document have an identifier that can be used to refer to it?
1.4. Is the document version referred?
1.5. Is the document release date referred?
1.6. Is there identification of the organization that produced the document?
1.7. Is the document status referred (e.g. draft, final)?

2. Table of contents Verified


2.1. Is there a table of contents?
2.2. Is there a list of figures?
2.3. Is there a list of tables?

3. Revision history Verified


3.1. Is a revision history provided?
3.2. Does each entry in the history includes the version, author date and changes description?
3.3. Are the major document milestones referred and easily identifiable?
3.4. Is the most recent entry of the history consistent with the date and version referred in the cover page?

4. Introduction section Verified


4.1. Does the document include an introductory section or chapter?
4.2. Is the objective of the document referred in the introduction?
4.3. Is the document scope identified?
4.4. Is the intended audience identified?
4.5. Are all the acronyms used throughout the document listed and defined?
4.6. Is there a glossary with the definition of specific terms used throughout the document (if any)?
4.7. Is the document structure described?
4.8. Are all the necessary references to others documents correctly included?

5. Consistency, adequacy, completeness and readability Verified


5.1. Are the document pages numbered?
5.2. Are the styles used in the document consistent throughout the document?
5.3. Does every single figure have a caption?
5.4. Does every single table have a caption?
5.5. Are all the document elements readable?
5.6. Are figures described and do the description follows the same order as the presented elements?
5.7. Are presented items described end does the descriptions appear in the same order?
5.8. Does the document make proper use of language?
5.9. Is the documentation adequate for its purposes?
5.10. Is the documentation complete, and consistent?
5.11. Was the documentation preparation timely?
5.12. Does the configuration management of documents follow the specified procedures?

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 111

15.2 Requirements Review Checklists


Data Type Consistency Scenario

1. Identify all data objects mentioned in the overview


(e.g., hardware component, application variable, abbreviated term or function)
(a) Are all data objects mentioned in the overview listed in the external interface section?

2. For each data object appearing in the external interface section determine the following information:
- Object name:
- Class: (e.g., input port, output port, application variable, abbreviated term, function)
- Data type: (e.g., integer, time, Boolean, enumeration)
- Acceptable values: Are there any constraints, ranges, limits for the values of this object
- Failure value: Does the object have a special failure value?
- Units or rates:
- Initial value:
(a) Is the object's specification consistent with its description in the overview?
(b) If object represents a physical quantity, are its units properly specified?
(c) If the object's value is computed, can that computation generate a non-acceptable value?

3. For each functional requirement identify all data object references:


(a) Do all data object references obey formatting conventions?
(b) Are all data objects referenced in this requirement listed in the input or output sections?
(c) Can any data object use be inconsistent with the data object's type, acceptable values, failure value, etc.?
(d) Can any data object definition be inconsistent with the data object's type, acceptable values, failure value, etc.?

Incorrect Functionality Scenario

1. For each functional requirement identify all input/output data objects:


(a) Are all values written to each output data object consistent with its intended function?
(b) Identify at least one function that uses each output data object.

2. For each functional requirement identify all specified system events:


(a) Is the specification of these events consistent with their intended interpretation?

3. Develop an invariant for each system mode (i.e. under what conditions must the system exit or remain in a given mode)?
(a) Can the system's initial conditions fail to satisfy the initial mode's invariant?
(b) Identify a sequence of events that allows the system to enter a mode without satisfying the mode's invariant.
(c) Identify a sequence of events that allows the system to enter a mode, but never leave (deadlock).

Ambiguities Or Missing Functionality Scenario

1. Identify the required precision, response time, etc. for each functional requirement.
(a) Are all required precisions indicated?

2. For each requirement, identify all monitored events.


(a) Does a sequence of events exist for which multiple output values can be computed?
(b) Does a sequence of events exist for which no output value will be computed?

3. For each system mode, identify all monitored events.


(a) Does a sequence of events exist for which transitions into two or more system modes is allowed?

1. Correctness & completeness Verified


Do the software requirements correctly and completely implement the system architecture and system requirements they
are traced to?
Are the software requirements externally and internally consistent (not implying formal proof consistency)?
Are the software requirements verifiable?
Are the software requirements implying feasibility of software design?

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 112

Are the software requirements implying feasibility of operations and maintenance?


Are the software requirements related to safety, security, and criticality correct as shown by suitably rigorous methods?

15.3 Architectural Design Review Checklist


1. Interfaces Verified
1.1. Does every single interface exposed by a component is required by at least one other component?
1.2. Does every single interface required by a component is provided by some other component?
1.3. Is the description of the data format, timing characteristics, accuracy, etc. consistent between the IF supplier and the
IF user?
1.4. Is the layout used for interface description consistent throughout the architectural design document?
1.5. Is the level of detail of the interface description the same for all the interfaces?
1.6. Does the description of each interface is sufficient and clear?
1.7. Is the number of interfaces reduced to the minimum necessary to expose the component functionality?
1.8. Is the number of interface parameters minimised (i.e. minimum data passed at each interface)?

2. Correctness & completeness Verified


2.1. Does each component correctly implement the requirements it is traced to?
2.2. Is the architectural design of the software components correct with respect to the applicable interface control
documents?
2.3. Has the architecture been adequately decomposed (i.e. complexity and modularity) in accordance with quality
requirements?
2.4. Have the system functions been appropriately allocated to components?
2.5. Is the traceability to the upper level documents – namely the technical specification – provided?
2.6. Does the architecture promote separation between logic and data?
2.7. In case of real-time software is the computational model provided as well as the timing and size budgets?
2.8. Were the timing and sizing budgets of the software correctly allocated?
2.9. Does the design implement proper sequence of events, inputs, outputs, interfaces, logic flow, allocation of timing
and sizing budgets, and error handling?

3. Dependability and safety Verified


3.1. Does the architecture minimise the number of critical components?
3.2. Does the architecture properly implement Fault Detection Isolation and Recovery (FDIR) mechanisms?
3.3. Does the architecture avoid unnecessary redundancy and complexity?
3.4. Does the architecture adequately address maintainability issues?
3.5. Are hazardous design constructs avoided?
3.6. Are the implemented corrective actions independent of the failing device or component?
3.7. Does the design implement safety, security, and other critical requirements correctly as shown by suitable rigorous
methods?

4. Feasibility of subsequent phases Verified


4.1. Does the architecture provide an adequate base for producing a detailed design?
4.2. Can the architectural design be easily tested in an incremental fashion?
4.3. The component design allows specification of a clear acceptance criterion for validating it?
4.4. Does the detailed design provide an adequate base for of operation and maintenance?

5. Clarity and readability Verified


5.1. Is an overview of the system context provided?
5.2. Are useless details avoided in architectural diagrams and respective description?

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 113

5.3. Is a model provided for each of the main architectural components?


5.4. Are all the components presented using the same layout and similar level of detail?

15.4 Detailed Design Review Checklist


1. Interfaces Verified
1.1. Does every single interface exposed by a unit is required by at least one other unit?
1.2. Does every single interface required by a unit is provided by some other unit?
1.3. Is the description of the data format, timing characteristics, accuracy, etc. consistent between the IF supplier and the
IF user?
1.4. Is the layout used for interface description consistent throughout the architectural design document?
1.5. Is the level of detail of the interface description the same for all the interfaces?
1.6. Does the description of each interface is sufficient and clear?
1.7. Is the number of public interfaces reduced to the minimum necessary to implement the required functionality?
1.8. Is the number of function parameters minimised (i.e. the minimum data passed)?
1.9. Does the detailed design avoid the use of global data (e.g. all class data is declared private or at least protected)?

2. Correctness & completeness Verified


2.1. Do the detailed design units correctly and completely implement the architectural components they are traced to?
2.2. Do the detailed design units correctly implement the requirements they are traced to?
2.3. Is the detailed design of the software units correct with respect to the applicable interface control documents?
2.4. Does each component have high internal cohesion and low external coupling?
2.5. Does the detailed design refer all the applicable programming standards?
2.6. Are the development tools listed and the development environment described?
2.7. Is the traceability to the upper level documents – namely the architectural design – provided?
2.8. Does the detailed design promote a clear separation between logic and data?
2.9. Were the timing and sizing budgets of the software correctly addressed and expanded into further detail?
2.10. Are all the function’ parameters described? Is it specified whether they are IN, OUT or IN-OUT?
2.11. Are the function return values (and exception that can be raised if any) described?
2.12. Does the design implement proper sequence of events, inputs, outputs, interfaces, logic flow, allocation of timing
and sizing budgets, and error handling?

3. Dependability and safety Verified


3.1. Does the detailed design minimise the number of critical units?
3.2. Does the detailed design properly implement Fault Detection Isolation and Recovery (FDIR) mechanisms?
3.3. Does the detailed design avoid unnecessary redundancy and complexity?
3.4. Does the detailed design adequately address maintainability issues?
3.5. Are hazardous design constructs avoided?
3.6. Are inputs verified at least at interface level?
3.7. Are there proper handlers for all the errors that can be returned and exceptions that can be raised?
3.8. Does the design implement safety, security, and other critical requirements correctly as shown by suitable rigorous
methods?

4. Feasibility of subsequent phases Verified


4.1. Does the detailed design provide an adequate base for coding?
4.2. Can every single detailed design unit be easily tested?
4.3. The unit design allows specification of a clear acceptance criterion for validating it?
4.4. Does the detailed design provide an adequate base for of operation and maintenance?

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 114

5. Clarity and readability Verified


5.1. Is an overview of the design component provided prior start describing each design unit it comprises?
5.2. Are useless details avoided in the description of each design unit?
5.3. Are all the units presented using the same layout and similar level of detail?

15.5 Code Inspection Checklist


1. Structure Verified
1.1. Does the code completely and correctly implement the design?
1.2. Does the code conform to any pertinent coding standards?
1.3. Is the code well-structured, consistent in style, and consistently formatted? Can the internal consistency between
software units be ensured?
1.4. Are there any uncalled or unneeded procedures or any unreachable code?
1.5. Are there any leftover stubs or test routines in the code?
1.6. Can any code be replaced by calls to external reusable components or library functions?
1.7. Are there any blocks of repeated code that could be condensed into a single procedure?
1.8. Are symbolic constants used rather than “magic number” constants or string constants?
1.9. Are any modules excessively complex and should be restructured or split into multiple routines?

2. Documentation Verified
2.1. Is the code clearly and adequately documented with an easy-to-maintain commenting style?
2.2. Are all comments consistent with the code?

3. Variables Verified
3.1. Are all variables properly defined with meaningful, consistent, and clear names?
3.2. Do all assigned variables have proper type consistency or casting?
3.3. Are there any redundant or unused variables?

4. Arithmetic operations Verified


4.1. Does the code avoid comparing floating-point numbers for equality?
4.2. Does the code systematically prevent rounding errors?
4.3. Does the code avoid additions and subtractions on numbers with greatly different magnitudes?
4.4. Are divisors tested for zero or noise?

5. Loops and branches Verified


5.1. Are all loops, branches, and logic constructs complete, correct, and properly nested?
5.2. Are the most common cases tested first in IF- -ELSEIF chains?
5.3. Are all cases covered in an IF- -ELSEIF or CASE block, including ELSE or DEFAULT clauses?
5.4. Does every case statement have a default?
5.5. Are loop termination conditions obvious and invariably achievable?
5.6. Are indexes or subscripts properly initialized, just prior to the loop?
5.7. Can any statements that are enclosed within loops be placed outside the loops?
5.8. Does the code in the loop avoid manipulating the index variable or using it upon exit from the loop?

6. Defensive programming Verified


6.1. Are indexes, pointers, and subscripts tested against array, record, or file bounds?
6.2. Are imported data and input arguments tested for validity and completeness?
6.3. Are all output variables assigned?

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 115

6.4. Are the correct data operated on in each statement?


6.5. Is every memory allocation deallocated?
6.6. Are timeouts or error traps used for external device accesses?
6.7. Are files checked for existence before attempting to access them?
6.8. Are all files and devices are left in the correct state upon program termination?
6.9. Are all run-time errors absent?
6.10. Is the code implementing proper event sequence, consistent interfaces, correct data and control flow, completeness,
appropriate allocation timing and sizing budgets and error handling?
6.11. Is the code implementing safety, security, and other critical requirements correctly as shown by suitable rigorous
methods?

15.6 Test Plan Review Checklist


1. Completeness Verified
1.1. Does the test plan clearly identifies the items to be tested (e.g. requirements, specific components, etc.)?
1.2. Does the test plan identify the software features to be tested?
1.3. Does the test plan identify the software features not to be tested?
1.4. Is the reason for not testing a particular feature provided?
1.5. Is the testing approach described?
1.6. Is the pass/fail criteria for each test item defined?
1.7. Were the entire test deliverables defined?
1.8. Are the testing tasks specified?
1.9. Are the necessary testing tools and environment described?
1.10. Are the testing coverage goals identified?
1.11. Is the required staffing and training needs specified?
1.12. Are the test milestones identified in the project schedule referred?
1.13. Were the potential risks and respective contingencies identified?

2. Correctness Verified
2.1. Is the reason for not testing a particular feature acceptable and sufficient?
2.2. Is the pass/fail criterion for each test item correct?
2.3. Is the test plan conformant with the project testing strategy in which respects to the types of tests to be performed?
2.4. Are the specified test coverage goals in conformance with the criticality of the project?
2.5. The test plan is in conformance with the project plan?
2.6. Is the identified staffing and training appropriated for the testing tasks that are to be performed?
2.7. Do the described contingencies can overcome the identified risks?

3. Feasibility Verified
3.1. Is the defined testing approach feasible?
3.2. Are the different testing roles and responsibilities correctly defined and assigned?
3.3. Are the specified environment needs and selected tools feasible for executing the necessary testing activities?
3.4. Is the proposed schedule as well as the identified training needs feasible?
3.5. Are the identified contingencies feasible taking into account this test plan and the project schedule?

15.7 HSIA Checklist


HARDWARE SOFTWARE INTERACTION ANALYSIS (HSIA)
HSIA ID HW System SW System
FMECA HW Subsystem SW Subsystem

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 116

Reference
Failure Failure
Mode Criticality
Failure
Mode
Definition
Questions & Findings yes/no
1. Is sufficient and reliable information about the failure available?

2. Is failure mode information reported beyond the system?

3. Does the software initiate a corrective action that safe-keeps the system and negate the effects of the failure?

4. Are there fault tolerance characteristics in order to compensate the failure mode?

5. Is there possibility of external intervention (especially when reaction fails)?

6. Is the immediate corrective action not built on top of the function that has failed?

7. Is the failure recovery completely performed within the system?

8. Is it verified that software does not stress the hardware?

9. Is the failure detection and reaction executed within appropriate time limits?

10. Does the implemented software remove any possibility for Single Point of Failure?

Summary:
Software
detection
Risks accepted
/ identified
Recommendati
ons
Update FMECA
Issues identified
Comments

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 117

1. Is sufficient and reliable information about the failure available?


This question will permit to verify that a software error detection mechanism can potentially
have sufficient information to correctly detect and identify the failure. It will also be verified that
this information does not lead to a false detection. Note that since the detection is the first
interaction between the software and the hardware failure, an inaccurate detection will
drastically affect all subsequent software actions regarding the failure.

2. Is failure mode information reported beyond the system?


This question was added to figure out if, regardless of successful or possible failure recovery,
at least the failure is reported (either explicitly or in periodic HK). The main reason for this
question is that it was detected, during the functional HSIA, that in several cases the software
is unable to recover from hardware failures. In those cases, but not limited to, the software
shall attempt to accurately report the detected failure thus allowing the possibility of external
intervention (such the S/C OBS).

3. Does the software initiate a corrective action that safe-keeps the system and negate the effects
of the failure?
This question shall verify that the software initiates a corrective action upon a hardware failure.
By “corrective action” is meant that the software reaction keeps the hardware failure from
compromising the mission.

4. Are there fault tolerance characteristics in order to compensate the failure mode?
This question tends to answer if the case of the failure mode was taken in account by the
design, or imposed by the requirements, of both hardware and software. Examples of
characteristics that shall be looked-for are the presence of redundant units, alternative
communication links, error correction codes (e.g. CRC), etc.

5. Is there possibility of external intervention (especially when reaction fails)?


As referred in question two, there are several cases where the recovery fails or is not possible
to be performed within the system. The answer to this question shall identify if the HW/SW
system under analysis implements mechanisms that allow the recovery actions to be triggered
or performed through external intervention.

6. Is the immediate corrective action not built on top of the function that has failed?
An example a situation that shall be checked is for instance trying to send an error report when
the failure mode affects the communication link. Another example is a situation where the
failure detection is based on the received HK but HK may be incorrect due to that failure.

7. Is the failure recovery completely performed within the system?


This question shall be used to identify the level of autonomy of the HW/SW system under
analysis. In ideal conditions all the failure modes can be recovered without external
intervention. Note that this question is as relevant as the HW/SW system is more or less
accessible (e.g. a personal laptop or a spacecraft orbiting Pluto).

8. Is it verified that software does not stress the hardware?


This point shall verify that the actions taken by the software (both in nominal or failure recovery
modes) do not violate the operational constraints of the hardware. It shall be ensured that
software failures do not stress the hardware. One shall consider not just the possibility of HW
damage, but also the side effects of HW stress and abnormal function due to incorrect SW

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 118

actions (e.g. incorrect use of a link may not damage the link controller but may cause further
errors in the SW).

9. Is the failure detection and reaction executed within appropriate time limits?
This question shall verify if the failure is detected early enough in order to have sufficient spare
time to perform the respective corrective action and if the corrective action itself, is performed
within a time limit that safe-keeps the system.

10. Does the implemented software remove any possibility for Single Point of Failure?
This is a conclusion of the precedent check-points. The conclusion is adapted to failure
management philosophy. There shall be no possibility that any single failure can bring the
module in an unrecoverable

15.8 Validation Checklist


1. Checklist for Performing Analysis for ISVV level 1
1.1. Are events happening too early/late handled correctly?
1.2. Analyze FDIR requirements
1.3. Are injections outside or at the specified boundaries handled correctly?
1.4. Are potential runtime errors and overflow/underflow taken care of?
1.5. Investigate dataflow conflicts.
1.6. Are worst case situations handled correctly?
1.7. Inspect the Software User Manual focusing on operator failure.

ESA ISVV Guide Rev 1.0c Nov2005.doc


November 22, 2005
ESA Guide for Independent Software Verification and Validation
Issue 1 Revision 0 Page 119

16.0 Annex H – Software Validation Facility


The software validation facility must be able to support all test cases identified. The final
construction of the validation facility is dependent on the actual project. This is especially the
case when the facility must provide project specific reactions to stimuli of the environment:
• The generic software validation facility can provide general facilities for monitoring and
control of the software under test, and
• The software validation facility must allow for extensions that represents the project specific
aspects.

However, it is possible to state a number of general requirements:


• The software validation facility shall enable execution of the OBS under test in a flight
representative environment (“validation in context”).
• The software validation facility shall enable execution of the OBS under test flight image.
• The software validation facility shall support error injection on all levels, e.g.,
- memory corruption
- data bus failure
- erroneous data from subsystems
- communication failure (e.g., response too early/late, missing/duplicated packets)
• The software validation facility shall support event triggered actions. The events shall
include:
- OBS reading or writing to specific memory locations
- time based events
• The software validation facility shall support white box test mode:
- inspection of the internal software state
- control of the execution of the processor
• The software validation facility shall support black box test mode, controlling the input to
the OBS under test and monitoring the output.
• The software validation facility shall use a central clock representing the target processor
time (Simulated Real Time) as its global time reference, and relate all time events to this.
• The software validation facility shall provide a test script language to support regression
testing of OBS through batch test execution.

ESA ISVV Guide Rev 1.0c Nov2005.doc

You might also like