VV Manual v1.1 PDF
VV Manual v1.1 PDF
VV Manual v1.1 PDF
.
2
Foreword
The development and delivery of materiel capability solutions is a complex business that
requires a considered and consistent Verification and Validation (V&V) process. This V&V
Manual is a major step toward realising such a process for the DMO.
V&V is about managing and reducing the numerous project risks that we face each day. Better
V&V processes will allow us to identify defects or issues associated with our projects earlier,
thus allowing us to remedy problems before they escalate, both in consequence and cost.
The V&V Manual provides guidance to DMO staff on how to gather the V&V evidence
required for system acceptance and acceptance into service. It will ensure that activities
required to achieve V&V are better planned, through the Test Concept Document (TCD) and
the Test and Evaluation Master Plan (TEMP). It will also place greater emphasis on a wider
range of V&V activities, including those conducted early in the Materiel Life Cycle.
The V&V Manual aims to standardise the way in which the planning of V&V activities is
conducted in the DMO. The V&V Manual also discusses how V&V issues should be
prioritised. The V&V Manual will bring DMO processes, such as the development of TEMPs,
in line with best industry practice by using the ASDEFCON philosophy of V&V that is based
on the approach to V&V discussed in the EIA-632 systems engineering standard.
The development of the V&V Manual is part of a wider range of initiatives under the Materiel
Acquisition and Sustainment Framework (MASF) aimed at improving our business outcomes
and delivering complete capability solutions to the ADF.
S. GUMLEY
CEO DMO
May 2004
DRAFT
3
Email: [email protected]
Approving Authority
Under Secretary Defence Materiel
Defence Materiel Organisation
Russell Offices R2-5-C131
Canberra ACT 2600
SME
Michael Polya
Materiel Policy and Services Branch
Russell Offices R2-6-C147
Telephone: (02) 6266 0105
E-mail: [email protected]
Sponsor
Head Electronic Systems
Russell Offices R2-5-C081
CANBERRA ACT 2600
Produced and supported by the Material Policy and Services Branch of Electronic Systems Division, Defence
Materiel Organisation.
© Australia Government (Department of Defence) 2004.
This work is copyright. Apart from any use as permitted under the Copyright Act 1968, no part may be
reproduced by any process without prior written permission from the Department of Defence.
DRAFT
4
Contents
FOREWORD............................................................................................................................................................2
CONTENTS ............................................................................................................................................................ 4
1 INTRODUCTION ...............................................................................................................................................7
1.1 INTRODUCTION ..............................................................................................................................................7
1.2 V&V IN DMO ................................................................................................................................................7
1.3 THE ROLE OF V&V ........................................................................................................................................7
1.4 DMO POLICY .................................................................................................................................................7
1.5 V&V MANUAL ...............................................................................................................................................8
1.6 DEFENCE T&E POLICY .................................................................................................................................8
1.7 IMPORTANCE OF TIMELY V&V .....................................................................................................................8
1.8 FUNDING .......................................................................................................................................................10
1.9 RISK MANAGEMENT ....................................................................................................................................10
1.10 SYSTEMS ENGINEERING .............................................................................................................................12
1.11 VARIATIONS BETWEEN SERVICES AND PROJECT CATEGORIES ................................................................12
1.12 KEY DEFINITIONS ..................................................................................................................................13
2. SCOPE AND EXCLUSIONS ......................................................................................................................14
2.1 SCOPE .....................................................................................................................................................14
2.2 EXCLUSIONS ...........................................................................................................................................15
3. V&V REFERENCES ...................................................................................................................................16
3.1 CURRENT STANDARDS AND POLICIES RELEVANT TO V&V ..................................................................16
3.2 OTHER DOCUMENTS REFERRED TO IN THIS MANUAL ...........................................................................17
4. V&V PRINCIPLES......................................................................................................................................18
4.1 V&V OVERVIEW ...................................................................................................................................18
4.2 SUMMARY OF DMO RESPONSIBILITIES IN RELATION TO V&V ...........................................................18
4.3 T&E ORGANISATIONS............................................................................................................................22
5. V&V PLANNING.........................................................................................................................................26
5.1 V&V PLANS NEEDED FOR A PROJECT AND THEIR ROLES .....................................................................26
6. NEEDS PHASE V&V ..................................................................................................................................32
6.1 INTRODUCTION ......................................................................................................................................32
6.2 V&V TASKS IN THE NEEDS PHASE ........................................................................................................32
7. REQUIREMENTS PHASE V&V ...............................................................................................................34
7.1 V&V IN THE FIRST PASS PHASE............................................................................................................34
7.2 V&V IN THE SECOND PASS PHASE ........................................................................................................35
8. ACQUISITION PHASE V&V ....................................................................................................................36
8.1 V&V IN THE SOLICITATION PHASE ......................................................................................................36
8.2 V&V WHEN MANAGING THE CONTRACT AND PROJECT COMMITMENTS .............................................36
8.2.1 V&V required for Design Acceptance ............................................................................................42
8.2.2 V&V required for System Acceptance ............................................................................................42
8.3 V&V IN THE TRANSITION INTO SERVICE PHASE .................................................................................43
8.3.1 V&V required for Operational Acceptance / Release (Acceptance Into Service)...........................43
8.4 V&V REQUIRED FOR PROJECT CLOSURE .............................................................................................45
9 IN-SERVICE PHASE V&V ........................................................................................................................46
9.1 IN-SERVICE PHASE V&V OVERVIEW ....................................................................................................46
DRAFT
5
DRAFT
6
DRAFT
7
1 Introduction
1.1 Introduction
Contemporary capability systems are often characterised by complexity and high levels of
integration across both mission and support systems. The ADF seeks operational advantages
through capability systems that are usually at the forefront of what is technically feasible. The
successful fielding of such systems is often threatened by significant risks in a number of
forms. An essential tool in managing these risks is the ability to make objective assessments of
a materiel system’s development throughout its life cycle, and thus determine that the system
being acquired, upgraded or modified is compliant with requirements and fit for purpose.
Verification and validation (V&V) are the principal activities adopted within DMO to allow
for the objective assessment of a capability system as it progresses through the Materiel Life
Cycle. The DMO approach to V&V is founded on the accepted approach described by the
ASDEFCON contract templates which provide for concurrent V&V of the mission system and
support system, and progressive V&V throughout the duration of the contract. The aim of this
two pronged strategy is to ensure that through the provision, examination and evaluation of
objective evidence, that project acceptance milestones are achieved within the allocated
budget and schedule, and that the mission system can be properly supported during its in-
service phase. Additionally, the terminology of V&V is consistent with the key international
standards for the design and development of systems and software, which reflect the systems
engineering processes utilised by the DMO.
The role of verification is to provide confirmation that a system complies with its specified
requirements while the role of validation is to provide proof that the system capability (ie.
capabilities of the combination of the mission system and the support system) satisfies the
user’s needs. Given the general use of systems engineering (SE) processes across the Materiel
Life Cycle, there is an additional role for V&V to validate the adequacy of requirements
baselines and system specifications, and to verify that requirements and system specifications
are entirely consistent with higher level requirements.
DI (M) X-XX ‘DMO Verification & Validation Policy’ was developed to clearly enunciate the
role of V&V in DMO, how V&V is to be applied and who is responsible for applying it. The
policy also identifies the sources of objective evidence necessary to satisfy DMO V&V
requirements. DI (M) X-XX ‘Acceptance Process for New, Modified or Upgraded
Capability’ describes how V&V is to be applied to support a progressive and sequential
acceptance process thereby ensuring that a capability system achieves the required milestones
on time and on budget, culminating in Operational Release/Acceptance.
DRAFT
8
The V&V Manual was developed to provide DMO System Program Offices (SPOs), in
particular project engineering managers and test and evaluation (T&E) managers, with further
instruction on how to implement the DI (M)s on V&V in major capital equipment acquisition
and sustainment projects.
The purpose of T&E in Defence is to obtain information to support the objective assessment
of a capability system with known confidence. Defence T&E Policy provides that the results
from T&E are to be used within the Materiel Life Cycle to inform decision making and
mitigate risk. Objective evidence for the purposes of T&E can be obtained through physical
testing, modelling, simulation, demonstration and inspection. DI (G) OPS 43-1 Defence T&E
Policy provides that key milestones for acquisition life cycle management require some form
of V&V through the results of T&E so that risk is contained within acceptable boundaries, and
that the intended system meets safety standards and the end users’ requirements.
The DMO V&V policy is compliant with Defence T&E Policy while still remaining consistent
with the ASDEFCON approach. This approach acknowledges T&E’s significant role in
determining a system’s fitness for purpose or for contractual compliance as one of a number of
methods that satisfy DMO V&V requirements. Other methods include analysis methods,
audits, walkthroughs, system reviews and documentation reviews. By acknowledging all the
methods that can be considered for satisfying DMO V&V requirements, project and test
managers can employ the most appropriate methods for determining the acceptability of
systems development products, including concepts, requirements, designs and delivered
equipment.
A key outcome from a well managed V&V program is the management of cost, technical,
capability, safety and schedule risk achieved through the early identification of requirements
and system design, and the resolution of development and construction defects. Significant
cost savings can be achieved by identifying and rectifying defects early in the Materiel Life
Cycle. As shown in Figure 1, rectifying a defect during concept development can be up to
1000 times cheaper than rectifying the defect once the system has been introduced into
service.
DRAFT
9
Figure 1: Relative cost of correcting defects over the Materiel Life Cycle (DMO 2003)
100%
Committed Costs 95% Operation
90% 85% through to
Cumulative % life cycle cost
Disposal
80%
ts
70% 70% efec 500-1000×
td
60% tr ac
o ex 20-100×
t
50%
st 100%
co Production
Test
40%
30% 3-6×
50%
20%
Development
10% Concept Design
20%
8% 15%
0%
Defense Systems Management College 9-93
time
focus needed
Consequently, a V&V program must be developed and implemented from early in the
Materiel Life Cycle. This requires placing emphasis on requirements analysis, operational
assessments and on reviewing the contractor test processes prior to contract signature, and
validation of the outputs from requirements analysis, functional allocation and the elicitation
of operational requirements. Commonwealth review of the contractor’s progressive testing
during integration from the lowest level hardware and software assembly also requires
emphasis. Early testing is particularly important in the context of software V&V where if a
defect is not identified in unit testing it is more difficult, and therefore expensive, to identify
and remedy the defect in component, integration or system level testing.
DRAFT
10
1.8 Funding
The TCD indicates, well in advance, to the DMO SPO and the relevant Service(s) what the
T&E activities are expected to cost and therefore reduces the risk of cancellation or
postponement of major T&E activities. The DMO SPO will not usually fund all of the T&E
costs. For example, a part of the OT&E costs may be borne by CDG or the relevant Service(s)
(since the operators will be participating in OT&E). The completed TCD, produced at Second
Pass approval, is to contain a funding estimate which must identify the extent of the costs of
T&E activities that will be paid for by the DMO, using project funding, and what will be paid
for by other groups in Defence. The majority of funds for V&V would be spent in conducting
T&E activities.
T&E, on average, costs around 13% of the total project acquisition cost (Crouch 1995). This
figure would be higher for high-risk projects and lower for low-risk projects. The funds
allocated to T&E would be lower for commercial off the shelf (COTS) integration projects
than for projects where there is a high proportion of new design since less Developmental Test
and Evaluation (DT&E) is required
It is also important to ensure when planning a V&V activity that the value to be obtained from
the V&V activity is greater than cost of conducting the V&V activity. The extent that the
financial, schedule and technical risk is likely to be mitigated by the V&V activity should be
significant in comparison to the costs and risks associated with the actual conduct of the V&V
activity.
Risk is defined as the chance of an event happening that will have an impact upon objectives.
It is measured in terms of consequences and likelihood. Risk is the expectation of this event
occurring. That is, the estimated probability that the event will occur multiplied by the
consequence of that event.
The greater the risk associated with a function performed by a system, the higher the priority
of the V&V activities associated with that function. Thus the need for, and level of, V&V
should be based on a risk assessment.
Risks may be cost, schedule or performance related. Risks to achieving the desired capability
can either be categorised as mission critical or safety critical. The magnitude of safety critical
risk associated with a function is assessed through various types of hazard assessments. The
Safety thread in QEMS provides further details on the process of conducting these
assessments.
V&V is an effective risk management tool through which system deficiencies can be
identified and rectified early in the system development process. Early identification and
rectification ensures that the system being developed achieves key milestones and meets the
user’s needs, and results in significant cost savings. An important part of V&V planning is to
prioritise the deficiencies associated with non-compliances where the prioritisation is based
on:
DRAFT
11
This prioritisation will guide the level of effort applied to remedying the deficiencies by
improving the design processes or the design of the system. Through analysis of the set of
defects, V&V may also allow the identification and rectification of systemic defects in the
developmental processes.
An acceptable level of risk is determined in consultation with the Capability Manager. Based
on these agreed levels the acceptance criteria that the system will be judged against in
Acceptance Testing are determined.
For example, the MIL-STD-498 guide on prioritisation of defects, shown in Table 1, could be
used as guidance for determining the number of unresolved defects, of a given level of
severity, that would be deemed to represent an acceptable level of risk. This could form one
of the acceptance criteria.
Defect resolution can also be managed using a Failure Reporting and Corrective Action
System (FRACAS) as described in MIL-HDBK-2155.
DRAFT
12
V&V is used across the Materiel Life Cycle and within the underlying SE processes to provide
a feedback mechanism whereby the output of each stage of system development is assessed
for adequacy. Progression to the next stage of system development cannot take place until the
outputs are validated and/or verified as appropriate. V&V across the Materiel Life Cycle is
undertaken by the conduct of V&V activities, such as test or analysis activities, the results of
these activities are then evaluated to determine whether they confirm the verification of a
requirement or the validation of the system. The V&V is then confirmed in a series of system
reviews and audits, at which technical data packages are presented for Commonwealth
approval.
ASDEFCON defines different acquisition processes for each of the materiel acquisition
categories (i.e. Complex Materiel 1; Complex Materiel 2; Support or Strategic Materiel).
Accordingly the application of V&V will vary depending on the materiel acquisition category
for which the project is responsible. Further details on the ASDEFCON approach to V&V for
each of these project types are provided in the relevant ASDEFCON Handbooks.
There are some variations to the V&V processes described in this Manual for the different
Services. These relate specifically to the variations in the T&E processes. These are described
in the specific T&E policies for each of the Services, which are listed below:
a. RAN: ABR 6205 –Naval Operational Test and Evaluation Manual (NOTEMAN)
b. RAAF: DI (AF) LOG 2-7 – Test and Evaluation of Technical Equipment
c. Army: DI (A) LOG 12-1 – Regulation of the Technical Integrity of Land Materiel.
DRAFT
13
Listed below are definitions for some key terms used throughout this Exposure Draft V&V
Manual. Other definitions of V&V related terms contained within this document are contained
in the glossary in annex A. Annex D contains a list of definitions of acronyms used
throughout this manual.
Acceptance Criteria – The criteria upon which a decision to grant contractual acceptance of
the mission system and support system will be determined.
Mission System – That element of the capability that directly performs the operational
function. Examples include platforms (e.g. ship, tank, or aircraft), distributed systems (e.g.
communications network), and discrete systems that integrate into other Mission Systems (e.g.
a radar upgrade for a platform). Major Support System Components (such as simulators,
Automatic Test Equipment (ATE) and Logistic Information Management Systems (LIMS))
could also be classified as Mission Systems if the level of management attention to be applied
to these components warranted this classification.
Validation – Proof through evaluation of objective evidence that the specified intended end
use of a product is accomplished in an intended environment.
DRAFT
14
The V&V Manual is intended as guidance to assist DMO SPO staff to implement the DMO
V&V policy to plan, approve and conduct V&V activities. The detailed explanation of V&V
activities that should be managed by a SPO contained in this manual follows the phases of the
Materiel Life Cycle described in Quality and Environmental Management System (QEMS).
These are then followed by the following activities in the In-Service Phase (as defined in
QEMS), which are:
The intent of this manual is to provide on overview of DMO V&V processes and how
V&V evidence should be gathered to support informed acceptance decisions. However,
this manual provides only limited detail on how to conduct these processes. This manual
does, however, refer to other references and courses for more detailed information on how
to conduct V&V activities.
It should be stressed that during the Needs Phase, First Pass Phase and Second Pass Phase
the Integrated Project Team (IPT) that manages the project is led by the Capability
Development Group (CDG) with support provided by DMO and other groups in the ADO.
This support function includes, as a minimum, the review and endorsement of all
acquisition business case documents including the capability definition documents (CDD).
The CDG may not necessarily manage V&V in the same manner as described in this
document. In subsequent phases the project is managed by the DMO.
DRAFT
15
2.2 Exclusions
Issues concerning the roles and responsibilities of CDG with respect to T&E are covered in
the Capability Systems Life Cycle Management Manual (CSLCMM) and so they are not
discussed in this manual. The CSLCMM can be found at
http://defweb.cbr.defence.gov.au/home/documents/departmental/manuals/cslcm.htm.
The current portfolio level T&E policy is contained within DI (G) OPS 43-1, which is
maintained by DTRIALS. The V&V Manual is consistent with the approach to T&E
described in DI (G) OPS 43-1.
Where applicable, the V&V Manual also refers users to the ASDEFCON (Strategic Materiel)
Handbook for further information. The ASDEFCON contract templates and associated
handbooks can be found on the DRN at http://intranet.defence.gov.au/dmoweb/Sites/CPO/
DRAFT
16
3. V&V References
3.1 Current standards and policies relevant to V&V
DRAFT
17
Draft DI (M) X-XX – DMO Acceptance Process for New, Modified or Upgraded Capability
IEEE Std. 1220-1998: IEEE Standard for Application and Management of the Systems
Engineering Process,
Crouch V. 1995, The Test and Evaluation Process Used to Design and Develop Weapons
Systems in the United States Department of Defense, minor thesis for a Masters of Engineering
(T&E), University of South Australia, Adelaide, SA, Australia
Defense Acquisition University 2001, Test and Evaluation Guide 4th Ed., DAU Press, Fort
Belvoir, VA, USA
DMO 2003, System Review Guide v1.1, Department of Defence, Canberra, ACT, Australia
Sessler A., Cornwall J., Dietz B., Fetter S., Frankel S., Garwin R., Gottfried K., Gronlund L.,
Lewis G., Postol T. & Wright D. 2000, Countermeasures: A Technical Evaluation of the
Operational Effectiveness of the Planned US National Missile Defense System, Union of
Concerned Scientists, Cambridge, MA, USA
DRAFT
18
4. V&V Principles
4.1 V&V Overview
Requirements listed in the Functional and Performance Specification (FPS) should each have
an appropriate verification method associated with them. Verification activities must be
traceable to the verification methods listed in the requirements in the FPS, System
Specification (SSPEC), and the contractor’s Verification Cross Reference Matrix (VCRM).
Validation activities must be traceable to operational scenarios that are consistent with those in
the Operational Concept Document (OCD) and the Concept of Operations (CONOPS).
The verification method listed in the FPS, SSPEC and VCRM would usually be one or more
of the following:
Validation should be conducted using scenarios that are consistent with the OCD. This would
normally be performed in an OT&E activity, although it may be conducted in conjunction
with analysis methods such as modelling and simulation.
Requirements, end products and enabling products all need to be validated. The performance
and functionality of systems and sub-systems are verified by demonstrating the system’s
compliance with the validated requirements. Systems and sub-systems are verified prior to
System Acceptance. Requirements should be validated prior to Second Pass approval. End
products (e.g. the mission system) are validated prior to System Acceptance, during the
transition into service and in the In-Service Phase. The enabling products (e.g. the Support
System) are validated usually through Supportability V&V, which is conducted in the
Acceptance V&V and, later, in the In-Service Phase.
The responsibilities of the SPO and the Contractor with respect to approval of the plans,
procedures and results of V&V activities are outlined in Figure 3. This diagram demonstrates
DRAFT
19
the principles of Clear Accountability in Design (CAID) that are fundamental to the approach
to V&V in the ASDEFCON contract templates:
Commonwealth is responsible for approval at the system definition and verification boundary
operational
capability
V&V Plan evaluation
definition
SS, SSSPEC VCRM Test Plans and
Procedures
subsystem subsystem
definition test
specifications
test
documents
contractor
approval
The V&V clauses in the ASDEFCON (SM) SOW uses the terms ‘Acceptance Verification’
and ‘Acceptance Validation’ to highlight that the DMO’s role in the V&V program is almost
solely related to determining whether or not contractual deliverables are able to be accepted.
The DMO should not have a hands-on role in the contractor’s internal testing activities, unless
the results of those activities are expected to be utilised for the purposes of obtaining
Acceptance. Instead, the DMO should adopt a monitoring role over the contractor’s internal
testing activities for risk-management purposes. This approach helps to ensure that the
contractor retains responsibility for ensuring that the requirements of the contract are met as
determined by the Acquisition Baseline.
An IPT, led by the staff from CDG but including DMO representatives, produces a TCD to
define the T&E strategy required to ensure verification and validation of requirements and
operational needs stated in the FPS and OCD. The financial and managerial responsibilities of
the DMO, the T&E agencies, DSTO, CDG, the relevant Service(s) and the contractor with
respect to T&E conducted throughout the materiel life cycle are identified in the TCD.
The development of the TCD is produced through negotiation between the members of the
IPT before second pass approval.
DRAFT
20
For example, if flight testing is required for a project then this will involve using the resources
of various groups. The DMO has to be satisfied that the funds and resources required to
conduct the necessary T&E activities required for the various stages of acceptance will be
available when needed to ensure that the capability will be delivered on time and on budget.
The TCD must state these details.
The TCD has to be approved by the relevant one star officer (or equivalent) from the DMO
before the project can be recommended for second pass approval.
The IPT that is responsible for the development of the TCD should contract out the
development of the TCD to a CDD Panel qualified expert to ensure that the TCD is a high
quality document.
After second pass approval the IPT, now led by the SPO, drafts a Test and Evaluation Master
Plan (TEMP) to more precisely define the T&E strategy stated in the TCD. The TEMP should
also address the other verification methods, which are listed in section 4.1 of this manual. A
guide on how to draft a TEMP is provided in Annex B of this manual. A hierarchy of V&V
related plans is provided in Figure 4.
The contractor must produce five V&V related deliverables that the SPO must review and
approve. These are the V&V Plan (V&VP), the VCRM, the Acceptance Test Plans (ATPs),
the Acceptance Test Procedures (ATProcs) and the Acceptance Test Reports (ATRs).
The SPO must monitor the V&V activities of the contractor prior to Acceptance Testing (i.e.
those V&V activities conducted at the element and sub-system level). The risk that problems
will occur during this stage is mitigated through review of the V&V Plan.
The SPO holds Test Readiness Reviews (TRRs) with the contractor before each acceptance
testing activity to ensure that the acceptance testing activity will provide reliable evidence to
support an acceptance decision.
The SPO must also plan and manage the conduct of V&V activities that may also be outside
the scope of the V&VP. They will largely consist of validation activities that will be defined in
the Validation Plan and Supportability Verification and Validation (SV&V) activities that will
be defined in the SV&V Plan. SV&V is used to assess the extent to which the mission system
design and the support system facilitates supportability (supportability includes such ILS
considerations as reliability, maintainability and availability).
Planning and managing the conduct of V&V activities by the SPO will consist of a analysis to
determine whether the activity will provide the required data for V&V in a timely and
objective manner and to determine whether it will constitute unnecessarily duplication of
activities already conducted. The V&V activity will then be planned and conducted and the
results analysed and evaluation with respect to V&V made.
They will also include other V&V activities as required by the Technical Regulatory Authority
(TRA) in order to verify that all requirements needed to achieve design acceptance have been
met.
DRAFT
21
The SPO must then coordinate the data analysis from these activities. The results will then be
used to validate the system against the operational needs defined in the OCD and to determine
if there are any deficiencies in the system with respect to fitness for purpose.
For those activities that are directed to be conducted by a higher Defence committee (ie to
assist a committee make an informed consideration on a capital equipment project) the project
director is responsible for disseminating the results of such T&E in a timely manner to the
members of the originating committee.
Project directors are responsible for the sponsorship of T&E and possess the authority to act
upon the results and implement the recommendations arising from those results. Project
directors are also responsible for seeking authorisation for the necessary resources required -
through the appropriate division of CDG - for the conduct of project specific T&E by Defence
T&E agencies.
The SPO must then arrange to have the system accepted into service in consultation with the
relevant Service(s).
The SPO must manage ongoing operational testing to determine the operational suitability and
operational effectiveness of the system in new scenarios or of the system with modifications
after the system has been transitioned into service. This testing, along with supportability
testing, will also support the business process closure. A large part of this testing in the In
Service Phase may be managed by an ADO T&E agency. In this instance the SPO will just be
the sponsor for this testing.
At the end of the capability lifecycle the SPO will then manage the disposal of the system.
The responsibilities normally held by the SPO in relation to V&V are further described in
sections 6 to 9 of this document. The specific responsibilities that the SPO will have in
relation to test activities should be stated in the TCD.
Acceptance V&V is discussed further in DI (M) X-XX ‘Acceptance Process for New,
Upgraded and Modified Capability’.
DRAFT
22
Within Defence there are a number of T&E agencies that can provide support in relation to the
planning and conduct of T&E activities. These include the Directorate of Trials (DTRIALS),
the Land Engineering Agency (LEA), Royal Australian Navy Test, Evaluation and Analysis
Authority (RANTEAA), Joint Ammunition Logistics Organisation (JALO), Aerospace
Operational Support Group (AOSG) and the RAN Aircraft Maintenance and Flight Trials Unit
(AMAFTU). These agencies also provide services in analysis and interpretation of the results
of T&E activities, particularly in relation to OT&E.
The responsibilities of a T&E agency with respect to major T&E activities must be stated in
the TCD since there may be long lead times associated with T&E agency support to a T&E
activity.
The general roles of the T&E agencies are identified in DI (G) OPS 43-1.
The specific skills that these agencies have are described in Table 2.
In this table the types of materiel systems are divided into the four major groups namely
mobility, firepower, surveillance and target acquisition and communications and information
systems. These categories of systems are described in the Army Technical Staff Officer’s
Course (ATSOC).
The services that can be provided by the various T&E agencies that can contribute to V&V of
these four types of systems are described in Table 2.
DRAFT
23
DRAFT
24
Modelling and
Simulation
Information on test and evaluation ranges and facilities operated within Defence can be found
on the DRN at http://intranet.defence.gov.au/nod/131/7062_1.pdf
Independent V&V (IV&V) organisations and the DSTO can also provide V&V and T&E
services to the DMO.
DRAFT
25
In addition to this there are some T&E and V&V related working groups which are described
below.
The T&E Principals Forum is a group that liaises with industry and overseas T&E agencies.
The T&E Principals Forum meets at least once annually, to discuss and review Defence T&E
and DMOV&V policy and to ensure that Defence T&E and DMOV&V policy is consistent
with current best practices. The T&E Principals Forum consists of the heads of the above-
mentioned T&E agencies and the DMO Director of Systems Engineering.
The DMO Test and Evaluation Working Group (DMOT&EWG) is a group of T&E
practitioners within Defence, that exists in the form of an e-mail group, which discusses and
reviews Defence T&E and DMO V&V policy.
DRAFT
26
5. V&V Planning
The V&V plans and reports that must be developed for a project and the description of these
documents is provided in Tables 3 and 4 below.
Table 3: Commonwealth produced V&V plans needed for a project and their roles
Commonwealth Plans The V&V documents that the Commonwealth must produce for a
/ Reports major capital equipment acquisition project are listed below.
Test Concept During the First and Second Pass Phases, the IPT led by staff from
Document (TCD) CDG produces a Test Concept Document (TCD) to define the
project’s T&E strategy. The CDD consists of the TCD, the OCD
and the FPS. The IPT will consist of representatives from CDG,
DMO, the Capability Manager and the relevant T&E agency. It
should also include a representative from the relevant TRA.
The template for the TCD and the associated guidance in relation to
producing the TCD is contained in the annex to Chapter 6 of the
Defence Capability Life Cycle Management Manual (CSLCMM).
Test and Evaluation After Second Pass approval, a Test and Evaluation Master Plan
Master Plan (TEMP) (TEMP) is developed by the IPT, under the leadership of the DMO
SPO, to describe the DMO V&V strategy in detail. The TEMP
forms part of the Project Management Plan. The TEMP is updated
throughout the Acquisition Phase as new test requirements arise.
Validation Plan The Validation Plan covers the full scope of all mission system
validation activities and the SV&V Plan covers the full scope of all
SV&V activities. For smaller projects the Validation Plan and the
SV&V Plan may be included as annexes to the TEMP.
DRAFT
27
Supportability V&V The SV&V Plan needs to be developed by the DMO SPO. There
Plan (SV&V Plan) will be a need for the SPO, in consultation with the relevant T&E
agency, to develop detailed test plans and procedures subordinate to
the Validation Plan (e.g. detailed OT&E Plans such as an OpEval
Plan) and the SV&V Plan. Some of the detailed SV&V plans will
be developed by the Contractor in accordance with the relevant
ASDEFCON (SM) ILS DIDs (such as the Support Test and
Equipment Plan DID). Other detailed SV&V plans will need to be
developed according to the SV&V templates that are to be included
in the SV&V Guidance to be produced as part of MASF release 4.
Other V&V related There will also be a need for additional test plans to be produced by
Plans / Reports T&E agencies as they conduct additional test activities to address
the differences between the Capability Baseline and the Acquisition
Baseline or perform OT&E activities. T&E agencies may also need
to conduct testing that requires the use of Government Furnished
Material (GFM), special test resources (e.g. electromagnetic
compatibility (EMC) or environmental testing labs) or government
staff (e.g. pilots).
DRAFT
28
Table 4: Contractor produced V&V plans and reports needed for a project and their
roles
Contractor Plans / The V&V documents that the prime contractor for a strategic
Reports materiel project must produce according to the ASDEFCON
(Strategic Materiel) template are listed below. The Data Item
Descriptions (DIDs) for these can be found as part of the Asset
Library associated with the ASDEFCON (SM) Template at
http://intranet.defence.gov.au/dmoweb/sites/cpo/default.asp?p=load.asp?page=10
888. These DIDs should not normally be tailored to suit the project.
Equivalent DIDs for Complex Materiel and Support projects were
under development at the time of writing of this manual.
V&V Plan (V&VP) The V&VP defines the contractor’s V&V strategy pertaining to the
mission system and the support system. The SPO must ensure that
the V&VP addresses as much as possible of the testing required in
the TCD and TEMP.
Acceptance Test Plan The ATPs for each phase of Acceptance Testing, describe the test
(ATP) cases, test conditions, configuration of the system under test and
equipment, documentation and personnel required for conducting
the acceptance test. It should also list the requirements to be verified
in each test case. At least one requirement should be verified in
each test case. The SPO, or an ADO specialist (T&E or technical)
as their representative, must formally witness the acceptance testing.
Acceptance Test The ATProcs describe the detailed test steps and the acceptance
Procedures (ATProcs) criteria for each test case. The SPO must determine the acceptance
criteria that the contractor must meet based on requirements
contained in the FPS. The SPO must ensure that the ATProcs
contains test steps that are in an appropriate logical and temporal
sequence, contain sufficient detail to be repeatable and are suitable
for determining achievement of the acceptance criteria.
Acceptance Test The ATR records the results of the acceptance testing, including the
Report (ATR) PASS/FAIL status for each test case. The SPO approves all ATRs
but only if it believes they represent an accurate record of the test
results that they have witnessed in the Acceptance Testing.
Verification Cross The VCRM is a table that specifies how the contractor will verify
Reference Matrix each requirement. The contractor completes the VCRM
(VCRM) progressively as the design program progresses. The SPO must
ensure that the VCRM states the appropriate verification method to
be used for verification against all requirements and must ensure
that it is updated and accurately reflects the current status of system
verification. At SRR the VCRM is agreed with the contractor and
the verification methods are contractually binding and cannot
change without a CCP.
DRAFT
29
Test plan contents: As a minimum a test plan should include the following:
- a description of the system under test (SUT) (including the system software and
hardware configurations being tested);
- a description of interdependencies between the SUT and other systems;
- a description of all stakeholder responsibilities;
- a description of the purpose of the testing and its relationship to other V&V processes;
- references to other relevant documentation (e.g. OCD, FPS, VCRM etc.);
- a definition of project specific terms;
- reference to the requirements to be verified or operational scenarios that the system
will be validated against ensuring that all tests can be traced through to the FPS or
OCD;
- a brief description of the test strategy including a brief description of the testing
conducted to date;
- the test schedule;
- test sequence;
- test constraints and assumptions;
- test limitations (in terms of the extent to which the test will be able to accurately verify
compliance with requirements);
- risks to completing the test (e.g. resource availability / staff availability) and a
description of how those risks will be mitigated;
- a description of the key functionality to be tested;
- a description of the major outstanding defects;
- a list of the test cases with prioritisation of the test cases. The prioritisation of test
cases will be partly based on areas perceived to be the highest risk which will include
tests of functions in which defects have been found to date and tests of critical
functions or critical performance requirements;
- a description of major external and internal interfaces;
- a description of how test results will be recorded and analysed;
- a description of what verification methods will be used for the various requirements
(e.g. demonstration for observable results, inspection for examining code or examining
a system for compliance with a design requirement or standard or analysis for
comparing baseline test results with new test results);
- a description of measurement instrumentation, how the instrumentation will be
configured and how this instrumentation will impact the accuracy of the test results;
and
- an identification of all test tools and test data to be used (this may include references to
files containing the test data).
Test procedure contents: As a minimum a test procedure should include the following:
DRAFT
30
- a description of the type and level of the testing (i.e. system level or component level
etc.);
- a description of the objectives for each test case (e.g. verification of a specific
requirement);
- a description of the acceptance criteria;
- a description of the non-system related uncontrollable factors that will influence the
outcome of the test (e.g. environmental factors) and how the impact of these factors
will be minimised (e.g. through randomisation or blocking);
- prioritisation of the tests;
- the test sequence;
- the test steps expressed in sufficient detail such that they are repeatable but in no more
detail than is absolutely necessary for the test steps to be repeatable;
- a column for indicating whether the test has passed/failed or if the requirement is only
partially verified;
- reference to the requirement being addressed in each test case or alternatively a
description of the test objective for each test case;
- inputs for each test step (this may include reference to files containing test data);
- the expected result for each test step;
- test preconditions; and
- a signature block for the tester and the witness to sign off each test case.
Test report contents: As a minimum a test report should include all of the content of the test
procedure (since it is often an annotated version of the test procedure) as well as: :
A summary of the V&V and T&E plans hierarchy is provided in Figure 4. This is a list of the
main plans/documents that are required for Strategic Materiel projects. In some projects there
may be a need for additional plans such as Modelling and Simulation Plans or Software V&V
Plans. Some of these additional plans may fall under the scope of detailed OT&E plans for
instance. In the case of minor projects it may be possible to merge some of these plans into
one plan. For instance, the Validation Plan and the Supportability V&V Plan can be included
as annexes to the TEMP in the case of Complex Materiel projects.
DRAFT
31
+ TEST AND
EVALUATION
MASTER PLAN
(TEMP)
* VERIFICATION
+ SUPPORTABILITY AND + VALIDATION
V&V PLAN VALIDATION PLAN
(SV&VP) PLAN (V&VP)
*
+ DETAILED
ACCEPTANCE + DETAILED
SUPPORTABILITY
TEST PLANS OT&E PLANS
PLANS
(ATPs)
+ * ACCEPTANCE
SUPPORTABILITY + OT&E
TEST
PROCEDURES PROCEDURES
PROCEDURES
(ATProcs)
+ * ACCEPTANCE + OT&E
SUPPORTABILITY TEST REPORTS TEST
TEST REPORTS (ATRs) REPORTS
DRAFT
32
Chapters 6, 7, 8 and 9 of this manual outline what is expected of Defence and the Contractor
with respect to V&V throughout the various phases of a project. The phases referred to and
the Unique Identifiers (UI) referred to are those described in QEMS (ref.
http://qems.dcb.defence.gov.au). Consult the QEMS references and the standards referred to
in the text in this section for further information.
An acquisition program is managed by CDB during the Needs Phase and the Requirements
Phase and by the DMO in the remaining phases. The DMO must contribute to the
development and must review the CDDs in consultation with the other members of the IPT
during the Requirements Phase. The IPT that develops the CDD is led by CDB until second
pass approval at which point the leadership of the IPT is transferred to the DMO. The purpose
of chapters 6 and 7 is to outline the processes that are to be conducted during the Needs Phase
and the Requirements Phase and how the DMO contributes to this process. The CSLCMM
and the CDD Guide should also be consulted for information on capability development
processes throughout the Needs Phase and Requirements Phase.
Below is a list of V&V related activities that need to be conducted in the Needs Phase.
- Review results of tests of the systems to be replaced and review the performance of these
systems with respect to the missions they need to accomplish and determine the required
improvements in capability.
- Review the threats likely to be encountered by the system being acquired at the time that
the system is expected to be introduced into service.
- The initial stages of an Operational Assessment (OA) are conducted at this point, usually
in consultation with DSTO (see DI (G) ADMIN 06-1 for details on tasking DSTO). This
will usually involve modelling and simulation. In the first instance, business process
modelling using a computer aided systems engineering (CASE) tool such as CORE is
required to ensure that the proposed operational concept is valid.
- Analytic studies and military experimentation to explore the costs and benefits of different
force structure options in performing strategic tasks in the context of the Australian
Illustrative Planning Scenarios (AIPS) (QEMS reference: UI 2177).
DRAFT
33
- Identify potential technical risks to achieving the desired capability that may present
throughout the project and identify ways to mitigate these risks through V&V.
DRAFT
34
In the First Pass Phase a solution-independent set of Capability Definition Documents (CDDs)
is produced by the IPT. This will usually consist of sections 1-4 of the OCD.
Requirements definition during the First Pass Phase, culminates in production of CDD
documentation of which section 1 to 4 of the OCD can be viewed as the major Requirements
Baseline resulting from the First Pass process. It should be noted that other ‘intermediate’
Requirements Baselines may be produced progressively as more detail is developed prior to
completing the OCD. Requirements validation is a critical activity during the First Pass Phase
which confirms that each Requirements Baseline developed, including the OCD, is an
accurate and complete representation of the capability need and meets an established set of
quality and content criteria. Requirements validation activities, and requirements quality
characteristics, are described in more detail in Annex C and in the CDD Guide. In summary
the following Validation activities must be performed in the context of Requirements
development.
A Preliminary Capability Options Document (PCOD) is developed in the First Pass Phase to
determine the feasibility of the various capability options that have been proposed. (QEMS
reference: UI 2193).
More detailed modelling and simulation can be performed in the First Pass Phase than is
conducted in the Needs Phase. This initially involves determining what types of models
would be appropriate to conduct an Operational Assessment (OA) to evaluate the overall
likely system effectiveness, implementing the models and validating the models. These
models may include physical models (i.e. mock-ups or prototypes), simulations and
technology demonstrators. These models can then be used to evaluate the capability options
that were identified in the PCOD to determine which of the options best addresses the
operational need. Any models or simulations proposed to be developed should follow the
guidance of the Defence Simulation Proposal Guide. The Australian Defence Simulation
Office (ADSO) recommends the use of the US DoD Modeling and Simulation Office
‘Verification, Validation and Accreditation (VV&A) Recommended Practices Guide’ for
VV&A of simulations.
In the First Pass Phase a Risk Management Plan is developed as part of the Project
Management Plan (PMP). This should include a section on how V&V will be used to mitigate
risks throughout the project including IV&V.
DRAFT
35
In the First Pass Phase it may be necessary to conduct a Project Definition Study (PDS) which,
amongst other aims, will identify legal, doctrinal, environmental, electromagnetic spectrum
and TRA constraints (QEMS reference: UI 2194). The PDS should also contain the results of
OA, trade-off studies used to compare the capability options under consideration and the
functional and performance priorities with respect to the mission and support systems, which
would be determined from the results of the OA.
The solution dependent CDD (i.e. the OCD, FPS and TCD), addressing both the mission
system and support system, is to be completed and endorsed in the second pass phase (QEMS
reference: UI 2113).
As part of this, the IPT produces a TCD to describe the V&V strategy to be used for the
system being acquired throughout the Defence Capability Life Cycle. The TCD is produced
according to the process described in the CDD Guide and must comply with the format
provided in that document.
V&V in the second pass phase involves the validation of requirements in the FPS against the
OCD to ensure that compliance with the requirements is likely to address all of the operational
needs.
Requirements must be drafted so that they are unambiguous, verifiable and address the
complete set of operational needs to address the complete set of COIs. A verification method
must be listed for each requirement.
Requirements validation is often conducted in consultation with IV&V staff at this point since
an experienced systems engineer must review the FPS to ensure that the requirements cover
the full scope of the user needs. Requirements validation is discussed further in Annex C.
The Second Pass process develops the capability requirements articulated in the OCD into
more detailed representations of the requirement through a more developed OCD, FPS and
TCD. This can be achieved, in part, through application of the SE activities of functional
analysis and synthesis, which enable derivation of greater requirements and system detail.
Verification must be applied to the resulting functional and physical architectures, to ensure
complete traceability between the underlying Requirements Baseline and functional and
physical architectures, and a complete absence of voids and conflicts between these items.
Further detail on applying verification is provided in Annex C. The OCD, FPS and TCD can
also be considered the primary Requirements Baselines resulting from the Second Pass Phase
given that these will provide the set of requirements on which the Solicitation Phase will be
based. Accordingly, Requirements Baselines developed through the Second Pass must also be
subjected to Requirements Validation as described in Section 6.3 and Annex C to ensure that
they provide a sound foundation for subsequent work.
DRAFT
36
On approval of the Capability Baseline at Second Pass approval, the leadership of the IPT
transfers from CDG to the DMO for the further development of the required V&V products.
On the establishment of the SPO, the SPO must commence development of the TEMP (QEMS
reference: UI 2162). The person responsible for development and maintenance of the TEMP
is identified in the Acquisition PMP. The requirements derived from the TEMP and TCD are
then flowed into the FPS and sent out with the Request For Tender (RFT).
During the Solicitation Phase, the prospective Contractor(s) is provided with an OCD
(identifying scenarios detailing ‘fitness for purpose’ requirements), the FPS and the TCD. The
Contractor will then produce a V&V Plan to describe the full scope of the V&V activities that
they intend to conduct and how they intend to satisfy the V&V of the requirements detailed in
the FPS. The Contractor(s) must also produce the System Safety Program Plan (SSPP) in
response to the RFT. These must be reviewed and evaluated by the project office to evaluate
the tender(s), using the DMO Checklists that can be found on the ASDEFCON web site.
During the Solicitation Phase the tenderer’s processes may also be evaluated by suitably
qualified IV&V staff to ensure that the processes will identify and assess defects efficiently
and that the tenderer(s) have a suitable System Problem Reporting process that ensures that
defects are prioritised and remedied according to priority. A CMMI evaluation may also be
useful in the solicitation phase. The project office has a responsibility to ensure that this
prioritisation of defects is in line with the users’ functional and performance priorities.
At the completion of the Solicitation phase (i.e. after contract negotiations) the CDD is further
refined and the TEMP further developed by the project office to reflect any changes that may
arise during the Contract Negotiations. This establishes the Acquisition Baseline that the
contractor will be expected to deliver against for Final Acceptance.
Some of the details in relation to the responsibilities of the project office with respect to
managing the Contractor V&V activities are not included in this manual. This is because
some of these details can already be found in QEMS and in the ASDEFCON (Strategic
Materiel) Handbook; Part 3 – Draft SOW & Annexes – guidance associated with clause 7; and
in section 13 of the Supplementary Information – Philosophy Behind the Draft SOW &
Annexes. The ASDEFCON (Strategic Materiel) Handbook can be found at
http://intranet.defence.gov.au/dmoweb/sites/cpo/default.asp?p=load.asp?page=8176
The Contractor must develop the system and verify that the system meets the requirements
stated in the System Specification (SSPEC) (which is traceable to the FPS) by ensuring that
the system meets the acceptance criteria in the acceptance testing. These specifications
establish the two functional baselines of;
a. Mission System Functional Baseline; and
b. Support System Functional Baseline.
DRAFT
37
Prior to acceptance V&V, the project office must monitor the contractor’s internal V&V
processes at the sub-system and configuration item level. This is done to ensure that the
system development is proceeding according to schedule, that system problem reports are
being raised, prioritised and resolved in a timely and efficient manner and that project risk is
being systematically and progressively reduced. Importantly, the project office must ensure
that the problem (i.e. defect) reports are prioritised according to the functional and
performance priorities of the users and that the resolution of the defects is proceeding
sufficiently rapidly to ensure the likely achievement of acceptance criteria in the acceptance
testing.
In managing the contractor’s V&V activities the project office must review and approve the
contractor’s V&V Plan, VCRM, ATP, ATProcs and ATR. The project office must check that
these documents comply with the requirements of the associated Data Item Descriptions
(DIDs). The project office must also review and approve the Contractor’s SSPP to ensure that
the V&V methods proposed for assuring compliance with System Safety Requirements
(SSRs) is acceptable. The Safety Case Report will also be developed at this point in the
Materiel Life Cycle. Prior to system acceptance the project office must monitor the
verification of compliance with SSRs which are raised during the development of the Safety
Case.
The DIDs are listed in the ASDEFCON (SM) Asset Library. For strategic materiel projects
the DIDs and the DMO Checklists for system reviews can be found in the ASDEFCON (SM)
Asset Library on the ASDEFCON (SM) web site at:
http://intranet.defence.gov.au/dmoweb/sites/cpo/default.asp?p=load.asp?page=10888
The SPO must monitor the developmental V&V activities of the contractor to ensure that the
development of the system is likely to be within quality, budget and schedule constraints.
This will involve tracking the resolution of problem reports (PR) (i.e. comparing the number
of PRs raised to PRs closed) to ensure that all high priority (i.e. high risk) defects are resolved
in a timely manner. The project must also ensure that sufficient element and sub-system level
testing is conducted to mitigate the risk of their being a significant likelihood of high priority
defects left unresolved after Acceptance Testing. The SPO manages this risk by ensuring that
it evaluates a contractor’s V&V processes by reviewing the contractor’s V&V Plan during the
Solicitation Phase.
The contractor must establish, through a completed VCRM, that the system requirements have
all been verified such that the system meets the acceptance criteria with respect to the Mission
System Functional baseline and the Support System Functional baseline.
DRAFT
38
project office should have visibility of the database containing the up-to-date
VCRM. The final VCRM is presented at the Functional Configuration Audit
(FCA) and the project office must review and approve the final VCRM within
30 days of the FCA. The draft and final VCRM are to be reviewed at each
relevant system review.
c. The ATP is delivered to the project office 30 days prior to the Detailed Design
Review (DDR) and must be ready for approval by the project office at the
DDR. The ATP is to be updated at each relevant system review.
d. The ATProcs are delivered to the project office 30 days prior to the TRR and
must be ready for approval by the project office at the TRR. The ATProcs are
updated at each relevant system review.
These timings may be tailored to the specific needs of the project.
The project office must demonstrate that the system will meet the operational needs stated in
the OCD operational scenarios (against the projected threats if feasible). The scenarios to be
used for this purpose are determined by the IPT when the OCD is developed. Demonstration
that the system will meet operational needs is done through, either the demonstrations listed in
the AV&V (minimum requirement) or an OpEval activity (maximum requirement). This will
be determined by the IPT at the time of accepting the Contractor’s V&V Plan. The objectives
of the OpEval should be described in detail in the OpEval Test Plan, at a higher level in the
Validation Plan, at a higher level again in the V&VP (if the Contractor is required to conduct
Mission System Validation according to the contract) and also in the TEMP.
The project office must ensure that the system achieves Design Acceptance and System
Acceptance during the contract management phase of the Materiel Life Cycle through an
effective acceptance V&V Program.
Under the ASDEFCON (Strategic Materiel) SOW template, there are two fundamental groups
of activities that need to be performed before the system(s) can be formally accepted. These
are:
a. Verification of the Mission System and Support System against their respective
functional baselines (Acceptance Verification): and
The V&V clauses in the ASDEFCON (SM) SOW use the terms ‘Acceptance Verification’ and
‘Acceptance Validation’ to highlight that the Australian Government’s role in the V&V
program is almost solely related to determining whether or not contractual deliverables are
able to be Accepted.
The Australian Government should not have a hands-on role in the Contractor’s internal
testing activities, unless the results of those activities are expected to be used for the purposes
of obtaining Acceptance. Instead, the Australian Government should adopt a monitoring role
over the Contractor’s internal testing activities for risk-management purposes. This approach
DRAFT
39
helps to ensure that the Contractor retains responsibility for ensuring that the requirements of
the Contract are met as determined by the Acquisition Baseline.
The objective for AV&V is to conduct AV&V on equipment that is of the same hardware and
software configuration as that which will be offered for System Acceptance. Refer to DI (M)
X-XX ‘DMO Acceptance process for New, modified or Upgraded Capability’ for more
information on AV&V.
The V&V clauses have been linked to the Acceptance provisions to clarify the Australian
Government’s role under the Contract, which was unclear under the previous terminology of
‘Developmental T&E (DT&E)’, ‘Operational T&E (OT&E)’, and ‘Acceptance T&E (AT&E)’
because these terms are not independent and can overlap each other. Under the CAID
principles, the Australian Government should not have a hands-on role in the Contractor’s
internal testing activities, unless the results of those activities are expected to be used for the
purposes of obtaining Acceptance. Instead, the Australian Government should adopt a
monitoring role over the Contractor’s internal testing activities for risk-management purposes.
This approach helps to ensure that the Contractor retains responsibility for ensuring that the
requirements of the Contract are met, which is one of the primary aims behind the adoption of
the CAID philosophy.
Acceptance Test and Evaluation (AT&E) is but one process by which Acceptance Verification
and Acceptance Validation may be conducted. Other processes may involve design reviews,
audits, analysis of modelling and simulation results.
Acceptance Verification
Verification of the Support System involves verifying that each of the Support System
Constituent Capabilities satisfy its relevant specification and that the Support System overall
satisfies the requirements defined in the Support System Functional Baseline. Verification of
the Support System Components involves verifying that each of these components satisfies its
specification.
Acceptance Validation
The Contractor is to demonstrate to the Australian Government that the Mission System will
satisfy the Mission System Functional Baseline when operated in accordance with the OCD.
In other words, this clause is assessing fitness for purpose in the actual operating environment.
Operational Test and Evaluation (OT&E) is a subset of Mission System Validation. However,
OT&E under ASDEFCON (Strategic Materiel) is limited to that required for Acceptance
purposes. For example, the Mission System elements of the RAN’s processes for Operational
Release could come under this umbrella if desired by the Project Authority. From a
supportability perspective, this aspect of V&V would validate that the supportability
characteristics of the mission system meet the specified requirements. Supportability factors
for the mission system would normally be defined in the FPS for the mission system and, as
DRAFT
40
part of the validation of the mission system, these supportability factors/characteristics need to
be validated.
The contractor must demonstrate to the Australian Government that the support system will
satisfy the Support System Functional Baseline when operated in accordance with the OCD.
Some V&V is to be performed to provide confidence that;
a. the associated support with training, documentation, facilities etc, has been
developed to specification and is operational; and
b. the processes and procedures are mature enough to provide support to the mission
system once delivered into the operating field.
The demonstrations as identified by ASDEFCON (listed below) or an Operational Evaluation
(OpEVAL) could perform this.
The IPT, through the development of the TEMP, will be required to determine and approve as
to the level required for Support System Validation (i.e: Between an ASDEFCON (Strategic
Materiel) demonstration to a full RAN style OPEVAL). Support System Validation
demonstrations are listed below.
c. Engineering Support Effectiveness Demonstrations
(1) The purpose of this clause is for the Contractor to validate to the Australian
Government in the Defence environment the effectiveness of the
Engineering Support Constituent Capability, developed as part of the
Support System, which has been documented in the Approved Support
System Specification (SSSPEC).
d. Maintenance Support Effectiveness Demonstrations
(1) The purpose of this clause is for the contractor to demonstrate to the Australian
Government the effectiveness of the maintenance support constituent
capability, developed as part of the support system, which has been
documented in the approved SSSPEC.
e. Supply Support Effectiveness Demonstrations
(1) The purpose of this clause is for the contractor to demonstrate to the Australian
Government the effectiveness of the supply support constituent capability,
developed as part of the support system, which has been documented in the
approved SSSPEC.
f. Training Support Effectiveness Demonstrations
(1) The purpose of this clause is for the contractor to demonstrate to the Australian
Government the effectiveness of the training support constituent capability,
developed as part of the support system, which has been documented in the
approved SSSPEC.
g. Support System Endurance Demonstrations
(1) The purpose of this clause is for the contractor to demonstrate to the Australian
Government that the complete integrated support system performs
DRAFT
41
effectively over an extended period. The extended period means that the
demonstration may extend beyond other acquisition AV&V activities and
the preferred timing for the final payment under the contract. Hence, to
ensure the contractor has provided an effective support system, the
endurance demonstration may be linked to a performance guarantee along
with other extended demonstrations such as for reliability.
(2) The rationale for this activity being conducted over an extended period is that a
number of measures of the effectiveness of the support system are statistical
in nature, and a reasonable amount of time is needed for the results to be
valid. Additionally, implementation problems during rollout and a partial
build-up of a fleet during transition may falsely indicate a support situation
that differs from what will actually be achieved in the long-term in-service
support environment. For these reasons, the start of endurance testing may
also be delayed until late in the transition period.
Refer to ASDEFCON (SM) SOW and DI(G) LOG 03-6 Annex A for further detail.
Because of the complexity of the systems that are being addressed and the significant time and
effort required to conduct a comprehensive V&V program, the likelihood of completing a
V&V program without need for rework is low. It would not be cost-effective for the
Australian Government to demand a complete retest for any rework. The alternative is, where
a design or configuration change is made during the V&V program, to conduct regression
testing based on the knowledge of the implementation and the risk to the Australian
Government.
It is important that all test environments and equipment used during the V&V phases are
controlled and validated to confirm that they will meet their objectives as used in the program.
This includes the more straightforward elements, such as the calibration of test equipment, as
well as the need to validate and accredit more elaborate models used in the V&V program
(e.g. underwater propagation models used as part of the V&V program for a sonar system). In
the latter case, if the models were well established and known to be valid, then they might be
validated by reference to subject matter experts. If however the models are relatively new or
developed specifically for the project then further scrutiny and potential supplementation by
real world trials (i.e. as part of the Acceptance Validation) may be required. It is important
that an appropriately qualified accreditation agent accredit the model.
DRAFT
42
Regression Testing
If changes are made to the Mission or Support System configuration after starting AV&V, the
contractor must repeat those activities where results are shown by regression analysis to have
been potentially affected by the configuration changes.
The V&V requirements for achieving design acceptance consist of ensuring that the mission
system and support system meet the relevant Technical Regulatory Authority’s (TRA)
requirements. The high level technical regulations applicable to ADF Materiel for the
purposes of achieving Design Acceptance are defined in DI (G) LOG 8-15 ‘Regulation of
Technical Integrity of ADF Materiel’.
DTR-A, DGTA and DGNAVSYS are the TRAs for the Army, RAAF and RAN respectively
and should be used by the IPT to ensure TRA requirements are identified for V&V evidence
required for Design Acceptance. The Ordnance Safety Group (OSG), formerly known as the
Australian Ordnance Council, is the TRA that should be contacted regarding V&V of system
safety requirements required for design acceptance of ordnance systems.
Design acceptance requirements, specified by a TRA, usually state that a system must be fit
for purpose in order to achieve design acceptance. This implies that some validation activities
and safety verification activities would need to take place prior to design acceptance. The
Design Acceptance Authority assesses a system for its suitability for acceptance into service
and advises the project director of the status of a system with respect to its suitability for
acceptance into service. Please note that acceptance into service is often referred to in
Defence documents as operational acceptance or operational release.
The Defence Instructions outlining the technical regulation policies of the DTR-A, DGTA,
DGNAVSYS and OSG are:
DI (A) LOG 12-1
DI (G) OPS 02-2
DI (N) LOG 47-3
DI (G) LOG 07-1
The Service specific TRAs requirements for V&V evidence required in order to achieve
Design Acceptance are further discussed in the following:
ARMY – TRAMM (http://defweb.cbr.defence.gov.au/home/documents/army/mmanuals.htm)
RAAF - TAMM (http://wilap006.sor.defence.gov.au/aaplib/7001_053(AM1)/prelim.pdf)
RAN - ABR 6492 (http://defweb.cbr.defence.gov.au/home/documents/navy/mabr.htm)
In order to achieve System Acceptance all contractual requirements need to be met. System
Acceptance should occur at approximately the same time as Initial Operational Capability
(IOC), in the RAAF, or Initial Operational Release (IOR), in the RAN.
DRAFT
43
Depending on the level of AV&V the Contractor has undertaken in the lower level testing
activities, the Contractor may not be required to conduct any further Mission System
Validation (apart from the ASDEFCON demonstrations as a minimum) since this section of
the contract is entirely tailorable in the ASDEFCON (SM) contract template.
The project office must ensure that the system achieves Acceptance Into Service during this
phase of the Materiel Life Cycle.
The transition plan describing the V&V activities that need to be conducted in order to achieve
Acceptance Into Service should be described in the TEMP.
8.3.1 V&V required for Operational Acceptance / Release (Acceptance Into Service)
The V&V activities required for operational acceptance/release include testing of product
improvements, operational characteristic modifications and changes implemented to reduce
system life cycle costs.
The system should also be validated against the OCD at this point by conducting further Initial
Operational Test and Evaluation (IOT&E) activities, to demonstrate the operational
effectiveness and operational suitability of the system against the measures described in the
TEMP with typical users in realistic operational scenarios. This testing is conducted in
consultation with the Capability Manager and the relevant T&E agency. The extent of
responsibility that the Contractor has for the mission and support system validation at this
point is determined by the content of the relevant clause in the contract.
DRAFT
44
Acceptance Into Service (AIS) is granted by the Capability Manager on the basis of
recommendations from the relevant T&E agency and the DMO. AIS can be granted when the
system has been validated against OCD, all testing identified in the TCD has been completed,
when the TRA requirements have been satisfied and when the Capability Manager is satisfied
that the Functional Inputs to Capability are in place.
Before the OT&E required for Acceptance Into Service can be conducted both System
Acceptance and Final Acceptance (as defined in clause 6.6 of the ASDEFCON (SM) SOW)
must be achieved.
The output of the acceptance into service process is the Transition Certificate.
Further information on the process of obtaining V&V evidence to achieve Operational
Acceptance / Release (Acceptance Into Service), including a Functional Flow Block Diagram
(FFBD) describing the process, can be found in DI (M) X-XX (draft) ‘DMO Acceptance
Process for New, Modified or Upgraded Capability’.
Table 5 provides a summary of what V&V evidence is required to achieve the various types
acceptance and what issues need to be resolved at each stage.
DRAFT
45
The relevant project governance board will review the project closure report and the DMO
project manager will produce the project closure certificate.
DRAFT
46
During the In-Service Phase the SPO manages V&V activities that are outside the scope of the
V&V Plan. These activities include various OT&E and Supportability V&V (SV&V)
activities that should be described in the TEMP, the SV&V Plan and the Validation Plan. The
through life support contractor will also produce a V&V Plan relating to the support contract
in accordance with the ASDEFCON (Support) contract template.
V&V in the In-Service Phase is used to support operational acceptance, business process
closure and acceptance of system upgrades and modifications.
As with the Acquisition Phase, the SPO will review the contractor’s V&V Plan, test plan, test
procedures and test report during the In-Service Phase. However, during the In-Service phase
there will be a greater degree of involvement of Commonwealth employees due to the fact that
most of the V&V activities will be conducted on operational equipment, often with typical
users of the system.
During the In-Service Phase the SPO, T&E agencies and support agencies manage the V&V
of system modifications and upgrades, conduct further SV&V activities and conduct Follow-
On OT&E (FOT&E) of the system to evaluate the performance of the system in new
operational scenarios and to refine tactics and doctrine. In-Service Phase V&V also includes
V&V of support systems, which includes systems set up for maintenance and training.
FOT&E is conducted during the In-Service Phase to verify the correction of deficiencies with
respect to operational effectiveness and operational suitability. FOT&E is also performed to
validate the system against additional scenarios, environments and threats, that are derived
from those listed in the OCD, that the system was not validated against in the OpEval.
The key emphasis at this stage of the Materiel Life Cycle is to ensure that all in-service
support arrangements are complete and that the in-service support facilitates operational
suitability of the system.
During the In-Service Phase V&V is conducted against the established and accepted
Acceptance Into Service Baseline.
During the mission system availability planning a range of OT&E activities will be conducted
to provide the project office with the assurance that the operational availability requirements
will be met. This includes gathering data from in service use to determine the operational
availability of the system. The operational availability is dependent upon the time it takes to
perform various maintenance task, the frequency with which these maintenance tasks are
needed to be performed and the reliability of the system. Therefore maintenance verification
tasks are required to be performed and an evaluation of the reliability of the component sub
systems and the reliability of the complete system are required to assess operational
availability. Operational availability is often defined as the proportion of the time in which a
materiel system is available for use in military operations.
DRAFT
47
Verification of maintenance tasks to ensure that the support system enables RAM
requirements to be met should be conducted in sustaining the mission system. This would
involve performing a range of maintenance tasks by typical maintenance staff to ensure that
they can be performed in a timely and accurate manner in compliance with the through life
support requirements.
As with other V&V activities, supportability V&V requires identifying the appropriate
parameters that need to be measured, gathering the data, analysing the data, identifying
corrective action (if required) and implementing corrective action (if required). This process
should be outlined in the project’s Supportability V&V Plan. Examples of appropriate
parameters to be measured at this stage of the Materiel Life Cycle could include the Mean
Time To Repair, Mean Time Between Failures and Mean Administration and Logistics Delay
Times.
In sustaining the mission system a FRACAS should be implemented to track and report on
defects recorded by users and the rectification of these defects. Further information on
FRACAS can be found in MIL-HDBK-2155.
This should follow a similar procedure to problem reporting during the development of a
system whereby defects are prioritised and the Contractor that provides the Through Life
Support will rectify the defects and demonstrate through V&V that the defect has been
remedied. Further information on this can be found on QEMS (QEMS Reference: UI 3319).
V&V in this activity associated with the In-Service Phase involves planning and implementing
V&V programs for minor and major changes and enhancements required to rectify
deficiencies discovered through operational use of the system to the point at which the
modified system can be accepted into service.
DRAFT
48
These V&V activities will include more involvement from the end users since it is performed
on an accepted and operational system.
These V&V activities will involve regression testing to ensure that all of the original functions
are not adversely affected by the modifications made. This is particularly critical for software
intensive systems. Regression testing should be performed on all units that have been
modified and all units that are dependent upon the outputs of modified units. This principle
can be applied to testing both software and hardware.
The V&V in managing the support system should be described in the Supportability V&V
Plan. Guidance on the development of Supportability V&V Plans will be available on QEMS
in MASF 4.
The V&V activities required in this activity associated with the In-Service Phase include
maintaining the TEMP and the Supportability V&V Plan and ensuring that the V&V training
of SPO staff is adequate.
In this activity associated with the In-Service Phase the SPO must determine whether the
output of supportability V&V activities indicate whether any modifications are required to the
support system or the design characteristics of the mission system in order to enable the
materiel system to meet the supportability requirements. This should be managed through the
Supportability V&V Plan.
V&V in the Disposal Phase involves verification that the system has been safely disposed of
(i.e. verification of the appropriate System Safety Requirements produced in the Safety Case
Report that relate to disposal).
V&V in the Disposal Phase also includes verification of compliance with security
requirements to ensure that the disposal of the system will not result in any security breaches
(i.e. that classified material has been appropriately disposed of). This involves verifying
compliance with the relevant sections of the Defence Security Manual (DSM). This would
include ensuring that the system documentation is securely stored to ensure that the
documentation can be retrieved if there is a need to recommission the system in future.
DRAFT
49
The logical approach to V&V is to use techniques and methods that are most effective at a
given level. For example, in software V&V the most effective way to find anomalies at the
component level is inspection. However, inspection is not applicable at the system level since
when handling a higher level of complexity it is far more efficient to use test methods.
Some methods of V&V, such as design reviews and trade-off analyses, are analysis techniques
that do not require experimentation themselves but which may involve the analysis of the
results of experiments already conducted.
Information on the purpose and conduct of system reviews is contained in the System Review
Guide and therefore detailed information on this method of V&V has not been described in
this manual. The System Review Guide and the DMO Checklists, which can be found in the
ASDEFCON (SM) Asset Library at
http://intranet.defence.gov.au/dmoweb/sites/cpo/default.asp?p=load.asp?page=10888, must be
consulted to obtain information on what is expected of the project office when participating in
system reviews.
DRAFT
50
The V&V methods listed in section 10.1 are described in further detail below. All of these
methods can be used for both verification and validation.
The most commonly used V&V method, in terms of cost and effort, is T&E.
All forms of T&E have a single purpose, which is to evaluate the results of testing to support
achieving defined objectives. When this principal is applied in decision-making at key
milestones, a traceable link is established through the results of T&E for assuring risk is
contained within acceptable boundaries. The results of T&E are fundamental for decision-
making when validating operational concepts and end-user requirements, evaluating designs
or modifications, identifying alternative designs, comparing and analysing trade-offs when
capability specifications cannot be met, verifying contract compliance, and evaluating system
performance.
While T&E can be applied during all phases within a capability system’s life cycle, a
significant concentration of T&E effort is necessary in the Requirements and Acquisition
phases. T&E is often characterised according to its objective and linkage to key project
milestones. Accordingly, specific terminology can be used to more succinctly describe the
T&E activity being undertaken. The most commonly used descriptions are Developmental
Test and Evaluation (DT&E), Acceptance Test and Evaluation (AT&E), Production Test and
Evaluation (PT&E) and Operational Test and Evaluation (OT&E).
DT&E reflects T&E conducted to assist the system design and development process and
support verification of technical or other performance criteria and objectives. The Contractor
usually manages DT&E. However, there may be DT&E activities conducted prior to Second
Pass approval which will be managed by an ADO T&E agency (eg. environmental testing)
and/or conducted as a joint DSTO/industry arrangement (eg. the development of concept
technology demonstrators). DT&E is conducted in a controlled and repeatable manner to
verify system compliance with contract specifications. DT&E is conducted first at the element
level, then at the sub-system level followed by the system level. In software testing this is
often referred to as unit testing, component testing, integration testing and system testing. In
software testing the system level testing is often performed according to System Level Use
Cases, which describe the inputs and expected outputs for a test.
AT&E is T&E carried out to determine whether or not the material developed and produced
fulfils the contractual requirements and specifications. During AT&E the system being
acquired undergoes Factory Acceptance Testing (FAT) and System Acceptance Testing
(SAT). In ASDEFCON (SM) this is referred to as Acceptance Testing. AT&E is usually
conducted by the contractor and culminates in the Final Acceptance Test, which the project
office formally witnesses to verify that compliance with the contract specifications and
acceptance criteria have been met. This is referred to in ASDEFCON (SM) as Final
Acceptance Testing. The acceptance criteria are defined in the Acceptance Test Procedures
(ATProcs), which is produced by the Contractor and reviewed and approved by the project
office. Detailed test steps of the acceptance testing are contained in the ATProcs while the
DRAFT
51
pass/fail status of the system against each test is contained in the Acceptance Test Report
(ATR). The status against the requirements is then provided in the updated Verification
Cross-Reference Matrix (VCRM). The AT&E conducted by the contractor is often referred to
as PT&E in the Navy. With higher risk projects there is usually a need to conduct a further
AT&E activity in consultation with a T&E agency to provide additional confidence that the
system can be accepted and that design acceptance can be achieved.
OT&E is T&E conducted under realistic operational conditions. OT&E is conducted with
representative users of the system, in the expected operational context, for the purpose of
determining a system’s operational effectiveness and suitability to carry out the role and fulfil
the requirement that it was intended to satisfy. OT&E activities can span the entire capability
life cycle. The scope of the OT&E will vary with the level of development involved in the
acquisition. For COTS/MOTS acquisition the amount of OT&E will be less than that
expected for a large developmental project.
OT&E can consist of Operational Assessments (OA), Initial Operational Test and Evaluation
(IOT&E) (also known as Operational Evaluation or OPEVAL) and Follow-On Operational
Test and Evaluation (FOT&E). OA is performed to identify what the risks to achieving the
desired capability are likely to be. IOT&E is performed to validate the system against the
COIs listed in the TCD using a scenario based on the scenarios listed in the OCD. FOT&E is
performed to assess the system against COIs listed in the TCD using other scenarios, to
evaluate tactics and doctrine and to assess the need for future modifications to the system.
OT&E is initially identified in the TCD and described in further detail in the TEMP and the
Validation Plan.
The extent to which OT&E will include the involvement of the Contractor will depend upon
the content of the Mission System Validation clause in the contract. This clause is blank in the
ASDEFCON (SM) template. The content of the clause must be tailored to the specific project
needs. The degree of responsibility that the Contractor should have with respect to mission
system validation should be dependent upon the level of risk that the system would not be fit
for purpose after it has undergone acceptance verification. If there is a high risk of the system
not being fit for purpose then the Contractor should have a greater degree of responsibility in
the OT&E program. In this instance the Contractor should be responsible for rectifying defects
to ensure successful mission system validation. The mission system validation should be
conducted as identified and agreed in the TEMP and the V&V Plan.
The level of OT&E that is required depends upon the magnitude of risk associated with the
project. A strategic materiel project will require OT&E activities since the risk that the
system won’t satisfy the operational needs is higher for strategic materiel projects than other
acquisition projects. The risk that is mitigated by OT&E is the risk that the system, whilst it
should have passed development and acceptance testing, it may not necessarily satisfy the
users’ operational needs when used as intended in a practical scenario.
The following definitions of the various phases of OT&E are adapted from the US Defense
Acquisition University’s T&E Management Guide (Defense Acquisition University 2001) to
reflect the roles of these phases of OT&E as they relate to OT&E processes in Australia.
DRAFT
52
FOT&E is conducted in the In-Service Phase to ensure the operational suitability and
effectiveness in operations and exercises is as required. FOT&E is performed to:
g. assess the logistics readiness and sustainability:
h. evaluate the weapon support objectives:
i. assess the implementation of ILS planning:
j. evaluate the capability of ILS activities:
k. determine the disposal of displaced equipment: and to
l. evaluate the affordability and life cycle cost of the system.
FOT&E is similar to Supportability T&E. FOT&E is also used to identify capability gaps
which will be addressed either in an upgrade or replacement project.
DRAFT
53
Test
Test methods in this context refers specifically to physical testing. Test methods should be
used if the associated requirement needs a detailed evaluation of performance and/or
functionality of a system.
Test methods are used to verify compliance with a requirement only if the requirement can be
verified with sufficient confidence through a physical test within the given resource
constraints. Test methods are also to be used if the requirement cannot be verified with
sufficient confidence by some less resource intensive verification method such as inspection,
demonstration, audit, system review or comparison.
Test methods should be used when the requirement needs a repeatable test sequence to be
conducted to ensure compliance with the requirement and the requirement specifies a level of
performance or functionality that is too complex to be evaluated by either inspection or
demonstration.
Test instrumentation should be appropriately calibrated prior to testing. This requires that the
SPO ensures that the test facilities are National Association of Testing Authorities (NATA)
accredited (or equivalent).
Testing may include comparison tests to guide a decision as to which piece of equipment is
selected for procurement.
Time and costs may preclude actual physical testing of a system/capability. In this case,
modelling and simulation may be a better choice. For example, firing war shot missiles is a
costly process, so the decision may be made to use a test missile or to simulate the firing
process. Modelling in the physical (eg wind tunnels) or in software, could also be cost and/or
time effective.
Simulation and modelling methods have some risk inherent in them in that the development of
a simulation model is a development project in itself and considerable effort may be required
to verify, validate and accredit a simulation and model if this is deemed necessary. This may
involve physical testing to validate a simulation model to ensure that the simulation model can
simulate the intended performance of a system with respect to the relevant parameters. The
US DoD Verification, Validation and Accreditation Recommended Practices Guide should be
followed for guidance on verification, validation and accreditation of simulations and models.
The Defence Simulation Proposal Guide should also be consulted for guidance on the
development and implementation of simulations.
In may be useful to conduct automated testing, using a software tool. The benefits of this
include that these methods allow a larger number of tests to be conducted within a given
period of time and they often allow the simulation of enabling systems, thus providing greater
confidence in the performance of the system in an operational scenario. However, the
simulators within these tools have been accredited for a specific application at a determined
level of fidelity. It is recommended that the user accredit these simulators for the intended
application. It also may take a considerable amount of time to develop and maintain the test
DRAFT
54
scripts and validate the test harness. Consequently, automated test methods should only be
used if the system is not undergoing a rapid rate of modification.
Demonstration
Demonstration may be used in low risk areas where you are confident in the system’s ability
to achieve the desired function. The difference between test and demonstration is that
demonstration is less rigorous. Demonstration is when you are interested in the outcome and
not necessarily to process that got you there. Demonstration methods involve the use of a
system, sub-system or component operation to show that a requirement can be achieved by the
system. These methods are generally used for basic confirmation of system functionality
without measurement of performance. These methods should be used as the verification
method for a requirement only where the requirement does not require much detailed data in
relation to performance or functionality of the system in order to confirm compliance with the
requirement.
Inspection and demonstration verification methods are also to be used to verify compliance
with requirements that cannot be expressed meaningfully in quantitative terms (e.g. usability
requirements).
Inspection
In very low risk areas or simple test criteria, inspection may suffice. Inspection methods
consist of visual examinations of the system, component or sub-system. These are used to
verify physical design features or specific manufacturer identification. These may be used
when a formal test is not required since we are not verifying functionality or performance and
are instead just verifying the physical configuration of the system as in physical configuration
audits where. For example, are the safety labels required present? You have to check, but it is
not an involved process.
Experimentation
A Defence Trial will usually be comprise of one or more of the activities described above.
Analysis
Analysis methods are required when it is not physically possible to conduct a test capable of
confirming compliance with a requirement within the budgetary, schedule or availability
constraints. An example of a requirement to meet certain reliability, availability or
maintainability thresholds (e.g. An MTBF of no less than 5000 hours). These requirements
DRAFT
55
could only be verified by demonstration in the In-Service Phase. Consequently the only way
of confirming compliance with the requirement prior to contractual acceptance would be
through analysis methods. Analysis verification methods are to be used in the early stages of
the capability lifecycle when a physical prototype is not available or is not cost effective to
produce. In summary, analysis verification methods are used to verify performance
requirements that are too costly or difficult to verify by other means.
As in simulation and modelling, the analysis techniques have an inherent risk in that the
analysis system may need to developed and so need be tested and evaluated prior to it’s use.
In software V&V the main analysis methods are: Software requirements traceability analysis,
software requirements interface analysis, design traceability analysis, source code traceability
analysis, algorithm analysis and simulation. These are discussed further in IEEE 1012 - 1998.
Additional optional activities that suit the V&V of software are described in IEEE 1012-1998.
One important consideration with analysis methods is that often the conduct of an analysis
activity results in the production of a data item, which has to be reviewed by the project office.
The project office must consider whether it has the time, skills and resources to review all of
the data items that will be produced in the V&V program.
Audits
Audits are independent examinations of a work product or set of work products to assess
compliance with specifications, standards, contractual agreements, or other criteria. Audits
are a type of inspection. These are required to provide assurance of functional and physical
configuration of a system. The three main types of audits are Functional Configuration
Audits, Physical Configuration Audits and Quality Audits.
Walkthroughs
Walkthroughs are a process of examining documentation and systems to ensure the technical
integrity and validity of the design. The types of walkthroughs that are conducted are
requirements walkthroughs, design walkthroughs, source code walkthroughs and test
walkthroughs. Requirements walkthroughs are conducted in requirements’ validation. Design
walkthroughs and source code walkthroughs are conducted in detailed design reviews. Test
walkthroughs are conducted prior to TRRs. These types of walkthroughs are described further
in IEEE 1012-1998.
DRAFT
56
System Reviews
System reviews are conducted to verify the status of the system is such that it can be
transitioned into the next phase in the system engineering process.
A brief summary of the activities conducted in system reviews is provided in Table 6 below:
Preliminary Design In the PDR the CoA must review the sub-system
Review (PDR) specifications to ensure that they will satisfy the
requirements. The contractor should present trade study
results or demonstrate how the recommended design will
meet or exceed the functional requirements of the materiel
system.
Detailed Design Review At the DDR the CoA reviews the product designs to
(DDR) ensure that they satisfy the parent requirements.
It is also necessary to check that the enabling product
requirements have been defined in the design
documentation presented by the contractor at the DDR.
Functional Configuration The purpose of the FCA is to verify that the configuration
Audit (FCA) items comply with the requirements.
Physical Configuration The purpose of the PCA is to verify that the configuration
Audit (PCA) items were built as per the design documentation.
Test Readiness Review In the TRR the coverage of the requirements must be
(TRR) checked to ensure that the tests will enable the accurate
verification of compliance with requirements.
Support System Detailed In the SSDDR the final RTM is reviewed, outstanding
Design Review ATPlans are reviewed and requirements pertaining to
(SSDDR) support system facilities and equipment are validated.
DRAFT
57
Support and Test In the S&TEPPR the CoA must confirm with the
Equipment Provisioning contractor that the support and test equipment will enable
Preparedness Review achievement of the functional requirements baselines for
(S&TEPPR) both the mission system and the support system at
minimal support and test equipment cost.
Spares Provisioning The SPPR is conducted to verify that the spares to be
Preparedness Review acquired will enable the mission system and the support
(SPPR) system to comply with requirements at minimal spares
cost. The requirements for packaging are also reviewed at
the SPPR. Facilities Requirements Analysis Reports
(FRAR) may be used in the verify that the spares
provisioning will allow the requirements to be met.
Task Analysis In the TARR the CoA must determine whether failure
Requirements Review modes and preventative maintenance requirements and
(TARR) operator and non-maintenance tasks with logistics
requirements have been addressed by the documented
tasks. The Training Needs Analysis Report is to be
reviewed by the CoA to ensure that the full set of training
requirements have been identified.
Long Lead Time Items In the LLTIR the requirements for long lead time items
Review (LLTIR) (ie. spares, equipment and other supplies) must be
validated.
System reviews are discussed further in the System Review Guide which can be found on
QEMS at http://qems.dcb.defence.gov.au.
Historical Data
Data collected in the past on similar systems, used in similar configurations, to support V&V
in acquisition or sustainment of a materiel system can be employed for verification of a system
being acquired or sustained. This may include data from analysis, demonstration, inspection or
DRAFT
58
test activities conducted on similar systems and may include data collected from the in-service
use of a similar system being used to support V&V (eg. RAM data).
Using this verification method may involve applying analysis methods to determine whether
the historical data is relevant to the system being acquired or sustained. This may include
checking to see whether the system configuration that was used in deriving the historical data
was significantly different from that intended for the system being acquired or sustained or
checking to see whether a test lab that was used was appropriately certified (eg. NATA
certified) to conduct the relevant test.
Conformance Certificates
The conformance Certificate method of V&V is to be used when an item that is to be acquired
has already been subjected to a number of tests in the past, either by Defence or an
independent agency, and the test results indicated that the system conforms to the
requirements of a standard that is quoted in a requirement. A conformance certificate should
only be accepted for testing conducted by appropriately certified test laboratories (ie. NATA
certified). It is also important to consider the configuration of the system under test and the
specific test procedures used in order to determine whether the test results will be relevant for
verifying whether the system being acquired is fit for purpose. If the configuration of the
system being acquired is to be modified for its intended application in the ADF then the
results of the previously conducted testing may not be relevant for the purposes of V&V
T&E differs from Quality Assurance (QA) in that Quality Assurance basically aims to ensure
the repeatability of product from a production process, including an audit trail via the process
documentation. However, the overall aim of T&E is to mitigate the risk of the product
delivered not fulfilling its intended capability role. T&E achieves this by assessing whether
the user’s needs have been fulfilled, and so is a higher level goal than the goal of QA.
Independent staff to conduct V&V (ie. IV&V staff) are employed when financial and
management independence from both Defence and the contractor are required. For example,
safety case evaluators must be IV&V staff. When there are insufficient staff, either within
Defence or the Contractor’s operation, with the requisite skills to conduct a V&V activity then
IV&V staff may be engaged to conduct the V&V activity.
At the time of writing the preferred method for obtaining IV&V services is to either use the
PMSS panel, details of which can be found at http://intranet.defence.gov.au/dmoweb/sites/BCS/), or to
use the ASDEFCON (Services) RFT and contract templates, which can be found at
http://intranet.defence.gov.au/dmoweb/sites/CPO/). The DMOSS panel is in the process of being
established. It will be possible to select IV&V staff using the DMOSS panel when it is
operating in 2005.
DRAFT
59
One of the major problems is that people view T&E as something that should be done late in
the project rather than understanding that it is part of a V&V process that provides the best
value for money when it is performed earlier in the process.
This may not be a comprehensive list of problems that can occur with V&V but it lists of
some of the major issues.
Broadly the five main types of errors that can occur in V&V are:
1. The requirements have not been validated and consequently compliance with the
requirements would not result in satisfaction of the operational need.
2. The requirements have not been communicated clearly to the Contractor leading to an
inappropriate solution.
3. The Contractor has not implemented the validated requirements correctly in the system
design.
4. The verification of the system with respect to requirements has not been conducted
properly leading to the system being erroneously identified as compliant with the
requirements.
Some specific mistakes that relate to all of these types of errors are listed
below.
a. The responsibilities of various stakeholders with respect to V&V may be ill defined.
Some specific mistakes relating to the first type of error are listed below.
a. The requirements are not verifiable (e.g. verification of a requirement for an MTBF of
5000 hours would take a significant and excessive time to verify with a high degree of
confidence).
b. The requirements may not be valid, in the sense that they do not indicate the
appropriate verification method, are incomplete or may be ambiguous and so there is
uncertainty in terms of how to go about verifying compliance with the requirement.
DRAFT
60
c. The requirements may not be valid in that they do not individually or collectively
address the operational need (i.e. you could strictly comply with the requirements but
not address the operational need). For example, a requirement may specify that the
system shall carry out a function but it may not specify the performance required or the
system configuration that the function must be performed in. Consequently, that
function may not be performed sufficiently fast to satisfy the operational need or it
may not function well enough in a configuration that would be used in practice.
e. Suitable measures (parameters) may not have been properly identified to assist in
determining whether the system will actually satisfy the operational need. Performance
against the measures selected may indicate that the system performs well by some
criteria but not by all of the relevant criteria required to demonstrate operational
effectiveness or operational suitability.
Further details relating to the second and third types of error are listed below.
The second and third errors are usually prevented by appropriate requirements elicitation and
ensuring that the system design is valid for addressing the full set of requirements. These
topics are further discussed in Annex C.
The main problem arising from the second and third types of errors are that user requirements
may not have all been captured in the design process or the requirements elicitation process.
Consequently some of these deficiencies may have been overlooked in a system review
process and the requirements may not have been recorded or verified.
Some specific mistakes relating to the fourth and fifth types of errors are listed below.
f. The test steps are not adequately described in that they do not contain enough detail to
be repeatable to provide Defence with sufficient confidence that the test will
adequately verify compliance with the requirement.
g. The testing does not include representative types of test input data (this is particularly
relevant in software testing).
h. The test results may not be recorded in sufficient detail and consequently a test result
that has only demonstrated partial verification may be recorded in the VCRM as
having passed (particularly if the only types of verification status recorded are PASS or
FAIL).
i. Expensive testing that requires the involvement of large numbers of specialist users,
such as OpEvals, may not be planned or funded well in advance enough for them to be
conducted, or conducted properly, prior to System Acceptance or Acceptance Into
Service. This may cause either a delay in the delivery of the system or a higher
residual risk that the Capability Manager may have to accept.
DRAFT
61
k. The expected result of the test is not adequately described in the test procedures.
l. The system configuration is not properly recorded to enable the tests to be repeatable.
For example, changes to the system, such as various configurable system parameters
that will affect performance (such as the time delay between sending packets of data)
may have been made prior to testing that have not been recorded.
m. Emphasis is not placed on planning V&V to ensure that defects will be identified early
in the lifecycle (obviously it is much cheaper to rectify a defect early in the lifecycle).
n. The tests may not be conducted under controlled conditions meaning that the tests are
not repeatable - or randomisation or blocking to minimise the impact of factors other
than factors pertaining to the system performance itself wasn't conducted to enable the
test to be repeatable.
o. The tests may not have been prioritised according to risk and consequently compliance
with some high-risk requirements may not have been verified as thoroughly as they
should have been.
p. Sometimes contractors don't produce test plans well in advance enough to enable the
plans to be reviewed in detail by Defence T&E staff thus increasing the risk of invalid
testing being conducted.
q. Various alternative V&V techniques other than physical testing such as analysis
techniques (e.g. simulation) may be insufficiently used. In some instances these may
be the only viable way of determining compliance with a requirement (because of the
difficulty of conducting physical testing or the lack of control often inherent in
physical testing).
s. The OT&E may not exercise the system in a realistic operational scenario (for example
a system may be tested in a very limited configuration or a very limited scenario that is
not representative of how it would be used in practice).
t. Sometime insufficient care has been given to determine the acceptable level of residual
risk that the Capability Manager is prepared to accept when determining the
acceptance criteria.
u. Sometimes, the requirements database is not kept up to date and so some requirements
may not be adequately verified.
w. The TRA's requirements (including safety issues) may not be considered until too late
in the process - making them more difficult and expensive to comply with.
DRAFT
62
x. Sufficient time between acceptance testing and delivery to the users has not been
allocated to take into account the amount of time that may be required for rectification
of defects.
DRAFT
63
The are two main types of critical issues pertaining the acquisition of a capability. These are
critical operational issues (COIs) and constraints.
COIs form a basis for rejecting or accepting a capability. Identifying COIs is done by
determining what the major risks are to achieving the desired capability. That is, the system
would not be suitable for operational deployment and would not exhibit the desired capability
unless all COIs are resolved. COIs are solution independent. COIs are usually phrased as a
question and not in the form of requirements. COIs are usually resolved in OT&E.
COIs are identified in the OCD and the TCD, before Second Pass approval, and are derived
from the system mission scenarios and the operational needs described in the OCD. COIs
must address a specific system capability and significantly impact system effectiveness or
suitability. COIs state the mandatory operational requirements of a capability. They may
refer to requirements that are already satisfied, partially or fully, by the present capability.
For instance, if a means of enhancing the operational range of frigates is to be acquired by the
Navy ‘Will the system enable a fully-equipped frigate to deploy (without stopovers) from
Australia to any port in the world?’ may be a relevant COI. If, perhaps, frigates were not
satisfactorily protected against various threats such as an enemy submarines by our current
capabilities it may be that we need to find a new solution to enhance detection of enemy
submarines. There may be various solutions to achieve this. ‘Will the system enhance the
protection of Navy assets against enemy submarines in a combat environment?’ or ‘Will the
system detect the threat in a combat environment at adequate range to allow successful
engagement?’ would be example operational effectiveness related COIs. It should be possible
to relate these to the operational scenarios that are described in the OCD.
There are also operational suitability COIs which relate to the ability of the system to be
suitable for service. In this example operational suitability COIs could include: ‘Will the
system be able to maintain its detection capabilities throughout a typical mission described in
the OCD?’ or ‘Will the system be safe to operate in the anticipated operational environment?’
Constraints are political, social, legal, technical or economic limitations placed on a system
acquisition. An example of a constraint is a legal limitation that prevents a solution for
achieving a capability that uses prohibited substances (e.g. depleted uranium). This then
places technical constraints on a project, which are listed in the Critical Technical Parameters
(CTP) in the TCD.
Critical technical parameters (CTPs) are also to be identified in the TCD. CTPs state technical
characteristics of the system (ie. design characteristics) that must be met in order for the
system to be able to satisfy the desired capability. A CTP is so important that failure to meet
the threshold may cause the project to be cancelled. For example, that the bore of a gun must
DRAFT
64
be 155mm would be a CTP if, in this case, 155mm ammunition were required in order to
achieve the desired capability. Compliance with CTPs is determined in Acceptance Testing.
COIs can be decomposed into a number of testable measures that collectively enable the
satisfaction of each COI to be confirmed. A number of Measures of Effectiveness (MOEs)
and Measures of Suitability (MOSs) can be directly derived from a single COI while Measures
of Performance (MOPs) can be derived from the MOEs and MOSs. The MOEs, MOSs and
MOPs form a hierarchy of measures that describe the desired effectiveness and performance
of a system against the underlying COIs.
MOEs are:
a. metrics of warfighter effectiveness;
b. attributes of a system that directly contribute to the operational effectiveness of a
system in the intended operational scenario;
c. mission oriented;
d. solution independent; and
e. measures of how well the system meets the user needs.
MOPs are the technical parameters that are directly measured when evaluating a system.
Against each MOP is an evaluation threshold.
A certain amount of data needs to acquired through the various V&V methods to determine
whether the evaluation threshold has been achieved. The data requirements (DRs) are the type
and number of measurements that need to be taken in order to measure the MOP with the
required level of confidence.
The number of V&V activities that need to be conducted is dependent upon the factors listed
below.
f. Whether the system is new or is an upgrade or modification of an existing system.
g. The effect of a system failure, which can range from catastrophic (a total failure of the
mission) to minor (results in inconvenience or additional cost).
h. The variability of effectiveness in the operational environment.
i. Whether the system will meet opposition and what the nature of the opposition might
be (Sessler et al 2000).
j. The level of confidence required in the result of the V&V activity.
The level of confidence required in the result, and therefore the level of confidence in meeting
the requirement, is dependent upon the priority of the requirement. This, in turn, is dependent
upon the level of risk associated with the relevant aspect of operational effectiveness or
operational suitability that the requirement concerns.
DRAFT
65
COI Constraint
MOP MOP
(effectiveness) (suitability)
DR DR
AS 4216 lists the main software quality characteristics. Software only satisfies its intended
need within the context of a complete system and all complex Defence systems use software
to perform their intended functions and so these characteristics are actually system quality
characteristics that are applicable to any type of complex system. The software quality
characteristics listed in AS 4216 are also general enough to be applied to any type of system.
These sorts of quality characteristics should be addressed in the MOEs and MOSs for a
system.
These characteristics, and various sub-characteristics associated with them, from AS 4216 are
listed below.
Functionality
Functionality relates to whether the system performs the functions that it must in order to
satisfy the specified requirements and the operational need.
DRAFT
66
Reliability
Reliability is the ability of a system to perform its intended function consistently over a given
period of time.
Usability
Efficiency
Efficiency is the level of performance of the system given a set of resources (i.e. hardware and
communications systems) under controlled conditions.
Maintainability
Portability
Portability is the ability for the system to be integrated into alternative environments.
DRAFT
67
MOPs need to be determined for each of these MOEs and evaluation thresholds need to be
determined for each MOP.
MOEs are also measures of external quality. External quality is defined in AS1598 as ‘the
extent to which a product satisfies stated and implied needs when used under specified
conditions’
DRAFT
68
There is no single course that covers all of the T&E and V&V training requirements of the
ADO at present. The various T&E organisations and DMO have identified some courses
which combined together have the potential to provide a basis for DMO V&V. Some of these
courses are still being assessed or are under review.
The T&E Practitioners Course, run by PlanIT, is currently being reviewed by RANTEAA to
ensure its suitability for use in training T&E practitioners in the DMO and T&E agencies.
ARDU currently require their test engineers to do the Principles of T&E and Operational T&E
courses run by the University of South Australia.
DTRIALS is assessing a combination of the Principles of T&E, Operational T&E and the
T&E Practitioners course as a means of providing the required T&E training. DTRIALS is
also looking to review the competencies required for T&E practitioners and trying to ascertain
if other training providers exist.
DRAFT
69
End to End Software Testing Two day overview of the IV&V Australia
software testing process
DRAFT
70
The DMO recommends using CORE for the development of scenarios and recommends using
DOORS or RDT for requirements management, including the tracking of verification of
requirements in the VCRM.
A T&E management tool is to be selected by the ADO. The benefit of a test management tool
would be in providing traceability from requirements to test plans to test procedures and test
reports. A test management tool can be used to monitor progress towards addressing
requirements, tracking problem reports, tracking the resolution of defects, recording variations
to test procedures, MOPs, MOEs and ultimately COIs to ensure that the project is on schedule
and within budget.
DRAFT
71
Annexes
Annex A: GLOSSARY OF TERMS THAT MAY BE ENCOUNTERED
These definitions have been developed in a generic form and may require qualification when
used in specific contexts. If elaboration is required then the definition of the term should be
used with any explanatory comment following. These definitions include the complete list of
terms from the T&E Lexicon, which has been endorsed by DTRIALS, as well as some
additional definitions from relevant Defence policy.
Acceptance – means acceptance of the Supplies in accordance with clause 6.5 of the
conditions of contract signified by the Project Authority’s signature of the Supplies
Acceptance Certificate; and "Accept" has a corresponding meaning.
Acceptance Into Naval Service (AINS) – this term referred to the milestone, in the Materiel
Life Cycle, at which Chief of Navy was satisfied that the equipment was, in all respects,
suitable for RAN operational service. This milestone has been superseded by Operational
Release (OR), and is used only in reference to legacy projects.
Acceptance Test and Evaluation (AT&E) – T&E carried out to demonstrate whether or not
the materiel developed and produced fulfils the contractual requirements and specifications.
AT&E is a subset of Acceptance V&V.
Aircraft Stores Clearance Test and Evaluation - The process of test and evaluation
necessary to clear carriage and/or release of an airborne store from an aircraft.
DRAFT
72
Audit – These are checks used to assure the SPO that the functional and physical
configuration of a system is as specified. The three main types of audits are Functional
Configuration Audits, Physical Configuration Audits and Quality Audits.
Critical Issues – the requirements that are of such significance that together they characterise
the required outcomes of the proposed system. They are deemed critical because any one, if
not resolved, can be a ‘show stopper’. Critical issues are usually categorised as either Critical
Operational Issues or Critical Technical Parameters (CTPs). CTPs can be derived from
policy, cost related or specified technical constraints.
Critical Operational Issue (COI) – questions (hence qualitative issues) that need to be
answered to prove functional aspects (operational and support) of the required capability
system have been met by the acquired system. The COIs or questions are decomposed by
T&E specialists into testable elements to provide a data that enables the issues to be
evaluated.
Critical Technical Parameter (CTP) – technical aspects that directly describe or quantify a
system’s performance, or impact compatibility of components within the system or interaction
with other systems. They are ‘system qualifiers’ and can impose design limits in the form of
weight, length, area, volume, frequency response, temperature, electrical or magnetic
thresholds, etc. CTPs will be tested by direct measures.
Defence T&E Community - the Australian Defence Organisation’s test facilities, T&E
agencies/authorities, industry and personnel concerned with T&E.
Defence Trial – A trial conducted under the authorisation of the Director of Trials.
DRAFT
73
performance capability and is differentiated from testing by the level of risk involved and the
subsequent reduced detail in data gathering.
Design acceptance – certifying that an approved design is acceptable and meets the individual
ADO technical requirements, as detailed in the DMO’s Technical Certification Plan. The
design review should involve some assessment of the ability of the design to meet the
operational requirements.
Development Test and Evaluation (DT&E) – T&E conducted specifically to assist the
system design and development process and to verify attainment of technical or other
performance criteria and objectives. It normally applies in the design stage when developing
new systems, but also applies to in-service systems when developing upgrades or modification
to those systems.
Evaluation – the process of review and analysis of quantitative or qualitative data to provide
an objective assessment of a system’s safety, performance, functionality and supportability in
measurable terms, to determine fitness for purpose or for contractual compliance. Generally,
the review will be to compare results against agreed criteria, but can also be to identify design
or system limitations, and operational use issues.
Final Acceptance – means acceptance of the Capability in accordance with clause 6.6 of the
conditions of contract signified by the Project Authority’s signature of the Final Acceptance
Certificate.
Initial Operational Capability (IOC) – the point in time at which a capability system is
deemed by the Capability Manager as ready to be deployed in an operational role. IOC
coincides with Operational Acceptance or Operational Release of whole or partial capability.
If IOC coincides with the operational acceptance/release of the system’s full capability, the
system is deemed to have achieved its In-Service Date (ISD)
Initial Operational Release – the first operational release in a progressive release of full
operational capability.
Initial Operational Test and Evaluation (IOT&E) – T&E that is the first time the system
(full or partial capability) is tested on production representative test articles used by typical
operators with typical field equipment in a realistic environment. The objective of this type of
testing is to determine operational effectiveness and suitability through resolution of critical
operational issues, and to ensure deficiencies discovered in earlier operational
assessments/evaluations have been corrected. Whether this will include the involvement of
DRAFT
74
the Contractor will depend upon the Mission System Validation clause in the contract. IOT&E
usually follows system acceptance, but can be combined with T&E in support of system
acceptance.
In-Service Date (ISD) - the point in time when a capability system achieves operational
acceptance / release.
Integrated Logistic Support (ILS) - a disciplined approach to the management and technical
activities necessary to:
- cause support considerations to positively influence concepts, design requirements
and design selection;
- define and integrate logistic support requirements and optimise support requirements
with weapon system performance;
- acquire the required support;
- provide the required support during the operational phase at minimum life-cycle
cost; and
- address logistic support requirements during the disposal phase.
Materiel System - the combination of the Mission System and Support System, which covers
most aspects of the Fundamental Inputs to Capability (FIC), ie organisations, personnel,
collective training, major systems, supplies, facilities and support (for materiel system), and
command and management (for support system), but does not include operational doctrine.
Measure of Effectiveness (MOE) – a parameter that describes the ability in which a system
accomplishes its assigned role and is independent of equipment solution.
Operational Effectiveness – the ability to perform its intended function over its intended
operational spectrum, in the expected operational environment, and in the face of expected
threats when operated by typical operational personnel.
DRAFT
75
Operational Suitability – the capacity of the system, when operated and maintained by
typical operational personnel in expected numbers, at the expected level of competency, to be
reliable, maintainable, available, logistically supportable, compatible, interoperable, safe and
ergonomically satisfactory
Operational Test and Evaluation (OT&E) – T&E conducted under realistic operational
conditions. OT&E is conducted with representative users of the system, in the expected
operational context, for the purpose of determining its operational effectiveness and suitability
to carry out the role and fulfil the requirement that it was intended to satisfy.
Qualification Testing – testing employed to verify that the design and manufacturing
processes comply with mandated specifications and standards and provides a baseline for
subsequent acceptance/production tests. Qualification of a product must be done to set
standards. When a design is qualified, product that complies with the design does not need to
undergo additional testing against those standards each time it is manufactured.
Safety and Suitability for Service (S3) – a term used to summarise the requirements for
materiel to be acceptably free from hazards and to have inherent characteristics that meet
specified requirements during its agreed life cycle. This definition generally excludes
operational effectiveness and lethality but may include certain performance characteristics if
these aspects are deemed to be part of the item design function.
Safety and Suitability for Service (Ordnance specific) - The inability of an item which
contains explosives to hazard the lives of Servicemen or the public at large, or to cause
damage to Australian Government or private property, the capability of parts which contain
explosives to function as designed and the assurance that this functioning will not be
unacceptably degraded by the Service environment.
Simulation – The implementation or exercise of a model over time (DI (G) OPS 42-1).
Support System - the organisation of hardware, software, materiel, facilities, personnel, data,
processes, and services required to enable the mission system to be effectively operated and
support to meet its operational requirements.
DRAFT
76
Supportability - the degree to which planned support (including test, measurement, and
diagnostic equipment; spares and repair parts; technical data; support facilities; transportation
requirements; training; manpower; and software support) meets system reliability, availability,
and maintainability requirements.
System Acceptance – the acknowledgment by the DMO project authority that an acquired
system complies with contractual requirements.
System Reviews - a series of system engineering activities by which the technical progress on
a project is assessed relative to its technical or contractual requirements. The formal reviews
are conducted at logical transition points in the development effort to identify and correct
problems resulting from the work completed thus far before the problem can disrupt or delay
the technical progress. The reviews provide a method for the Contractor and procuring activity
to determine that the development of a CI and its identification has met contractual
requirements.
System Performance Parameter (SPP) – a system parameter, the value of which can be
calculated or derived from a number of direct measurements.
T&E Activity - an activity that has an outcome of producing objective evidence to support
making critical decisions. A T&E activity may encompass a combination of inspections,
testing, demonstration and analysis to deliver the necessary outcome.
T&E Community - the collective term for test facilities, T&E agencies/authorities and
personnel concerned with T&E, particularly within Australia. The T&E Community is
inclusive of the Defence T&E Community. Example members of the T&E community
include the CSIRO, NATA, and SEEC.
T&E Principals’ Forum – a group of Defence T&E Principals representing their Services,
Capability and Analysis Group, DSTO and DMO who have the responsibility of commanding
or directing T&E agencies and authorities.
Test Concept Document (TCD) – The Commonwealth document that outlines the T&E
strategy (including major T&E objectives, activities, funding and identification of
responsibilities) throughout the Materiel Life Cycle. The TCD is one of the Capability
Definition Documents and is produced to assist in achieving Second Pass Government
approval.
Test and Evaluation Master Plan (TEMP) – document that describes the T&E objectives,
activities, funding and organisation and staff roles and responsibilities for planning the
conduct of T&E.
DRAFT
77
Test – an activity in which a scientific method is used to obtain quantitative or qualitative data
relating to the safety, performance, functionality and supportability of a system.
Trial - an activity consisting of single or multiple tests conducted to establish the performance
and/or characteristics of equipment, a system or a concept. In this context, a trial usually takes
the form of a planned process aimed at exercising the subject under test in its actual or
simulated environment to produce data for analysis.
Validation – Proof through evaluation of objective evidence that the specified intended end
use of a product is accomplished in an intended environment.
DRAFT
78
Introduction
This annex is a guide to assist test managers in developing a TEMP for strategic materiel
projects.
The TEMP is a single document that outlines the whole V&V program and its relationship to
other engineering activities. It provides traceability up to the CDD (and in particular the TCD)
and down to the lower level test plans. The TEMP summarises the current status of the
system with respect to compliance with the Acquisition, Capability and Acceptance Into
Service Baselines (compliance with respect to the Acquisition Baseline is more
comprehensively described in the VCRM). The TEMP summarises what the major risks of
failing to achieve V&V against these baselines are and how they are to be mitigated. The
TEMP ensures that all of the V&V activities have been planned and funded and have clear
objectives. The TEMP ensures that the status of the system with respect to requirements’
compliance and fitness for purpose (i.e. validation) are clearly documented in a single
document. The TEMP also provides a hierarchy of measures that the fitness for purpose of the
system can be measured against. The TEMP ensures that a consistent program has been set in
place to verify compliance with requirements, manage contractor V&V activities, develop
acceptance criteria and validate the system. The TEMP describes the hierarchy of test plans
and refers to lower level test plans for further details on the V&V program. The TEMP does
not have to be issued to the contractor but the contractor may request it for information to
assist in drafting the V&VP.
The TEMP is developed by an IPT immediately after second pass approval under the direction
of the DMO SPO. However, in some projects it may be necessary to start to develop the
TEMP before second pass approval if the TCD does not provide sufficiently detailed
information on the cost of T&E activities in order for second pass approval to be achieved.
The IPT consists of representatives from the DMO SPO, the relevant T&E agency, CDG and
the Capability Manager. In practice, the TEMP would be developed by the T&E Manager in
the DMO SPO and reviewed by the other representatives of the IPT. A TEMP for a RAAF
project must be endorsed by the project director, the Aerospace Acquisition Division Chief
Engineer, the RAAF operational sponsor and the Design Acceptance Authority for an aircraft
project or the appropriate ground system Design Acceptance Authority Representative for a
ground based project. A TEMP for a RAN or Army project needs to be endorsed by the
project director and the RAN or Army operational sponsor.
The TEMP must initially be consistent with the CDD. However, unlike the TCD, the TEMP
is a living document. The TEMP is updated whenever a major test activity is completed,
whenever there are changes to the Acquisition Baseline, whenever there are changes to the
acceptance criteria or whenever changes need to be made to the nature of V&V activities
required in order to achieve acceptance.
The TEMP provides more detail than is described in the TCD and is the plan that outlines how
the V&V evidence required for Design Acceptance, System Acceptance and Acceptance Into
Service will be obtained. It also describes the status of testing conducted to date and identifies
the necessary verification and validation activities. The TEMP relates the V&V program
DRAFT
79
schedule and required resources for conducting V&V to the COIs, CTPs, test objectives,
evaluation thresholds and the milestone decision points.
The value obtained in mitigating the risk of non-compliances with respect to requirements and
operational needs that is provided by planning, conducting and tracking a V&V program
through the TEMP and subordinate test plans, procedures and reports greatly exceeds the cost
of producing these documents.
As a rough guide it is expected that the TEMP should be approximately 50 pages in length.
DRAFT
80
DRAFT
81
SECTION VI - SAFETY
6.1 Assessment of Safety
6.2 Critical Safety Issues
6.3 Safety Management for V&V activities
DRAFT
82
DRAFT
83
• Is the section written from the perspective of Defence monitoring of Contractor DT&E
activities?
• Does it identify the nature and scope of DT&E to be required of the Contractor?
• Are the basic test phases/blocks clearly described?
• Are the criteria expressed in terms of where in the overall program specific values are
to be met (milestones at which incremental/final values are to be met)?
• Is the scope of testing (quantity) clearly stated for each phase/block?
• Is it clear how the objectives are to be met?
• Are items outside the control of the SPO clearly identified?
Section IV (Validation)
• Is there a correlation to TEMP Format paragraph 4.1 - 4.3?
• Has the relevant OT&E agency been involved in the development of Validation
objectives?
Section V (Acceptance V&V)
• Is there a correlation to TEMP Format paragraph 5.1 - 5.3?
Section VI (Safety)
• Is there a correlation to TEMP Format paragraph 6.1 – 6.3?
Section VII Specialty Test Programs)
• Is there a correlation to TEMP Format para 7.1 - 7.2?
Section VIII (Supportability Test Plan)
• Are the business models identified?
• Are all the business model organisations identified?
• Are the funding/resource requirements identified?
• Is a schedule provided showing when all testing is to be conducted?
Section IX (Transition Plan)
• Have all items been identified that have to be transitioned into service?
• Are all the organisations identified to accept assets?
• Are the funding/resource requirements identified?
• Is a schedule provided showing when all major and key resources are to be
transitioned?
Section X (Special Resource Summary)
• Are the quantities, types and configurations of test articles identified?
• Are all of the major test sites, instrumentation and facilities identified?
• Are all the major support resources identified?
• Is the source organisation identified for each resource?
• Are the funding requirements and any known shortfalls identified?
• Is a schedule provided showing when all major and key resources are required, and any
conflicts identified?
• Are the resources adequate to conduct the test program?
Section XI (Identify/Estimate Funding and Human Resource Limitations)
• Have any/all shortfalls in funding or human resources been identified?
DRAFT
84
The process for the development of the TEMP is described below in Figure B1 in the form of a Functional Flow Block Diagram (FFBD).
V&V
Changes activities
in project
scope
contractor
develops
V&VP
TCD
T&E
OCD agency
advice Safety Transition
FPS SV&V
Case P Plan
Describe key Identify System Identify and Assess Allocate Develop DT&E/ Develop Develop
Develop safety Develop SV&VP Review and
features of Performance Allocate Critical Resource Management AV&V/Validation specialty T&E transition plan Maintain TEMP
V&V summary Approve
system Objectives T&E Issues Requirements Responsibility outlines program program summary
approved updated
TEMP TEMP TEMP TEMP TEMP TEMP TEMP s9
TEMP s6 TEMP s7 TEMP s8 TEMP TEMP
s.1.1 s.1.2 s.1.3-1.6 s2,2.2,2.3 s.2.1 s3,4&5
This process is then broken down into the development of each section of the TEMP as described below.
DRAFT
85
The references to sections of the OCD are based on the OCD format contained in version 1.2 of the Capability Definition Documents (CDD) Guide. The
references to section of the TCD are on the TCD format, which is also contained in version 1.2 of the CDD Guide. These are the latest versions of these
formats at the time of writing.
Within this section the operational need must be described in section 1.1.1. Section 1.1.1 should contain the content of section 3.5 of the OCD
for details on objectives of the system within the context of the sorts of missions it is to be used in and the general capabilities that the system
must have. This section should refer to the OCD rather than paraphrase from the OCD to ensure that the intent of the OCD is fully captured and
to simplify configuration control of the documents.
Section 1.1.2 should describe the missions that need to be accomplished by the system. This should be taken from the relevant annex of the
OCD.
Section 1.1.3 should describe the logistical and operational environment that the system will be used in as per the operational scenarios listed in
Annex A of the OCD. This section should also describe the type of physical environment in which the system will be used to provide a reference
for determining the acceptance criteria for compliance against environmental engineering specifications.
The FFBD describing the process for developing section 1.1 is provided below in Figure B2:
DRAFT
86
OCD
OCD s.3.5
annexes
Describe
Describe Describe
Operational
Missions Environment
Need
Within this section the key functionality of the system must be described in section 1.2.1. This should list the solution independent consolidated
functional needs listed in section 3.6 of the OCD and the solution dependent description of the system functionality and performance in section
5.5 of the OCD.
In section 1.2.2 a brief description of the system architecture, a description of the hardware and software to be used, a general description of the
components and sub-systems to be included for each configuration of the system and the internal and external interfaces of the system must be
included. This information should initially be sourced from section 5 of the OCD. This section will need to be updated as further information
becomes available, in particular after the receipt of the Contractor’s system specification and in the event of any accepted engineering change
proposals.
In section 1.2.3 a list of the critical system characteristics and unique support concepts that will require special verification or validation
requirements must be included. For instance, if the system threat assessment indicates that new countermeasures to the system being acquired are
likely to emerge before the system is accepted for service then it may be that new threat simulators are required. These characteristics are
potentially high-risk elements of the system, since they may impact greatly on the COIs, they may be difficult to verify or validate or they may
be difficult to support logistically. Some of these characteristics can be determined from the system characteristics that impact greatly on the
COIs and CTPs listed in sections 3.1 and 3.2 of the TCD respectively. The T&E support issues listed in section 3.3 of the TCD and the major
DRAFT
87
T&E activities listed in section 5.2 of the TCD may provide information as to which system characteristics will be difficult to verify or validate.
The currently unsupported requirements in the FPS (i.e. those requirements that are intended to address the capability gap) are also likely to
include unique system characteristics.
The FFBD describing the process for developing section 1.2 of the TEMP is provided below in Figure B3:
Describe
Describe Key Describe
Unique
Functions Interfaces
Characteristics
Section 1.3 should contain a list of the COIs taken directly from section 3.1 of the TCD. It may be that as new threats are identified or new
requirements are added the COIs may change. This section of the TEMP will have to be reviewed when changes to the requirements are
approved.
Section 1.4 should refer to the Threat and Risk Assessment for the system and should contain a summary of the threat environment copied
directly from the Threat and Risk Assessment. The Threat and Risk Assessment should be the latest version from the DIO. There may be more
than one Threat and Risk assessment, including perhaps a TEMPEST Threat and Risk Assessment. In this situation, all of those Threat and Risk
Assessment that are relevant to describing the operational environment should be included in this section. This section is necessary because the
likely countermeasures to a system need to be considered when developing plans for mission system validation.
DRAFT
88
Section 1.5.2 should contain a list of the key operational suitability characteristics. This must consist of a hierarchy of measures including all of
the operational suitability related COIs (i.e. those COIs that relate to the suitability for service of the system), MOSs and MOPs.
Examples of generic effectiveness and suitability characteristics are provided in sections 5.3 and 3.2 of the V&V Manual.
Section 1.5.3 should contain a list of evaluation thresholds. These should be based on the thresholds identified in the FPS. These evaluation
thresholds should represent the level of performance to be demonstrated that will represent a level of risk (that the desired level of operational
effectiveness and operational suitability won’t be met) that the members of the IPT are willing to accept.
Sections 1.5.1 – 1.5.3 can, alternatively, be represented in the form of a table listing the COI, MOE, MOP, evaluation threshold, validation
method and the CDD reference in a similar format to that in the example in Table B1 below. The validation method should be identified
according to the V&V methods listed in section 8.2 of the V&V Manual and the validation activities should be determined based on the activities
listed in section 5.2 of the TCD. Since the TCD contains only a basic list of V&V activities, further V&V activities will need to be identified in
the TEMP such that sufficient V&V activities are planned to address all of the COIs.
The highest priority performance parameters (Key Performance Parameters) should be marked as such. Typically a system will have no more
than 10 COIs, with a few MOEs per COI, a few MOPs per MOE and an evaluation threshold against each MOP.
A discussion of the identification of COIs, MOEs and MOPs is contained in section 9.2 of the V&V Manual and is also provided at the end of
this section.
The validation method should be included in the table along with details of at which test activity compliance with respect to the evaluation
threshold is expected. It may be that, due to time or budgetary constraints, that there may be different evaluation thresholds that the systems
DRAFT
89
performance will need to be judged against at different test activities. For instance, it may only be possible to test compliance with an evaluation
threshold in a very limited sense in a demonstration but it may be possible to evaluate compliance with the threshold fully in the Operational
Evaluation.
Table B1: An example of a table listing the Required Operational Characteristics for section 1.5 of the TEMP
DRAFT
90
The key technical effectiveness characteristics need to be defined in section 1.6. This must include the CTPs from section 3.2 of the TCD. The
CTPs may include critical inter-operability requirements (such as the system shall be capable of being powered by a 240V 50Hz power supply),
critical design constraint requirements from the FPS. The CTPs are any important technical characteristic of the system that the system must
have in order for the operational need to be satisfied.
The CTPs must be described in the form of a table in the following format. An example is provided in Table B2 to demonstrate the concept:
Table B2: An example of a table listing the Key Technical Effectiveness Characteristics for section 1.6 of a TEMP
Critical technical CDD Reference V&V activity in Decision Supported Threshold Value
parameter which compliance
will be verified
CTP 1: Bore of the FPS para x Inspection prior to System Acceptance Must be
main tank gun Acceptance Test 155mm±0.5mm
The process for producing sections 1.3 – 1.6 is described in the FFBD in Figure B4.
DRAFT
91
V&VM
Guidance
Derive Determine
Derive MOEs Effectiveness Effectiveness
Threat
TCD s.3.1 MOPs Thresholds TCD s3.2
Assessment
TEMP TEMP
s.1.5.1 s1.5.3
Determine AND AND
List COIs Describe CTPs
System Threats
TEMP
s.1.5.2
The process for producing the COIs, MOEs, MOSs, MOPs and CTPs
If the COIs are not listed in the TCD or the OCD it will be necessary to produce the COIs in consultation with the other members of the IPT
using the process described in Figure B5.
DRAFT
92
The MOEs should be derived in consultation with the other members of the IPT as per the process described in Figure B6.
1.3.3.1
Determine all
LP effectiveness measures LP
measurable in one V&V
activity
The Effectiveness MOPs should be derived in consultation with the other members of the IPT as per the process described in Figure B7.
DRAFT
93
The MOSs should be derived in consultation with the other members of the IPT as per the process described in Figure B8.
Determine all
LP Suitability Measures LP
measurable in one
V&V
ti it
The Suitability MOPs should be derived in consultation with the other members of the IPT as per the process described in Figure B9.
DRAFT
94
Determine objectively
Determine most
LP measurable LP
significant factors that
parameters for these
influence MOS
factors (MOP)
The CTPs should be derived in consultation with the other members of the IPT by examining the ‘essential’ technical requirements from the
FPS. Consideration of what level of ‘important’ requirements that can fail before the system under test (SUT) can be deemed to have reached a
Critical failure level must also be made. The OCD should also be checked for possible additional CTPs, which were not transferred into the
FPS.
A short description of the project phases and the associated V&V phases should be contained in section 2.1. This should include a description of
the project phases, which should be taken from the Equipment Acquisition Strategy (EAS) and the project V&V phases, which should include all
of the phases described in section 5.2 of the TCD. This section will need to be updated if there are any major changes to the scope of any phases
of the project.
A description of the responsibilities of all stakeholders in relation to V&V should be contained in section 2.2. The major stakeholders whose
roles need to be described in this section will include as a minimum the DMO SPO, the Contractor, T&E Agency and the Sponsor. This should
describe the responsibilities of the stakeholders with respect to all of the V&V activities discussed in this TEMP. For instance, the role of the
Contractor in planning, conducting and reporting on the Acceptance Testing should be identified and the role of the DMO (in consultation with
the T&E Agencies if required) in reviewing and approving the ATP, ATProcs (including the acceptance criteria) and the ATR should be
DRAFT
95
identified. As the roles of stakeholders change this section will need to be updated. This section could be described in a table that clearly
identifies the responsibilities of the various stakeholders. Sources of information for developing this section are the details contained in section
4.1 of the TCD, the Contract, the System Review Guide, the ASDEFCON (SM) Handbook and consultation with the project stakeholders.
The responsibilities of the project stakeholders for the funding of the V&V activities should be discussed in section 2.3. This should be refined
from the TCD and should address the complete list of V&V activities discussed in the TEMP is section 1.5. This section must be updated to
ensure that the priority of funding of various V&V activities is kept up to date. It is also important to ensure that the level of funding of V&V
activities is commensurate with the level of risk mitigation that the V&V activity will contribute to such that adequate resources will be devoted
to conducting V&V early in the project lifecycle. Advice on the cost of various T&E activities should be sought from the relevant T&E agencies
and the users. The agreement of the various stakeholders with respect to the level of funding that they are responsible for must be obtained prior
to producing this section.
Section 2.4 should contain an integrated schedule in the form of a Gantt Chart detailing the project phases and V&V phases (including
Validation phases such as OpEval, OA, IOT&E or FOT&E, AV&V phases, DT&E phases and other V&V phases). It should also indicate
relevant decision points, such as when Design Acceptance, System Acceptance and Acceptance Into Service should occur. This should also
contain details of funding and responsibilities of the project stakeholders with respect to funding. The decision points should include important
dates such as the various system reviews, contract signature and the In-Service Date. The major V&V activities identified in the tables in
sections 1.5 and 1.6 of the TEMP should also be identified in the integrated summary.
DRAFT
96
An FFBD describing the process for developing section 2 of the TEMP is described in Figure B10.
DRAFT
97
Section 3.1 should contain a list of the critical DT&E issues. This would include the critical issues listed in the TCD in section 3.2 that would be
best addressed in DT&E. The V&V methods used to address these critical issues should be discussed here. The discussion of critical DT&E
issues will cover how the achievement of the CTPs will be verified in terms of:
• what activities will be conducted to verify that the CTPs have been met;
• what the critical resource requirements are for these activities;
• what risks there may be to achieving these CTPs; and
• what will be done to mitigate these risks.
Section 3.2 should discuss the DT&E conducted to date. This will include the range of V&V activities described in section 8.2 of the V&V
Manual (i.e. audits, comparisons, inspections, demonstrations and analysis) as they relate to V&V throughout the development of the system.
This section should also provide a brief description of the functional configuration of the system when these activities were conducted. For
example, it may be mentioned that a particular DT&E activity was conducted using a particular version of a software package that is being
developed as a major component of the system. After the V&VP has been produced this section should refer to the V&VP for further details on
the DT&E conducted by the Contractor to date. Importantly, this section should also highlight any high priority deficiencies in the system with
respect to requirements (i.e. defects) identified so far and the methods proposed for rectifying these defects.
Section 3.3 should discuss the future DT&E activities to be conducted. After the V&VP has been produced this section should refer to the
V&VP for further details on the DT&E responsibilities of the Contractor.
Section 3.3.1 should contain a discussion of the objectives of each of the future DT&E phases for the project. This section should also contain a
diagram explaining the hierarchy of DT&E related test plans and procedures. The relationship between the outputs of these DT&E phases and
the decision points throughout the project should be described here.
Section 3.3.2 should contain a listing of the future DT&E activities along with details of the objectives and scope the each of these activities in
terms of what decisions they will support and what parts of the Acquisition Baseline they will verify compliance with. This section should also
provide a brief description of the functional configuration status of the system that is expected when these activities are to be conducted.
DRAFT
98
Section 3.3.3 should contain a list of the critical resource requirements for conducting these DT&E activities in terms of the personnel, funding
and equipment required. This section should also contain a discussion of how the risks associated with obtaining these resources will be
mitigated.
Section 3.3.4 should contain a discussion of the constraints and limitations that apply to the DT&E activities, how the limitations will affect the
ability to draw conclusions from the data and how these limitations may be overcome in other non-developmental V&V activities (e.g. Mission
System Validation).
Section 4.1 should contain a list of the critical validation issues (i.e. the COIs). The COIs would derive from section 3.2 of the TCD. The
methods used to address both the effectiveness and suitability COIs should be discussed here. The main method for validation is OT&E. The
discussion of critical validation issues will cover how the achievement of the COIs will be validated in terms of what activities will be conducted
to verify that the COIs have been resolved, what the critical resource requirements are for these activities, what risks there may be to achieving
these COIs and what will be done to mitigate these risks. There may be a combination of validation methods used to resolve the COIs. A
discussion of why the particular validation methods proposed for resolving the COIs were deemed to be the most accurate, timely and cost
effective methods for resolving the COIs should be included in this section. This section should also highlight any high priority deficiencies in
the system with respect to requirements (i.e. defects) identified so far and the methods proposed for rectifying these defects. This section should
also highlight any high priority deficiencies in the system with respect to validation (i.e. defects) identified so far and the methods proposed for
rectifying these defects.
Section 4.2 should discuss the validation conducted to date. This discussion of validation conducted to date should include the OT&E activities
as well as the range of validation activities other than OT&E. This will include the range of V&V activities described in section 8.2 of the V&V
Manual (i.e. audits, inspections, demonstrations and analysis) as they relate to validation of the mission system and the validation of the support
system’s contribution to maintaining the operational suitability of the system. This section should also provide a brief description of the
functional configuration status of the system when these validation activities were conducted and the scenarios in which the validation was
conducted. For example, it may be mentioned that validation activity X was conducted using version or configuration Y of the system in
operational scenario Z described in the OCD. After the Validation Plan has been produced this section should refer to the Validation Plan for
DRAFT
99
further details on validation activities conducted to date. After the V&VP is produced this section should refer to the V&VP for further details
on the Contractor’s involvement in validation activities.
Section 4.3.1 should contain a discussion of the overall objectives of future validation phases for the project (e.g. Operational Evaluation,
FOT&E etc.). This section should also contain a diagram explaining the hierarchy of validation related test plans and procedures. The
relationship between the outputs of these validation phases and the decision points throughout the project should be described here.
Section 4.3.2 should contain a listing of the future validation activities along with details of the objectives and scope the each of these activities
in terms of what decisions they will support and what parts of the Capability Baseline they will validate the system against. This section should
also provide a brief description of the functional configuration status of the system that is expected when these activities are to be conducted.
Section 4.3.3 should contain a list of the critical resource requirements for conducting these validation activities in terms of the personnel,
funding and equipment required. This section should also contain a discussion of how the risks associated with obtaining these resources will be
mitigated.
Section 4.3.4 should contain a discussion of the constraints and limitations that apply to the validation activities and how the limitations will
affect the ability to draw conclusions from the data. It should also discuss how these limitations may be overcome in other V&V activities or
whether the Capability Manager would be willing to accept the risk resulting from the uncertainties in the results of the validation activities due
to the limitations of the validation activities.
Section 5.1 should contain a list of the critical AV&V issues (i.e. the highest priority acceptance criteria). This would include the critical issues
listed in the TCD in section 3.2 that would be best addressed in AV&V. The V&V methods used to address the high priority acceptance criteria
should be discussed here. The main method for AV&V is Acceptance Testing. The discussion of critical AV&V issues will need to include a
discussion of how the achievement of the acceptance criteria will be verified. This will include a list of AV&V activities that will be conducted
to verify that the acceptance criteria have been met, what the critical resource requirements are for these activities, what risks may prevent
meeting the acceptance criteria and what will be done to mitigate these risks. There may be a combination of AV&V methods used to verify that
DRAFT
100
the acceptance criteria have been met. A discussion of why the particular V&V methods proposed for verifying that the acceptance criteria have
been met were deemed to be the most accurate, timely and cost effective methods for verifying that the acceptance criteria have been met should
be included in this section. Importantly, this section should also highlight any high priority deficiencies in the system with respect to acceptance
(i.e. defects) identified so far and the methods proposed for rectifying these defects.
Section 5.2 should discuss the AV&V conducted to date. This should include the Acceptance Testing as well as the range of other V&V
activities needed to ensure compliance with the requirements listed in the FPS and the acceptance criteria. This will include any mission system
and support system validation activities that the Contractor has been contracted to perform. This may include the range of V&V activities
described in section 8.2 of this manual (i.e. audits, comparisons, inspections, demonstrations and analysis) as they relate to validation of the
mission system and the validation of the support system’s contribution to maintaining the operational suitability of the system. This section
should also provide a brief description of the functional configuration status of the system when these activities were conducted and the
scenarios in which the validation was conducted. After the V&VP has bee produced this section should refer to the V&VP for further details on
the Contractor’s involvement in validation activities. This section should also refer to the ATP, ATProcs, ATR and VCRM for further details on
AV&V conducted to date. Whether the DMO has reviewed approved the Contractor’s V&VP, ATP, ATProcs, ATR and VCRM should also be
mentioned in this section.
Section 5.3.1 should contain a discussion of the overall objectives of future AV&V activities for the project (e.g. Acceptance Testing). This
section should also contain a diagram explaining the hierarchy of AV&V related test plans and procedures. The relationship between the outputs
of these AV&V activities and the decision points throughout the project should be described here.
Section 5.3.2 should contain a listing of the future AV&V activities along with details of the objectives and scope the each of these activities in
terms of what decisions they will support and what parts of the Acquisition Baseline they will verify compliance with. This section should also
provide a brief description of the functional configuration status expected of the system when these activities are to be conducted.
Section 5.3.3 should contain a list of the critical resource requirements for conducting these AV&V activities in terms of the personnel, funding
and equipment required. This section should also contain a discussion of how the risks associated with obtaining these resources will be
mitigated.
DRAFT
101
Section 5.3.4 should contain a discussion of the constraints and limitations that apply to the AV&V activities and how the limitations will affect
the ability to draw conclusions from the data. It should also discuss how these limitations may be overcome in other V&V activities, such as
Defence-run validation activities, or whether the Capability Manager would be willing to accept the residual risk resulting from the uncertainties
in the results of the AV&V activities due to the limitations of the AV&V activities.
Section 6 – Safety
Section 6.1 should describe how safety issues will be assessed. Under ASDEFCON (SM) the Contractor would be expected to produce a System
Safety Program Plan (SSPP) and a Safety Case Report (SCR) as per the ASDEFCON (SM) DIDs. This section will refer to the SSPP and SCR
and provide a summary of the V&V activities that need to be conducted in order to assure the Capability Manager that the residual safety risks
associated with use of the system are acceptable. The standard that the Contractor will use for assessing safety risks and for safety assurance
(i.e. mitigation of unacceptable safety risks) must be specified in this section of the TEMP.
Section 6.2 should list the highest risk safety issues, as listed in the Safety Case Report. It should also outline the schedule for the verification
activities that will ensure that the highest priority risks will be mitigated to an extent such that the Capability Manager would deem the residual
risk acceptable. The status of safety assessment activities and the status of verification of System Safety Requirements should be updated in this
section after each major safety assessment or safety verification activity to reflect the current status of the safety assurance program.
Section 6.3 should describe the methods to be employed and the standards to be complied with for ensuring safety within V&V activities with
particular emphasis placed on the Mission System Validation activities.
Refer to the Safety thread in QEMS (http://qems.dcb.defence.gov.au) for further information on safety.
Section 7.1 should contain a list of the Specialty Test Program requirements.
The Specialty Test Program includes test and evaluation to verify compliance with requirements pertaining to specialty engineering fields.
DRAFT
102
This includes verification of requirements in relation to system security, standardisation (e.g. compliance with various safety standards), human
factors engineering, electromagnetic interference and electromagnetic compatibility (EMI/EMC), environmental engineering, vulnerability to
sophisticated threats such as EW systems and compliance with frequency spectrum restrictions.
Section 7.2 should contain a list of critical issues that have been resolved within the Specialty Test Program and the critical issues expected to be
resolved within the remaining parts of the Specialty Test Program. This section should also describe any major system defects with respect to
the specialty engineering fields relating to the system and what will be done to rectify these defects.
This section should contain an overview of objectives of the Supportability Verification and Validation (SV&V), provide a summary of SV&V
activities conducted to date and SV&V activities to be conducted in the future. This section should describe the hierarchy of SV&V plans and
an overview of the SV&V process. This section should also refer to the source in the FPS and OCD for SV&V requirements. This section
should also refer to the SV&VP for further details.
This section should describe the V&V evidence required for Acceptance Into Service and should describe the relationship between the TEMP
and the Transition Plan. It should list the schedule for conducting the major V&V activities that need to be completed to achieve the ISD
(especially the IOT&E activities). This section should describe and list the status of any risks to the successful completion of the test activities
required to achieve AIS. This section should be updated if the scope of the IOT&E activities that need to be completed to achieve AIS changes.
Section 10.1 should contain a schedule of V&V activities that have special resource requirements. These are resources that may be expensive or
difficult to obtain.
Section 10.2 should contain the details of the special resource requirements in terms of facilities, GFM and special instrumentation for each of
the V&V activities listed in section 10.1. This may include targets, threat simulations or other simulations and models, ranges or other T&E
facilities and other special support.
DRAFT
103
DRAFT
104
DRAFT
105
DRAFT
106
Operational Capability
User requirements Validation
Verification
Verification
Verification
DRAFT
107
DRAFT
108
12. Feasibility: The requirements must be assessed to determine whether they can be
implemented with the technology available or other constraints.
13. Validity of Models: Part of the RB may include one or more system models, such
as data-flow models of the system’s functionality, object models, event models
etc. which must be checked for internal and external consistency. All models
must include all information which is necessary, possess no conflicts between the
parts of a model, and no conflicts between different models.
Applying the above concepts to the CDD process, RB Validation can be defined as
the activities, which evaluate the RB to ensure compliance with Sponsor expectations,
project and organisational constraints and external constraints. Two major iterations
of the RB will be the completed OCD and the FPS, although there may be a number
of iterations of these documents prior to their completion. RB Validation should be
performed at the conclusion of any Requirements Analysis activities within the CDD
process and most importantly at the completion of any major CDD document. The
inputs to the RB Validation process are the RB, higher-level requirements statements,
relevant organisational standards, and organisational and domain knowledge. The
outputs of the RB Validation process are a list of problems of the current RB
document (such as variances and conflicts), an agreed list of actions to rectify the
problems (including iterating through Requirements Analysis), and ultimately, a
validated RB document.
DRAFT
109
DRAFT
110
DRAFT
111
DRAFT