Crusader Plan
Crusader Plan
November 1998
The Program/Project Manager’s signatures represent an understanding that as the plans mature, are scheduled and
implemented, that specific responsibilities, funding and schedule will be subject to formal negotiations.
Distribution A: Approved for Public Release
ii
COORDINATION PAGE
iii
Executive Summary
The Crusader weapon system development approach follows a streamlined acquisition strategy
that emphasizes a number of “flagship” initiatives such as Integrated Product Teams (IPTs), flexible
program phasing, and Simulation Based Acquisition (SBA) to control schedule, cost, and risk.
Principal among these initiatives is the SBA approach. This approach assumes the use of modeling
and simulation (M&S) as a fundamental element of all aspects of Crusader development, from
requirements generation and trade-off analyses to design engineering, test and evaluation, development
of operational doctrine, and ultimately to development of efficient training systems and system models
that can be upgraded to support operations and modifications across the program’s life cycle. The
Crusader strategy strives to fully leverage M&S to achieve efficiencies in schedule and assets and
reduce technical development and performance risks. At the same time, the strategy strives to
maximize the efficiency with which the M&S tools, techniques, and platforms are applied and shared
across program functional areas and development elements, and reused across program phases.
In order to implement the SBA approach fully, the Crusader program has initiated a System
Simulation Development (SSD) process that provides for early, evolutionary, and iterative design and
integration, coupled with the use of contractor or government constructive and virtual models to
ensure that design evolution is driven by, and consistent with, appropriate performance/cost trade-offs
and evolving operational doctrine. It has also initiated a simulation support coordination process to
encourage interaction and leveraging of M&S-related expertise and assets between the developer and
other government organizations.
Early M&S-based design reduces timelines and cost by allowing logical errors and
requirements ambiguities or conflicts to be caught before the design is implemented and tested. As
simulations and design mature, the modeling allows greater flexibility to optimize and refine designs
while minimizing reliance on costly hardware-in-the-loop processes. Early design models and
simulations evolve into products used during the integration process, and eventually into an integrated
life-cycle model that supports the program as Crusader matures.
As models and simulations are accredited, they will be used to augment the test and evaluation
process. M&S will aid in the creation of realistic test scenarios. Environments planned to be used in
testing will be used in simulations first to verify that the planned test conditions and environmental
states can be met with sufficient realism, and within the bounds of safe operating conditions;
consequently, the potential for unexpected and dangerous test outcomes will be reduced. It is expected
that leveraging M&S will ultimately result in reduced field test assets, resources, test iterations, and
overall duration of testing. Using M&S will allow evaluation of conditions for which tests may be
difficult, if not impossible, to run due to limited test resources, environmental restrictions, and possible
safety or loss-of-asset issues. M&S also extends what is known about the performance of the system
beyond strictly defined test scenarios into areas where it may be impractical to test or demonstrate
system performance.
M&S are also crucial to the development and refinement of combat development issues such as
Tactics, Techniques, and Procedures (TTP’s) and for the design and development of trainers. The
U.S. Army Training and Doctrine Command System Manager, Cannon and the Depth and
Simultaneous Attack (D&SA) Battle Lab are teaming to provide simulation capabilities to support
development and training of Crusader. The process requires constructive simulations and virtual
simulator drivers along with live soldier interaction to provide testing capabilities on a synthetic
battlefield. The interaction is transparent to the soldier, allowing realistic assessment of techniques
employed, and a subsequent capability for virtual training.
iv
1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007
CY 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4
FY 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1
Dem/Val (Phase 1) Dem/Val (Phase 2) EMD (Phase 3) LRIP Full Scale Production
PEO-CMDT
FUE
Program SFR TRR MS II EMD Continuation
IPR LRIP
Milestones
SLR SLR Kr MSIII
SLR
MS II
3 7 9 12 13 17 22 26 28
Integration
Activities & Checkout
PP/DT Avail. Prod. Long Lead
Safety Certification
95 96 97 98 99
Activity. 00 BH&T 01 02 03 04 05 06 07
PP/DT to Lab 30 30 30 30
Prod. Build
Combined E
ISFL Cannon DT/OT U
Automotive Maturation
Test Pgm T Cont’d Test &
Armament Weapons Thermal, Sensors, ISFL
Evaluation of PPQT- LRIP (4)
Ver. 1 Hdwre
Hardstand
ASM/ATR PP/DT Risk Mit. Low Risk
Moderate to Low
at LRIP IPR.
High Risks Risk at MS II IOT&E (5)
Software Integration, Power Pack
Cooling, Tube Wear, Laser Ignition
1
CRUSADER
Figure 1: Crusader Acquisition Strategy
An overview of functional area M&S activities supporting the acquisition strategy is shown in
Figure 1. The heavy reliance that the Crusader development process places on efficient and effective
use of M&S across functional areas requires a parallel process that ensures adequate attention is given
to the validity of the tools being utilized. PM Crusader is establishing a validation management
process to provide early identification of model validation issues and required actions to ensure that all
models are appropriate for their use within the Crusader development life cycle.
v
TABLE OF CONTENTS
I. PURPOSE.........................................................................................................................................1
II. SYSTEM DESCRIPTION................................................................................................................1
III. PROGRAM ACQUISITION STRATEGY......................................................................................2
IV. PROGRAM SIMULATION APPROACH/STRATEGY.................................................................3
IV.A. M&S Support to Engineering Design........................................................................................5
IV.B. M&S Support of Crusader T&E.............................................................................................10
IV.B.1 Introduction.........................................................................................................................10
IV.B.2 Model and Simulation Support for Milestone II.................................................................13
IV.B.3 M&S Support to Test and Evaluation Events.....................................................................14
IV.B.4 M&S Support for Supplemental Tests................................................................................16
IV.B.5 M&S Applied to System Software Test Plans (SSTP).......................................................17
IV.C. M&S in Combat Developments...............................................................................................19
IV.D. M&S To Support Training.......................................................................................................21
V. MANAGEMENT............................................................................................................................24
VI. FACILITIES and EQUIPMENT....................................................................................................25
VI.A. SIF............................................................................................................................................25
VI.B. Other Facilities........................................................................................................................28
VII. FUNDING...................................................................................................................................30
VIII. VERIFICATION, VALIDATION, & ACCREDITATION.......................................................30
IX. REMARKS/SUPPLEMENTAL INFORMATION.........................................................................31
vi
TABLE OF FIGURES
vii
TABLE OF APPENDICES
viii
I. PURPOSE
The purpose of the Simulation Support Plan (SSP) is to outline the major Modeling and
Simulation (M&S) activities that are supporting or will support the development, acquisition, and
fielding of the Crusader advanced field artillery system. These activities include development and use
of M&S that will affect decisions in the acquisition process. The acquisition strategy for the Crusader
system allows the prime contractor, United Defense, Limited Partnership (UDLP) to choose actual
M&S implementations for the system development. This plan identifies the UDLP approach, and
identifies a process by which the Office of the Program Manager (OPM) and other government
agencies are able to influence, subject to contractual constraints, the planning and utilization of
Crusader M&S capabilities. It also identifies a process by which the contractor personnel can interact
with the government to obtain and utilize existing M&S tools in lieu of developing their own from
scratch. The plan identifies M&S that the government will perform independent of the contractor, but
emphasizes ways in which the government can capitalize on contractor efforts to maximize the utility
of its activities. The plan places emphasis on activities leading to the milestone II (MSII) decision to
enter the Engineering and Manufacturing Development (EMD) phase, and describes continuation of
the M&S approach throughout the program’s acquisition cycle. The SSP is considered by OPM
Crusader to be a living management tool that both articulates the overall M&S strategy and serves to
foster interaction between the contractor, OPM, and other government agencies via a System
Simulation Coordination Group (SSCG). The SSCG handles issues related to actual tool utilization,
and verification, validation, and accreditation (VV&A). As the program progresses, the SSCG will
serve as a filter to ensure that external developing capabilities are appropriately considered for
implementation to support the Crusader program.
1
The RSV will be a self-propelled vehicle designed to support the SPH on the battlefield. The
RSV and the SPH will share a chassis with common components. Commonality between the two
vehicles will be designed to the maximum extent practicable, in accordance with cost effectiveness.
The RSV system can be described as having a chassis or mobility platform; a crew cab or
compartment; and a compartmentalized payload or mission compartment containing the ammunition,
MACS propellant, and resupply handling and transfer equipment. Vetronics will be allocated to the
appropriate hardware systems and subsystems in the vehicle. The crew compartment,
compartmentalized from the ammunition and propellant, will have communications, displays and
controls for all functions. The RSV will be able to communicate with the SPH, Platoon Operations
Center (POC), and organic battalion operational or logistics elements to exchange information and
receive mission orders.
2
1995 1996 1997 1998 1999 2000 2001
CY 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4
FY 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1
RACM PDRR EMD
Integration &
SW Test Plan/Procedure Development Checkout Set 1 MTP
Safety Release Update
Detail Test Planning SPH &RSV
MTP Update
WH & RSV(-)
Training
SPH2
Test Planning - EDT/EUE/IntEx Available
Activities
95 96 97
EUT Support
Planning
98Detail Test Planning
99SPH2 00 01
SPH1 & RSV1
Available
HW/SW Fixture & Instrumentation
Development
RSV(-) Evaluation
Period
Thermal Hardstand Cannon Firinigs, YPG
SW FQT Testing
Req’ts Tested
DT 54% in DT
Tests
SD 46% in EUE
EDT - VRM
EDT / Rel Growth
EUE
Combined
CEP 1 CEP 2 CEP 3 Testing at EUT, SPH1 &
YPG SPH 2 SPH2 at YPG
1
CRUSADER
Consistent with this philosophy, Crusader has planned a phased MSII in which limited funding
is provided at the first decision point (Sep 2000) to begin a limited build of EMD prototypes, with a
follow-on decision to award the remainder of the EMD contract when Early User Tests (EUT), MSII
assessments, and documentation are finalized (Apr 2001). This approach minimizes customer risk
while maintaining concurrent design and evaluation activities that will keep the schedule for FUE on
track.
4
Goal:
Maximize
Engineering
Overlap in Development Test & Evaluation
Utilization of
M&S Tools
Combat
Development Training
5
Low Fidelity Models System Simulation Development (SSD) High Fidelity Models
Solid Modeling
Screen Prototyping
The engineering development process requires activities that both directly and indirectly
support system integration. System modeling activities that indirectly support integration include
static and dynamic solid modeling, functional modeling, effectiveness modeling, and servo modeling.
Solid modeling is used to estimate space claims for physical partitioning and mechanical
integration, resolve interference issues, and locate space for new equipment. CAD/CAM
master models, such as Pro-E, are upgraded as subsystems are developed and refined by
periodically including higher fidelity CAD assembly and part models. Dynamic solid models
check for interference caused by mechanical motion. Animation allows for iterating complex
designs prior to committing to fabrication and assembly.
Functional models examine functional elements such as thermal responses, gunfire dynamics,
communication flow, etc., to establish expected system responses.
Force and system-level effectiveness models indirectly support engineering design efforts by
validating design requirements against performance in battlefield situations and by helping to
trade design parameters based on prioritized performance outcomes.
Servo modeling provides a basis for higher fidelity models.
Activities that directly support integration fall into two broad categories spanning the Crusader
development life cycle: rapid prototyping models (Object Modeling and Visual Prototypes) and SES
models. Integration support models are accessed and managed via a CSF that provides the foundation
for the Crusader M&S infrastructure.
Rapid Prototyping Models:
The rapid prototyping process may be referred to as a “build-a-little, test-a-little” philosophy.
M&S will be used as intermediate integration tools by taking the place of unavailable hardware and
software. Early models and simulations will migrate from workstations to benchtop electronics
located in the Crusader SIF. The CSF allows developers to evaluate models within either a
workstation environment or the target electronics environment. It provides an interface for correct
6
input/output (I/O) functionality on workstations, which, in turn, allows preliminary unit tests to be
accomplished at the workstation level.
Object Modeling: An object-oriented design model, or object architecture, provides a way to
map the logical Crusader architecture to the physical Crusader architecture. The DOM is a set
of high level system simulations that captures sequencing, timing, and synchronization
associated with the execution of Crusader functions.
Visual Prototyping: Computer imaging capabilities allow visualization to be implemented as a
fundamental part of the design engineering process. During concept and design phases, the use
of visual feedback enables design alternatives to be rapidly explored. It fosters identification
of potential design problems, and serves to provide communication of total system design
issues to subsystem and element designers to foster designing to the system rather than
designing just to subsystem specifications. Visual prototyping activities establish a set of
common graphics and utilities to be used for out-the-window views, DIS capabilities, and
animation of vehicle structures, and will directly support the development of a virtual crew
station.
Virtual Crew Station: The virtual crew station is a reference model that provides man-in-the-
loop simulation depicting how the crew interfaces with the Crusader vehicles. With logical
behavior provided by the DOM, the virtual crew station provides the Crusader program with a
reference model that can be used to refine the functional design and development of the tactical
crew station’s graphical user interface, onboard training capabilities, decision aids and
procedures. It will also serve to aid institutional training development, provide a platform for
computer-based training, and support the SES process as a crew station stimulator.
SES Models:
Integration of the Crusader system employs an incremental philosophy that emphasizes the
identification and resolution of equipment performance and interface difficulties as early as possible in
order to mitigate program cost and schedule impacts downstream. The SES process allows iterative
derivation and integration of hardware, software, and simulations across the Crusader program. The
major SES terms are defined below.
Target: Hardware/Software that is ideally in full compliance with specifications and is
designated for use in the end product, but, at a minimum, is at least a “benchtop” version.
Simulation: Simulation duplicates the functional characteristics of system hardware and
software. A simulator reproduces, under test conditions, results that are likely to occur during
the actual operation of the target system.
Emulation: Emulation duplicates the behavior of electrical interfaces between subsystems.
This emulator duplicates the interfaces to Crusader Standard Electronics, focusing on physical
subsystem behavior but not CPU functionality.
Stimulation: Interaction with functional and electrical interfaces to target hardware. A
stimulator to a given subsystem can be provided by a simulation of another subsystem.
As an example, a command, control, and communication (C3) simulation could provide the
dynamic behaviors of C3 as a stimulant to an armament emulation. The responses of the armament
emulation can then be assessed for compliance with expected behavior, and in turn serve as a stimulant
for other subsystem emulators.
7
The SES approach consists of three essential stages of development:
1. Mechanical, electrical, and software behavior are simulated through the use of dynamic
models.
2. Hardware interfaces are emulated by dynamic models running on a real time i/o controller.
3. Target hardware replaces the simulations and emulations as it becomes available.
The SES process establishes a set of guideline characteristics to implement during detailed
design. It allows the transition of hardware and software from design to integration as quickly and
cost-effectively as possible. This integration supports development early by allowing functional teams
to analyze, study, and improve products in a fully functional system environment before committing to
a hardware or software build.
The primary benefits of the incremental SES approach can be summarized as:
Simulation-based integration minimizes cost, schedule and overall risk.
The SES approach provides a flexible, reconfigurable, and expandable test environment.
Design iteration costs are reduced by early detection and resolution of specification and
interface problems.
The resulting legacy simulations and emulations (Integrated Crusader Emulator) provide a
platform for system evaluation, future evaluation of proposed modifications, and for
troubleshooting the fielded product.
To obtain the greatest benefit from the approach, SES follows an incremental process outlined
in Figure 5. The steps identified in the figure are identified below:
Step 1 represents a high-fidelity non-real-time workstation simulation of the particular element
under test. The functionality of other elements is represented by a low-fidelity simulation that
includes only those functions that are required to interface with the element under test.
Step 2 migrates the simulated element software to the target central processing unit (CPU).
Step 2a represents the development of the servocontrol algorithms and is shown as a separate
development path due to its unique nature. For this step, the simulated load under servocontrol
is replaced by test stand hardware. The servocontrol algorithms are executed in real-time on a
rapid prototyping system (e.g., AC-100 or ADI RTS), which facilitates algorithm development
and testing via a flexible GUI and built-in interfaces to servohardware.
8
Migrate to LoFi 2 LoFi
Target CPU Sim Sim Incrementally
1 Target Integrate I/O
LoFi LoFi CPU Emulators
Sim Sim Tgt LoFi
I/O Emul 3
Product Workstation SW Sim
Design LoFi LoFi
HiFi LoFi
Sim Sim
Sim Sim Servo Control
Development I/O Emul Target
I/O Emul
Electronics
Tgt LoFi
LoFi 2a LoFi SW Sim
6
I/O Emul Sim Sim I/O Emul
Upgrade of External
Tgt Tgt Functionality Teststands Stimulators
SW SW
HiFi HiFi
I/O Emul Target I/O Emul Sim Sim Servo Control
Electronics
Tgt Tgt Algorithms
SW SW
Incrementally
External
I/O Emul
Stimulators Replace Emulators
Target with Target HW
Target HW
Integrated Crusader Emulator HW 5 4
LoFi LoFi
Sim Sim
Target I/O Emul Target Target
Target Electronics Target HW
Electronics
Integrated HW & SW HW Tgt HiFi
Product SW Sim
External
HW I/O Emul
SW Target Stimulators
HW Mechanical
Integration
Step 5 physically integrates all mechanical hardware along with the software at the segment level.
Step 6 incorporates the integrated Hardware/Software product from Step 5, all the necessary I/O
emulators, and simulations and combines them to form the system-level emulation known as the
ICE.
Formal software qualification and test is a parallel process to hardware integration. The SES
also supports the software qualification and test by providing a real-time high fidelity emulation of the
subsystem’s hardware, real-time low fidelity emulations of other subsystems, and a common real-time
simulated environment controlled by a scripted scenario. The resulting qualified onboard software
becomes available to support other subsystems’ validation testing. When the vehicles are completely
integrated, and all simulations have been replaced with tactical software and hardware, the SES
process is completed.
The ICE is the residual product of the SES process. The ICE collects all simulations,
emulations, and stimulations to support design evolutions beyond MSII, support Test and Evaluation
(T&E) activities through MSIII, and enable post deployment maintenance activities to continue. The
ICE serves as the Crusader end-to-end model; it has the capability to simulate the entire system and
permit analysis of details down to the subsystem level.
9
IV.B. M&S SUPPORT OF CRUSADER T&E
IV.B.1 Introduction
The primary objective of T&E in the Crusader program is to verify that requirements are being
met throughout the Crusader development cycle. Since the program is in early PDRR, T&E planning
and the supporting role of M&S is currently being defined in detail for PDRR, but less so for the
following phases. The use of M&S in support of T&E will therefore be emphasized below, with more
general strategy provided for the EMD and LRIP development phases. The objectives of M&S support
for testing are:
1. to facilitate the integration and test of Crusader hardware and software products at the subsystem,
element, and segment levels;
2. to enable item checkout and debug activities at component and lower levels;
3. to support item acceptance, software test, and hardware test efforts;
4. to provide an infrastructure for integration and interoperability of external models and simulations
used in system level testing and analysis.
Figure 6 depicts the overall approach to incremental system development, integration, and
testing. It also provides a representation of the use of M&S in testing utilizing the SIF at UDLP. The
SIF will be a shared facility used by developers and integrators, as well as by the test community.
Items that are part of delivered system products are white objects. Shaded objects are simulations,
prototypes, or models of items in a product. The interaction of M&S activities with product
development, integration, and test efforts is illustrated, from initial concept through final segment and
system test.
The process starts with models of product items as specified in Product Data Sheets. As
products are developed, the simulation models are updated and validated so that they accurately
represent the behavior of those items. Updated models are then used to test out functionality and
compatibility within the system. After successful completion of this step, the models are replaced by
target hardware and software. The target items are then tested against the integrated system
simulation. This process continues throughout all levels of integration and test until complete products
have been tested in both the simulated system and the actual system.
The incremental integration and test process for the SPH and RSV will be implemented in six
essential steps, each of which requires the use of models and simulations:
Establish framework
Develop element subsystem simulations
Integrate element simulations within the SIF
Integrate and test element products within the SIF (hardware and software)
Integrate segment simulations within the SIF
Integrate and test segment products within the SIF
For testing at the system level, an ICE will be employed by the SIF as defined above. This will
allow interfacing of segments for the purpose of conducting system-level testing using a realistic
system-level simulation.
10
Incremental System Development &
Integration
Development, Integration, Assemby & Test Activites
Segment Integration
SIF
Simulations
Software
as needed Electrical
Mechanical
Element Integration
SIF
Element Software Products
Electrical
Simulations Mechanical Mechanical
Electrical
Subsystem Integration
Software
SIF Software Software
Electrical
Subsystem Electrical
Mechanical
Simulations Simulations,
Prototypes,
Component Development and Integration Models
Software Mechanical
Electrical
SIF Electrical Mechanical Software Mechanical Electrical
Component Software
Electrical
Simulations Mechanical Electrical Software
Mechanical Software
Modeling and Simulation (M&S) supports Crusader T&E activities in several important ways:
Assists in the formulation of Master Test Plan (MTP) test definition
Mitigates risk in execution of tests imposing possible compromise to program assets by
allowing simulation of conditions that are potentially hazardous if present in a live testing
situation
Allows examination of physically untestable requirements
Increases capability for test automation
Provides essentially unlimited capability for repeating simulated test events
Provides cost effective reuse of simulations developed for other programs or other areas of the
Crusader development arena
Provides very efficient test setup
Provides very effective control of scenarios and testing parameters
Performs pre-test modeling to project likely test outcomes
Supports post-testing analysis efforts
Supports post-testing reporting efforts
Shows the probability that critical requirements can be met during EMD
Supports software testing for safety release
Provides opportunities to reduce the number of tests.
The immediate objective of use of M&S to support testing is to aid in the development of
efficient and smart test design. M&S will be used by test personnel to create realistic developmental
and operational test scenarios and will improve the test and evaluation planning process. Simulated
11
test environments will be used to verify that the test conditions and environmental states can be met
with sufficient realism, and within the bounds of safe operating conditions. An equally important role
of M&S, given sufficient confidence in the models or simulations, is for M&S to be used to augment,
or even in lieu of, testing in order to verify requirements are being met. In concert with PM
Crusader’s desire to maximize the process of simulation-based acquisition, the entire test community is
expected to consider all potential uses of M&S for this purpose. Simulation vs. actual testing will
therefore be considered and evaluated for cost-effectiveness and risk of acceptance over the planned
life cycle for all test activities.
Consideration of M&S for support to testing will begin early in the process, as T&E objectives
are first laid out. It is expected that leveraging M&S will ultimately result in reduced field test assets
needed, resources required, required test iterations, and overall duration of testing. Using M&S will
allow evaluation of conditions that may be difficult, if not impossible, to test due to limited test
resources, environmental restrictions, and possible safety or loss-of-asset issues. In addition, M&S
will support the data management process by providing synthetic data to exercise the test analysis and
reporting systems. Early evaluation is supported by virtual prototypes, which allows operational test
personnel to do early operational assessments in multiple threat environments. These synthetic
environments will permit evaluation in environments not easily achievable in actual tests due to safety
or resource constraints.
M&S will extend the usefulness of field test data by exploring and identifying questionable
areas as well as improving the leveraging of test data between developmental and operational testing.
M&S will enhance the sharing of information between system engineers, designers, software
engineers, test engineers, logistics engineers, and users. Eventually, virtual representations of the
manufacturing process will be used to examine how the manufacturing process will adapt as the
Crusader prototypes are changed.
The purpose of supporting MTP development is to establish and maintain a comprehensive
understanding of operational testing requirements with an emphasis on Concept Evaluation Plan
(CEP), Engineering Development Testing (EDT), Early User Testing (EUT), and Operational Testing
(OT). The use of the M&S environment for requirement verification will be applied wherever
possible. Simulation modeling will support test planning decisions. These same models will be used
by developers to create representations of Army scheduled testing, and to predict expected test results.
Common mission scenario and non-Crusader specific data and simulations will be identified
early in the MTP development process. The government and the Crusader team will use the M&S
coordination process to identify new M&S needs, as well as potential sources. The MTP volumes
developed for element, subsystem and component testing within the Product Development Teams
(PDTs’) will identify models to be acquired and/or developed as needed (many may be byproducts of
development analysis models). The MTP volumes developed for segment testing will similarly
identify the models needed for testing , but will be able to additionally draw upon the models used for
element level testing. The MTP volume for system test will follow the same approach. The models
initially identified in the Requirements Translation Model (RTM) database for requirements and the
initial list of general models/sources provide a useful guide for the test plan developers. The
evaluation of models for applicability and cost effectiveness for testing will be performed by the test
plan developer. It will be important to catalogue and control the simulation models used for testing
and/or requirement verification. A list of general models will be maintained in MTP to reference
general models or tools planned to be used in testing. The MTP will also include appendices that
associate the general models to specific requirements. The use of a general model for a specific
requirement may necessitate a unique configuration or interpretation of the model, which will be
12
tracked via the model identification. Each MTP volume will have a unique table of model
identifications that shows the relationship of M&S to the requirements addressed within that volume.
Pre-test modeling is conducted to derive "predicted" performance measures prior to live events.
Pre-test M&S will aid test planning by providing insight into data requirements, fidelity, and data
reduction before costly and often times unique live test events. Additionally, pre-test modeling
establishes a model or simulation baseline (data set) that, when compared to actual test results, permits
model "calibration", in turn improving modeling capability for future use. Testing-related model
calibration will provide improved predictive capability for all performance measures. Test exit criteria
will be modeled prior to any government test, thus providing early insights to Team Crusader into high
risk performance areas for the test.
When it is required, post-test modeling will serve to extrapolate observed test events and
results into additional test environments not represented during limited government testing. Post-test
modeling will be used to determine whether Crusader performance is within predicted tolerance limits
and determine likely cause(s) of requirements non-compliance, if any. The program will use
calibrated models to create a representation of operational activities/events for design decision support
as well as requirements verification.
All simulation models used for qualification testing will be accredited. This is necessary in
order to assure that the models thus used properly reflect the developer’s functional descriptions and
specifications, and that they provide an accurate representation, to the degree expected, of the real
world processes they are being used to simulate.
Models used by the Crusader team in lieu of formal testing for the purpose of verifying that
requirements are met will be required to pass through an accreditation process prior to being used.
Previously certified/accredited models, reused from other sources, will be required to pass a
qualification test if modifications for Crusader have occurred. Model certification/accreditation will
be a condition for passing Test Readiness Review, because it will demonstrate that the contractor’s test
facility will support the planned formal qualification test activities. Contractor and government
personnel will be participants in the accreditation process (currently under development by Team
Crusader) and the VV&A process . Models, simulations and facilities that are being used in a support
role only for formal testing will be required to pass an acceptance test, which may be less rigid than
formal verification and validation procedures. It may be necessary in some cases to use live test
results as a source of data to validate models and simulations. This may necessitate additional data
collection during tests to capture data specifically useful for validation.
14
The objectives of the CEP3, again using the simulation setup as previously described for the
earlier CEP’s, are as follows:
1. Assess the feasibility of operational concepts in terms of command, control, communications
and crew (C4),
2. Determine, to the extent possible, if the notional table of organization and equipment (TOE) is
correct,
3. Assist in refining the operational concepts document,
4. Provide information to Team Crusader for development of the system,
5. Refine the critical operational issues, and
6. Determine test issues for further evaluation.
The test environment functions collectively as a DIS, and employs a seamless confederation of
models, simulations, and actual field equipment to simulate the digital battlefield of the future on
which Crusader will fight. Data flows digitally among the layers through the use of common protocol
packets, which are used to encode and decode the information from each of the elements. Use of a
DIS environment allows the separate elements of the test environment to function in real time and for
the soldiers participating in the test exercises to feel that they are functioning in support of real
maneuver elements in a real developing battle. The separate layers of the test environment are
described below:
1) J-Link (Modeling The Maneuver Battle): The overall stimuli for tactical fire control and
fire mission processing come from a battlefield maneuver simulation called J-Link, which is a DIS-
compatible version of the Janus simulation. Janus is used at Fort Sill for training basic and advanced
field artillery classes. Janus is a high resolution, two-sided, interactive ground combat simulation with
digitized terrain, line of sight for all platforms or weapons systems, and battle calculus computed on
probability of kill. It is used as a seminar trainer for commanders' use in training subordinates and
principal staff officers in close battle planning and synchronization.
2) TAFSM (Modeling Artillery Operations): TAFSM is a stochastic, two-sided high
resolution simulation that has been applied extensively for combat developments. TAFSM explicitly
plays vehicle movement, target acquisition, and communications of targeting information to fire
direction centers and subsequent fire messages down to individual guns. It uses the fire mission
requirements generated by the tactical fire control process and performs technical fire control -
calculating the ballistic solution, applying it to a Crusader SPH fire mission, firing it and reporting the
results. TAFSM has been modified to become DIS-compliant and interoperable with live (soldier
operated) fielded tactical Command and Control devices. For CEP support, it is linked interactively
with the Janus battle simulation.
3) CEP Process: The CEP process relies on soldiers from the unit to provide fire mission
processing for issuing move orders to the individual howitzers and resupply vehicles. In the CEP’s, a
portion of the Crusader battalion forces are played in real time through workstations at which the
soldiers perform functions typical of howitzer or resupply vehicle crew members. Interactive screen
designs are provided by the contractor building the final prototypes for EUT, to allow assessment of
required message traffic and workload at the individual vehicle level. This approach should provide
the stressors necessary to develop training modules and to subsequently evaluate the ability of soldiers
to perform their missions. All Manpower, Personnel, and Training (MANPRINT) domains can and
15
should be addressed during these simulations. The balance of the Crusader entities are modeled in
TAFSM. The next paragraph describes how these modeled entities will perform.
For purposes of the CEP, the modeled Crusader howitzers and resupply vehicles 'live' in
TAFSM. Messages sent (usually by the POC) to Crusader are acted on in TAFSM, and TAFSM
responds back in real time to the network with digital messages just as the individual Crusader
elements would. This fidelity and immediacy of these transactions gives the whole Synthetic Theatre
of War (STOW) environment the ability to produce an Operational Tempo (OPTEMPO) during the
battle and provides a very realistic level of stress to the soldiers and the functions (C2 and resupply)
performed by the Crusader artillery battalion during the battles.
4) Situational Awareness: A Maneuver Control System, Plan View Display, or equivalent
display device is used at the Battalion Operations Center to provide an overview of the emerging
battle. This device superimposes icons of the opposing forces onto a pictorial representation of the
digital terrain, much as you would post a map to indicate locations of various objects. The Crusader
battalion staff is thus able to track progress of the unit both individually and collectively. The CEP
process described above exemplifies the interactive environment in which the Crusader research takes
place. Crusader expects to link to the protocols found in Force XXI, Army after Next, etc..
The CEP’s are a good example of how M&S is being used by Team Crusader to support total
system development. The exercises provide important information to the users, system designers, and
to the testers. Crusader System T&E personnel participate in the CEP simulations to support the
simulation center personnel with latest information in portraying the Crusader system in Janus and
TAFSM. They participate during the simulation to observe, gather data and bring back the lessons
learned that can be used for the design of software, crew interfaces, and developmental and operational
testing. Some of the specific advantages of the CEP M&S exercises to the T&E group are:
1. Provide data to determine the types of scenarios that need to be written for software test,
2. Provide data on the operational effectiveness of the system and provide feedback to design
engineering,
3. Determine the instrumentation requirements for the OT and EUT,
4. Use the lessons learned in the conduct of the simulations as a rehearsal for OT and EUT,
5. Use the lessons learned in the conduct of the simulations to reduce set up and trouble shooting
for EUT’s,
6. Refine test plans for OT and EUT, and
7. Refine software test plans.
17
subsystems is used as necessary in the test process. The resulting qualified onboard software is
available to support other subsystems’ validation testing within the element’s integration.
The software that supports a typical subsystem X qualification test in the ith build towards
onboard software release N is as shown in Figure 7. An emulation of the onboard software for
subsystem X is also produced as a result of qualification testing and is used in the Element Level
integration of other subsystems. A real-time low-fidelity emulation of the overall subsystem X is also
produced as a result of the testing and made available to other subsystems for testing and integration.
Emulation of
Subsystem X
Qualified
Software
Subsystem X
Qual-Test
Software
Procedures
for (i,N)
Perform Subsystem X
Software
Definition of Subsystem X
Develop Qual Testing
Functional Software Qualified
Subsystem X of build i for SW
Capabilities for tested on Subsystem X
Software on Release N
Build i for SW Workstation Software
Workstation (i,N)
Release N based for (i,N)
based Sim
(i,N) Simulations
Realtime
Develop and Environment Lo-Fidelity
Test Real-time Simulation Subsystem X
SES Models Emulation
and Bench Top Simple Bus for (i,N)
Simulations for Stimulations
Existing Subsystem X
Realtime Software Testing Lo to Hi
Lo-Fidelity
Fidelity Realtime
Subsystem Z
Real-time Hi-Fidelity
Emulation
Subsystem X Subsystem X
(i,N)
Emulations & Emulation
Stimulations for (i,N)
Existing
Realtime
Hi-Fidelity
Subsystem Z
Emulation
(i,N)
18
From Survivability
via Segment Certification Survivability
HW
Turret Assy
Test Stand
H/W
Integrated Mechanical Certified Mechanical
Assembly Assembly
Armament
Control
Subsystem
Validated
Armament
Turret Assy
Turret
Subsystem
S/W Armament Armament
Models SES Qualified To Segment
Intermediate Products Element SW
Cannon
Subsystem
Resupply
Lo-FI
Model
Armament Armament
Gun Mount Element
C4 Software
Subsystem Test Plan
Lo-FI Test Plan
Model
Mobility
Gun Pointing Lo-FI
Subsystem Model
Survivability Resupply
Ammunition Lo-FI
Loader Model C4
Subsystem Armament
Lo-Fi
Vehicle Elec. Mobility
Model
Lo-FI
From Vehicle Electronics Model Survivability
via Segment Certification Common
Bench Top
Electronics Vehicle Elec.
20
M&SinCombatDevelopment
CrusaderBattleLabWarfightingExperiments(BLWE)
M
M&
&S
ST
To
oo
olls
s::J
J--
LLiin
nk
kan
ad
ndT
TA
AF
FS
SM
M E
Ex
xe
err
cis
c e
iss
esE
Ev
va
alua
lutt
ae::
e
2
• M ovem en tC
IncreasIng MaturIty
CEP1 (4Q1996) (Ft. Sill)
• Inform atio nth roug h p u t
• Situatio n al aw aren ess
• EvaluateCom m and&Control atbattalionlevel
• Veh icleam m un itionstatu srep o rting
• Develop/ VerifyTTPs
• M ainten an cereq uests
• Am m un itio nresu pp ly(A TPtoS P H )
CEP2 (3Q1997) (Ft. Hood) • Lo gisticsresu pp lyC 2
CEP1&CEP2PLUS:
• EvaluateC2a tb attalion , battery, platoon§ionlevel
• UseP Cb asedC 2c en ters&w orkstations
• Useavailablep roto typ eh ardw area ndsoftware EUT (2000)
• Longerscen arios(8h rsvs. 4h rs)
FDTE(2003)
M
M&
&S
Slle
etts
ss
so
olld
diie
er
rs
sg
ge
ettiin
nv
vo
ollv
ve
ed
dw
wiitth
htth
he
es
sy
ys
st
te
em
ms
so
oo
on
ne
er
r
1
21
Maintenance Trainer, Crew Station Trainer (CST), Hull Maintenance Trainer, and Turret Maintenance
Trainer.
As with other system developments, M&S will be used extensively for the design and
development of the trainers. All MANPRINT domains should be addressed by this modeling. They
will also serve as an essential element of the trainers themselves by providing the necessary functional
simulations and scenario drivers. The prototype CST will be developed concurrently with the
prototype vehicles, and will be available for EUT. The development of the CST relies heavily on
expertise, techniques, and technologies drawn from the UDLP San Jose’s Combat Simulation and
Integration Lab’s (CSIL) efforts on the Bradley and other systems. The development process, as
shown in Figure 10, will use lessons learned and rely on common architectures as the preceding
Bradley efforts, but will be based on emulators, simulators, and man machine interfaces designed,
developed, and demonstrated in the SIF at UDLP in Minneapolis. The same SES process described
above and shown in Figure 5 is also used for trainer development. The simulation development
process for the CST is shown in Figure 11.
22
Crusader CTS Systems Simulation
Development Process
Embedded
Training
Operating Crew
Environment Trainers
Maintenance
Trainers
The reliance of Crusader on training aids, devices, simulators, and simulations provides
realistic training at significantly reduced cost as compared to current training concepts that rely on live
fire, performance-based, and combined arms training. Each of the Crusader trainers will use
simulation technology to interact with Crusader tactical software to create visual, audio, and sensory
perceptions of operating or performing operator maintenance on Crusader. The CICST will simulate
the full range of SPH and RSV individual tasks. It will also use DIS protocols, common terrain
databases, and both visual and voice technologies to interface with the family of Combined Arms
Tactical Trainers (CATT’s). Interaction between the CICST and the CATT simulators via DIS will
train collective capabilities at levels above the Crusader crew level. Objective CICST reconfigurable
stations will be fielded simultaneously with the LRIP Crusaders in 2005.
In addition to the M&S that are directly used to develop and support the Crusader trainers and
embedded training, constructive M&S tools will also be linked to the trainers and earlier prototype
devices to allow dynamic training within the context of force-level training. The previously discussed
models, Janus, J-Link, and TAFSM, will provide this capability. The constructive models can be
rapidly reconfigured to represent multiple training environments with varied mission, enemy, terrain,
troops, and time available. Use of the constructive models allow simultaneous training of personnel
from crew through command level, with the capability to store and evaluate results of decisions and
actions made during the training. Field artillery men can practice the entire military decision-making
process in a relatively low cost environment. This approach allows the flexibility to practice in
multiple scenarios under a variety of conditions and to do it systematically and routinely. These
advantages are not possible with live combat training exercises.
23
V. MANAGEMENT
The Crusader program has planned from its inception to take full and maximum advantage of
M&S to augment and streamline the overall development and acquisition of the Crusader system. It
has implemented the objectives of the integrated simulation, test and evaluation (STEP) process to
combine simulation and testing to improve system design and measurement of system performance. It
has also instituted goals of leveraging existing expertise and M&S capabilities to avoid duplication of
effort; ensure maximum commonality, compatibility, and interoperability of the chosen M&S tool set;
and develop early and practical VV&A planning and implementation to ensure that the use of M&S
provides valid information to benefit the program and aid the decision-making process.
Management of the M&S program follows the Integrated Product Team (IPT) model in use for
the rest of the program, but acts at several distinct levels. In order to achieve the goals identified
above, it is imperative that communication and coordination occur between UDLP’s PDT’s, between
PM Crusader’s PDT’s, between UDLP and PM Crusader, and between Team Crusader (UDLP and
PMO) and other government agencies. Figure 12 shows how the management process will be
implemented. Within UDLP and the PMO there are M&S mission area representatives within each of
the functional area PDT’s. These representatives interact with the government contractor PDT’s.
Management of system- level issues is the responsibility of a M&S manager designated within UDLP
and the PMO to ensure coordination between the PDT’s and implementation of system-level planning.
To take advantage of available expertise and capabilities, a larger government coordination is
also required. This interaction provides the critical forum in which M&S needs and issues can be
identified by Team Crusader for consideration by the community, with the potential and expected
response of recommendations, products, and other support back to Crusader. The larger government
coordination occurs within the context of a Crusader Simulation Coordination Group (SCG).
Depending on the issues to be addressed, the group may communicate en large by meeting or video
teleconference, or issues or needed information may be disseminated individually for consideration
and resolution.
Although individual M&S managers are designated to oversee M&S development and issues,
M&S in general is considered to be an integral part of the total system development process. The
management responsibility serves to ensure that planning is coordinated and remains on track
according to the previously identified goals.
24
Figure 12: Crusader M&S Management Structure
DRAFT
CRUSADER Modeling and Simulation
Management Structure
TEAM CRUSADER
SCG
Products & Support Needs & Issues
M&S COMMUNITY
Potential Tools
V&V Info and Efforts
TRAINING/ DOCTRINE/ SIMULATOR SUPPORT/ DIS/
Test and Evaluation Issues
TTPs/ OPs WIDER COORDINATION
TRADOC AMSAA, ARL, OPTEC, STRICOM
TSM Cannon TARDEC, TECOM, RDECs, etc SARDA
DMSO/ MSMO
25
The SIF will integrate existing UDLP capabilities into a single facility. Major SIF capabilities include:
Dedicated product development areas
Dedicated hardware and software integration
Dedicated test data collection and management facilities
Integrated receiving, shipping, and storage area
SIF support staff office facilities
Dedicated modeling and simulation areas
26
Bldg. 48
Conference Center
N
North
Engineering Expansion SIF Expansion.
Crusader Engineering
The overall SIF floor space is estimated to be approximately 60,038 square feet, broken down
as 36,597 square feet of assembly space, 17,497 square feet of lab space, and 5,944 square feet of
office space. Due to the sensitive electronic components and the wide use of workstations within the
SIF, all of the office and much of the laboratory space will need to be environmentally controlled.
Initial estimates are that there will be 23,441 square feet of environmentally controlled area.
As seen in figure 14, approximately 5% of the total SIF floor space is allocated to Army
modeling and simulation efforts. However, the total Army space allocation supports the development
process that is based in M&S. The combination of development laboratories, assembly areas,
computer resources, and data networks located in the SIF are designed to provide an environment that
allows performance evaluation at successively higher developmental levels. A combination of
multiple-use products and techniques is employed to satisfy the program success criteria, provide a
cost-effective development path, and expose potential development problems as early as possible. A
listing of equipment and their applications is provided in Appendix D.
27
C o rrid or
A rm y
Sec ure D ev
A rm y System A rm y
D em o M odeling & C 4 D evelopm ent
M aterial N a vy
Storage Sec ure D ev Joint S im ulation
V isual
Joint O n- Integration
T rainring
S ite O ffices A rm y H W /SW
N avy M odeling T est
N avy M at’l & Sim O ffices Integration
S torage
F irepowe r A rm y
M echanical
N D evelopm ent
A ssem bly & Integration
N avy
A ssem bly &
A utom otive
Integration M ech D ev
N avy R esupply F irepowe r
S ub M echanical M echanical
A ssy D evelopm ent D evelopm ent
B ldg Equip
C on tro ls D e v
A u tom otive
N a vy N a vy N a vy N avy R esupply F irepowe r
D em o AAHS G un H W /SW C ontrols C ontrols
“ C IC ” C ontrols C ontrols Integ
M en’s W ashroom
D ev D evelopm ent
D ev D ev
VII. FUNDING
The funds associated with the development and use of M&S assets in support of Crusader
acquisition are nearly impossible to separate from the overall Research, Development, Test and
Evaluation (RDT&E) funding outlays through 2005. Because the development strategy relies heavily
on leveraging of M&S, the use of M&S and its associated costs are fundamentally interwoven within
total acquisition costs. It must be noted that all costs associated with M&S are assumed to be
investment dollars that will result in net savings over the development cycle. Validation of these
assets is being investigated, and may require additional resources to be invested specifically for this
purpose. Some limited data on funding levels of specific simulations is available and will be provided
upon request by OPM Crusader.
31
APPENDIX A: ACRONYMS AND ABBREVIATIONS
32
GSD - Ground systems Division
GUI - Graphical User Interface
LD - Logistics Demonstration
LFT&E - Live Fire Test and Evaluation
LLI - Long Lead Item
Lo Fi - Low Fidelity
LP - Liquid Propellant
LRIP - Low Rate Initial Production
LRU - Line Replaceable Unit
33
SBA - Simulation Based Acquisition
SES - Simulators, Emulators, and Stimulators
SFR - System Functional Review
SIF - System Integration Facility
SLAVE - Simple Lethality and Vulnerability Estimator
SP - Solid Propellant
SPH - Self Propelled Howitzer
SRS - Software Requirements Specification
SCG - Simulation Coordination Group
SSD - System Simulation Development
SSP - Simulation Support Plan
SSTP - System Software Test Plan
STOW - Synthetic Theatre of War
STD - Software Test Description
STP - Software Test Plans
STRICOM - Simulation, Trainers and Instrumentation Command
SW - Software
34
APPENDIX B: REFERENCES
1. White Paper: “U.S. Army’s Next Generation Cannon Artillery System Relies on Simulation to
Accelerate Design and Mitigate Development Risk.”, James DeLabar (OPM Crusader) and Wendy
Sallman (UDLP), undated.
2. “Crusader Master Integration Plan”, Crusader System Integration Team, UDLP, 9 Sep 96.
3. “Crusader Master Integration Plan, Appendix A, System Simulation Development (SSD) Plan”,
Crusader System Integration Team, UDLP, 9 Sep 96.
4. “Crusader Master Integration Plan, Appendix B, System Integration Facility (SIF) Plan”, Crusader
System Integration Team, UDLP, 9 Sep 96.
5. “Crusader Mater Test Plan, Volume 3, Modeling and Simulation for the Crusader System”,
Crusader Product Assurance and Test Team, UDLP, 17 Jan 97.
6. “Simulations to Train and Develop the 21st Century FA”, Dr. Linda Pierce and Walter W.
Millspaugh, Field Artillery (HQDA PB6-97-4), Jul-Aug, 97.
7. “Crusader: Training Force XXI’s Firepower”, Maj(Ret) Willian L. Bell, Jr., Field Artillery (HQDA
PB6-96-2), Mar-Apr 96.
35
APPENDIX C: Crusader Verification, Validation, and Accreditation Management Plan
(VVAMP)
Purpose: The purpose of this document is to set out a blueprint for identifying and managing
Crusader resources and schedule for verification, validation, and accreditation (VV&A) of the
program’s models and simulations (M&S). This will ensure that valid M&S information will be
available to answer critical issues and impact major program decisions including those at the major
milestones.
Scope: The initial Crusader VVAAMP covers the program through milestone 2 (2QFY01) to the low
rate initial production milestone (4QFY03). Its focus is on managing the VV&A of M&S in the
development and test and evaluation (T&E) functional areas; however, it also covers the principal
models in the combat development and training functional areas. The VVAAMP provides a means to
manage M&S VV&A level of effort based on considerations such as intended use of the M&S; the
customer’s requirements; issues addressed by the M&S and the importance of those issues;
development status, use history, and documentation of the M&S product; existing configuration
management and control of the M&S product; and required fidelity of the M&S. Included in this
document are sections covering how to determine which M&S products need to be VV&A’d, how to
determine the priority of and level of effort needed for the VV&A, and how to develop a VV&A plan
for a model. Also included are the regulatory requirements concerning who the accreditation authority
is for a specific model, as well as some information concerning organizational responsibilities and
required resources.
Establishing M&S Requirements: Before being able to specify particular requirements for
accreditation, it is necessary to have a thorough understanding of the M&S application. When VV&A
efforts fail, it is generally due to a failure to define adequately the overall problem and how the models
will be used in resolving all or part of the problem. A clear problem definition and explicit M&S
requirements are the Project Manager’s best tools for controlling M&S costs. The whole purpose of
explicitly defining the problem is to provide a common, well-understood starting point for the analysts
who must define the application parameters, accredit and run the model, and analyze the resulting data
to arrive at a problem resolution. Without such an explicit statement, there is a significant possibility
that analysts, in an attempt to avoid errors, will make conservative assumptions that can lead to gold
plating the model or unnecessary VV&A. Consequently, one of the first steps is to examine the
requirements for the Crusader program and decide the types of data or information that will be
required to address those requirements.
The Crusader system needs to satisfy criteria in numerous performance areas as defined in the
system operational requirements document (ORD). Primary ORD requirements are shown below:
Success criteria are other sources of requirements and apply to the development contractor for
the Crusader system. The success criteria are written at the system level, (Crusader), segment level
(SPH and RSV), element level (armament, mobility, survivability, and others), subsystem level
(cannon), and component level. Success criteria are contractual requirements which, if not met, can
result in rejection of the item to which the criterion applies. System level success criteria are shown in
the table below. Criteria for segment and element are not shown here but are likely to be used in
determining the overall Crusader model VV&A requirements in the management plan. The criteria
not shown here are found in the Master Test Plan (MTP), Volume 3.
Exit criteria are a third source of requirements; they are used to permit an acquisition decision
when one or more of the critical program elements meet a level for acceptable decision risk with
confidence that the level will improve to the required value by the full production decision. Exit
criteria typically comprise a subset of the success criteria. The existing exit criteria pertaining to
Crusader are shown in Crusader Exit Criteria. As exit criteria are developed for later milestones, they
will become useful sources of M&S requirements as well.
38
Once the necessary data requirements are established, one should begin to lay out the potential
sources for those data. This can be done by establishing reasonable and complete M&S requirements.
M&S requirements are, in simple terms, statements of what the model or simulation is expected to do
and what is needed to run the model. M&S requirements can be stated in terms of criteria that
candidate models must meet in order to be considered acceptable for use in an application. Comparing
these requirements with the information available about a model identifies model deficiencies, and can
aid in specifying requirements for V&V that will lead to a better understanding of the model’s
strengths and weaknesses.
M&S requirements can be grouped into three categories: functional, fidelity, and operating.
Functional requirements are system features or functions, political or environmental conditions,
physical phenomena, or personnel actions that have an important impact on the ultimate solution of the
problem and, therefore, must be represented in a simulation. In all probability, the interactions
between these represented entities must also be simulated by the model or simulation. The functional
requirements proper to individual applications will be influenced heavily by the nature, depth, and
breadth of the application scenario as well as the purpose and objectives of the application. Defining
functional requirements begins with identifying those model outputs that are required to calculate the
key metrics. Once model outputs have been identified, a user can identify contributing functions that
are likely to have a direct impact on model outputs, and then prioritize them in terms of their potential
impact on measures of merit (MOM’s) values and problem outcomes.
Fidelity requirements can be defined as the degrees of correlation between model outputs and
real world phenomena that are necessary for credible use of a model for a particular problem. They
can also be looked upon as the acceptable errors that can be tolerated in model outputs before problem
outcomes will be grossly affected or problem decisions will change from one state to another. Fidelity
requirements are generally determined through sensitivity analyses performed on the problem MOM’s.
Operating requirements address practical issues surrounding the operation of software for use
in the intended application. The goal is to characterize the computational environment in which the
model must be used so that the resources and capabilities available to the model user can be compared
with actual model usage requirements as defined, for example, in a User’s Manual or other model
documentation. This comparison leads to the identification of unmet operating requirements that must
be addressed before the model can be properly and effectively used. It is rare that unmet operating
requirements alone will derail an accreditation effort.
Appendix E displays a list of models that may be used to address some of the requirements for
the Crusader program. However, although this list may not be exhaustive, it may contain models that
really have very little chance of being used in the program. The developer should carefully scrub this
list of models to try to pare it to only those that have a reasonable chance of being used in the program.
The selection of candidate models that totally comply with all problem requirements is usually not
feasible except in the case of very simple problems. In most cases, the available models will have
some functional limitations that must be addressed with model changes or work-arounds. Once this is
done, the next step is to establish a matrix of the uses of the models, both in terms of specific
requirements to be addressed and in terms of phases of development. These steps go a long way in
aiding the choice of the appropriate models to then provide the requisite data. These steps also then
help focus the VV&A efforts for the various models chosen.
VV&A Definitions: VV&A consists of three major components. Verification is the process of
determining that a model implementation accurately represents the developer’s conceptual description
and specifications. Validation is the process of determining the degree to which a model is an accurate
representation of the real world from the perspective of intended uses of the model. Finally,
39
accreditation is both a process and an outcome wherein the user of the information provided by the
M&S product certifies that the information is applicable, with acceptable risk, to the intended use.
Verification is accomplished by first decomposing the M&S into its functions. The idea that
V&V should be conducted at the functional level rather than at the subroutine or overall model level is
an essential element of cost effective V&V. Breaking the model into its functions encompasses
identifying and describing major functional capabilities of the model as well as the functional elements
that implement each capability. This can allow streamlining of V&V efforts by permitting parallel
execution of V&V tasks on the functional elements common to several models. Other tasks of
verification include assessing the quality of the software in terms of its conformance to accepted
coding practices; identifying model assumptions, limitations, and errors; producing design
documentation; performing logical verification by comparing the model with modeling requirements
of the problem at hand to determine whether the model can reasonably be expected to produce results
that are realistic enough; performing code verification to ensure that design requirements have been
satisfied and that the algorithms and equations being used are properly implemented in the software;
and documenting the results of the verification effort.
Validation tasks include conducting sensitivity analyses, with priority given to those functions
expected to have the largest impact on overall model results; performing face validation, which is a
subjective evaluation of model outputs against expectations of a group of subject matter experts;
performing results validation; and documenting the validation results.
A determination for accreditation depends on a comparison between the modeling requirements
determined by how the model is going to be used in an application and what is known about the
model’s capabilities and characteristics. This comparison should result in a logical rationale that
justifies accreditation. If any deficiencies are identified as a result of the comparison, some means of
correcting or mitigating them must be developed to justify accreditation of the model. The model
might be modified to correct the deficiency, some type of work-around might be used, or some
restrictions might be placed on model use or data interpretation. The accreditation process begins with
execution of any non-V&V tasks identified. Non-V&V tasks typically involve the collection of data
about model characteristics and development background that are usually available in model
documentation or through the model manager. These include such things as configuration
management attributes; documentation; VV&A status; usage history; and hardware, software, and
interface attributes.
Regulatory Requirements: The Department of Defense requires that almost all M&S affecting
acquisition decisions be accredited for use in the required application. Included under the pertinent
regulations are models used for education and training; analysis; test and evaluation; research and
development; and production and logistics. System training devices and embedded tools that do not
communicate outside the host system are the only types of M&S specifically excluded. The
regulations further stipulate who is responsible for conducting VV&A. If a M&S is under Army
control and a V&V proponent has already been established, that command is responsible for
conducting the VV&A effort. If a M&S is under Army control and is used by only one agency, that
agency has responsibility for the VV&A effort. If, however, an Army-controlled model is used by
several agencies, the predominant user or the chair of the established users’ group has responsibility
for the VV&A effort. If a M&S is under development, the sponsoring agency is responsible for
conducting the VV&A effort. For any contractor models, the Army sponsor will ensure that VV&A is
performed when deemed necessary. This can be accomplished by including requirements for
documentation of VV&A activities and stipulating acceptability criteria in documents such as the
request for proposal and statement of work. Verification, in particular, should normally be
40
accomplished by an independent agent, but the M&S proponent ultimately has responsibility for
ensuring accomplishment.
Determination of the Need for VV&A: Each model must have its status examined as illustrated in
the flowchart below.
M&S
Product/Tool Process: Determining Need for VV&A
VV&A
Are valid data No
Are valid data available VV&A
available elsewhere?
elsewhere?
Yes
No
Is model
needed to No No VV&A
VV&A extend Req’d
domain?
Yes
VV&A
Walking through this process for a particular model results in a determination of whether or not the
model is required to undergo some form of VV&A. What results is a determination of need for no
action, accreditation only, or VV&A. (The chart has been simplified here by treating accreditation and
VV&A as being equivalent actions.)
Determination of Priority of VV&A of M&S: Once it has been determined that VV&A is required
for a model, it is necessary to determine the priority that should be given to the VV&A effort for the
model in question. The VV&A decision table that follows expands on the information shown in the
flowchart shown above and provides a mechanism to prioritize the VV&A efforts.
41
VV&A DECISION TABLE
Determining the Level of VV&A Required: Once one determines that a model requires VV&A to
some level, that level needs to be determined consistent with the Crusader program’s major milestones
and the nature of the model itself. General factors that are considered include the user’s requirements
42
for the M&S, prior experience with the model, the size and complexity of the model, whether it is still
under development, whether an acceptable V&V plan exists, and whether documentation is or will be
complete when the model is to be run. The key issue for the level of effort for VV&A is the program
resources that need to be devoted to achieve an acceptable level of confidence or credibility in the
M&S with the model user and the program decision makers.
Crusader models that have been identified for VV&A will receive at least one of the VV&A
levels defined below:
1) Is the model a legacy code with only a few newly added features?
2) Has the model been VV&A’d before and was it reasonably well done?
3) Is the model with its new features likely to fairly directly fit the user’s
requirements?
43
4) Are the model and its results generally accepted at large and would a check test of
new features likely be accepted by subject matter experts?
c. Limited Assessment
The limited assessment looks for risk areas in applying the model to the user’s
requirements. It does a check of the V&V of the conceptual model, the computer model, input
data, and output results against real world information. Typically, a limited assessment would
take several months (6 or less), but the timeframe is dependent on the size and complexity of
the model, as well as the level of confidence needed by the user. The limited assessment
produces a report that discusses the degree to which the model meets the user’s requirements,
discusses the bounds on the model’s domain, provides at least a face validation of the model by
subject matter experts, and provides the basis for accrediting the model. The following
provides a list of questions that if answered affirmatively will result in selecting this level of
VV&A:
1) Although not previously VV&A’d for the intended application, does the model have
a VV&A history that can be leveraged?
2) Are the model results expected to directly impact acquisition decisions at
Department of the Army or Department of Defense levels?
3) Is high confidence required in the model results?
d. Full Assessment
The full or formal assessment includes the elements of the limited assessment but goes
into greater depth in identifying risk areas in applying the model. This level of assessment
normally would take more than 6 months; a year or more would be reasonable for new, large
scale, system level models with complex interactions. Answering ‘yes’ to the following
questions results in selecting this most stringent VV&A level.
Data Collection for VV&A: The table that follows provides the basic information that should be
collected for each candidate model.
44
MODEL INFORMATION TABLE
Type of M&S
Engineering Design
Item Level Performance
Force-on-force
Reliability
Cost
Constructive or Virtual
System Level or
Segment Level or
Component Level
Functional Area
Engineering Development
Test and Evaluation
Combat Development
Training
Intended Use Known?
Customer's Requirements Known?
Fidelity Required for Main measures of Merit
(H, M, L)
Do Issues Address
ORD/KPP
Exit Criterion
Success Criterion
Development Status of M&S (Partial or Full)
If Partial, % Complete
Use History
Past Application to Similar Problem?
Existing Documentation
Analyst Manual
Users Manual
Accreditation (Y/N. If Y, then date)
Past V&V Efforts (Y/N. If Y then year completed)
V&V Required
Estimated Level (Audit, Limited, Full)
Required Completion Date
V&V Proponent
Accreditation Authority
Required Completion Date(s)
Developing a VV&A Plan: The objective of VV&A planning is to ensure that all V&V efforts are
focused on justifying the accreditation decision. Careful planning will minimize the chance that any
V&V activities will be undertaken without a specific justifying requirement. The first step in the
planning is to determine the accreditation requirements. Accreditation requirements fall into three
categories: V&V data requirements, non-V&V data requirements, and documentation requirements.
The task of determining V&V requirements can be broken down into a series of sub-tasks. The
starting point is a determination of what types of V&V or other information are needed to justify an
accreditation decision. Existing data that match these requirements are collected and used to evaluate
whether or not the M&S requirements are satisfied. This comparison leads to identification of
information voids. The voids are then analyzed to determine which ones are critical; appropriate V&V
tasks are then identified to fill those critical voids and are added to a consolidated task list. Non-V&V
requirements encompass needs for information that is not obtained through traditional V&V activities.
This “other” information includes basic information about the model normally found in model
45
documentation or other documentation produced by the model manager or configuration manager.
Non-V&V information is typically needed to determine how well the model fulfills operating
requirements associated with the application. The same steps taken to determine V&V data
requirements apply to non-V&V data requirements as well. Requirements for accreditation
documentation include the specification of which documents must be prepared, as well as the format
and content requirements for each. The sources of these requirements are service or DoD policies,
accreditation authority requirements, and any special requirements due to archiving compatibility. All
of the accreditation requirements should be succinctly summarized and recorded as the basis for
eventual accreditation. This summarized list of accreditation requirements will serve as a checklist of
items to be reviewed as part of the accreditation assessment. These accreditation requirements should
be documented in the accreditation plan.
The next step is doing the actual planning of the V&V and other data collection efforts. An
accreditation plan includes plans for performing V&V tasks, collecting non-V&V data, assessing the
results in light of the accreditation requirements, and documenting both the V&V and assessment
results. In the cases of the V&V tasks, the non-V&V tasks, and documenting the results, the first step
is to identify and then prioritize the tasks. Once the tasks are identified, the resources required to
accomplish the tasks can be identified. Then a schedule is established to execute the tasks, and
responsibilities are assigned to appropriate people.
The third necessary element of the planning process is to develop plans for assessing the
suitability of the model for the intended application. There are two commonly used approaches to
performing an accreditation assessment. The first is to place the responsibility on the primary analyst
who obtains and interprets the model results. If this approach is used, the plans for accreditation
assessment should address issues such as: ensuring that specific criteria for evaluating the model’s
suitability are clearly documented and utilized; determining what actions will be taken if criteria are
not specific; identifying what steps will be taken if the model does not fit the criteria; identifying the
reviews that will be done on the analyst’s findings; and identifying sources of assistance if other
problems arise beyond the assessor’s technical capability. The second commonly used approach for
performing an accreditation assessment is the use of an expert review team. If this approach is used,
there is significantly more planning that is necessary to make such a team assessment effective.
Planning for an expert team assessment must address team composition, team leader selection,
assessment criteria, methods for resolving differences of opinion, the mechanics of running and
supporting the assessment meetings, financial support for team members, and documenting the team’s
findings. If these issues cannot be definitively planned during the planning phase, which often takes
place several months before the assessment is actually done, the assessment plans must at least identify
who will do the detailed planning, when it will be done, and who will be responsible for
implementation.
VV&A plans should be developed in the same priority order as determined for the VV&A
efforts for the models themselves.
46
REFERENCES:
47
APPENDIX D: Equipment Type Descriptions and Application
The following types of equipment will be needed to support the development activities carried
out in the SIF. Actual quantities are identified in figure D1 as they support each SIF entity.
Workstations: All UNIX workstations will be networked throughout the SIF to a central file server.
Two different types of workstations are required: a model development workstation and an emulator
workstation. The model development workstation (e.g., Sun Ultra 1) will be used to create and
manage low and high fidelity models generated in support of the SES process. The configuration of
this workstation will include sufficient resources to accomplish the tasks identified above. The
development workstation (e.g., Sun Ultra 2) will be dedicated to emulator interface, which may
include software/hardware/electronics development, integration and testing, rapid prototyping, data
analysis, user interface tasks testing and simulation. This workstation will include sufficient resources
to accomplish these tasks.
Emulators: Emulators (e.g., VME-bus based expandable systems) are self-contained data
processing systems on the network that are individually capable of implementing high fidelity real-
time emulations of subsystem hardware/software or environmental simulations/stimulations. The
emulators will be capable of implementing prototype subsystem designs that may have been produced
by COTS prototyping tools such as Matrix X. The emulators will be capable of emulating, in real-
time, any device on any of the vehicle’s data buses, and/or providing real-time sensor outputs to the
bus or other devices, and actuator/load simulations via the SAIU. Some of the bus emulators will
require unique features. This may include analog to digital conversion, digital to analog conversion,
digital to resolver conversion, special processing, digital/serial/parallel I/O, and special memory. The
emulators will have common interfacing and processing features but will be tailored as required by
each of the subsystems. The basic features should include at least one high performance CPU that is
equivalent (e.g., commercial version) to the target CPU for the vehicle, an Ethernet interface for T and
M (possibly on the same board), a bus adapter to the real-time, synchronous data acquisition and
control bus, and a bus adapter to extend the bus emulator’s bus. A bus adapter to the vehicle’s
interprocess bus (i.e., high-speed data bus) may also be provided.
The basic emulator should also implement a common real-time operating system (e.g., VX-works) and
interact through its Ethernet interface using (e.g., TCP/IP, CORBA). The emulator should support the
boards/features that the subsystem emulation requires on an individual basis. In general, the basic
emulator will be similar to the VME-based bus emulator used in the HLSIM for the Bradley Fighting
Vehicle A3 Electronic Control System. In cases where a subsystem requires a feature/board support
that is not achievable using the basic emulator, and where the subsystem emulation does not require
high bandwidth communication of the common simulation environment with other emulators, other
COTS products (e.g., AC-100) may be used in combination with the basic emulator to address the
specific needs.
Graphics Work Stations: Graphics Work Stations (e.g., Silicon Graphics Maximum Impacts,
Octanes) are utilized to develop solid 3D models, to execute simulations/models for engineering
design and front-end analysis and to develop models for the visual simulation, and to monitor/display
2D or 3D graphics that represent situational displays of vehicle subsystems. Graphics workstations
will also be used to generate simulated environments for development and testing purposes applied to
the SES process.
The processors provide the same capabilities as the development workstations, but additionally
possess special hardware, and software that processes face-boundaries, occlusion, hidden line and
rendering to support 3D graphics with near real-time (30 Hz) performance. The processors also
include monitors and drivers for display as well as auxiliary input equipment. They can also access
48
the CDE in the same fashion as ordinary workstations, but additionally support specialized graphics
development packages.
Real Time Graphics Processors: The real-time graphic processors (e.g. Silicon Graphics,
ONYX) are utilized in the visual simulation to provide high fidelity 3D graphics for real time
simulated “out-the-window” or simulated monitor views. Real-time is defined as the rate by which the
visual scene is updated.
The processors implement the models developed on the non real-time graphics work stations.
The number of models in the field of view depends on the simulated scenario, the real-time geometry
and the viewer’s eyepoint(s), therefore the number of face boundaries to process is variable and may
be large. The real-time graphics processor will be capable of processing multiple independent and
simultaneous views consisting of but not limited to terrain, other moving vehicles and obstacles in
real-time with processing lags from controller to monitor of no more than 0.5 seconds.
File Server/Network: The file server (e.g., UltraSPARC 1000) provides file services to all
clients on the SIF network. The server will have high speed access to mass storage, including the
CDE, office software packages, CASE and CAE tools, configuration and file management tools, as
well as Web site development utilities as needed. The server will also have lower speed mass storage
devices such as a read/write CD and tape. The file server will also have a backup system that is
capable of performing automated and scheduled backups. The SIF network and server will be
protected by a COTS firewall if internet access is implemented. Additional local area networks, as
part of the SIF network, will be implemented as needed.
49
APPENDIX E: List of General Models for M&S
The list of models is now maintained in a separate data base for management purposes.
Specific information on these models may be requested from The Program Office by DOD
Agencies and DOD Contractors.
50
APPENDIX F: Classified M&S Annex (Under separate cover)
51