Military Standard Software Development and Documentation
Military Standard Software Development and Documentation
SENSITIVE
MIL-STD-498
5 December 1994
(PDF version)
Superseding
DOD-STD-2167A
29 February 1988
DOD-STD-7935A
31 October 1988
DOD-STD-1703(NS)
12 February 1987
MILITARY STANDARD
SOFTWARE DEVELOPMENT
AND DOCUMENTATION
FOREWORD
1. This Military Standard is approved for use by all Departments and Agencies of the
Department of Defense.
2. Beneficial comments (recommendations, additions, deletions) and any pertinent data which
may be of use in improving this document should be addressed to SPAWAR 10-12, 2451 Crystal
Drive (CPK-5), Arlington, VA 22245-5200. The comments may be submitted by letter or by using
the Standardization Document Improvement Proposal (DD Form 1426) appearing at the end of
this document.
4. This standard can be applied in any phase of the system life cycle. It can be applied to
contractors, subcontractors, or Government in-house agencies performing software development.
For uniformity, the term "acquirer" is used for the organization requiring the technical effort, the
term "developer" for the organization performing the technical effort, and the term "contract" for
the agreement between them. The term "software development" is used as an inclusive term
encompassing new development, modification, reuse, reengineering, maintenance, and all other
activities resulting in software products.
5. This standard is not intended to specify or discourage the use of any particular software
development method. The developer is responsible for selecting software development methods
that support the achievement of contract requirements.
6. This standard implements the development and documentation processes of ISO/IEC DIS
12207. It interprets all applicable clauses in MIL-Q-9858A (Quality Program Requirements) and
ISO 9001 (Quality Systems) for software.
8. Data Item Descriptions (DIDs) applicable to this standard are listed in Section 6. These
DIDs describe the information required by this standard.
9. This standard and its Data Item Descriptions (DIDs) are meant to be tailored by the
acquirer to ensure that only necessary and cost-effective requirements are imposed on software
development efforts. General tailoring guidance can be found in Section 6 and in DOD-HDBK-
248. Tailoring guidance specific to this standard can be found in Appendixes G and H and in
guidebooks and handbooks planned for this standard.
MIL-STD-498 (PDF version)
CONTENTS
Paragraph Page
1. SCOPE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2.1 Organizations and agreements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2.2 Contract-specific application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2.3 Tailoring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2.4 Interpretation of selected terms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2.4.1 Interpretation of "system" . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2.4.2 Interpretation of "participate" in system development . . . . . . 2
1.2.4.3 Interpretation of "develop," "define," etc . . . . . . . . . . . . . . . 2
1.2.4.4 Interpretation of "record" . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Order of precedence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2. REFERENCED DOCUMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
3. DEFINITIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
4. GENERAL REQUIREMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.1 Software development process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.2 General requirements for software development . . . . . . . . . . . . . . . . . . . . . . . . 8
4.2.1 Software development methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.2.2 Standards for software products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.2.3 Reusable software products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.2.3.1 Incorporating reusable software products . . . . . . . . . . . . . . 8
4.2.3.2 Developing reusable software products . . . . . . . . . . . . . . . . 9
4.2.4 Handling of critical requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2.4.1 Safety assurance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2.4.2 Security assurance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2.4.3 Privacy assurance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2.4.4 Assurance of other critical requirements . . . . . . . . . . . . . . . 9
4.2.5 Computer hardware resource utilization. . . . . . . . . . . . . . . . . . . . . . . 10
4.2.6 Recording rationale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4.2.7 Access for acquirer review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
5. DETAILED REQUIREMENTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5.1 Project planning and oversight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.1.1 Software development planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.1.2 CSCI test planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.1.3 System test planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.1.4 Software installation planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.1.5 Software transition planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
5.1.6 Following and updating plans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
5.2 Establishing a software development environment . . . . . . . . . . . . . . . . . . . . . 13
5.2.1 Software engineering environment . . . . . . . . . . . . . . . . . . . . . . . . . . 13
5.2.2 Software test environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
5.2.3 Software development library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
CONTENTS - Continued
Paragraph Page
iv (PDF Version)
MIL-STD-498 (PDF version)
CONTENTS - Continued
Paragraph Page
v (PDF Version)
MIL-STD-498 (PDF version)
CONTENTS - Continued
Paragraph Page
6. NOTES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
6.1 Intended use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
6.2 Data requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
6.3 Relationship between standard and CDRL . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.4 Delivery of tool contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.5 Tailoring guidance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.6 Cost/schedule reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.7 Related standardization documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.8 Subject term (key word) listing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
APPENDIXES
Appendix Page
A LIST OF ACRONYMS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
A.1 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
A.2 Applicable documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
A.3 Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
vi (PDF Version)
MIL-STD-498 (PDF version)
CONTENTS - Continued
APPENDIXES
Appendix Page
INDEX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
CONTENTS - Continued
FIGURES
Figure Page
1. SCOPE
1.1 Purpose. The purpose of this standard is to establish uniform requirements for software
development and documentation.
1.2.3 Tailoring. This standard and its Data Item Descriptions (DIDs) are meant to be tailored
for each type of software to which they are applied. While tailoring is the responsibility of the
acquirer, suggested tailoring may be provided by prospective and selected developers. General
tailoring guidance can be found in Section 6 and in DOD-HDBK-248. Tailoring guidance specific
to this standard can be found in Appendixes G and H and in guidebooks and handbooks planned
for this standard.
1.2.4 Interpretation of selected terms. The following terms have a special interpretation as used
in this standard.
a. The term "system," as used in this standard, may mean: (1) a hardware-software system
(for example, a radar system) for which this standard covers only the software portion, or
(2) a software system (for example, a payroll system) for which this standard governs
overall development.
MIL-STD-498 (PDF version) 1. Scope Page 2
1.3 Order of precedence. In the event of conflict between the requirements of this standard
and other applicable standardization documents, the acquirer is responsible for resolving the
conflicts.
MIL-STD-498 (PDF version) 2. Referenced Documents Page 3
2. REFERENCED DOCUMENTS
This section does not apply to this standard, since no documents are referenced in Sections 3,
4, or 5. Section 6 contains a list of standardization documents that may be used with this
standard.
MIL-STD-498 (PDF version) 3. Definitions Page 4
3. DEFINITIONS
3.2 Acquirer. An organization that procures software products for itself or another
organization.
3.4 Architecture. The organizational structure of a system or CSCI, identifying its components,
their interfaces, and a concept of execution among them.
3.5 Associate developer. An organization that is neither prime contractor nor subcontractor
to the developer, but who has a development role on the same or related system or project.
3.6 Behavioral design. The design of how an overall system or CSCI will behave, from a
users point of view, in meeting its requirements, ignoring the internal implementation of the
system or CSCI. This design contrasts with architectural design, which identifies the internal
components of the system or CSCI, and with the detailed design of those components.
3.7 Build. (1) A version of software that meets a specified subset of the requirements that the
completed software will meet. (2) The period of time during which such a version is developed.
Note: The relationship of the terms "build" and "version" is up to the developer; for example, it
may take several versions to reach a build, a build may be released in several parallel versions
(such as to different sites), or the terms may be used as synonyms.
3.9 Computer hardware. Devices capable of accepting and storing computer data, executing
a systematic sequence of operations on computer data, or producing control outputs. Such
devices can perform substantial interpretation, computation, communication, control, or other
logical functions.
3.10 Computer program. A combination of computer instructions and data definitions that
enable computer hardware to perform computational or control functions.
3.12 Computer Software Configuration Item (CSCI). An aggregation of software that satisfies
an end use function and is designated for separate configuration management by the acquirer.
CSCIs are selected based on tradeoffs among software function, size, host or target computers,
developer, support concept, plans for reuse, criticality, interface considerations, need to be
separately documented and controlled, and other factors.
3.13 Configuration Item. An aggregation of hardware, software, or both that satisfies an end
use function and is designated for separate configuration management by the acquirer.
3.14 Database. A collection of related data stored in one or more computerized files in a
manner that can be accessed by users or computer programs via a database management
system.
3.15 Database management system. An integrated set of computer programs that provide the
capabilities needed to establish, modify, make available, and maintain the integrity of a database.
3.16 Deliverable software product. A software product that is required by the contract to be
delivered to the acquirer or other designated recipient.
3.17 Design. Those characteristics of a system or CSCI that are selected by the developer in
response to the requirements. Some will match the requirements; others will be elaborations of
requirements, such as definitions of all error messages in response to a requirement to display
error messages; others will be implementation related, such as decisions about what software
units and logic to use to satisfy the requirements.
3.18 Developer. An organization that develops software products ("develops" may include new
development, modification, reuse, reengineering, maintenance, or any other activity that results
in software products). The developer may be a contractor or a Government agency.
3.20 Evaluation. The process of determining whether an item or activity meets specified
criteria.
3.21 Firmware. The combination of a hardware device and computer instructions and/or
computer data that reside as read-only software on the hardware device.
3.22 Hardware Configuration Item (HWCI). An aggregation of hardware that satisfies an end
use function and is designated for separate configuration management by the acquirer.
3.23 Independent verification and validation (IV&V). Systematic evaluation of software products
and activities by an agency that is not responsible for developing the product or performing the
activity being evaluated. IV&V is not within the scope of this standard.
3.24 Interface. In software development, a relationship among two or more entities (such as
CSCI-CSCI, CSCI-HWCI, CSCI-user, or software unit-software unit) in which the entities share,
provide, or exchange data. An interface is not a CSCI, software unit, or other system component;
it is a relationship among them.
MIL-STD-498 (PDF version) 3. Definitions Page 6
3.25 Joint review. A process or meeting involving representatives of both the acquirer and the
developer, during which project status, software products, and/or project issues are examined and
discussed.
3.26 Non-deliverable software product. A software product that is not required by the contract
to be delivered to the acquirer or other designated recipient.
3.27 Process. An organized set of activities performed for a given purpose; for example, the
software development process.
3.28 Qualification testing. Testing performed to demonstrate to the acquirer that a CSCI or a
system meets its specified requirements.
3.29 Reengineering. The process of examining and altering an existing system to reconstitute
it in a new form. May include reverse engineering (analyzing a system and producing a
representation at a higher level of abstraction, such as design from code), restructuring
(transforming a system from one representation to another at the same level of abstraction),
redocumentation (analyzing a system and producing user or support documentation), forward
engineering (using software products derived from an existing system, together with new
requirements, to produce a new system), retargeting (transforming a system to install it on a
different target system), and translation (transforming source code from one language to another
or from one version of a language to another).
3.30 Requirement. (1) A characteristic that a system or CSCI must possess in order to be
acceptable to the acquirer. (2) A mandatory statement in this standard or another portion of the
contract.
3.31 Reusable software product. A software product developed for one use but having other
uses, or one developed specifically to be usable on multiple projects or in multiple roles on one
project. Examples include, but are not limited to, commercial off-the-shelf software products,
acquirer-furnished software products, software products in reuse libraries, and pre-existing
developer software products. Each use may include all or part of the software product and may
involve its modification. This term can be applied to any software product (for example,
requirements, architectures, etc.), not just to software itself.
3.32 Software. Computer programs and computer databases. Note: Although some definitions
of software include documentation, MIL-STD-498 limits the definition to computer programs and
computer databases in accordance with Defense Federal Acquisition Regulation Supplement
227.401.
3.33 Software development. A set of activities that results in software products. Software
development may include new development, modification, reuse, reengineering, maintenance, or
any other activities that result in software products.
3.34 Software development file (SDF). A repository for material pertinent to the development
of a particular body of software. Contents typically include (either directly or by reference)
considerations, rationale, and constraints related to requirements analysis, design, and
implementation; developer-internal test information; and schedule and status information.
MIL-STD-498 (PDF version) 3. Definitions Page 7
3.36 Software development process. An organized set of activities performed to translate user
needs into software products.
3.37 Software engineering. In general usage, a synonym for software development. As used
in this standard, a subset of software development consisting of all activities except qualification
testing. The standard makes this distinction for the sole purpose of giving separate names to the
software engineering and software test environments.
3.40 Software quality. The ability of software to satisfy its specified requirements.
3.41 Software support. The set of activities that takes place to ensure that software installed
for operational use continues to perform as intended and fulfill its intended role in system
operation. Software support includes software maintenance, aid to users, and related activities.
3.42 Software system. A system consisting solely of software and possibly the computer
equipment on which the software operates.
3.43 Software test environment. The facilities, hardware, software, firmware, procedures, and
documentation needed to perform qualification, and possibly other, testing of software. Elements
may include but are not limited to simulators, code analyzers, test case generators, and path
analyzers, and may also include elements used in the software engineering environment.
3.44 Software transition. The set of activities that enables responsibility for software
development to pass from one organization, usually the organization that performs initial software
development, to another, usually the organization that will perform software support.
3.45 Software unit. An element in the design of a CSCI; for example, a major subdivision of
a CSCI, a component of that subdivision, a class, object, module, function, routine, or database.
Software units may occur at different levels of a hierarchy and may consist of other software
units. Software units in the design may or may not have a one-to-one relationship with the code
and data entities (routines, procedures, databases, data files, etc.) that implement them or with
the computer files containing those entities.
4. GENERAL REQUIREMENTS
4.1 Software development process. The developer shall establish a software development
process consistent with contract requirements. The software development process shall include
the following major activities, which may overlap, may be applied iteratively, may be applied
differently to different elements of software, and need not be performed in the order listed below.
Appendix G provides examples. The developers software development process shall be
described in the software development plan.
4.2 General requirements for software development. The developer shall meet the following
general requirements in carrying out the detailed requirements in section 5 of this standard.
4.2.1 Software development methods. The developer shall use systematic, documented
methods for all software development activities. These methods shall be described in, or
referenced from, the software development plan.
4.2.2 Standards for software products. The developer shall develop and apply standards for
representing requirements, design, code, test cases, test procedures, and test results. These
standards shall be described in, or referenced from, the software development plan.
4.2.3 Reusable software products. The developer shall meet the following requirements.
4.2.3.1 Incorporating reusable software products. The developer shall identify and evaluate
reusable software products for use in fulfilling the requirements of the contract. The scope of the
search and the criteria to be used for evaluation shall be as described in the software
development plan. Reusable software products that meet the criteria shall be used where
practical. Appendix B provides required and candidate criteria and interprets this standard for
incorporation of reusable software products. Incorporated software products shall meet the data
rights requirements in the contract.
MIL-STD-498 (PDF version) 4. General Requirements Page 9
4.2.3.2 Developing reusable software products. During the course of the contract, the
developer shall identify opportunities for developing software products for reuse and shall evaluate
the benefits and costs of these opportunities. Opportunities that provide cost benefits and are
compatible with program objectives shall be identified to the acquirer.
Note: In addition, the developer may be required by the contract to develop software products
specifically for reuse.
4.2.4 Handling of critical requirements. The developer shall meet the following requirements.
4.2.4.1 Safety assurance. The developer shall identify as safety-critical those CSCIs or
portions thereof whose failure could lead to a hazardous system state (one that could result in
unintended death, injury, loss of property, or environmental harm). If there is such software, the
developer shall develop a safety assurance strategy, including both tests and analyses, to assure
that the requirements, design, implementation, and operating procedures for the identified
software minimize or eliminate the potential for hazardous conditions. The strategy shall include
a software safety program, which shall be integrated with the system safety program if one exists.
The developer shall record the strategy in the software development plan, implement the strategy,
and produce evidence, as part of required software products, that the safety assurance strategy
has been carried out.
4.2.4.2 Security assurance. The developer shall identify as security-critical those CSCIs or
portions thereof whose failure could lead to a breach of system security. If there is such software,
the developer shall develop a security assurance strategy to assure that the requirements, design,
implementation, and operating procedures for the identified software minimize or eliminate the
potential for breaches of system security. The developer shall record the strategy in the software
development plan, implement the strategy, and produce evidence, as part of required software
products, that the security assurance strategy has been carried out.
4.2.4.3 Privacy assurance. The developer shall identify as privacy-critical those CSCIs or
portions thereof whose failure could lead to a breach of system privacy. If there is such software,
the developer shall develop a privacy assurance strategy to assure that the requirements, design,
implementation, and operating procedures for the identified software minimize or eliminate the
potential for breaches of system privacy. The developer shall record the strategy in the software
development plan, implement the strategy, and produce evidence, as part of required software
products, that the privacy assurance strategy has been carried out.
4.2.4.4 Assurance of other critical requirements. If a system relies on software to satisfy other
requirements deemed critical by the contract or by system specifications, the developer shall
identify those CSCIs or portions thereof whose failure could lead to violation of those critical
requirements; develop a strategy to assure that the requirements, design, implementation, and
operating procedures for the identified software minimize or eliminate the potential for such
violations; record the strategy in the software development plan; implement the strategy; and
produce evidence, as part of required software products, that the assurance strategy has been
carried out.
MIL-STD-498 (PDF version) 4. General Requirements Page 10
4.2.5 Computer hardware resource utilization. The developer shall analyze contract require-
ments concerning computer hardware resource utilization (such as maximum allowable use of
processor capacity, memory capacity, input/output device capacity, auxiliary storage device
capacity, and communications/network equipment capacity). The developer shall allocate
computer hardware resources among the CSCIs, monitor the utilization of these resources for the
duration of the contract, and reallocate or identify the need for additional resources as necessary
to meet contract requirements.
4.2.6 Recording rationale. The developer shall record rationale that will be useful to the support
agency for key decisions made in specifying, designing, implementing, and testing the software.
The rationale shall include trade-offs considered, analysis methods, and criteria used to make the
decisions. The rationale shall be recorded in documents, code comments, or other media that
will transition to the support agency. The meaning of "key decisions" and the approach for
providing the rationale shall be described in the software development plan.
4.2.7 Access for acquirer review. The developer shall provide the acquirer or its authorized
representative access to developer and subcontractor facilities, including the software engineering
and test environments, for review of software products and activities required by the contract.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 11
5. DETAILED REQUIREMENTS
The order of the requirements in this section is not intended to specify the order in which they
must be carried out. Many of the activities may be ongoing at one time; different software
products may proceed at different paces; and activities specified in early subsections may depend
on input from activities in later subsections. If the software is developed in multiple builds, some
activities may be performed in every build, others may be performed only in selected builds, and
activities and software products may not be complete until several or all builds are accomplished.
Figure 1 provides an example of how each activity may be applied in one or more builds. Non-
mandatory notes throughout section 5 tell how to interpret each activity on a project involving
multiple builds. A project involving a single build will accomplish all required activities in that
build. Appendix G provides guidance for planning builds, determining which activities apply to
each build, and scheduling these activities.
Builds
Activity
Build 1 Build 2 Build 3 Build 4
Integral processes:
5.1 Project planning and oversight. The developer shall perform project planning and
oversight in accordance with the following requirements.
Note: If a system or CSCI is developed in multiple builds, planning for each build should be
interpreted to include: a) overall planning for the contract, b) detailed planning for the current
build, and c) planning for future builds covered under the contract to a level of detail
compatible with the information available.
5.1.1 Software development planning. The developer shall develop and record plans for
conducting the activities required by this standard and by other software-related requirements in
the contract. This planning shall be consistent with system-level planning and shall include all
applicable items in the Software Development Plan (SDP) DID (see 6.2).
Note 1: The wording here and throughout MIL-STD-498 is designed to: 1) Emphasize that
the development and recording of planning and engineering information is an intrinsic part of
the software development process, to be performed regardless of whether a deliverable is
required; 2) Use the DID as a checklist of items to be covered in the planning or engineering
activity; and 3) Permit representations other than traditional documents for recording the
information (e.g., computer-aided software engineering (CASE) tools).
Note 2: If the CDRL specifies delivery of the information generated by this or any other
paragraph, the developer is required to format, assemble, mark, copy, and distribute the
deliverable in accordance with the CDRL. This task is recognized to be separate from the
task of generating and recording the required information and to require additional time and
effort on the part of the developer.
Note 3: The software development plan covers all activities required by this standard.
Portions of the plan may be bound or maintained separately if this approach enhances the
usability of the information. Examples include separate plans for software quality assurance
and software configuration management.
5.1.2 CSCI test planning. The developer shall develop and record plans for conducting CSCI
qualification testing. This planning shall include all applicable items in the Software Test Plan
(STP) DID (see 6.2).
5.1.3 System test planning. The developer shall participate in developing and recording plans
for conducting system qualification testing. For software systems, this planning shall include all
applicable items in the Software Test Plan (STP) DID (see 6.2). (The intent for software systems
is a single software test plan covering both CSCI and system qualification testing.)
5.1.4 Software installation planning. The developer shall develop and record plans for
performing software installation and training at the user sites specified in the contract. This
planning shall include all applicable items in the Software Installation Plan (SIP) DID (see 6.2).
5.1.5 Software transition planning. The developer shall identify all software development
resources that will be needed by the support agency to fulfill the support concept specified in the
contract. The developer shall develop and record plans identifying these resources and
describing the approach to be followed for transitioning deliverable items to the support agency.
This planning shall include all applicable items in the Software Transition Plan (STrP) DID (see
6.2).
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 13
5.1.6 Following and updating plans. Following acquirer approval of any of the plans in this
section, the developer shall conduct the relevant activities in accordance with the plan. The
developers management shall review the software development process at intervals specified in
the software development plan to assure that the process complies with the contract and adheres
to the plans. With the exception of developer-internal scheduling and related staffing information,
updates to plans shall be subject to acquirer approval.
5.2.1 Software engineering environment. The developer shall establish, control, and maintain
a software engineering environment to perform the software engineering effort. The developer
shall ensure that each element of the environment performs its intended functions.
5.2.2 Software test environment. The developer shall establish, control, and maintain a software
test environment to perform qualification, and possibly other, testing of software. The developer
shall ensure that each element of the environment performs its intended functions.
5.2.3 Software development library. The developer shall establish, control, and maintain a
software development library (SDL) to facilitate the orderly development and subsequent support
of software. The SDL may be an integral part of the software engineering and test environments.
The developer shall maintain the SDL for the duration of the contract.
5.2.4 Software development files. The developer shall establish, control, and maintain a
software development file (SDF) for each software unit or logically related group of software units,
for each CSCI, and, as applicable, for logical groups of CSCIs, for subsystems, and for the overall
system. The developer shall record information about the development of the software in
appropriate SDFs and shall maintain the SDFs for the duration of the contract.
5.2.5 Non-deliverable software. The developer may use non-deliverable software in the
development of deliverable software as long as the operation and support of the deliverable
software after delivery to the acquirer do not depend on the non-deliverable software or provision
is made to ensure that the acquirer has or can obtain the same software. The developer shall
ensure that all non-deliverable software used on the project performs its intended functions.
5.3 System requirements analysis. The developer shall participate in system requirements
analysis in accordance with the following requirements.
Note: If a system is developed in multiple builds, its requirements may not be fully defined
until the final build. The developers planning should identify the subset of system
requirements to be defined in each build and the subset to be implemented in each build.
System requirements analysis for a given build should be interpreted to mean defining the
system requirements so identified for that build.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 14
5.3.1 Analysis of user input. The developer shall participate in analyzing user input provided
by the acquirer to gain an understanding of user needs. This input may take the form of need
statements, surveys, problem/change reports, feedback on prototypes, interviews, or other user
input or feedback.
5.3.2 Operational concept. The developer shall participate in defining and recording the
operational concept for the system. The result shall include all applicable items in the Operational
Concept Description (OCD) DID (see 6.2).
5.3.3 System requirements. The developer shall participate in defining and recording the
requirements to be met by the system and the methods to be used to ensure that each
requirement has been met. The result shall include all applicable items in the System/Subsystem
Specification (SSS) DID (see 6.2). Depending on CDRL provisions, requirements concerning
system interfaces may be included in the SSS or in interface requirements specifications (IRSs).
5.4 System design. The developer shall participate in system design in accordance with the
following requirements.
Note: If a system is developed in multiple builds, its design may not be fully defined until the
final build. The developers planning should identify the portion of the system design to be
defined in each build. System design for a given build should be interpreted to mean defining
the portion of the system design identified for that build.
5.4.1 System-wide design decisions. The developer shall participate in defining and recording
system-wide design decisions (that is, decisions about the systems behavioral design and other
decisions affecting the selection and design of system components). The result shall include all
applicable items in the system-wide design section of the System/Subsystem Design Description
(SSDD) DID (see 6.2). Depending on CDRL provisions, design pertaining to interfaces may be
included in the SSDD or in interface design descriptions (IDDs) and design pertaining to
databases may be included in the SSDD or in database design descriptions (DBDDs).
Note: Design decisions remain at the discretion of the developer unless formally converted
to requirements through contractual processes. The developer is responsible for fulfilling all
requirements and demonstrating this fulfillment through qualification testing (see 5.9, 5.11).
Design decisions act as developer-internal "requirements," to be implemented, imposed on
subcontractors, if applicable, and confirmed by developer-internal testing, but their fulfillment
need not be demonstrated to the acquirer.
5.4.2 System architectural design. The developer shall participate in defining and recording the
architectural design of the system (identifying the components of the system, their interfaces, and
a concept of execution among them) and the traceability between the system components and
system requirements. The result shall include all applicable items in the architectural design and
traceability sections of the System/Subsystem Design Description (SSDD) DID (see 6.2).
Depending on CDRL provisions, design pertaining to interfaces may be included in the SSDD or
in interface design descriptions (IDDs).
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 15
5.5 Software requirements analysis. The developer shall define and record the software
requirements to be met by each CSCI, the methods to be used to ensure that each requirement
has been met, and the traceability between the CSCI requirements and system requirements.
The result shall include all applicable items in the Software Requirements Specification (SRS) DID
(see 6.2). Depending on CDRL provisions, requirements concerning CSCI interfaces may be
included in SRSs or in interface requirements specifications (IRSs).
Note: If a CSCI is developed in multiple builds, its requirements may not be fully defined until
the final build. The developers planning should identify the subset of each CSCIs
requirements to be defined in each build and the subset to be implemented in each build.
Software requirements analysis for a given build should be interpreted to mean defining the
CSCI requirements so identified for that build.
5.6 Software design. The developer shall perform software design in accordance with the
following requirements.
Note: If a CSCI is developed in multiple builds, its design may not be fully defined until the
final build. Software design in each build should be interpreted to mean the design necessary
to meet the CSCI requirements to be implemented in that build.
5.6.1 CSCI-wide design decisions. The developer shall define and record CSCI-wide design
decisions (that is, decisions about the CSCIs behavioral design and other decisions affecting the
selection and design of the software units comprising the CSCI). The result shall include all
applicable items in the CSCI-wide design section of the Software Design Description (SDD) DID
(see 6.2). Depending on CDRL provisions, design pertaining to interfaces may be included in
SDDs or in interface design descriptions (IDDs) and design pertaining to databases may be
included in SDDs or in database design descriptions (DBDDs).
5.6.2 CSCI architectural design. The developer shall define and record the architectural design
of each CSCI (identifying the software units comprising the CSCI, their interfaces, and a concept
of execution among them) and the traceability between the software units and the CSCI
requirements. The result shall include all applicable items in the architectural design and
traceability sections of the Software Design Description (SDD) DID (see 6.2). Depending on
CDRL provisions, design pertaining to interfaces may be included in SDDs or in interface design
descriptions (IDDs).
Note: Software units may be made up of other software units and may be organized into as
many levels as are needed to represent the CSCI architecture. For example, a CSCI may
be divided into three software units, each of which is divided into additional software units,
and so on.
5.6.3 CSCI detailed design. The developer shall develop and record a description of each
software unit. The result shall include all applicable items in the detailed design section of the
Software Design Description (SDD) DID (see 6.2). Depending on CDRL provisions, design
pertaining to interfaces may be included in SDDs or in interface design descriptions (IDDs) and
design of software units that are databases or that access or manipulate databases may be
included in SDDs or in database design descriptions (DBDDs).
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 16
5.7 Software implementation and unit testing. The developer shall perform software
implementation and unit testing in accordance with the following requirements.
Note: The term "software" includes both computer programs and computer databases. The
term "implementation" means converting software design into computer programs and
computer databases. If a CSCI is developed in multiple builds, software implementation and
unit testing of that CSCI will not be completed until the final build. Software implementation
and unit testing in each build should be interpreted to include those units, or parts of units,
needed to meet the CSCI requirements to be implemented in that build.
5.7.1 Software implementation. The developer shall develop and record software corresponding
to each software unit in the CSCI design. This activity shall include, as applicable, coding
computer instructions and data definitions, building databases, populating databases and other
data files with data values, and other activities needed to implement the design. For deliverable
software, the developer shall obtain acquirer approval to use any programming language not
specified in the contract.
Note: Software units in the design may or may not have a one-to-one relationship with the
code and data entities (routines, procedures, databases, data files, etc.) that implement them
or with the computer files containing those entities.
5.7.2 Preparing for unit testing. The developer shall establish test cases (in terms of inputs,
expected results, and evaluation criteria), test procedures, and test data for testing the software
corresponding to each software unit. The test cases shall cover all aspects of the units detailed
design. The developer shall record this information in the appropriate software development files
(SDFs).
5.7.3 Performing unit testing. The developer shall test the software corresponding to each
software unit. The testing shall be in accordance with the unit test cases and procedures.
5.7.4 Revision and retesting. The developer shall make all necessary revisions to the software,
perform all necessary retesting, and update the software development files (SDFs) and other
software products as needed, based on the results of unit testing.
5.7.5 Analyzing and recording unit test results. The developer shall analyze the results of unit
testing and shall record the test and analysis results in appropriate software development files
(SDFs).
5.8 Unit integration and testing. The developer shall perform unit integration and testing in
accordance with the following requirements.
Note 1: Unit integration and testing means integrating the software corresponding to two or
more software units, testing the resulting software to ensure that it works together as
intended, and continuing this process until all software in each CSCI is integrated and tested.
The last stage of this testing is developer-internal CSCI testing. Since units may consist of
other units, some unit integration and testing may take place during unit testing. The
requirements in this section are not meant to duplicate those activities.
Note 2: If a CSCI is developed in multiple builds, unit integration and testing of that CSCI will
not be completed until the final build. Unit integration and testing in each build should be
interpreted to mean integrating software developed in the current build with other software
developed in that and previous builds, and testing the results.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 17
5.8.1 Preparing for unit integration and testing. The developer shall establish test cases (in
terms of inputs, expected results, and evaluation criteria), test procedures, and test data for
conducting unit integration and testing. The test cases shall cover all aspects of the CSCI-wide
and CSCI architectural design. The developer shall record this information in the appropriate
software development files (SDFs).
5.8.2 Performing unit integration and testing. The developer shall perform unit integration and
testing. The testing shall be in accordance with the unit integration test cases and procedures.
5.8.3 Revision and retesting. The developer shall make all necessary revisions to the software,
perform all necessary retesting, and update the software development files (SDFs) and other
software products as needed, based on the results of unit integration and testing.
5.8.4 Analyzing and recording unit integration and test results. The developer shall analyze the
results of unit integration and testing and shall record the test and analysis results in appropriate
software development files (SDFs).
5.9 CSCI qualification testing. The developer shall perform CSCI qualification testing in
accordance with the following requirements.
Note 1: CSCI qualification testing is performed to demonstrate to the acquirer that CSCI
requirements have been met. It covers the CSCI requirements in software requirements
specifications (SRSs) and in associated interface requirements specifications (IRSs). This
testing contrasts with developer-internal CSCI testing, performed as the final stage of unit
integration and testing.
Note 2: If a CSCI is developed in multiple builds, its CSCI qualification testing will not be
completed until the final build for that CSCI, or possibly until later builds involving items with
which the CSCI is required to interface. CSCI qualification testing in each build should be
interpreted to mean planning and performing tests of the current build of each CSCI to ensure
that the CSCI requirements to be implemented in that build have been met.
5.9.1 Independence in CSCI qualification testing. The person(s) responsible for qualification
testing of a given CSCI shall not be the persons who performed detailed design or implementation
of that CSCI. This does not preclude persons who performed detailed design or implementation
of the CSCI from contributing to the process, for example, by contributing test cases that rely on
knowledge of the CSCIs internal implementation.
5.9.2 Testing on the target computer system. CSCI qualification testing shall include testing on
the target computer system or an alternative system approved by the acquirer.
5.9.3 Preparing for CSCI qualification testing. The developer shall define and record the test
preparations, test cases, and test procedures to be used for CSCI qualification testing and the
traceability between the test cases and the CSCI requirements. The result shall include all
applicable items in the Software Test Description (STD) DID (see 6.2). The developer shall
prepare the test data needed to carry out the test cases and provide the acquirer advance notice
of the time and location of CSCI qualification testing.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 18
5.9.4 Dry run of CSCI qualification testing. If CSCI qualification testing is to be witnessed by
the acquirer, the developer shall dry run the CSCI test cases and procedures to ensure that they
are complete and accurate and that the software is ready for witnessed testing. The developer
shall record the results of this activity in appropriate software development files (SDFs) and shall
update the CSCI test cases and procedures as appropriate.
5.9.5 Performing CSCI qualification testing. The developer shall perform CSCI qualification
testing of each CSCI. The testing shall be in accordance with the CSCI test cases and
procedures.
5.9.6 Revision and retesting. The developer shall make necessary revisions to the software,
provide the acquirer advance notice of retesting, conduct all necessary retesting, and update the
software development files (SDFs) and other software products as needed, based on the results
of CSCI qualification testing.
5.9.7 Analyzing and recording CSCI qualification test results. The developer shall analyze and
record the results of CSCI qualification testing. The results shall include all applicable items in
the Software Test Report (STR) DID (see 6.2).
5.10 CSCI/HWCI integration and testing. The developer shall participate in CSCI/HWCI
integration and testing activities in accordance with the following requirements.
Note 1: CSCI/HWCI integration and testing means integrating CSCIs with interfacing HWCIs
and CSCIs, testing the resulting groupings to determine whether they work together as
intended, and continuing this process until all CSCIs and HWCIs in the system are integrated
and tested. The last stage of this testing is developer-internal system testing.
5.10.1 Preparing for CSCI/HWCI integration and testing. The developer shall participate in
developing and recording test cases (in terms of inputs, expected results, and evaluation criteria),
test procedures, and test data for conducting CSCI/HWCI integration and testing. The test cases
shall cover all aspects of the system-wide and system architectural design. The developer shall
record software-related information in appropriate software development files (SDFs).
5.10.2 Performing CSCI/HWCI integration and testing. The developer shall participate in
CSCI/HWCI integration and testing. The testing shall be in accordance with the CSCI/HWCI
integration test cases and procedures.
5.10.3 Revision and retesting. The developer shall make necessary revisions to the software,
participate in all necessary retesting, and update the appropriate software development files
(SDFs) and other software products as needed, based on the results of CSCI/HWCI integration
and testing.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 19
5.10.4 Analyzing and recording CSCI/HWCI integration and test results. The developer shall
participate in analyzing the results of CSCI/HWCI integration and testing. Software-related
analysis and test results shall be recorded in appropriate software development files (SDFs).
5.11 System qualification testing. The developer shall participate in system qualification testing
in accordance with the following requirements.
Note 1: System qualification testing is performed to demonstrate to the acquirer that system
requirements have been met. It covers the system requirements in the system/subsystem
specifications (SSSs) and in associated interface requirements specifications (IRSs). This
testing contrasts with developer-internal system testing, performed as the final stage of
CSCI/HWCI integration and testing.
5.11.1 Independence in system qualification testing. The person(s) responsible for fulfilling the
requirements in this section shall not be the persons who performed detailed design or
implementation of software in the system. This does not preclude persons who performed
detailed design or implementation of software in the system from contributing to the process, for
example, by contributing test cases that rely on knowledge of the systems internal
implementation.
5.11.2 Testing on the target computer system. The developers system qualification testing shall
include testing on the target computer system or an alternative system approved by the acquirer.
5.11.3 Preparing for system qualification testing. The developer shall participate in developing
and recording the test preparations, test cases, and test procedures to be used for system
qualification testing and the traceability between the test cases and the system requirements. For
software systems, the results shall include all applicable items in the Software Test Description
(STD) DID (see 6.2). The developer shall participate in preparing the test data needed to carry
out the test cases and in providing the acquirer advance notice of the time and location of system
qualification testing.
5.11.4 Dry run of system qualification testing. If system qualification testing is to be witnessed
by the acquirer, the developer shall participate in dry running the system test cases and
procedures to ensure that they are complete and accurate and that the system is ready for
witnessed testing. The developer shall record the software-related results of this activity in
appropriate software development files (SDFs) and shall participate in updating the system test
cases and procedures as appropriate.
5.11.5 Performing system qualification testing. The developer shall participate in system
qualification testing. This participation shall be in accordance with the system test cases and
procedures.
5.11.6 Revision and retesting. The developer shall make necessary revisions to the software,
provide the acquirer advance notice of retesting, participate in all necessary retesting, and update
the software development files (SDFs) and other software products as needed, based on the
results of system qualification testing.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 20
5.11.7 Analyzing and recording system qualification test results. The developer shall participate
in analyzing and recording the results of system qualification testing. For software systems, the
result shall include all applicable items in the Software Test Report (STR) DID (see 6.2).
5.12 Preparing for software use. The developer shall prepare for software use in accordance
with the following requirements.
Note: If software is developed in multiple builds, the developers planning should identify what
software, if any, is to be fielded to users in each build and the extent of fielding (for example,
full fielding or fielding to selected evaluators only). Preparing for software use in each build
should be interpreted to include those activities necessary to carry out the fielding plans for
that build.
5.12.1 Preparing the executable software. The developer shall prepare the executable software
for each user site, including any batch files, command files, data files, or other software files
needed to install and operate the software on its target computer(s). The result shall include all
applicable items in the executable software section of the Software Product Specification (SPS)
DID (see 6.2).
Note: To order only the executable software (delaying delivery of source files and associated
support information to a later build), the acquirer can use the SPS DID, tailoring out all but
the executable software section of that DID.
5.12.2 Preparing version descriptions for user sites. The developer shall identify and record the
exact version of software prepared for each user site. The information shall include all applicable
items in the Software Version Description (SVD) DID (see 6.2).
5.12.3 Preparing user manuals. The developer shall prepare user manuals in accordance with
the following requirements.
Note: Few, if any, systems will need all of the manuals in this section. The intent is for the
acquirer, with input from the developer, to determine which manuals are appropriate for a
given system and to require the development of only those manuals. All DIDs permit
substitution of commercial or other manuals that contain the required information. The
manuals in this section are normally developed in parallel with software development, ready
for use in CSCI testing.
5.12.3.1 Software user manuals. The developer shall identify and record information needed
by hands-on users of the software (persons who will both operate the software and make use of
its results). The information shall include all applicable items in the Software User Manual (SUM)
DID (see 6.2).
5.12.3.2 Software input/output manuals. The developer shall identify and record information
needed by persons who will submit inputs to, and receive outputs from, the software, relying on
others to operate the software in a computer center or other centralized or networked software
installation. The information shall include all applicable items in the Software Input/Output Manual
(SIOM) DID (see 6.2).
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 21
5.12.3.3 Software center operator manuals. The developer shall identify and record information
needed by persons who will operate the software in a computer center or other centralized or
networked software installation, so that it can be used by others. The information shall include
all applicable items in the Software Center Operator Manual (SCOM) DID (see 6.2).
5.12.3.4 Computer operation manuals. The developer shall identify and record information
needed to operate the computers on which the software will run. The information shall include
all applicable items in the Computer Operation Manual (COM) DID (see 6.2).
a. Install and check out the executable software at the user sites specified in the contract.
5.13 Preparing for software transition. The developer shall prepare for software transition in
accordance with the following requirements.
Note: If software is developed in multiple builds, the developers planning should identify what
software, if any, is to be transitioned to the support agency in each build. Preparing for
software transition in each build should be interpreted to include those activities necessary
to carry out the transition plans for that build.
5.13.1 Preparing the executable software. The developer shall prepare the executable software
to be transitioned to the support site, including any batch files, command files, data files, or other
software files needed to install and operate the software on its target computer(s). The result
shall include all applicable items in the executable software section of the Software Product
Specification (SPS) DID (see 6.2).
5.13.2 Preparing source files. The developer shall prepare the source files to be transitioned to
the support site, including any batch files, command files, data files, or other files needed to
regenerate the executable software. The result shall include all applicable items in the source
file section of the Software Product Specification (SPS) DID (see 6.2).
5.13.3 Preparing version descriptions for the support site. The developer shall identify and record
the exact version of software prepared for the support site. The information shall include all
applicable items in the Software Version Description (SVD) DID (see 6.2).
5.13.4 Preparing the "as built" CSCI design and related information. The developer shall update
the design description of each CSCI to match the "as built" software and shall define and record:
the methods to be used to verify copies of the software, the measured computer hardware
resource utilization for the CSCI, other information needed to support the software, and
traceability between the CSCIs source files and software units and between the computer
hardware resource utilization measurements and the CSCI requirements concerning them. The
result shall include all applicable items in the qualification, software support, and traceability
sections of the Software Product Specification (SPS) DID (see 6.2).
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 22
Note: In hardware development, the final product is an approved design from which hardware
items can be manufactured. This design is presented in the product specification. In software
development, by contrast, the final product is the software, not its design, and "manufacturing"
consists of electronically duplicating the software, not recreating it from the design. The "as
built" design is included in the software product specification not as the product but as
information that may help the support agency understand the software in order to modify,
enhance, and otherwise support it.
5.13.5 Updating the system design description. The developer shall participate in updating the
system design description to match the "as built" system. The result shall include all applicable
items in the System/Subsystem Design Description (SSDD) DID (see 6.2).
5.13.6 Preparing support manuals. The developer shall prepare support manuals in accordance
with the following requirements.
Note: Not all systems will need the manuals in this section. The intent is for the acquirer,
with input from the developer, to determine which manuals are appropriate for a given system
and to require the development of only those manuals. All DIDs permit substitution of
commercial or other manuals that contain the required information. The manuals in this
section supplement the system/subsystem design description (SSDD) and the software
product specifications (SPSs), which serve as the primary sources of information for software
support. The user manuals cited in 5.12.3 are also useful to support personnel.
5.13.6.1 Computer programming manuals. The developer shall identify and record information
needed to program the computers on which the software was developed or on which it will run.
The information shall include all applicable items in the Computer Programming Manual (CPM)
DID (see 6.2).
5.13.6.2 Firmware support manuals. The developer shall identify and record information
needed to program and reprogram any firmware devices in which the software will be installed.
The information shall include all applicable items in the Firmware Support Manual (FSM) DID (see
6.2).
a. Install and check out the deliverable software in the support environment designated in
the contract.
5.14 Software configuration management. The developer shall perform software configuration
management in accordance with the following requirements.
Note: If a system or CSCI is developed in multiple builds, the software products of each build
may be refinements of, or additions to, software products of previous builds. Software
configuration management in each build should be understood to take place in the context of
the software products and controls in place at the start of the build.
5.14.2 Configuration control. The developer shall establish and implement procedures
designating the levels of control each identified entity must pass through (for example, author
control, project-level control, acquirer control); the persons or groups with authority to authorize
changes and to make changes at each level (for example, the programmer/analyst, the software
lead, the project manager, the acquirer); and the steps to be followed to request authorization for
changes, process change requests, track changes, distribute changes, and maintain past
versions. Changes that affect an entity already under acquirer control shall be proposed to the
acquirer in accordance with contractually established forms and procedures, if any.
5.14.3 Configuration status accounting. The developer shall prepare and maintain records of the
configuration status of all entities that have been placed under project-level or higher configuration
control. These records shall be maintained for the life of the contract. They shall include, as
applicable, the current version/revision/release of each entity, a record of changes to the entity
since being placed under project-level or higher configuration control, and the status of
problem/change reports affecting the entity.
5.14.4 Configuration audits. The developer shall support acquirer-conducted configuration audits
as specified in the contract.
Note: These configuration audits may be called Functional Configuration Audits and Physical
Configuration Audits.
5.14.5 Packaging, storage, handling, and delivery. The developer shall establish and implement
procedures for the packaging, storage, handling, and delivery of deliverable software products.
The developer shall maintain master copies of delivered software products for the duration of the
contract.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 24
5.15 Software product evaluation. The developer shall perform software product evaluation in
accordance with the following requirements.
Note: If a system or CSCI is developed in multiple builds, the software products of each build
should be evaluated in the context of the objectives established for that build. A software
product that meets those objectives can be considered satisfactory even though it is missing
information designated for development in later builds.
5.15.1 In-process and final software product evaluations. The developer shall perform in-process
evaluations of the software products generated in carrying out the requirements of this standard.
In addition, the developer shall perform a final evaluation of each deliverable software product
before its delivery. The software products to be evaluated, criteria to be used, and definitions for
those criteria are given in Appendix D.
5.15.2 Software product evaluation records. The developer shall prepare and maintain records
of each software product evaluation. These records shall be maintained for the life of the
contract. Problems in software products under project-level or higher configuration control shall
be handled as described in 5.17 (Corrective action).
5.15.3 Independence in software product evaluation. The persons responsible for evaluating a
software product shall not be the persons who developed the product. This does not preclude
the persons who developed the software product from taking part in the evaluation (for example,
as participants in a walk-through of the product).
5.16 Software quality assurance. The developer shall perform software quality assurance in
accordance with the following requirements.
Note: If a system or CSCI is developed in multiple builds, the activities and software products
of each build should be evaluated in the context of the objectives established for that build.
An activity or software product that meets those objectives can be considered satisfactory
even though it is missing aspects designated for later builds. Planning for software quality
assurance is included in software development planning (see 5.1.1).
5.16.1 Software quality assurance evaluations. The developer shall conduct on-going evaluations
of software development activities and the resulting software products to:
a. Assure that each activity required by the contract or described in the software
development plan is being performed in accordance with the contract and with the
software development plan.
b. Assure that each software product required by this standard or by other contract
provisions exists and has undergone software product evaluations, testing, and corrective
action as required by this standard and by other contract provisions.
5.16.2 Software quality assurance records. The developer shall prepare and maintain records
of each software quality assurance activity. These records shall be maintained for the life of the
contract. Problems in software products under project-level or higher configuration control and
problems in activities required by the contract or described in the software development plan shall
be handled as described in 5.17 (Corrective action).
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 25
5.16.3 Independence in software quality assurance. The persons responsible for conducting
software quality assurance evaluations shall not be the persons who developed the software
product, performed the activity, or are responsible for the software product or activity. This does
not preclude such persons from taking part in these evaluations. The persons responsible for
assuring compliance with the contract shall have the resources, responsibility, authority, and
organizational freedom to permit objective software quality assurance evaluations and to initiate
and verify corrective actions.
5.17 Corrective action. The developer shall perform corrective action in accordance with the
following requirements.
5.17.2 Corrective action system. The developer shall implement a corrective action system for
handling each problem detected in software products under project-level or higher configuration
control and each problem in activities required by the contract or described in the software
development plan. The system shall meet the following requirements:
b. The system shall be closed-loop, ensuring that all detected problems are promptly
reported and entered into the system, action is initiated on them, resolution is achieved,
status is tracked, and records of the problems are maintained for the life of the contract.
c. Each problem shall be classified by category and priority, using the categories and
priorities in Appendix C or approved alternatives.
e. Corrective actions shall be evaluated to determine whether problems have been resolved,
adverse trends have been reversed, and changes have been correctly implemented
without introducing additional problems.
5.18 Joint technical and management reviews. The developer shall plan and take part in joint
(acquirer/developer) technical and management reviews in accordance with the following
requirements.
Note: If a system or CSCI is developed in multiple builds, the types of joint reviews held and
the criteria applied will depend on the objectives of each build. Software products that meet
those objectives can be considered satisfactory even though they are missing information
designated for development in later builds.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 26
5.18.1 Joint technical reviews. The developer shall plan and take part in joint technical reviews
at locations and dates proposed by the developer and approved by the acquirer. These reviews
shall be attended by persons with technical knowledge of the software products to be reviewed.
The reviews shall focus on in-process and final software products, rather than materials generated
especially for the review. The reviews shall have the following objectives:
a. Review evolving software products, using as criteria the software product evaluation
criteria in Appendix D; review and demonstrate proposed technical solutions; provide
insight and obtain feedback on the technical effort; surface and resolve technical issues.
b. Review project status; surface near- and long-term risks regarding technical, cost, and
schedule issues.
c. Arrive at agreed-upon mitigation strategies for identified risks, within the authority of those
present.
5.18.2 Joint management reviews. The developer shall plan and take part in joint management
reviews at locations and dates proposed by the developer and approved by the acquirer. These
reviews shall be attended by persons with authority to make cost and schedule decisions and
shall have the following objectives. Examples of such reviews are identified in Appendix E.
a. Keep management informed about project status, directions being taken, technical
agreements reached, and overall status of evolving software products.
c. Arrive at agreed-upon mitigation strategies for near- and long-term risks that could not be
resolved at joint technical reviews.
d. Identify and resolve management-level issues and risks not raised at joint technical
reviews.
e. Obtain commitments and acquirer approvals needed for timely accomplishment of the
project.
5.19 Other activities. The developer shall perform the following activities.
5.19.1 Risk management. The developer shall perform risk management throughout the software
development process. The developer shall identify, analyze, and prioritize the areas of the
software development project that involve potential technical, cost, or schedule risks; develop
strategies for managing those risks; record the risks and strategies in the software development
plan; and implement the strategies in accordance with the plan.
MIL-STD-498 (PDF version) 5. Detailed Requirements Page 27
5.19.2 Software management indicators. The developer shall use software management
indicators to aid in managing the software development process and communicating its status to
the acquirer. The developer shall identify and define a set of software management indicators,
including the data to be collected, the methods to be used to interpret and apply the data, and
the planned reporting mechanism. The developer shall record this information in the software
development plan and shall collect, interpret, apply, and report on those indicators as described
in the plan. Candidate indicators are given in Appendix F.
5.19.3 Security and privacy. The developer shall meet the security and privacy requirements
specified in the contract. These requirements may affect the software development effort, the
resulting software products, or both.
5.19.4 Subcontractor management. If subcontractors are used, the developer shall include in
subcontracts all contractual requirements necessary to ensure that software products are
developed in accordance with prime contract requirements.
5.19.5 Interface with software IV&V agents. The developer shall interface with the software
Independent Verification and Validation (IV&V) agent(s) as specified in the contract.
5.19.6 Coordination with associate developers. The developer shall coordinate with associate
developers, working groups, and interface groups as specified in the contract.
5.19.7 Improvement of project processes. The developer shall periodically assess the processes
used on the project to determine their suitability and effectiveness. Based on these assessments,
the developer shall identify any necessary and beneficial improvements to the process, shall
identify these improvements to the acquirer in the form of proposed updates to the software
development plan and, if approved, shall implement the improvements on the project.
MIL-STD-498 (PDF version) 6. Notes Page 28
6. NOTES
(This section contains information of a general or explanatory nature that may be helpful,
but is not mandatory.)
6.1 Intended use. This standard contains requirements for the development and
documentation of software. Its application is described in 1.2.
6.2 Data requirements. The following Data Item Descriptions (DIDs) must be listed, as
applicable, on the Contract Data Requirements List (DD Form 1423) when this standard is applied
on a contract, in order to obtain the data, except where DOD FAR Supplement 227.405-70
exempts the requirement for a DD Form 1423.
The above DIDs were those cleared as of the date of this standard. The current issue of DOD
5010.12, Acquisition Management Systems and Data Requirements Control List (AMSDL), must
be researched to ensure that only current, cleared DIDs are cited on the Form 1423.
MIL-STD-498 (PDF version) 6. Notes Page 29
6.3 Relationship between standard and CDRL. If the CDRL calls for a DID different from the
one named in corresponding paragraph(s) of this standard, all references to the DID in the
standard should be interpreted to mean the one in the CDRL.
6.4 Delivery of tool contents. Depending on contract provisions, the developer may be
permitted to satisfy CDRL requirements by delivering: 1) a repository or database containing the
information specified in the cited DID; 2) a means of accessing that repository or database, such
as a CASE tool, if not already available to the recipients designated on the CDRL; and 3) a hard-
copy or electronically stored table of contents, specifying how and where to access the
information required in each paragraph of the DID.
6.5 Tailoring guidance. This standard and its Data Item Descriptions (DIDs) are applied at the
discretion of the acquirer. In each application, the standard and DIDs should be tailored to the
specific requirements of a particular program, program phase, or contractual structure. Care
should be taken to eliminate tasks that add unnecessary costs and data that do not add value
to the process or the product. Tailoring for the standard takes the form of deletion of activities,
alteration of activities to more explicitly reflect the application to a particular effort, or addition of
activities to satisfy program requirements. This tailoring is specified in the Statement of Work.
Tailoring for the DIDs consists of deleting requirements for unneeded information and making
other changes, such as combining two documents under one cover, that do not increase the
required workload. DID tailoring for deliverables is specified in Block 16 of the CDRL.
6.6 Cost/schedule reporting. Developer cost/schedule reports should be prepared at the CSCI
level. The cost reports should indicate budgeted versus actual expenditures and should conform
to the Work Breakdown Structure (WBS) applicable to the development effort. These reports
should also indicate to the acquirer planned, actual, and predicted progress.
6.8 Subject term (key word) listing. The following list of key words may be used to catalog
or characterize key topics in this standard.
Computer security (4.2.4.2) DOD-5200.28 STD, DoD Trusted Computer System Evaluation Criteria
Joint technical and ANSI/IEEE Std 1028, Standard for Software Reviews and Audits
management reviews (5.18, MIL-STD-499, Engineering Management
App. E) MIL-STD-1521, Technical Reviews and Audits for Systems, Equipments,
and Computer Software (audit portion superseded by MIL-STD-973)
Software design ANSI/IEEE Std 1016, Recommended Practice for Software Design
(5.4, 5.6) Descriptions
IEEE Std 1016.1, Guide for Software Design Descriptions
IEEE/ANSI Std 990, Recommended Practice for Ada as a Program
Design Language
Software development IEEE Std 1209, Recommended Practice for the Evaluation and
environment Selection of CASE Tools
(5.2) DOD-STD-1467 (AR), Software Support Environment
MIL-HDBK-782 (AR), Software Support Environment Acquisition
Software development ANSI/IEEE Std 1058.1, Standard for Software Project Management
planning (5.1.1) Plans
Software management ISO/IEC 9126, Quality Characteristics and Guidelines for Their Use
indicators ANSI/IEEE Std 982.2, Guide: Use of Standard Measures to Produce
(5.19.2, App. F) Reliable Software
IEEE Std 1045, Standard for Software Productivity Metrics
IEEE Std 1061, Standard for Software Quality Metrics Methodology
Software problem IEEE Std 1044, Standard Classification for Software Anomalies
categories/priorities
(Appendix C)
Software product evaluation ANSI/IEEE Std 1012, Standard for Software Verification
(5.15) and Validation Plans
IEEE Std 1059, Guide for Verification and Validation Plans
Software quality assurance ISO 9001, Quality System - Model for Quality Assurance in
(5.16) Design/Development, Production, Installation, and Servicing
ISO 9000-3, Guidelines for the Application of ISO 9001 to the Development,
Supply, and Maintenance of Software
ANSI/IEEE Std 730, Standard for Software Quality Assurance Plans
IEEE Std 1298/A3563.1, Software Quality Management System
DOD-STD-2168, Defense System Software Quality Program
MIL-HDBK-286, A Guide for DOD-STD-2168
Software testing ANSI/IEEE Std 829, Standard for Software Test Documentation
(5.1.2, 5.1.3, ANSI/IEEE Std 1008, Standard for Software Unit Testing
5.7 - 5.11) ANSI/IEEE Std 1012, Standard for Software Verification
and Validation Plans
IEEE Std 1059, Guide for Verification and Validation Plans
Software user ANSI/IEEE Std 1063, Standard for Software User Documentation
documentation (5.12.3)
Work breakdown structure MIL-STD-881, Work Breakdown Structures for Defense Materiel Items
(6.6)
APPENDIX A
LIST OF ACRONYMS
A.1 Scope. This appendix provides a list of acronyms used in this standard, with their
associated meanings. This appendix is not a mandatory part of the standard. The information
provided is intended for guidance only.
A.3 Acronyms.
APPENDIX B
B.1 Scope. This appendix interprets MIL-STD-498 when applied to the incorporation of
reusable software products. This appendix is a mandatory part of this standard, subject to
tailoring by the acquirer.
B.3 Evaluating reusable software products. The developer shall specify in the software
development plan the criteria to be used for evaluating reusable software products for use in
fulfilling the requirements of the contract. General criteria shall be the software products ability
to meet specified requirements and to be cost-effective over the life of the system. Non-
mandatory examples of specific criteria include, but are not limited to:
B.4 Interpreting MIL-STD-498 activities for reusable software products. The following rules
apply in interpreting this standard:
a. Any requirement that calls for development of a software product may be met by a
reusable software product that fulfills the requirement and meets the criteria established
in the software development plan. The reusable software product may be used as-is or
modified and may be used to satisfy part or all of the requirement. For example, a
requirement may be met by using an existing plan, specification, or design.
b. When the reusable software product to be incorporated is the software itself, some of the
requirements in this standard require special interpretation. Figure 3 provides this
interpretation. Key issues are whether the software will be modified, whether unmodified
software constitutes an entire CSCI or only one or more software units, and whether
unmodified software has a positive performance record (no firm criteria exist for making
this determination). The figure is presented in a conditional manner: If an activity in the
left column is required for a given type of software, the figure tells how to interpret the
activity for reusable software of that type.
MIL-STD-498 (PDF version) Appendix B Page 34
Interpret the activity as follows for each type of existing, reusable software:
If this
MIL-STD-498 For CSCIs to be used unmodified For software units to be used For software
activity is unmodified units being
required: modified for/
Positive No or poor Positive No or poor during project
performance performance performance performance
record record record record
5.1 Project
planning and Include the activities in this figure in project plans
oversight
5.2 Establishing Establish and apply a software test environment, software development Apply full
software devel library, and software development files as appropriate to perform the activities requirements
environment in this figure
5.3 System Consider softwares capabilities in defining the operational concept & system requirements
requirements
analysis Use test/ Test to confirm Use test/ Test to Use tests or
performance ability to meet performance confirm ability records to
records to needs records to to meet needs determine
confirm ability to confirm ability to potential to
meet needs meet needs meet needs
5.4.1 System- Consider the softwares capabilities and characteristics in designing system behavior and in
wide design making other system-wide design decisions
5.4.2 System Include the CSCI in the system Consider the units capabilities and characteristics in
architectural architecture; allocate system designating CSCIs and allocating system
design requirements to it requirements to them
5.5 Software Specify the project-specific Consider the units capabilities and characteristics in
requirements requirements the CSCI must meet; specifying the requirements for the CSCI of which it
analysis verify via records or retest that the is a part
CSCI can meet them
5.6.1 CSCI-wide No requirement: the CSCI-wide design Consider the units capabilities and characteristics in
design decisions have already been made designing CSCI behavior and making other CSCI-
(recording the "as built" design is under wide design decisions
5.13)
5.6.2 CSCI No requirement: the CSCIs Include the unit in the CSCI architecture and allocate
architectural architecture is already defined CSCI requirements to it
design (recording the "as built" design is under
5.13)
5.6.3 CSCI No requirement: the CSCIs detailed No requirement: the unit is Modify the
detailed design design is already defined (recording the already designed (recording the units design
"as built" design is under 5.13) "as built" design is under 5.13) as needed
5.7.1 Software No requirement: the software for the No requirement: the software for Modify the
implementation CSCIs units is already implemented the unit is already implemented software for
the unit
Interpret the activity as follows for each type of existing, reusable software:
If this
MIL-STD-498 For CSCIs to be used unmodified For software units to be used For software
activity is unmodified units being
required: modified for/
Positive No or poor Positive No or poor during project
performance performance performance performance
record record record record
5.8 Unit No requirement: Perform selectively Perform except Perform this testing
integration and the CSCIs units if in question and where integra-
testing are already units are tion is already
integrated accessible tested/proven
5.9 CSCI No requirement: Perform this Include the unit in CSCI qualification testing
qualification CSCI is already testing
testing tested & proven
5.11 System Include the CSCI in Include the unit in system qualification testing
qualification system qualification testing
testing
5.12 Preparing Include the software for the CSCI or unit in the executable software; include in version
for software use descriptions; handle any license issues; cover use of the CSCI or unit, as appropriate, via
existing, new, or revised user/operator manuals; install the CSCI or unit as part of the overall
system; include its use, as appropriate, in the training offered
5.13 Preparing Include the software for the CSCI or unit in the executable software; prepare source files for
for software the CSCI or unit, if available; include in version descriptions; handle any license issues; prepare
transition or provide "as built" design descriptions for software whose design is known; install the CSCI or
unit at the support site; demonstrate regenerability if source is available; include in the training
offered
5.14 Software Apply to all software products prepared, modified, or used in incorporating this software
configuration
management
5.15 Software Apply to all software products prepared or modified in incorporating this software; for software
product products used unchanged, apply unless a positive performance record or evidence of past
evaluation evaluations indicates that such an evaluation would be duplicative
5.16 Software Apply to all activities performed and all software products prepared, modified, or used in
quality incorporating this software
assurance
5.17 Corrective Apply to all activities performed and all software products prepared or modified in incorporating
action this software
5.18 Joint Cover the software products prepared or modified in incorporating this software
reviews
APPENDIX C
C.1 Scope. This appendix contains requirements for a category and priority classification
scheme to be applied to each problem submitted to the corrective action system. This appendix
is a mandatory part of the standard, subject to the following conditions: 1) these requirements
may be tailored by the acquirer, and 2) the developer may use alternate category and priority
schemes if approved by the acquirer.
a. Assign each problem in software products to one or more of the categories in Figure 4.
b. Assign each problem in activities to one or more of the categories in Figure 1 (shown at
the start of Section 5).
C.4 Classification by priority. The developer shall assign each problem in software products
or activities to one of the priorities in Figure 5.
MIL-STD-498 (PDF version) Appendix C Page 37
APPENDIX D
D.1 Scope. This appendix identifies the software products that are to undergo software
product evaluations, identifies the criteria to be used for each evaluation, and contains a default
set of definitions for the evaluation criteria. This appendix is a mandatory part of the standard,
subject to the following conditions: 1) these requirements may be tailored by the acquirer, 2) the
developer may use alternate criteria or definitions if approved by the acquirer, and 3) if the
development of a given software product has been tailored out of the standard, the requirement
to evaluate that product does not apply.
D.3 Required evaluations. Figure 6 identifies the software products that are to undergo
software product evaluations and states the criteria to be applied to each one. Each software
product and criterion is labelled for purposes of identification and tailoring. For convenience, they
may be treated as subparagraphs of this paragraph (referring to the first criterion, for example,
as D.3.1.a). The software products are expressed in lower case letters to convey generic
products, not necessarily in the form of hard-copy documents. Evaluations of system-level
products are to be interpreted as participation in these evaluations. Some of the criteria are
subjective. Because of this, there is no requirement to prove that the criteria have been met; the
requirement is to perform the evaluations using these criteria and to identify possible problems
for discussion and resolution.
D.4 Criteria definitions. The following paragraphs provide definitions for the criteria in Figure
6 that may not be self-explanatory. The criteria are listed in alphabetical order, matching as
closely as possible the wording used in Figure 6.
D.4.2 Adequate test cases, procedures, data, results. Test cases are adequate if they cover all
applicable requirements or design decisions and specify the inputs to be used, the expected
results, and the criteria to be used for evaluating those results. Test procedures are adequate
if they specify the steps to be followed in carrying out each test case. Test data are adequate
if they enable the execution of the planned test cases and test procedures. Test or dry run
results are adequate if they describe the results of all test cases and show that all criteria have
been met, possibly after revision and retesting.
D.4.3 Consistent with indicated product(s). This criterion means that: (1) no statement or
representation in one software product contradicts a statement or representation in the other
software products, (2) a given term, acronym, or abbreviation means the same thing in all of the
software products, and (3) a given item or concept is referred to by the same name or description
in all of the software products.
D.4.4 Contains all applicable information in (a specified DID). This criterion uses the DIDs to
specify the required content of software products, regardless of whether a deliverable document
has been ordered. Allowances are to be made for the applicability of each DID topic. The
formatting specified in the DID (required paragraphing and numbering) are not relevant to this
evaluation.
Evaluation Criteria
2. Software test plan a. STP b. c. d. e. f. g. Covers all software-related qualification activities in the
(5.1.2, 5.1.3) DID SOW
h. Covers all requirements for the items under test
i. Consistent with other project plans
j. Presents a sound approach to the testing
3. Software installation a. SIP b. c. d. e. f. g. Covers all user site installation activities in the SOW
APPENDIX D
4. Software transition a. STrP b. c. d. e. f. g. Covers all transition-related activities in the SOW
plan DID h. Consistent with other project plans
(5.1.5) i. Presents a sound approach to the transition
39 (PDF Version)
12. CSCI detailed design a. SDD, b. c. d. e. f. g. Covers CSCI requirements allocated to each unit
(5.6.3) IDD, h. Consistent with CSCI-wide design decisions
APPENDIX D
13. Implemented N/A b. c. d. e. f. g. Covers the CSCI detailed design
software
(5.7.1)
40 (PDF Version)
15. CSCI qualification a. STR b. c. d. e. f. g. Covers all planned CSCI qualification test cases
test results DID h. Shows evidence that the CSCI meets its requirements
(5.9.7)
17. System qualification a. STR b. c. d. e. f. g. Covers all planned system qualification test cases
test results DID h. Shows evidence the system meets its requirements
(5.11.7)
19. Software version a. SVD b. c. d. e. f. g. Accurately identifies the version of each software
descriptions DID component (file, unit, CSCI, etc.) delivered
(5.12.2, 5.13.3) h. Accurately identifies the changes incorporated
20. Software user a. SUM b. c. d. e. f. g. Accurately describes software installation and use to the
manuals DID intended audience of this manual
(5.12.3.1)
APPENDIX D
(5.12.3.3)
25. "As built" CSCI a. SPS b. c. d. e. f. g. Accurately describes the "as built" design of the CSCI
design and related DID h. Accurately describes compilation/build procedures
information i. Accurately describes modification procedures
(5.13.4) j. Source files cover all units in the CSCI design
k. Measured resource utilization meets CSCI requirements
26. "As built" system a. SSDD b. c. d. e. f. g. Accurately describes the "as built" system design
design DID
(5.13.5)
29. Sampling of software N/A b. N/A d. e. f. g. Contents are current with the ongoing effort
development files h. Adequate unit test cases/procedures/data/results
APPENDIX D
k. Adequate CSCI/HWCI integration test cases/
procedures/data/results
l. Adequate system qualification dry run results
42 (PDF Version)
D.4.5 Covers (a given set of items). A software product "covers" a given set of items if every
item in the set has been dealt with in the software product. For example, a plan covers the SOW
if every provision in the SOW is dealt with in the plan; a design covers a set of requirements if
every requirement has been dealt with in the design; a test plan covers a set of requirements if
every requirement is the subject of one or more tests. "Covers" corresponds to the downward
traceability (for example, from requirements to design) in the requirement, design, and test
planning/description DIDs.
D.4.6 Feasible. This criterion means that, in the knowledge and experience of the evaluator, a
given concept, set of requirements, design, test, etc. violates no known principles or lessons
learned that would render it impossible to carry out.
D.4.7 Follows software development plan. This criterion means that the software product shows
evidence of having been developed in accordance with the approach described in the software
development plan. Examples include following design and coding standards described in the
plan. For the software development plan itself, this criterion applies to updates to the initial plan.
D.4.8 Internally consistent. This criterion means that: (1) no two statements or representations
in a software product contradict one another, (2) a given term, acronym, or abbreviation means
the same thing throughout the software product, and (3) a given item or concept is referred to by
the same name or description throughout the software product.
D.4.9 Meets CDRL, if applicable. This criterion applies if the software product being evaluated
is specified in the CDRL and has been formatted for delivery at the time of evaluation. It focuses
on the format, markings, and other provisions specified in the CDRL, rather than on content,
covered by other criteria.
D.4.10 Meets SOW, if applicable. This criterion means that the software product fulfills any
Statement of Work provisions regarding it. For example, the Statement of Work may place
constraints on the operational concept or the design.
D.4.11 Presents a sound approach. This criterion means that, based on the knowledge and
experience of the evaluator, a given plan represents a reasonable way to carry out the required
activities.
D.4.12 Shows evidence that (an item under test) meets its requirements. This criterion means
that recorded test results show that the item under test either passed all tests the first time or was
revised and retested until the tests were passed.
D.4.14 Understandable. This criterion means "understandable by the intended audience." For
example, software products intended for programmer-to-programmer communication need not be
understandable by non-programmers. A product that correctly identifies its audience (based on
information in Block 3 of the corresponding DID) and is considered understandable to that
audience meets this criterion.
MIL-STD-498 (PDF version) Appendix E Page 44
APPENDIX E
E.1 Scope. This appendix describes a candidate set of joint management reviews that might
be held during a software development project. This appendix is not a mandatory part of this
standard. The information provided is intended for guidance only.
a. The acquirer has reviewed the subject products in advance, and one or more joint
technical reviews have been held to resolve issues, leaving the joint management review
as a forum to resolve open issues and reach agreement as to the acceptability of each
product.
b. Any of the reviews may be conducted incrementally, dealing at each review with a subset
of the listed items or a subset of the system or CSCI(s) being reviewed.
E.4 Candidate reviews. Given below is a set of candidate joint management reviews that
might be held during a software development project. There is no intent to require these reviews
or to preclude alternatives or combinations of these reviews. The objectives supplement those
given in 5.18.2.
E.4.1 Software plan reviews. These reviews are held to resolve open issues regarding one or
more of the following:
E.4.2 Operational concept reviews. These reviews are held to resolve open issues regarding
the operational concept for a software system.
E.4.3 System/subsystem requirements reviews. These reviews are held to resolve open issues
regarding the specified requirements for a software system or subsystem.
E.4.4 System/subsystem design reviews. These reviews are held to resolve open issues
regarding one or more of the following:
E.4.5 Software requirements reviews. These reviews are held to resolve open issues regarding
the specified requirements for a CSCI.
MIL-STD-498 (PDF version) Appendix E Page 45
E.4.6 Software design reviews. These reviews are held to resolve open issues regarding one
or more of the following:
E.4.7 Test readiness reviews. These reviews are held to resolve open issues regarding one or
more of the following:
E.4.8 Test results reviews. These reviews are held to resolve open issues regarding the results
of CSCI qualification testing or system qualification testing.
E.4.9 Software usability reviews. These reviews are held to resolve open issues regarding one
or more of the following:
E.4.10 Software supportability reviews. These reviews are held to resolve open issues regarding
one or more of the following:
E.4.11 Critical requirement reviews. These reviews are held to resolve open issues regarding the
handling of critical requirements, such as those for safety, security, and privacy.
MIL-STD-498 (PDF version) Appendix F Page 46
APPENDIX F
F.1 Scope. This appendix identifies a set of management indicators that might be used on
a software development project. This appendix is not a mandatory part of this standard. The
information provided is intended for guidance only.
F.3 Candidate indicators. Given below is a set of candidate management indicators that might
be used on a software development project. There is no intent to impose these indicators or to
preclude others.
a. Requirements volatility: total number of requirements and requirement changes over time.
b. Software size: planned and actual number of units, lines of code, or other size
measurement over time.
e. Software progress: planned and actual number of software units designed, implemented,
unit tested, and integrated over time.
f. Problem/change report status: total number, number closed, number opened in the
current reporting period, age, priority.
g. Build release content: planned and actual number of software units released in each
build.
h. Computer hardware resource utilization: planned and actual use of computer hardware
resources (such as processor capacity, memory capacity, input/output device capacity,
auxiliary storage device capacity, and communications/network equipment capacity) over
time.
k. Effect of reuse: a breakout of each of the indicators above for reused versus new software
products.
MIL-STD-498 (PDF version) Appendix G Page 47
APPENDIX G
G.1 Scope. This appendix identifies three of the program strategies used by DoD and shows
how MIL-STD-498 can be applied under each of these strategies and on a project involving
reengineering. This appendix is not a mandatory part of the standard. The information provided
is intended for guidance only.
G.3 Candidate program strategies. DODI 8120.2 describes three basic program strategies
plus a generic strategy called "other," encompassing variations, combinations, and alternatives
to the three. DODI 5000.2 identifies similar strategies, called acquisition strategies. The three
basic strategies are summarized below and in Figure 7.
a. Grand design. The "grand design" strategy (not named in DODI 5000.2 but treated as
one strategy) is essentially a "once-through, do-each-step-once" strategy. Simplistically:
determine user needs, define requirements, design the system, implement the system,
test, fix, and deliver.
G.4 Selecting an appropriate program strategy. The program strategy is selected by the
acquirer, but may be proposed by prospective or selected developers. Figure 8 illustrates a risk
analysis approach for selecting an appropriate strategy. The approach consists of listing risk
items (negatives) and opportunity items (positives) for each strategy; assigning each item a risk
or opportunity level of High, Medium, or Low; and making a decision on which strategy to use
based on a trade-off among the risks and opportunities. The fill-ins shown are sample
considerations only. An actual analysis may use others. The "DECISION" entry on the bottom
line shows which strategy was selected.
G.5 Relationship of MIL-STD-498 to program strategies. The program strategy usually applies
to the overall system. The software within the system may be acquired under the same strategy
or under a different one, such as requiring that all software be finalized in the first build of the
system. Figures 9, 10, and 11 show how MIL-STD-498 might be applied under each of the
program strategies identified in G.3. Figure 12 shows how MIL-STD-498 might be applied on a
reengineering project. All four figures are, by necessity, simplified. For example, they show MIL-
STD-498 activities in sequence when they might actually be ongoing, overlapping, or iterative;
they show each software product as a single entity, without depicting early drafts or updates; and
they represent each software product by the name of the corresponding DID, when the actual
software product is the information called for by the DID, not necessarily in the form of a hard-
copy document.
G.6 Planning software builds and tailoring MIL-STD-498. Planning the software builds on a
project and tailoring MIL-STD-498 for each build may be accomplished in several ways. The
acquirer might, for example, select an overall program strategy and tailor the standard for the
overall contract, leaving it to the developer to lay out the software builds and propose the tailoring
for each build. Alternatively, the acquirer might lay out the software builds and specify the
tailoring for each as part of the contract. The approach selected will be project-dependent. The
paragraphs below provide guidelines for planning the builds and tailoring the standard without
attempting to divide these activities between the acquirer and developer.
G.6.1 Identifying builds and their objectives. The first step in software build planning is to lay
out a series of one or more builds and to identify the objectives of each build. The top part of
Figure 13 illustrates such planning. In the example, the system/subsystem specification (SSS)
already exists and fulfillment of its requirements is divided into four builds, two of which will be
prototypes delivered to a selected set of users, and two of which will actually be fielded. A further
objective of Build 4 is transitioning the software to the designated support agency. An actual
project would expand on these objectives.
G.6.2 Identifying the MIL-STD-498 activities to be performed in each build. The next step in
build planning is identifying which MIL-STD-498 activities apply in each build and determining the
extent to which they apply. The lower part of Figure 13 shows the start of such planning. Listed
on the left are the paragraphs of MIL-STD-498. The worksheet entries indicate in which builds
each activity is to be performed and include any notes regarding the nature of each activity in
each build. For example, the figure shows that each build will include software development
planning (5.1.1), but that the nature of that planning changes in each build. Some activities will
not apply at all in a given build, some will apply identically in all builds, and some will apply
differently in different builds. Since some aspects of the project, such as number and type of
CSCIs, may not have been identified at the time the worksheet is being filled out, completion of
the worksheet may itself be incremental. The following guidelines apply:
Grand Design Incremental Evolutionary
- System too large to do all at M - User prefers all capabilities at M - User prefers all capabilities at M
once first delivery first delivery
APPENDIX G
Opportunity Item Opp. Opportunity Item Opp. Opportunity Item Opp.
(Reasons to use this strategy) Level (Reasons to use this strategy) Level (Reasons to use this strategy) Level
49 (PDF Version)
- User prefers all capabilities at M - Early capability is needed H - Early capability is needed H
first delivery
- User prefers to phase out old L - System breaks naturally into M - System breaks naturally into M
system all at once increments increments
FIGURE 8. Sample risk analysis for determining the appropriate program strategy.
Project planning and oversight
CSCI
Unit Qual
CSCI 1: Software Integ/ Test Prepare
Software Implemen/ Test for SW
Software Design Unit Test STR Use
Req STD
Analysis Executable SW
SDD/IDD/DBDD SVDs
SRS/IRS System User/op manual
CSCI CSCI/HWCI Qual
Unit Qual Integ/ Test
CSCI 2: Software Integ/ Test Test
System Software Implemen/ Test STD STR Prepare
System Design Software Design Unit Test STR for SW
Req Req STD Transition
Analysis Analysis
SSDD/IDD SDD/IDD/DBDD Executable SW
APPENDIX G
HWCI(s) (Not covered by MIL-STD-498) Support manuals
50 (PDF Version)
SW devel environment, SW configuration management, SW product evaluation, SW quality assurance, corrective action, joint reviews, risk management,
software management indicators, security/privacy, interface with IV&V, coordination with associate developers, improvement of project processes
Note: All activities may be more ongoing, overlapping, and iterative than the figure is able to show.
FIGURE 9. One possible way of applying MIL-STD-498 to the Grand Design program strategy.
BUILD 1: Establish system and software requirements and install software implementing a subset of those requirements at user sites
Project planning and oversight
SDP (focus on Build 1) STP for Build 1 SIP for Build 1; Preliminary STrP
CSCI Prepare
Unit Qual for SW
CSCI 1: Software Integ/ Test Use
Software Implemen/ Test
Software Design Unit Test STR for Build 1 Executable SW
Req STD for Build 1 SVDs
Analysis System User/op manuals for
Partial SDD/IDD/DBDD CSCI CSCI/HWCI Qual Build 1
SRS/IRS* Unit Qual Integ/ Test
Software Integ/ Test Test
System CSCI 2: Software Implemen/ Test STD STR (No
System Design Software Design Unit Test STR for for Build 1 Software
Req Req STD for Build 1 Transition)
Analysis Analysis Build 1
SSDD/IDD* Partial SDD/IDD/DBDD
OCD* SSS/IRS* SRS/IRS* (All activities may be more
ongoing, overlapping, and iterative
*Intended to be than the figure is able to show)
complete and stable HWCI(s) (Not covered by MIL-STD-498)
SW devel environment, SW configuration management, SW product evaluation, SW quality assurance, corrective action, joint reviews, other activities
APPENDIX G
Project planning and oversight
SDP updated STP updated for Build 2 SIP for Build 2; completed STrP
51 (PDF Version)
FIGURE 10. One possible way of applying MIL-STD-498 to the Incremental program strategy.
BUILD 1: Establish preliminary system/software requirements and install a prototype implementing a subset of those requirements at selected user sites
Project planning and oversight
SDP (focus on Build 1) (No STP for Build 1: SIP for Build 1; Preliminary STrP
no qual testing) (No Prepare
Unit Qual for SW
CSCI 1: Software Integ/ Test) Use
Software Implemen/ Test
Software Design Unit Test (No STR for Build 1) Executable SW
Req (No STD for Build 1) SVDs
Analysis (No Sys User/op manuals for
Partial SDD/IDD/DBDD (No CSCI/HWCI Qual Build 1
Partial SRS/IRS Unit Qual Integ/ Test)
Software Integ/ Test) Test
System CSCI 2: Software Implemen/ Test (No STD/STR (No
System Design Software Design Unit Test (No STR for for Build 1) Software
Req Req (No STD for Build 1) Transition)
Analysis Analysis Build 1)
SSDD/IDD* Partial SDD/IDD/DBDD
OCD* SSS/IRS* Partial SRS/IRS (All activities may be more
ongoing, overlapping, and iterative
*Preliminary/ than the figure is able to show)
partial HWCI(s) (Not covered by MIL-STD-498)
SW devel environment, SW configuration management, SW product evaluation, SW quality assurance, corrective action, joint reviews, other activities
APPENDIX G
Project planning and oversight
SDP updated STP for Build 2 SIP for Build 2; completed STrP
52 (PDF Version)
SW devel environment, SW configuration management, SW product evaluation, SW quality assurance, corrective action, joint reviews, other activities
FIGURE 11. One possible way of applying MIL-STD-498 to the Evolutionary program strategy.
REENGINEERING
FORWARD ENGINEERING
APPENDIX G
Analysis Design Analysis redocument redocument software Transition
derived derived des
OCD SSS/IRS Derived SSDD/IDD reqts Executable SW
Derived reqts SDD/IDD/DBDD Source files
53 (PDF Version)
SW devel environment, SW configuration management, SW product evaluation, SW quality assurance, corrective action, joint reviews, risk management,
software management indicators, security/privacy, interface with IV&V, coordination with associate developers, improvement of project process
FOR MIL-STD-498 1 2 3 4
1. Identify at right the objectives of each Deliver to selected Deliver to selected Deliver to all users a Deliver to all users a
build users an operational users an operational tested system that tested system that
prototype that meets prototype that meets meets the requirements meets all system-level
2. Indicate below which activities are to the following system- the requirements of of Builds 1 and 2 plus: requirements; transition
be accomplished during the development level requirements: Build 1 plus: SSS-2, SSS-4, SSS-7, to designated support
of each build. Add clarifying notes as SSS-1, SSS-5, ... SSS-3, SSS-15, ..., SSS-10, ..., SSS-1248 agency
needed. SSS-1250 SSS-1249
Para Activity
APPENDIX G
5.1.2 Plan for CSCI qualification No: No CSCI qual No: No CSCI qual Yes: Plan for CSCI qual Yes: Update for CSCI
testing testing in this build testing in this build testing in this build qual testing in this build
54 (PDF Version)
5.1.3 Plan for system qualification No: No system qual No: No system qual Yes: Plan for system Yes: Update for system
testing testing in this build testing in this build qual testing in this build qual testing in this build
5.1.4 Plan for installing software at No: Let users install on No: Let users install on Yes: Plan to install at Yes: Update as needed
user sites their own their own user sites for installation of Bld 4
5.1.5 Plan for transitioning software to Yes: Very preliminary Yes: Update preliminary Yes: Update preliminary Yes: Finalize transition
the support agency planning only plans plans planning
5.1.6 Follow plans; perform Yes: For those plans Yes: For those plans Yes: For those plans Yes: For those plans
management review that are in effect that are in effect that are in effect that are in effect
5.2.1 Establish a software engineering Yes: As needed for Yes: Update as needed Yes: Update as needed Yes: Update as needed
environment Build 1 for Build 2 for Build 3 for Build 4
5.2.2 Establish a software test Yes: As needed for Yes: As needed for Yes: Set up fully for Yes: Update as needed
environment Build 1 testing Build 2 testing Build 3 qualification for Build 4 qualification
testing testing
a. Different decisions will apply to different types of software on a project. These differences
can be shown within the entries of one worksheet or by using different worksheets for
different types of software on a project.
G.6.3 Recording tailoring decisions. Tailoring decisions made by the acquirer before the project
begins are specified in the Statement of Work. Tailoring proposed by the developer may be
communicated via feedback on draft solicitations, proposals written in response to solicitations,
the software development plan, joint reviews during the project, or by other means of
communication. Refinements to the tailoring decisions may be ongoing as the project proceeds.
Those involving contractual changes should be handled accordingly.
G.6.4 Scheduling the selected activities in each build. Another important step in build planning
is scheduling the activities in each build. As with tailoring, the acquirer may set forth general
milestones and have the developer provide specifics or may provide specific schedules. The
following guidelines apply:
a. A common mistake is to treat all CSCIs as though they must be developed in "lock-step,"
reaching key milestones at the same time. Allowing CSCIs to be on different schedules
can result in more optimum development.
b. A similar mistake is to treat software units as though they must be developed in "lock-
step," all designed by a certain date, implemented by a certain date, etc. Flexibility in the
scheduling of software units can also be effective.
c. The activities in MIL-STD-498 need not be performed sequentially. Several may be taking
place at one time, and an activity may be performed continually or intermittently
throughout a build or over multiple builds. The activities in each build should be laid out
in the manner that best suits the work to be done.
MIL-STD-498 (PDF version) Appendix H Page 56
APPENDIX H
H.1 Scope. This appendix provides guidance to the acquirer on the deliverables to be required
on a software development project. This appendix is not a mandatory part of this standard. The
information provided is intended for guidance only.
H.3 Ordering deliverables. MIL-STD-498 has been worded to differentiate between the
planning/engineering activities that make up a software development project and the generation
of deliverables. A key objective of this wording is to eliminate the notion that the acquirer must
order a given deliverable in order to have planning or engineering work take place. Under MIL-
STD-498, the planning and engineering work takes place regardless of which deliverables are
ordered, unless a given activity is tailored out of the standard. In addition, joint technical reviews
have been included to review the results of that work in its natural form, without the generation
of deliverables. Deliverables should be ordered only when there is a genuine need to have
planning or engineering information transformed into a deliverable, recognizing that this
transformation requires time and effort that would otherwise be spent on the engineering effort.
Block 3 of each DID provides information helpful in deciding whether the corresponding
deliverable should be ordered.
H.5 Format of deliverables. Traditional deliverables take the form of paper documents exactly
following DID formats. While this form works well for some deliverables, it is not the only form,
and alternatives should be considered. One variation from paper documents is word processing
files containing those documents. This format saves paper, but still requires the developer to
format the information as required by the DID. Another variation is specifying that a paper or
word processor document is to include all DID contents but may be in the developers format.
Yet another variation is allowing deliverables to take forms that are not traditional documents at
all, such as data in computer-aided software engineering (CASE) tools. These variations in
required format can be specified on the CDRL, minimizing the time spent transforming actual work
products into deliverables.
H.6 Tailoring the DIDs. Tailoring the DIDs consists of deleting requirements for unneeded
information and making other changes that do not increase the required workload, such as
combining two documents under one cover. DID tailoring for deliverables is specified in Block
16 of the CDRL.
MIL-STD-498 (PDF version) Appendix I Page 57
APPENDIX I
I.1 Scope. This appendix provides a conversion guide from DOD-STD-2167A and DOD-STD-
7935A, the two standards that were merged to form MIL-STD-498. It maps key terms from each
of these standards to their counterparts in MIL-STD-498 and shows the relationship of the DIDs
required by these standards to their counterparts in MIL-STD-498. This appendix is not a
mandatory part of the standard. The information provided is intended for guidance only.
I.2 Applicable documents. This appendix references the following standards, both of which
are superseded by this standard:
I.3 Mapping of key terms. Figure 14 identifies selected terms in MIL-STD-498 and states their
counterparts in DOD-STD-2167A and DOD-STD-7935A.
Software system System (No specific term used Automated Information System,
(consisting of software and to distinguish this type of system
possibly computers) system)
Hardware-software system System (No specific term used (This type of system not covered
(where the hardware may to distinguish this type of by DOD-STD-7935A)
be other than computers) system)
I.4 Mapping of DIDs. Figure 15 identifies the DOD-STD-7935A DIDs and tells which MIL-
STD-498 DIDs contain their contents. Figure 16 provides a similar mapping from the DOD-STD-
2167A DIDs to the MIL-STD-498 DIDs. Figure 17 provides the reverse mapping, identifying the
MIL-STD-498 DIDs and telling which DOD-STD-2167A and/or DOD-STD-7935A DIDs formed the
basis for each.
Software Transition Plan (STrP) 2167A Comp Res Integ Sup Doc (CRISD) - planning info
7935A Maintenance Manual (MM) - planning info
Operational Concept Description (OCD) 2167A System/Segment Design Doc (SSDD), section 3
7935A Functional Description (FD), section 2
Software Center Operator Manual (SCOM) 7935A Computer Operation Manual (OM)
Computer Operation Manual (COM) 2167A Computer System Operators Manual (CSOM)
INDEX
This index covers both MIL-STD-498 and its DIDs. Paragraphs in the DIDs are indicated by the DID
acronym followed by paragraph numbers; an overall DID is indicated by the DID acronym alone;
paragraphs and figures in the standard have no preceding acronym. DID references that begin with "10.1"
refer to paragraphs in Block 10, section 10.1 of the DID (General Instructions). All other DID references
that do not cite a Block number refer to paragraphs in Block 10, section 10.2 of the DID (Content
Requirements). Entries in bold indicate primary sources of information about a topic. The entry "et al"
indicates that a topic (such as "software") appears too frequently to cite all references.
acceptance 3.1, 3.30, E.3; IRS 3; SRS 3; SSS 3 compliance with the contract 5.1.6, 5.16.3
access computer-aided software engineering (CASE)
access for acquirer review 4.2.7; SDP 4.2.7 Foreword-3, 1.2.4.4, 3.38, 5.1.1, 6.4, Fig 2, H.5;
access to information in a repository 6.4 all DIDs: Block 7; STrP 3.3
acquirer (use of term "acquirer" in MIL-STD-498) computer center, software center 5.12.3.2, 5.12.3.3;
Foreword-4, 1.2.1, 3.2, Fig 14 OCD 7.1; SCOM; SDP 5.12.3; SIOM; SIP 4;
acquisition strategies (see program strategies) SUM
acronyms Appendix A computer hardware characteristics CPM 3, 4;
allocation FSM 3.x.1, 3.x.4, 4.1; SRS 3.10.1; SSDD 4.1;
allocation of computer hardware resources 4.2.5; SSS 3.10.1; STrP 3.2
SDP 4.2.5; SDD 4.1; SSDD 4.1 computer hardware resource utilization 4.2.5, 5.13.4,
allocation of requirements Fig 3, Fig 6; SDD 4.1, 6; F.3; OCD 8.2; SDD 4.1; SDP 4.2.5; SPS 5.4, 6;
SRS 5; SSDD 4.1, 5; SSS 5 SRS 3.10.2; SSDD 4.1; SSS 3.10.2
ANSI/IEEE/EIA 1498 Foreword-1 Computer Operation Manual (COM) 5.12.3.4, 6.2,
application of MIL-STD-498 1.2 Fig 4, Fig 6, E.4.9, Fig 9-12, Fig 16, Fig 17; COM;
approval (by the acquirer) 3.3, 5.1.6, 5.7.1, 5.9.2, SDP 5.12.3
5.13.7, 5.18.1, 5.18.2, 5.19.7, C.1, D.1 computer program 3.10
architecture 3.4 (also see CSCI architectural design, Computer Programming Manual (CPM) 5.13.6.1, 6.2,
system architectural design) Fig 6, Fig 9-12, Fig 16, Fig 17; CPM; SDP 5.13.6
"as built" design descriptions 5.13.4, 5.13.5, Fig 2, Computer Software Configuration Item (CSCI) 1.2.2,
Fig 3, Fig 6, D.4.1, Fig 17; SDP 5.13.4, 5.13.5; 3.12, 3.24, Fig 14, et al
SPS 3, 5.1 CSCI architectural design 3.4, 5.6.2, Fig 3,
associate developer 3.5, 5.19.6, Fig 9-12; SDP 5.19.6 Fig 6, E.4.6; SDD 4; SDP 5.6.2; SPS 5.1
assurance 4.2.4; SDP 4.2.4 CSCI detailed design 3.45, 5.6.3, 5.13.4, Fig 3,
(also see software quality assurance) Fig 6, E.4.6; SDD 5; SDP 5.6.3, 5.13.4;
privacy assurance 4.2.4.3; SDP 4.2.4.3 SPS 5.1 (also see Database Design
safety assurance 4.2.4.1; SDP 4.2.4.1 Description, Interface Design Description)
security assurance 4.2.4.2; SDP 4.2.4.2 CSCI/HWCI integration and testing 4.1, Fig 1, 5.10,
audits (configuration audits) 5.14.4; SDP 5.14.4 Fig 3, Fig 6, Fig 9-12; SDP 5.10
CSCI qualification testing 3.28, 3.43, 4.1, Fig 1,
behavioral design 3.6, 5.4.1, 5.6.1, Fig 2, Fig 3; 5.1.2, 5.2.2, 5.9, 5.9.5, Fig 3, Fig 6, E.4.7, E.4.8,
DBDD 3; OCD 3.3, 5.3; SDD 3; SSDD 3 Fig 9-13; IRS 4; SDP 5.1.2, 5.9; SRS 4;
builds 3.7, 5, Fig 1, G.3, G.5, Fig 10, Fig 11 STD; STP; STR
build planning guidance G.6, Fig 13 CSCI requirements 5.5, 5.6.2, 5.7, 5.9, 5.9.3,
notes on interpreting activities for multiple builds Fig 2-4, Fig 6, Fig 9-12; DBDD 3, 4, 6; IDD 4;
5.1-5.16, 5.18 SDD 4.1, 5, 6; SDP 5.5; SPS 5.4, 6; SRS 3;
STD 4.x.y.1, 5; STP 6
category classifications for problem reporting 5.17.2, (also see software requirements analysis,
Fig 2, C.1, C.3, Fig 4; SDP 5.17.2 Software Requirements Specification,
(also see corrective action) Interface Requirements Specification)
code standards 4.2.2, D.4.7, G.6.2; SDP 4.2.2; CSCI-wide design decisions 3.6, 5.6.1, Fig 3,
SPS 5.3 Fig 6, E.4.6; SDD 3, 4.1; SDP 5.6.1, 5.13.4;
coding (see software implementation and unit testing) SPS 5.1
commercial off-the-shelf (COTS) (see reuse, configuration management
reusable software products) (see software configuration management)
compilers, compilation/build procedures 3.38, 5.13.7, Continuous acquisition and life-cycle support (CALS)
Fig 6; CPM 3; SPS 5.2, 5.3; STP 3.x.1; STrP 3.3 Fig 2; all DIDs: Block 7
MIL-STD-498 (PDF version) Index Page 62
firmware 1.2.2, 3.21, 3.38, 3.43; SPS 5.2; STP 3.x.2 key decisions (recording rationale for key decisions)
Firmware Support Manual (FSM) 5.13.6.2, 6.2, Fig 6, 3.34, 4.2.6, 5.2.4; all DIDs: Notes section;
Fig 9-12, Fig 16, Fig 17; FSM; SDP 5.13.6 SDP 4.2.6
forward engineering 3.29, Fig 12 key word listing in MIL-STD-498 6.8
key terms (mapping of key MIL-STD-498 terms to
general requirements of MIL-STD-498 4 DOD-STD-2167A, DOD-STD-7935A) I.3, Fig 14
Government in-house agencies (as developers)
Foreword-4, 1.2.1 licenses, licensing B.3, Fig 3; STP 3.x.4;
Grand Design program strategy G.3, Fig 7, Fig 8, Fig 9 STrP 3.2, 3.3, 3.4 (also see data rights)
life cycle Foreword-4, Fig 2, Fig 5, G.2; SDP 3
Hardware Configuration Item (HWCI) 3.22, 3.24, 5.10; logistics SRS 3.15; SSS 3.15 (also see Continuous
SDP 5.10; SRS 3.10.1; SSDD 4, 4.1; SSS 3.10.1 acquisition and life-cycle support (CALS))
hardware-software systems 1.2.4.1, 1.2.4.2, Fig 14;
SSDD 3; SSS 3.9, 3.10, 3.12 maintenance Foreword-4, 1.2.1, 1.2.4.3, 3.33, 3.41,
Fig 14 (also see software support)
implementation management indicators
(see software implementation and unit testing) (see software management indicators)
Incremental program strategy Foreword-3, G.3, Fig 7, management review 5.1.6; SDP 5.1.6
Fig 8, Fig 10 metrics (see software management indicators)
in-process
in-process and final software product evaluations non-deliverable software products 1.2.2, 3.26, 5.2.5;
5.15.1; SDP 5.15.1 SDP 5.2.5
joint reviews of in-process and final software Notes 6; all DIDs: Notes section
products 5.18.1
independence operational concept, Operational Concept Description
independence in software product evaluation (OCD) 5.3.2, 6.2, Fig 3, Fig 4, Fig 6, E.4.2,
5.15.3; SDP 5.15.3 Fig 9-12, Fig 15-17; OCD; SDP 5.3.2
independence in software quality assurance 5.16.3; operational concept reviews E.4.2
SDP 5.16.3
independence in qualification testing 5.9.1, 5.11.1; packaging, packaging requirements 5.14.5; SDP 5.14.5;
SDP 5.9.1, 5.11.1 SPS 3.3; SRS 3.17; SSS 3.17; STrP 7
Independent Verification and Validation (IV&V) 3.23, participate (interpretation of "participate" in
5.19.5, Fig 9-12; SDP 5.19.5 MIL-STD-498) 1.2.4.2
installation participation in system-level activities
installation at the support site 5.13.7; STrP 7 5.1.3, 5.3, 5.4, 5.10, 5.11, 5.13.5, D.3;
installation at user sites 5.12.4, Fig 6, E.4.9, Fig 14; SDP 5.3, 5.4, 5.10, 5.11
SCOM 4; SDP 5.12.4; SIP; SSS 3.16; planning (see build planning, project planning,
SUM 4.1.3; SVD 3.6 software development planning,
software installation planning 5.1.4, 6.2, Fig 6, software installation planning,
E.4.1, Fig 9-13, Fig 15, Fig 17; SDP 5.1.4; SIP software transition planning, test planning)
integral software development processes 4.1 priority classifications for problem reporting 5.17.2,
integration (see CSCI/HWCI integration and testing, C.1, C.4, Fig 5, F.3 (also see corrective action)
unit integration and testing) privacy 4.2.4.3, 5.19.3, B.3, E.4.11, Fig 9-12;
interface 3.24 all DIDs 1.3; COM 3.2.4; DBDD 3, 4.x, 5.x;
interface design, Interface Design Description (IDD) FSM 3.x.5; IDD 3.x; IRS 3.x, 3.y; OCD 3.3, 5.3;
5.4.1, 5.4.2, 5.6.1, 5.6.2, 5.6.3, 6.2, Fig 6, SCOM 3.2, 3.4, 3.6, 5.5.x; SDD 3, 4.3.x; SDP 3,
Fig 9-12, Fig 15-17; DBDD 3, 5.x; IDD; 4.2.4.3, 5.19.3; SIOM 3.2, 3.4, 3.6, 4.2.1, 4.3.1,
SDD 3, 4.3, 5, 5.x, 6; SPS 5.1; SSDD 3, 4.3, 5 4.3.2; SIP 3.7, SRS 3.3.x, 3.8, 3.18; SSDD 3,
interface requirements, Interface Requirements 4.3.x; SSS 3.3.x, 3.8, 3.18; STD 3, 4; STP 3.x.1,
Specification (IRS) 5.3.3, 5.5, 5.9, 5.11, 6.2, 3.x.2, 3.x.3, 4.2.x.y; STrP 3.1-3.4; SUM 3.2, 3.4,
Fig 6, Fig 9-12, Fig 15-17; IRS; SRS 3.3, 3.4; 3.6, 4.1.2; SVD 3.1, 3.2, 3.6 (also see assurance)
SSS 3.3, 3.4; STD 5; STP 6 problem reporting, problem/change reports 5.3.1,
ISO/IEC DIS 12207 Foreword-6 5.14.3, 5.15.2, 5.16.2, 5.17.1, 5.17.2, Fig 2,
Appendix C, Fig 4, Fig 5, F.3; SDP 5.17.1;
joint technical and management reviews 3.25, 4.1, Fig 1 STR 3.1, 4.x.2.y; SVD 3.3, 3.7
5.18, Fig 2-3, Fig 9-12, G.6.3, H.3; SDP 5.18 process improvement 5.19.7, Fig 9-12; SDP 5.19.7
candidate joint management reviews Appendix E program (computer program) 3.10
MIL-STD-498 (PDF version) Index Page 64
program strategies Appendix G, Fig 7, Fig 8, safety 4.2.4.1, Fig 2, B.3, Fig 5, E.4.11; COM 3;
Fig 9-11, H.4; SDP 3 DBDD 3, 5.x; FSM 3.x.6; IDD 3.x; IRS 3.x, 3.y;
programming languages 3.29, 4.2.2, 5.7.1, Fig 2; OCD 3.3, 5.3; SCOM 4, 5; SDD 3, 4.3.x;
DBDD 5.x; SDD 5.x; SDP 4.2.2; SRS 3.12; SDP 4.2.4.1; SIOM 4, 6; SIP 4.x.5, 5.x.2;
SSS 3.12 SRS 3.3.x, 3.7, 3.18; SSDD 3, 4.3.x;
(also see Computer Programming Manual) SSS 3.3.x, 3.7, 3.18; STD 3, 4; STP 4.2.x.y;
project planning 4.1, Fig 1, 5.1, Fig 3, Fig 6, STrP 3.1; SUM 4, 5; SVD 3.6
Fig 9-13; SDP 1.4, 5.1; SIP 1.4; STP 1.4; (also see assurance)
STrP 1.4 (also see software development planning) schedules 3.34; SDP 3, 6, 7.2; SIP 3.1, 4.x.1, 5.x.1;
project-unique identifiers 5.14.1; STP 5; STrP 5, 7
DBDD 3, 4, 4.x, 5, 5.x; IDD 3, 3.1, 3.x; cost/schedule reporting, cost/schedule risks
IRS 3, 3.1, 3.x; SDD 3, 4, 4.1, 4.3.1, 4.3.x, 5, 5.x; 5.18.1, 5.18.2, 5.19.1, 6.6, B.3, Fig 5
SRS 3, 3.3.1, 3.3.x; SSDD 3, 4, 4.1, 4.3.1, 4.3.x; guidance on scheduling activities G.6.4
SSS 3, 3.3.1, 3.3.x; STD 3.x, 4.x, 4.x.y; guidance on scheduling deliverables H.4
STP 4.2.x, 4.2.x.y; STR 4.x, 4.x.2.y, 4.x.3.y scope of MIL-STD-498 1
prototypes 5.3.1, G.6.1, Fig 11, Fig 13 security 4.2.4.2, 5.19.3, Fig 2, B.3, Fig 5, E.4.11,
Fig 9-12; all DIDs 10.1.c, 1.3; COM 3.2.4;
qualification methods IRS 3, 4; SPS 4; SRS 3, 4; DBDD 3, 4.x, 5.x; FSM 3.x.5; IDD 3.x; IRS 3.x,
SSS 3, 4; STP 4.2.x.y 3.y; OCD 3.3, 5.3; SCOM 3.3, 3.4.1, 3.4.2, 3.7,
qualification testing 3.28, 3.43, 5.2.2, 5.4.1, 5.9, 5.11 5.5.x; SDD 3, 4.3.x; SDP 3, 4.2.4.2, 5.19.3, 7.2;
(also see CSCI qualification testing, SIOM 3.2, 3.4, 3.6, 4.2.1, 4.3.1, 4.3.2; SIP 3.7,
system qualification testing) 4.x.2; SRS 3.3.x, 3.8, 3.18; SSDD 3, 4.3.x;
quality assurance (see software quality assurance) SSS 3.3.x, 3.8, 3.18; STD 3, 4; STP 3.x.1, 3.x.2,
quality factors (reliability, maintainability, etc.) 3.x.3, 4.2.x.y; STrP 3.1-3.5; SUM 3.2, 3.4, 3.6,
B.3; DBDD 3; OCD 3.3, 5.3; SDD 3; SRS 3.11; 4.1.2; SVD 3.1, 3.2, 3.6 (also see assurance)
SSDD 3; SSS 3.11 software 3.32, et al
Software Center Operator Manual (SCOM) 5.12.3.3,
rationale (recording rationale) 3.34, 4.2.6, 5.2.4; 6.2, Fig 6, Fig 9-12, Fig 15, Fig 17; SCOM
all DIDs: Notes section; SDP 4.2.6 (also see Software Input/Output Manual,
record (interpretation of "record" in MIL-STD-498) Software User Manual)
1.2.4.4 software configuration management 3.12, 4.1, Fig 1,
redocumentation 3.29, Fig 12 5.14, Fig 2, Fig 3, Fig 9-12; SDP 5.14
reengineering Foreword-4, 1.2.1, 1.2.4.3, 3.29, 3.33, configuration audits 5.14.4; SDP 5.14.4
G.5, Fig 12; SDD 4.1; SSDD 4.1 configuration control, author control, project-level
requirement (definition of requirement) 3.30 control, acquirer control 5.14.1, 5.14.2, 5.14.3,
requirements analysis (see software requirements analysis, 5.15.2, 5.16.2, 5.17.1, 5.17.2, F.3; SDP 5.14.2
system requirements analysis, traceability) configuration identification 5.14.1; SDP 5.14.1
resource utilization configuration status accounting 5.14.3; SDP 5.14.3
(see computer hardware resource utilization) packaging, storage, handling, and delivery (of
restructuring 3.29, Fig 12 software products) 5.14.5; SDP 5.14.5
retargeting 3.29, Fig 12 software design, Software Design Description (SDD)
reuse, reusable software products Foreword-4, 1.2.1, 3.17, 3.39, 3.45, 4.1, 4.2.6, Fig 1, 5.2.4, 5.6, 5.7,
1.2.4.3, 3.31, 3.33, 4.2.3; SDP 4.2.3 5.9.1, 5.11.1, 6.2, Fig 2, Fig 4, Fig 6, F.3, Fig 9-12,
developing reusable software products 4.2.3.2; Fig 15-17; DBDD 5; SDD; SDP 5.6; SPS 5.1
SDP 4.2.3.2; SDD 4.1; SSDD 4.1 (also see "as built" design descriptions,
evaluating reusable software products 4.2.3.1, behavioral design, CSCI architectural design,
B.3; SDP 4.2.3.1 CSCI detailed design, CSCI-wide design decisions,
incorporating reusable software products Database Design Description,
Foreword-3, 3.12, 4.2.3.1, Appendix B, Fig 3; Interface Design Description,
all DIDs: 10.1.i; SDP 4.2.3.1; SDD 4.1; system architectural design,
SSDD 4.1 System/Subsystem Design Description,
reverse engineering 3.29, Fig 12 system-wide design decisions)
reviews (see joint technical and management reviews) software design reviews E.4.6
revision and retesting 5.7.4, 5.8.3, 5.9.6, 5.10.3, software design standards 4.2.2, D.4.7;
5.11.6; SDP 5.7.4, 5.8.3, 5.9.6, 5.10.3, 5.11.6; DBDD 3, 4, 5; IDD 3; SDD 3, 4, 5; SDP 4.2.2;
STD 4.x.y, 4.x.y.3, 5; STP 4.1.3, 5 SPS 5.3; SSDD 3, 4
risk management 5.18.1, 5.18.2, 5.19.1, B.3, Fig 5, software development (meaning of term "software
G.4, Fig 8-12; SDP 4, 5, 5.19.1 development" in MIL-STD-498) Foreword-4, 1.2.1,
3.33
MIL-STD-498 (PDF version) Index Page 65
software development environment 1.2.2, 4.1, Fig 1, 5.2, software quality assurance 4.1, Fig 1, 5.16, Fig 2,
5.14.1, Fig 2, Fig 3, E.4.10, Fig 9-13; SDP 5.2 Fig 3, Fig 9-12; SDP 5.16
(also see software engineering environment, independence in software quality assurance 5.16.3;
testing - software test environment) SDP 5.16.3
software development files (SDF) 3.34, 5.2.4, 5.7.2, software quality assurance evaluations 5.16.1;
5.7.4, 5.7.5, 5.8.1, 5.8.3, 5.8.4, 5.9.4, 5.9.6, 5.10.1, SDP 5.16.1
5.10.3, 5.10.4, 5.11.4, 5.11.6, Fig 3, software quality assurance records 5.16.2;
Fig 6; SDP 5.2.4 SDP 5.16.2
software development library (SDL) 3.35, 5.2.3, Fig 3; software requirements, Software Requirements
SDP 5.2.3 Specification (SRS) 5.5, 5.9, 6.2, Fig 2-4, Fig 6,
software development methods Foreword-5, 4.2.1; Fig 9-12, Fig 15-17; SRS; STP 6
SDP 4.2.1 (also see CSCI requirements)
software development planning, Software Development software requirements analysis 4.1, Fig 1, 5.5,
Plan (SDP) 1.2.4.2, 4.1, 5.1.1, 5.14.2, 5.16.1, Fig 3, Fig 9-12; SDP 5.5
5.16.2, 5.17.1, 5.17.2, 5.19.1, 5.19.2, 5.19.7, 6.2, software requirements review E.4.5
Fig 2, B.3, Fig 4, Fig 6, D.4.7, E.4.1, software support Foreword-3, 3.35, 3.41, 3.44, 4.2.6,
Fig 9-13, G.6.3, H.4, Fig 15-17; SDP 5.2.3, 5.2.5, 5.13.4, Fig 2, Fig 3, Fig 6, Fig 14;
software development process 3.36, 4.1, Fig 1, 5.19.1, SPS 5; SRS 3.15; SSS 3.15; STrP
5.19.2, 5.19.7, 6.5, Fig 2, Appendix G, H.4 (also see maintenance)
software engineering environment 1.2.2, 3.37, 3.38, support agency 3.44, 4.2.6 5.1.5, 5.13, 5.13.4,
4.2.7, 5.2.1, 5.2.3, Fig 13; SDP 5.2.1; CPM 3 5.13.7, E.4.10, G.6.1, Fig 13;
software implementation and unit testing 4.1, Fig 1, OCD 3.6, 5.5, 7.1, 7.2, 7.3
5.7, 5.9.1, 5.11.1, Fig 3, Fig 6, Fig 9-12, Fig 14; support concept 3.12, 5.1.5; OCD 3.5, 5.5
SDP 5.7 support manuals 5.13.6, Fig 4, Fig 6, E.4.10;
Software Input/Output Manual (SIOM) 5.12.3.2, 6.2, SDP 5.13.6; STrP 3.2, 3.3, 3.4
Fig 6, Fig 9-12, Fig 15, Fig 17; SIOM support site 5.13.1-5.13.3, 5.13.7;
(also see Software Center Operator Manual, SDP 5.13.3, 5.13.7
Software User Manual) supportability reviews E.4.10
software installation planning, Software Installation Plan software systems 1.2.4.1, 1.2.4.2, 3.42, Fig 14;
(SIP) 5.1.4, 6.2, Fig 4, Fig 6, E.4.1, Fig 9-13, SSS 3.9, 3.10
Fig 15, Fig 17; SDP 5.1.4; SIP software testing (see testing)
software maintenance (see software support) software transition 3.44
software management indicators Foreword-3, 5.19.2, software transition planning, Software Transition Plan
Fig 2, Appendix F, Fig 9-12; SDP 5.19.2 (STrP) Fig 1, 5.1.5, 6.2, Fig 4, Fig 6, E.4.1,
software plan reviews E.4.1 Fig 9-12, Fig 15-17; SDP 5.1.5; STrP
software products 1.2.1, 3.16, 3.26, 3.31, 3.39, et al software transition (preparing for) 4.1, Fig 1, 5.13,
(also see standards for software products) Fig 3, Fig 9-12; SDP 5.13
software product evaluation 4.1, Fig 1, 5.15, 5.16.1, software unit 3.24, 3.45, 5.2.4, 5.6, 5.7, 5.8, 5.13.4,
Fig 2, Fig 3, Appendix D, Fig 6, Fig 9-12; SDP 5.15 5.14.1, B.4, Fig 3, Fig 6, F.3, G.6.4, Fig 14, Fig 15,
independence in software product evaluation Fig 17; DBDD 5, 5.x, 6;
5.15.3; SDP 5.15.3 SDD 3, 4.1, 4.2, 4.3, 4.3.1, 5, 5.x, 6; SPS 5.4, 6;
in-process and final software product evaluations SRS 3.12 (also see testing)
5.15.1; SDP 5.15.1 software use (preparing for) 4.1, Fig 1, 5.12, Fig 3,
software product evaluation criteria 5.15.1, 5.18.1, Fig 9-12
Fig 6, D.4; SDP 5.15.1 software usability reviews E.4.9
software product evaluation records 5.15.2; Software User Manual (SUM) 5.12.3, 5.12.3.1, 6.2,
SDP 5.15.2 Fig 2, Fig 3, Fig 4, Fig 6, E.4.9, Fig 9-12, Fig 15-17;
software products subject to evaluation D.3, Fig 6 SDP 5.12.3; SIP 3.5, 4.x.6, 5.x.2, 5.x.3;
Software Product Specification (SPS) 5.12.1, 5.13.1, STrP 3.2-3.4; SUM (also see Software Center
5.13.2, 5.13.4, 5.13.6, 6.2, Fig 6, E.4.10, Fig 9-12, Operator Manual, Software Input/Output Manual)
Fig 15-17; SPS Software Version Description (SVD) 5.12.2, 5.13.3,
software quality 3.40 6.2, Fig 3, Fig 6, D.4.1, E.4.9, E.4.10, Fig 9-12,
Fig 16, Fig 17; SDP 5.12.2, 5.13.3; SVD
(also see version/revision/release)
source code, source files 3.29, 5.7.1, 5.12.1, 5.13.2,
5.13.4, B.3, Fig 3, Fig 4, Fig 6, F.3;
SDP 5.7.1, 5.13.2; SPS 3.2, 3.3, 5.1; SVD 3.2
delivery of source files SPS 3.2
MIL-STD-498 (PDF version) Index Page 66
staffing F.3, Fig 8; OCD 3.4, 5.4, 7.2; SDP 7; CSCI qualification testing 3.28, 3.43, 4.1, Fig 1,
SIP 3.5, 3.6, 4.x.4; STP 3.x.6, 3.x.7; STrP 3.5 5.1.2, 5.2.2, 5.9, 5.9.5, Fig 3, Fig 6, E.4.7, E.4.8,
standardization documents related to MIL-STD-498 Fig 2 Fig 9-13; IRS 4; SDP 5.1.2, 5.9; SRS 4;
standards for software products 4.2.2, SDP 4.2.2; STD; STP; STR
SPS 5.3 (also see code standards, developer-internal testing 3.34, 5.4.1, 5.8, 5.9,
software design standards) 5.10, 5.11; SDP 5.9.4, 5.11.4
states and modes of operation COM 3.2; DBDD 3, 4, 5; dry run of qualification testing 5.9.4, 5.11.4, Fig 6,
IDD 3; IRS 3; OCD 3.3, 5.3, 6, 7.1, 7.2; D.4.2
SCOM 3.5; SDD 3, 4, 5; SIOM 3.5; SRS 3.1; revision and retesting 5.7.4, 5.8.3, 5.9.6, 5.10.3,
SSDD 3, 4; SSS 3.1; SUM 3.5 5.11.6; SDP 5.7.4, 5.8.3, 5.9.6, 5.10.3, 5.11.6;
Statement of Work (SOW) 1.2.1, 6.5, 6.7, Fig 6, STD 4.x.y.3, 4.x.y.5; STP 4.1.3, 5
D.4.5, D.4.10, G.6.3 Software Test Description (STD) 5.9.3, 5.11.3, 6.2,
subcontractor, subcontractor management Foreword-4, Fig 2, Fig 4, Fig 6, D.4.2, E.4.7, Fig 9-12,
1.2.1, 3.5, 4.2.7, 5.4.1, 5.19.4; SDP 4.2.7, 5.19.4 Fig 15-17; STD
subsystem 1.2.4.1, 5.2.4, 5.3.3, E.4.3, E.4.4; SSDD; software test environment 1.2.2, 3.37, 3.43, 4.2.7,
SSS (also see system/subsystem) 5.2.2, Fig 3, E.4.7, Fig 13; SDP 5.2.2; STP 3;
support (of software) (see software support, STR 3.2
software transition planning) software test planning, Software Test Plan (STP)
system (also see operational concept) 5.1.2, 5.1.3, 6.2, Fig 4, Fig 6, E.4.1, Fig 9-12,
interpretation of "system" in MIL-STD-498 1.2.4.1, Fig 15-17; SDP 5.1.2, 5.1.3; STP
Fig 14 Software Test Report (STR) 5.9.7, 5.11.7, 6.2,
system architectural design 3.4, 5.4.2, Fig 3, Fig 4, Fig 4, Fig 6, D.4.2, E.4.8, Fig 9-12, Fig 15-17;
Fig 6, E.4.4; SDP 5.4.2, 5.13.5; SSDD 4 SDP 5.9.7, 5.11.7; STR
system design 3.17, 4.1, Fig 1, 5.4, 5.13.5, Fig 3, system qualification testing 3.28, 3.43, 4.1, Fig 1,
Fig 4, E.4.4, Fig 9-12; SDP 5.4, 5.13.5 5.1.3, 5.2.2, 5.11, 5.11.5, Fig 3, Fig 6, E.4.7,
system qualification testing 3.28, 3.43, 4.1, Fig 1, E.4.8, Fig 9-13; IRS 4; SDP 5.11; SSS 4;
5.1.3, 5.2.2, 5.11, 5.11.5, Fig 3, Fig 6, E.4.7, STD; STP; STR
E.4.8, Fig 9-13; IRS 4; SDP 5.11; SSS 4; target computer system (testing on) 5.9.2, 5.11.2;
STD; STP; STR SDP 5.9.2, 5.11.2
system requirements 5.3.3, 5.4.2, 5.5, 5.11.3, Fig 3, test classes (timing, erroneous input,
Fig 4, Fig 6; DBDD 3, 4, 6; IDD 4; SDP 5.3.3; maximum capacity) STP 4.1.2, 4.2.x.y
SSDD 3, 4, 4.1, 5; STD 4.x.y.1, 5; STP 6 test information for developer-internal testing
(also see Interface Requirements Specification, (test cases, test descriptions, test procedures,
System/Subsystem Specification) test results) 3.34, 3.39, 5.2.4, 5.7.2, 5.8.1,
system requirements analysis 4.1, Fig 1, 5.3, Fig 3, 5.10.1, Fig 2-4, Fig 6, D.4.2
Fig 9-12; SDP 5.3 test readiness reviews E.4.7
system test planning 5.1.3; STP test results reviews E.4.8
system-wide design decisions 3.6, 5.4.1, 5.10.1, unit testing 5.7, Fig 3, F.3, Fig 9-12; SDP 5.7
Fig 3, Fig 6, E.4.4; SDP 5.4.1; SSDD 3, 4.1 unit integration and testing 4.1, Fig 1, 5.8, Fig 3,
System/Subsystem Design Description (SSDD) F.3, Fig 9-12; SDP 5.8
5.4.1, 5.4.2, 5.13.5, 5.13.6, 6.2, Fig 6, Fig 9-12, witnessed testing 5.9.4, 5.11.4; STR 5
Fig 15-17; SSDD traceability 5.4.2, 5.5, 5.6.2, 5.9.3, 5.11.3, 5.13.4;
system/subsystem design reviews E.4.4 DBDD 6; IDD 4; IRS 3, 5; SDD 6; SPS 5.4, 6;
system/subsystem requirements reviews E.4.3 SRS 3, 5; SSDD 5; SSS 3, 5; STD 4.x.y.1, 5;
System/Subsystem Specification (SSS) 5.3.3, 5.11, STP 4.2.x.y, 6
6.2, Fig 6, Fig 9-12, Fig 15-17; SSS training 5.1.4, Fig 2, Fig 3; OCD 7.1, 7.2;
SIP 3.4, 3.5; SRS 3.14; SSS 3.14; STrP 5, 7
tailoring MIL-STD-498 and its DIDs Foreword-9, 1.2.2, support agency training 5.13.7; STrP 5, 7
1.2.3, 5.12.1, 6.5, Fig 2, G.6, G.6.3, H.3, H.6; user training 5.12.4; SIP 3.4, 3.5; STP 3.x.8
all DIDs: 10.1.f transition (of software) (see software transition)
target computer system (testing on) 5.9.2, 5.11.2; translation 3.29, Fig 12
SDP 5.9.2, 5.11.2
testing Fig 2, D.4.13; STD; STP; STR unit, unit testing, unit integration and testing
advance notice of qualification testing 5.9.3, 5.9.6, (see software unit, testing)
5.11.3, 5.11.6 user manuals (see software user manuals)
CSCI/HWCI integration and testing 4.1, Fig 1, 5.10,
Fig 3, Fig 6, Fig 9-12; SDP 5.10
MIL-STD-498 (PDF version) Index Page 67
version/revision/release 3.7, 5.14.1-5.14.3, B.3; Work Breakdown Structure (WBS) 6.6, Fig 2
all DIDs: 10.1.c; DBDD 3, 5.x; FSM 3.x.4;
SDD 4.1, 4.3.1; SDP 4.2.2, 5.17.1; SIP 4.x.2;
SPS 5.2; SRS 3.3.1, 3.10.3; SSDD 4.1, 4.3.1;
SSS 3.3.1, 3.10.3; STR 5; STrP 3.2-3.4; SVD
Review Activities:
OSD - SO, IR, NT
Army - AR, CR, MI, AV
Navy - AS, SH, SA, TD, OM, MC
Air Force - 02, 06, 11, 13, 17, 19
DMA - MP
DNA - DS
NSA - NS
6. SUBMITTER
a. NAME (Last, First, Middle Initial) b. ORGANIZATION
c. ADDRESS (Include Zip Code) d. TELEPHONE (Include Area Code) 7. DATED SUBMITTED
(1) Commercial (YYMMDD)
(2) AUTOVON
(if applicable)
8. PREPARING ACTIVITY
a. NAME b. TELEPHONE (Include Area Code)
Space & Naval Warfare Systems Command (1) Commercial (2) AUTOVON
(703)602-4491 332-4491
c. ADDRESS (Include Zip Code) IF YOU DO NOT RECEIVE A REPLY WITHIN 45 DAYS, CONTACT:
SPAWAR 10-12 Defense Quality and Standardization Office
2451 Crystal Drive (CPK-5) 5203 Leesburg Pike, Suite 1403, Falls Church, VA 22041-3466
Arlington, VA 22245-5200 Telephone (703)756-2340 AUTOVON 289-2340
DD Form 1426, OCT 89 Previous editions are obsolete 198/290