ADA216827

Download as pdf or txt
Download as pdf or txt
You are on page 1of 79

F"DI-TI -89- 303 MIR- I Ui-4

UThe CCPDS-R AD-A216 827


I Software
* Engineering

I
Exercise
(SEE) S DTIC
LECTE
S JAN16 19901D
G. A. HLuffB
S. NI. Mlorowski

Nox&vember 1989

* Scenario
Task Generation
d nication

K NUDET
* Processing

NUDET ~ Preparedfor
Prprm anager t)r CCI'!)- R P'rogra.i

Airs aso )tFAi!rcc,


i o~ Badse, NA.NM~thLISvt(.N

Alp r n/16r
ti, rele'as. (listninitton unlimited

I 90 01 16 048 MITREchsct
UNCLASSIFIED
SECURITY CLASSIFICATION OF THIS PAGE

REPORT DOCUMENTATION PAGE


la REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE MARKINGS

Ii
Unclassified
2a SECURITY CLASSIFICATION AUTHORITY 3 DISTRIBUTION/AVAILABILITY OF REPORT

2b DECLASSIFICATION/DOWNGRADING SCHEDULE Approved for public release;


distribution unlimited.
4. PERFORMING ORGANIZATION REPORT NUMBER(S) 5 MONITORING ORGANIZATION REPORT NUMBER(S)
MTR-10544
ESD-TR-89-303

6a NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION
(if applicable)
The MITRE Corporation

6c. ADDRESS (City, State, and ZIP Code) 7b ADDRESS (City, State, and ZIP Code)
Burlington Road
Bedford, MA 01730

8a. NAME OF FUNDING/SPONSORING 8b. OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER
ORGANIZATION (If applicable)
Program Manager (continued) ESD/SR F19628-89-C-0001

8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS
Electronic Systems Division, AFSC PROGRAM PROJECT TASK WORK UNIT
Hanscom AFB, MA 01731-5000 ELEMENT NO. NO. NO. ACCESSION NO.
-1022A
I 11 TITLE (include Security Classification)
The CCPDS-R Software Engineering Exercise (SEE)

32 PERSONAL AUTHOR(S)
Huff, G. A., Maciorowski, S. M.

13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (Year, Month, Day) 15 PAGE COUNT
Final FROM TO 1989 November 97

U 16. SUPPLEMENTARY NOTATION

17. COSATI CODES 18 SUBJECT TERMS (Continue on reverse if necessary and identify by block number)
FIELD GROUP SUB-GROUP Ada
Contractor Assessment
I Software Acquisition (continued)
19 ABSTRACT (Continue on reverse if necessary and identify by block number)
To evaluate the software engineering capabilities of potential offerors during the
Command Center Processing and Display System-Replacement (CCPDS-R) Full-Scale Development/
Production source selection, ESD and MITRE project personnel devised a software engineer-
ing exercise (SEE) to be carried out by all offerors. The SEE, first used on CCPDS-R,
has since been utilized as a standard source selection technique by ESD and other agencies.

This report describes the CCPDS-R SEE concept and provides a history of the activities and
decisions made in defining and carrying out this first SEE. It documents the SEE material
contained in the CCPDS-R Request for Proposal package and the SEE ground rules and

3 problem specification issued to the CCPDS-R offerors.


learned and makes recommendations
It
for future programs which may wish
also identifies lessons
to conduct a SEE.

20, DISTRIBUTION/AVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY CLASSIFICATION


OUNCLASSIFED/UNLIMITED 12 SAME AS RPT C3DTIC USERS Unclassified
22a NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE (Include Area Code) 22c. OFFICE SYMBOL
Judith Schultz (617) 271-8087 Mail Stop D135
DO FORM 1473,84 MAR 83 APR edition may be used until exhausted. SECURITY CLASSIFICATION OF -HIS PAGE
All other editions are obsolete. UNCLASSIFIED
ICASFE
I
UNCLASSIFIED g
8a. for CCPDS-R Program, Space and Missile Warning Systems Program Office 3
18. Software Development Methodology
Software Engineering Exercise
Source Selection

U
I
I
S
3
I
I
I
U
I
I
I

UNCLASSIFIED
I
ACKNOWLEDGMENT
!
This report has been prepared by The MITRE Corporation under Project No. 022A, Contract
No. F19628-89-C-0001. The contract is sponsored by the Electronic Systems Division, Air Force
Systems Command, United States Air Force, Hanscom Air Force Base, Massachusetts 01731-5000.
The authors wish to thank Steven D. Litvintchouk and John A. Maurer of The MITRE
Corporation for the contributions they made to the Government's effort. Their support contributed
greatly to the overall success of the CCPDS-R Software Engineering Exercise.

U
I
I
!
I
I
I~Aeesston for /

SS i
DTIC TAB
Unamnounced
1ustifioation
[
1

£ BY
Dlstrlbut o/
Availablldt/O

I'ist specal

Iii wid/
I

U TABLE OF CONTENTS

SECTION PAGE

I Introduction 1

1.1 CCPDS-R Background 2


1.1.1 System Description 2
1.1.2 Program Description 3
1.2 Overview 3
1.2.1 SEE Objectives 3
1.2.2 CCPDS-R SEE System Description 4
1.3 Scope 4

2 MITRE Dry Run 7

2.1 SEETeam 7
2.2 Selected Tools and Methodologies 8
2.3 Schedule 8
2.4 Requirements Analysis Activities 10
2.5 Design Activities 13
2.6 Resulting SEE Design 15
2.7 SEE Documentation 15
3 2.7.1 SEE Specification
2.7.2 Instructions for the Offeror
15
16

3 Plans for Evaluating the Offerors 21

1 3.1
3.2
Source Selection Terminology Overview
SEE Source Selection Approach
21
21
3.2.1 SEE Evaluation Criteria 21
3.2.2 SEE Scoring Method 22
3.2.3 Discriminator Issues 23
3.3 Evaluation Tools and Techniques 24
3.3.1 EASE 24
3.3.2 Checklist Questions 25
3.4 Release of SEE to Offerors 25
3.5 Government Evaluation Approach 27
3.5.1 First-Pass Evaluation 27
3.5.2 Audit 27

I
3
*
TABLE OF CONTENTS (Continued) i
SECTION PAGE 1
4 Dry Run Lessons Learned 29 1
4.1 Requirements Analysis 29
4.1.1 Time Allocation 29
4.1.2 Government Interaction 30
4.1.3 Formal Methodology 30
30
4.2
4.3
4.1.4 Completion
Object-Oriented
Ada
of Phase
Design 30
31
I
4.3.1 Flow of Control 31
4.3.2 ADL 32
4.3.3 Personnel 32
34
4.4 DOD-STD-2167

5 Formal Conduct of the SEE 37 1


5.1
5.2
5.3
Issuance to the Offerors
Products Received
Government Evaluation Team
37
37
38
I
5.4 Evaluation Tools and Techniques 38
5.4.1 EASE 38
38 I
5.4.2 Checklist Questions
Approach 39
5.5 Government Evaluation
5.5.1 First-Pass Evaluation 39
5.5.2
5.5.3
Audit
Evaluation Completion
40
41 1
6 Source Selection Lessons Learned 43 3
6.1 Deliverable Products 43
6.2 Exercise Scope and Duration 43
6.3 Evaluation Tools and Techniques 44
6.3.1 EASE 44
6.3.2 Checklist Questions 44
6.3.3 Word Processing Capabilities 44
6.4 In-House Audit 45 H
6.4.1 Detailed Questions 45
6.4.2 SEE Transcripts 45 i

I
I
I
I TABLE OF CONTENTS (Concluded)

I SECTION PAGE

U 7 Offeror Feedback 47
7.1 Size 47
7.2 Appropriateness 47
7.3 Resources 47
7.4 Benefits 47

8 Conclusions/Recommendations 49
8.1 Dry Run 49
8.1.1 Objectives 49
8.1.2 Software Engineering and Ada 50
8.2 Actual Source Selection 51
8.2.1 CCPDS-R SEE Objectives 52
8.2.2 General Observations 52

List of References 55

Appendix A SEE Instructions for the Offeror and Exercise Specification 57

U Appendix B CCPDS-R IFPP SEE Material 69

Appendix C CCPDS-R Section M SEE Material 77

I Appendix D SEE Questionnaire 79

Glossary 85

v
I
U
U vii

I
I
I
LIST OF ILLUSTRATIONS i

FIGURE PAGE 1
I Projected MITRE SEE Dry-Run Schedule 9 3
2 Actual MITRE SEE Dry-Run Schedule 11
3 MITRE's Overall SEE Architecture 12 I
4 Sample Buhr Diagram: RETRIEVEEVENT 14

5 Government and Offeror Activities During the Source Selection Period 26


6 DOD-STD-2167 Products and Reviews as Tailored for CCPDS-R 35 3
I
I
I
I
I
I
I

viii 5
I
I
I
U SECTION 1

INTRODUCTION

During the source selection for a software intensive system, an offeror is usually evaluated on
his software engineering approach for managing and developing the software for the subject system.
Areas of evaluation include the offeror's methodologies, tool sets, software development plan (SDP)
and staffing. While evaluation of an offeror's software engineering approach during source selection
gives insight into how the offeror intends to implement the software for the system, the evaluation is
limited because it cannot give insight into the offeror's own expertise with the selected methodology.
Frequently, the Government has assessed an offeror's software engineering approach during source
selection as adequate only to discover once on contract that the offeror is not well versed in the
proposed methodology and tool set or that the offeror does not follow the SDP. As a result of the
offeror's lack of expertise in the selected software development approach or failure to follow a firm
plan, significant cost and schedule slips are often encountered during the software development phase.

In an attempt to limit further occurrences of this situation, the Electronic Systems Division
(ESD) of the Air Force Systems Command (AFSC) determined the need for a method to be used during
source selection for evaluating not only an offeror's software development plan but also the offeror's
expertise in the proposed software development approach. The need for this method was perceived as
even greater within the next few years due to the recent Department of Defense (DOD) directives that
mandated the use of Ada as an implementation language. It was feared that proposals would be
submitted by offerors who were not well trained in Ada as a software engineering methodology. To
resolve this situation, therefore, ESD and MITRE conceived the idea of a source selection software
engineering exercise (SEE). As conceived, the purpose of the exercise was to measure the degree of
risk associated with the offeror's software development approach by testing the offeror's proposed
methodology, as demonstrated through the offeror's actual implementation of a small exercise system,
and the offeror's ability to organize a SEE team knowledgeable in the proposed software engineering
approach and Ada.
The Command Center Processing and Display System-Replacement (CCPDS-R) program was
the first of five ESD programs to date to use a SEE during source selection. ESD, with technical
support from The MITRE Corporation, tailored the SEE concept for CCPDS-R, determined the
approach for incorporating the SEE into the source selection process, and drafted the actual exercise
specification which would serve as the basis for the CCPDS-R SEE. Prior to actually using the SEE
during the CCPDS-R source selection, the Government determined that the best way to finalize the SEE
concept, specification and evaluation approach was for MITRE to implement the exercise itself. In that
way, the Government would best be able to assess the feasibility of the SEE, including the scope of the
SEE and the time that would be made available to the offerors for conducting the SEE; to ensure that the
exercise specification which would be given to the offerors was well written and sufficiently
challenging; and to identify meaningful criteria that would be used to evaluate the offeror's SEE results.

IThis report provides a history of the activities and decisions made in defining and carrying out
the SEE for the CCPDS-R full scale development/production (FSD/P) phase source selection, together
with a rationale for those activities and decisions, and a discussion of the Government's experiences
using the SEE on CCPDS-R. In particular, it describes MITRE's dry run of the SEE as part of the

I
I
I
source selection preparation effort, and identifies lessons learned prior to the start of the source n
selection. It also describes the actual execution of the SEE during the CCPDS-R source selection and
the lessons learned during that period. Appendices to this report contain the CCPDS-R SEE exercise
specification and ground rules provided to the CCPDS-R FSD/P offerors, as well as the SEE
information included in the CCPDS-R FSD/P request for proposal (RFP) package.

1.1 CCPDS-R BACKGROUND I


1.1.1 System Description 3
CCPDS-R will replace the current Command Center Processing and Display System (CCPDS)
and the missile warning function of the NORAD Computer System. The current CCPDS is located at
four command centers: Headquarters, Strategic Air Command (SAC), Cheyenne Mountain Air Force
Base (CMAFB), National Military Command Center (NMCC), and the Alternate National Military
I
Command Center (ANMCC). CCPDS is dedicated to the receipt, processing and display of ballistic
missile tactical warning and attack assessment (TW/AA) information. In addition, CCPDS performs
associated command center unique functions for use by the national command authorities, chairman of
the Joint Chiefs of Staff, Commander-in-Chief, SAC, and Commander-in-Chief, North American
Aerospace Defense Command in making decisions related to the execution of the single integrated
operation plan, force/command, control and communications survival, and the use of strategic reserves
during all phases of nuclear engagement.

As defined by the new integrated tactical waning and attack assessment (ITW&A) architecture,
the CCPDS-R will consist of four subsystems. These are the CMAFB subsystem, the Offutt
3
Processing and Correlation Center (OPCC) subsystem, the SAC subsystem, and the processing and
display subsystem (PDS). The PDS is to be located at the CMAFB, NMCC, ANMCC, OPCC, and
SAC.

The CMAFB and OPCC subsystems, referred to as the common subsystems, will have
identical hardware and software. They will interface to all ballistic missile sensors via survivable and
non-survivable media, process the information received from those sensors, generate displays for local
I
consoles, integrate the missile warning information with other manually entered data on air, space, and
intelligence, and have the capability for distributing this correlated information to other command
centers and subscribers. The two common subsystems will process the same sensor information and
I
serve as mutual backups in case of failure of critical components. Both subsystems will be able to
distribute correlated ITW&A data to subscribers at a given time. 3
The SAC subsystem will be physically separate from the OPCC subsystem and will be solely
devoted to the support of the SAC force management and force survival missions. It will receive data
from either the OPCC or CMAFB common subsystems, from PDS, and from command-unique
data and displays for consoles located at the SAC command
I
interfaces. It willlocations
center and other process the
at Offutt Air generate
Force Base.

The PDS subsystem will be capable of receiving and displaying correlated ITW&A information I
from the common subsystem, direct ballistic missile sensor data, and communication systems status
from the Survivable Communications Integration System (SCIS). It will be the primary system for
presentation of ITW&A information at the NMCC, ANMCC, and SAC.

2
I
I
I 1.1.2 Program Description

The CCPDS-R acquisition program consists of two phases: a concept definition/design (CD/D)
phase and a full-scale development/production phase. The CCPDS-R FSD/P effort is primarily a
software intensive effort using Department of Defense Standard (DOD-STD) 2167 [1], Ada as the
design language, and, unless a waiver is granted, Ada as the implementation language. The CCPDS-R
FSD/P effort thus requires that contractors be prepared to design and develop a real-time system in Ada
using modem software engineering practices. The CCPDS-R FSD/P contract was awarded in June
1987 to TRW.

The CCPDS-R CD/D phase, a year-long effort that concluded in August 1986, was primarily a
study effort. The CD/D contractors were TRW and Ford Aerospace and Communications Corporation,
both of which were expected to bid on the FSD/P contract. During the CD/D phase, the contractors
were required to submit draft SDPs. They were also required to perform several Ada-related activities
to analyze the feasibility of using Ada as the CCPDS-R implementation language and to demonstrate the
contractor's capability to design and develop a system in Ada, should the contractor propose to use Ada
as the implementation language. The specific Ada-related activities included:

a. Assess the feasibility of using Ada as the CCPDS-R implementation language, and evaluate
current Ada programming support environments for their suitability of meeting CCPDS-R
requirements

b. Devise a plan for efficient transition to Ada, if it is determined that the use of Ada is not
feasible on this program at the present time

3 c. Define and conduct a demonstration that shows the contractor's readiness to use Ada, if it
is determined that the use of Ada is feasible on this program now

3 d. Provide the rationale for choosing the Ada-based design language (ADL) and present an
example of the ADL.
Despite the above activities, the Government determined that the CD/D contractors had not yet
adequately demonstrated their ability to design and implement a real-time system in Ada using modem
software engineering practices. In general, the contractors had not demonstrated an end-to-end
application of their methodologies and Ada; they had only demonstrated the features of their tool sets.
Therefore, since software development and Ada constitute major risks on CCPDS-R, the Government
recognized the need for an additional method to better evaluate the software engineering and Ada
capabilities of these two contractors and, more importantly, of any other offerors who might submit

5exercise
proposals for the FSD/P phase. To that end, the Government developed the software engineering
as part of the FSD/P source selection process.

1.2 OVERVIEW

1.2.1 SEE Objectives

The Government viewed the SEE as a practical way to assess each offeror's software
engineering capability prior to FSD/P contract award. The SEE, which consists of a small system to be

*3

I
I
I
designed by each of the offerors, was intended to measure the degree of risk associated with the I
offeror's software development methodology, as documented in the offeror's software development
plan. It focused specifically on the offeror's software development methodology, as demonstrated by
the actual application of the methodology to the exercise system, and on the offeror's ability to organize
a team for the SEE, fully knowledgeable in the proposed methodology and Ada.
I
Based on the offerors' performance on the SEE, the Government expected that it would be
better able to evaluate the offerors' probability of success. In particular, if an offeror failed the exercise,
3
it would be assessed that the offeror had low probability of implementing CCPDS-R within the
proposed cost and schedule. If an offeror successfully completed the exercise, it would not guarantee
that the offeror would be able to complete CCPDS-R successfully; however, it would provide some
level of confidence in the offeror's ability to implement CCPDS-R. In either case, it would provide
£
early identification of problem areas in the offeror's software approach, thereby enabling the
Government to concentrate on these areas immediately at the start of the FSD/P phase, should the
offeror be awarded the contract.
I
The Government did not intend to evaluate every aspect of software development via the SEE.
In particular, the Government did not plan to evaluate those areas that either would not scale up to a
large software development effort or would not provide meaningful or discriminating source selection
information. As eventually defined, the SEE was intended to evaluate the requirements analysis and
design methodologies, the actual SEE design, and the team expertise for the exercise. The SEE was not
intended to evaluate coding, testing, integration, productivity, quality assurance, configuration
I
management, software metrics, full compliance with DOD-STD-2167, schedule, and management of
subcontractors. The Government elected to evaluate the offerors' proposals in these areas by following
the traditional source selection evaluation approach.
I
1.2.2 CCPDS-R SEE System Description 3
For the CCPDS-R SEE to be a meaningful measure of an offeror's ability to design and
develop a real-time system like CCPDS-R, the Government felt that the SEE system would have to
require analysis of quantitative performance requirements and concurrent processing, like CCPDS-R, it
would have to be relevant to the CCPDS-R mission, and it would have to be of suitable size and
I
complexity so that it could be done in a reasonably short period of time. The system devised for the
SEE consists of a missile-warning scenario generator and simulator. The exercise system allows the
user to create and edit scenarios consisting of missile-warning events (where an event is a missile
I
launch or nuclear detonation) and to run in real-time a missile warning simulation controlled by a
selected scenario. The exercise system also allows the user to be able to run a particular scenario
simulation while editing that same scenario file. This requirement forces the offerors to address real-
time, concurrent operations comparable to those found in CCPDS-R. Appendix A contains the
I
CCPDS-R SEE system specification as provided to the offerors.

1.3 SCOPE

This report summarizes the Government's efforts on the software engineering exercise both in
preparation for and during the CCPDS-R FSD/P source selection. Sections 2 through 4 address
Government SEE efforts prior to the start of the CCPDS-R FSD/P source selection. In particular,
section 2 addresses MITRE's own approach for dry running the SEE, section 3 describes the plans

41
I
I
I
I devised by the Government for evaluating the offerors' SEE results during source selection, and section
4 summarizes the lessons learned from the MITRE dry run of the SEE. Sections 5 through 7 describe
the actual execution of the SEE during the CCPDS-R FSD/P source selection. Section 5 summarizes
the actual source selection conduct of the SEE, including the process of issuing the SEE to the offerors,
the SEE products received from the offerors, and the Governments approach to evaluation of the
offeror's SEE products. Section 6 describes the lessons learned from administering the SEE during
source selection, and section 7 summarizes the offerors' feedback concerning the use of the SEE.
Finally, sectionthe8 provides an overall summary of conclusions and recommendations
to conducting SEE during the CCPDS-R FSD/P source selection and as a result ofreached both
using the prior
SEE in
Ithe CCPDS-R FSD/P source selection.

II
I
I
I
I
I
I
I

I
i
I
U SECTION 2

3 MITRE DRY RUN

Once the Government devised the SEE concept and completed the SEE requirements definition
and preliminary SEE system specification (also referred to as the exercise specification), but prior to
giving the SEE to the offerors, MITRE assembled a team to dry run the SEE. The primary objectives of
the dry run were to generate a clearly defined SEE system specification, to develop the ground rules for
the offerors to follow when conducting the SEE, to identify a set of discriminating SEE source selection
evaluation criteria, and to assess whether the SEE could reasonably be done in the time allotted to the
offerors. The secondary objectives of the effort were to further educate CCPDS-R staff in requirements
analysis and design methodologies, ADL, and Ada, and to gain familiarity with DOD-STD-2167, a new
software development standard for DOD acquisitions.

This section describes the MITRE dry run of the SEE. In particular, it discusses the makeup of
the MITRE SEE team; the tools, techniques, and methodologies selected for carrying out the
implementation, and the approaches taken for educating team members in them; the schedule; the
activities that occurred during requirements analysis; the activities that occurred during the design phase;
an overview of the resulting SEE design; and the source selection documentation that was produced for
the SEE using the results of the dry run.

1 2.1 SEE TEAM

The MITRE SEE team consisted of eight people with assigned roles. The particular roles,
along with the planned percentage of total time to be devoted to the SEE, are as shown below.

3 Role Number of Individuals Percentage of Time

Userf'Govemment" Representative 1 30
Software Development Manager 1 30
Technical Lead 2 80, 50
Ada Consultant 1 5
Designer 2 80,60
I Designer/Recorder 1 80

All team members had in common a computer software background and knowledge of either
PASCAL or similar higher order languages. Only two team members could be considered both
software engineering/Ada experts with extensive experience. Two other team members had
considerable Ada experience, while the remaining team members had little or no actual Ada experience.
At the start of the dry run, the team had no software development methodology or tool set in place.

I
!7
I

2.2 SELECTED TOOLS AND METHODOLOGIES 1


To overcome its lack of a predetermined software development methodology and environment,
the SEE team selected a number of tools and methodologies for dry nmning the SEE, and then
undertook
follows:I efforts to become educated in them. The particular tools and methodologies selected were as
fols a. The team elected to develop
the exercise system in accordance
with DOD-STD-2167, as
tailored for the CCPDS-R program [2], and using Ada as the design language, since the
offerors would be required to do this.

b. For a design methodology, the team selected object-oriented design (QOD) as defined by
Grady Booch in "Software Engineering with Ada" [3]. Booch's version of OOD was
selected because it was one of the better known and documented methodologies, several of
the team members were acquainted with it, and it was expected that potential CCPDS-R
I
FSD/P offerors might propose a similar methodology.

c. For a software development environment, the team chose the VAX/VMS Ada environment
since it was readily accessible via the MITRE Bedford Computer Center, most team
members were familiar with it, and it provided sufficient capabilities to meet the demands
of the exercise.

d. As a graphical design representation technique, the team selected Buhr diagrams as defined
in "System Design With Ada" [4] because the technique is designed for use with Ada, it is
compatible with Booch's OOD, and it provided a more extensive mapping from Ada and
I
design constructs than did Booch's notation.

e. For the Ada-based design language, the team chose a draft ADL standard that had been
developed for another ESD project. A number of team members were acquainted with it.
I
f. The team did not select any particular methodology for requirements analysis, primarily
because the team initially felt that the exercise was relatively small, all members understood
3
the requirements clearly, and no data flow/data dictionary tools were readily available.

To become educated in all of these selected tools and procedures, the team studied numerous
articles, participated in group discussions, and conducted demonstrations under the tutelage of the
technical leads and consultant. The total time allocated throughout the effort for education in the tools
and methodologies was minimal, estimated at approximately five days distributed over a 2-week period.

2.3 SCHEDULE I
Prior to commencing the actual design and development of the SEE, the team technical leads
developed a schedule and work plan for the uffort. Figure 1 depicts this initial projected schedule. This
schedule, though longer than that anticipated for the CCPDS-R offerors, was considered justifiable
since the team required training in the methodology, which the offerors should not; the team members
were not dedicated full time, as the offerors' members were expected to be; and the team needed to

3
prepare additional documentation not required of the offerors. As figure 1 reveals, the projected effort

U
Ic
Iu Z2c
I. - -I -9 - - I-
would extend over a 2 1/2 month period, with I week allocated for requirements analysis, 3 1/2 weeks
for design, 1 1/2 weeks for coding of selected portions, and 2 1/2 weeks for developing the source
I
selection documentation (e.g., final exercise specification, evaluation criteria, etc.). This initial
schedule represented an accelerated effort based on an anticipated 15 July 1986 release of the CCPDS-R
RFP package. The actual schedule followed for the SEE dry run, however, turned out to be
considerably longer. Figure 2 depicts the actual MITRE SEE dry run schedule. The primary reasons
why the team deviated from the original schedule were that team members were unable to devote as
much time as originally planned, particular efforts, such as requirements analysis, took much longer
U
than estimated (see section 4.1.2), and unrelated delays occurred in the CCPDS-R RFP release which
obviated the need for the original, accelerated schedule. 3
2.4 REQUIREMENTS ANALYSIS ACTIVITIES 3
The team commenced its dry run of the SEE by conducting an analysis of the SEE specification
requirements. Input to this requirements analysis effort was the draft SEE system specification
described in section 1.2.2. The team assumed at the start of the requirements analysis phase that the
draft exercise specification was essentially free of major ambiguities and inconsistencies. This
assumption was based on a quick reading of the draft specification and the feeling of the team that such
a short specification probably did not have any serious problems in it. The team's main objectives for
this phase were to define a software architecture that clealy identified the computer software
configuration items (CSCIs) for the exercise system, to create adequately detailed software requirements
specifications (SRSs) that provided the technically important portions of the DOD-STD-2167 data item
(Dl) description (DID), such as the definition of inter-CSCI interfaces, and to identify any ambiguities
remaining in the exercise specification.
I
The team's first step in the requirements analysis effort was the development of an overall
software architecture for the exercise system. The software architecture which the team developed
consisted of four CSCIs: two application-level CSCIs, the missile warning simulator (MWS) and the
scenario generator (SG); a user-system interface (USI) CSCI; and a file manager (FM) CSCI. Figure 3
uses Buhr notation to depict the major components of this architecture and the control and data flows 1
among them. As figure 3 reveals, there are two major, independent control threads that tie the system
together, the first passes from USI through SG to FM, and the second passes from USI through the
MWS to FM. In the absence of other guidelines, the team decided to allocate particular requirements to
each CSCI so as to reflect most accurately and straightforwardly the requirements breakdown in the
U
draft exercise specification. Also, the team decided to decompose the system into these four distinct
CSCIs rather than one CSCI with four computer software components (CSCs) for two primary
reasons: first, the team wanted to make the design non-trivial so that the team would be forced to deal
immediately with issues of interface definition and performance allocation; and second, the team wished
I
to view the exercise specification as if it was a real specification that required a high level decomposition
and the creation of at least two SRSs. I
Upon development of the overall software architecture for the exercise system, the SEE team
carried out a number of other activities and generated specific products. The specific activities
conducted and products generated during the requirements analysis phase included

10
I
I
I
* m

I I
Im
I
I =IIIU

I I
I I
*
I
-,

_ _

I
*
- __ -

*
I
~2J ~Ii
I

I Iii Ii
ill I11 ~
II ii 'II
________ __

I
I
I
00
0 ... ~

U _

II

IW

'U12
I
I a. Documentation for most SRS sections for all four CSCIs. Sections of the SRSs not
prepared included adaptation requirements, qualification requirements, and quality factors.
These sections were not generated either because the team did not have sufficient time to
prepare them, the team did not expect the offerors to complete these sections in their allotted
time to conduct the SEE, or the sections did not provide any elaboration of requirements
contained in the exercise specification.

b. A partial allocation of timing budgets to CSCIs. A complete allocation was not performed
because there was insufficient time to finish this task properly and because the team did not
have control over the target execution environment (a time-shared VAX).

c. Interface specifications for each CSCI-to-CSCI interface.

3 d. Data flow diagrams for selected functions and a global data dictionary.

e. A "mock" software specification review (SSR) in accordance with DOD-STD-2167.

f. A revised draft of the system specification that reflected the discussions held during the
definition of the CSCIs and their interfaces (see section 2.7.1).
I Although the team did not apply a formal requirements analysis methodology, the use of a data
dictionary and data flow diagrams was sufficient for the team to complete the other efforts identified
above, to develop the overall software architecture, and hence, to achieve all of the requirements
analysis phase objectives. The team did feel that a more comprehensive exercise than the one defined
would have forced the use of a formal methodology.

I 2.5 DESIGN ACTIVITIES

The SEE team started the design phase of the dry run upon completion of the mock SSR and
review of the draft SRS documents. The team's objective for this phase was to raise as many Ada-
related methodology and design issues as possible. It was not the team's objective to develop a
complete, fully documented design. The team selected two of the CSCIs, scenario generator and file
manager, for which to conduct preliminary design. The team picked these two CSCIs for four reasons:
first, these CSCIs shared a non-trivial interface that required the joint, consistent specification of data
elements, control flow, and timing budgets; second, these CSCIs formed a portion of one of the two
major independent control threads in the exercise system; third, the correct operation of the file manager
required that the preliminary design show evidence that certain concurrent rad/write issues had been
resolved; and fourth, of the SRS documents prepared by the team, the SRSs for these CSCIs were the
3 most detailed.

As stated in section 2.2, the team carried out preliminary design for the two selected CSCIs
using Booch's OOD methodology. In addition, the team developed Buhr diagrams and ADL for the SG
and FM CSCIs. Figure 4 is a sample of one of the Buhr diagrams produced for a file manager
function, Retrieve_Event. The team did not develop formal software top-level design documents
(STLDDs) for these CSCIs, due to lack of time; however, the team made most of the technical decisions
needed for these documents and presented the results at a mock preliminary design review (PDR). The
team did no further design work following the mock PDR, the reasons being that team members could

S13
I
ILLS

F-l

0
2 z=I

0 z IC

C.)4U

I
LUU
zI
_ _______
I
a V: I

P-
II

143
I
I
no longer devote large amounts of time to the dry run and members felt that no additional discriminating
design issues would be raised by continued design decomposition.

I At the conclusion of the design phase, the team identified a number of Ada-related methodology
and design issues which were encuntered during the dry run. The most prominent methodology issue
was the transition to preliminary design from requirements analysis, following the guidelines in
Booch's OOD methodology, and in particular, the introduction of ADL. The most important design
issue was related to the Ada tasking model: the prioritization of tasks and the avoidance of deadlock,
race conditions, and task starvation. A second important issue concerned the treatment of system
initialization and termination, and their interrelation with the Ada elaboration rules. With the
identification of these and other issues (see section 4 for a discussion of these issues), the team satisfied

I its design-phase dry-run objective.

2.6 RESULTING SEE DESIGN

At the completion of the design-phase dry run, the design for the SEE system which emerged
consisted of a menu-driven system containing four CSCIs, each running asynchronously. In the
design, USI is an Ada task which generates the menus used to solicit input commands from the user,
validates all user inputs; forwards valid scenario generator and missile warning simulator commands for
processing to SG and MWS, respectively; accepts data from SG and MWS, and generates appropriate
menus to solicit input commands or missile warning displays. SG is an Ada task that performs scenario
generation processing, allowing the user to create, edit, delete and save scenario files consisting of
missile launch and nuclear detonation message events. MWS is an Ada task that performs the
simulation processing for a particular scenario file; that is to say, it performs processing on the event
messages in the simulation scenario, calculates missile warning display information elements based on
the contents of the event messages, and makes the information elements available for display by USI.
Finally, FM is an Ada task that provides a common set of mechanisms for SG and MWS to access a
centralized database of scenario files and to prioritize requests by SG and MWS for access to the
scenario files.

3 2.7 SEE DOCUMENTATION

As stated in section 2, two of the primary objectives of the MITRE SEE dry run were to
generate a clearly defined SEE system specification and to develop the ground rules for the offerors to
follow when conducting the SEE. The Government considered these products essential to scope the
exercise, to achieve a meaningful exercise, and to ensure commonality among offeror approaches and
efforts (e.g., what hardware could be used and what products were to be generated by the offerors as
part of the exercise), thereby enabling the Government to evaluate the offerors' SEE results in an
objective and consistent manner. The actual SEE system specification and ground rules, or detailed
instructions for the offeror, which w-re generated upon completion of the dry run of the SEE are
iontained in appendix A. A description of these documents and their derivation is contained below.

2.7.1 SEE Specification


MITRE commenced the SEE dry run using a draft SEE system specification. At the completion
of the dry run, when determining whether to modify this specification in certain areas, an issue was

315
I
I
raised as to the appropriate level of detail for the specification; some team members wanted very detailed I
requirements in the specification, and some felt that very detailed requirements were inappropriate since
they implied design. The resolution for this dilemma was to incorporate the very detailed requirements
into the specification only if they were necessary to bound the scope of the exercise; otherwise, the
detailed requirements were omitted and left as a design issue for the offerors. Thus, the major
modifications the team made to the draft specification as a result of the dry uniconcerned the areas of
hardware, growth and flexibility, and performance requirements. I
2.7.1.1 Hardware

The draft SEE system specification stated only that no special hardware was needed for the
exercise system. The team modified the SEE specification, however, to require that no special graphics
I
hardware or capabilities be used and that the user interface be designed to operate on a single dumb
tenninal with keyboard entry device. The team made these changes to ensure a level of commonality
among the offerors' designs and to preclude the offerors from focusing their efforts on sophisticated
graphics capabilities at the expense of addressing key software design issues.

2.7.1.2 Growth and Flexibility

Initially, the SEE system specification had no requirements for growth and flexibility of the
exercise system. Since growth and flexibility are key requirements of the CCPDS-R system, the team
elected to add requirements to the SEE specification in these areas so that the offerors could be evaluated
on their approaches for handling growth and flexibility. In particular, the team added both a general and
a detailed set of growth and flexibility requirements. The general requirement specified that the design
be modular to facilitate changes in software components which are needed to accommodate future
I
changes in operational requirements. The detailed requirements specified that the system include the
capability for the user to query an individual scenario file based on a fixed set of criteria, and that the
system be flexible enough to allow as future growth the capability for the user to query across multiple
scenario files for this same set of fixed criteria.

2.7.1.3 Performance Requirements I


The draft exercise specification contained no performance requirements. Again, since
performance is a key requirement of the CCPDS-R system, the team decided to add detailed
performance requirements and load conditions to the SEE specification so that the offerors could be
I
evaluated on their approaches for addressing performance issues. As part of this approach, the team
included in the SEE specification some exact performance requirements from the CCPDS-R system
specification [2]. The team also left some of the SEE performance requirements ambiguous. For
example, some of the SEE requirements were unclear as to the end points for measuring performance.
This approach of leaving ambiguous requirements in the specification provided the Government the
opportunity to evaluate the offerrs' methodologies for their ability to handle one of the key functions of
requirements analysis, namely, the detection and resolution of specification ambiguities.
I
2.7.2 Instructions for the Offeror 3
In addition to the SEE system specification, at the completion of the SEE dry run, the SEE team
generated a detailed set of offeror instructions. The major ground rules included in the detailed
instructions for the offeror (see appendix A) pertained to the exercise scope and duration, Government

16 3
I
I
Iinteraction.
formats.
the offenros methodologies and tools, team composition, and deliverable products and their

12.7.2.1 Exercise Scope and Duration

Based on the results of the dry run of the SEE, the team reached the following conclusions
regarding the scope and duration of the exercise:

a. A complete implementation of the exercise system, from requirements analysis


through coding and testing, could not be done within the time period allocated during the
CCPDS-R source selection.
b. Coding and testing of the exercise system would not provide significant discriminating
information regarding one's ability to design and implement a real-time system in Ada.
Use of Ada as a design language during the preliminary and detailed design phases
provided sufficient information to assess one's ability in these areas.

c. Allowing the exercise period to exceed four weeks was not considered beneficial.
Comparable to a college "take home" examination which has a point at which no further
improvement in quality is achieved, it was determined that no further discriminatory
information could be obtained by allowing the exercise period to extend beyond four weeks
to six or eight weeks, for example. In fact, having the exercise go beyond four weeks
could be detrimental since it could result in an overload of SEE material for the Government
to evaluate.
Given the above conclusions, the team specified in the detailed instructions for the offerors that
the offerors develop a complete software architecture for the exercise system; conduct requirements
analysis and preliminary design for two or more components of that architecture, with the components
to be selected by the offeror, and conduct detailed design for one or more components of the
architecture, again with the components selected by the offeror. This would provide the Government
with sample products from each major software development phase with minimal burden on the offeror.
Also, the team specified that the exercise duration, from offeror receipt of the SEE specification and
ground rules until delivery of the completed products, be limited to 3 1/2 weeks.
2.7.2.2 Government Interaction

The MITRE dry run of the SEE was conducted in accordance with DOD-STD-2167, as tailored
for CCPDS-R [2]. As such, it included some of the typical reviews held during the software
development effort, such as the software specification review and preliminary design review. During
these reviews, participating personnel assumed the roles of Government acquisition agencies,
Government using agencies, and contractors. Conducting these reviews provided the "contractors" the
opportunity to submit questions to the "Government" to obtain clarification of requirements, resolution
of specification ambiguities, and design verification. During the conduct of these reviews, it became
evident that CCPDS-R offerors might develop similar questions during their implementation of the SEE
which would require resolution. In the interest of fairness, it was considered undesirable to have any
interaction between the Government and the offerors during the exercise period, since one offeror might
inadvertently be given more information or direction than another. Therefore, in the recommended
instructions for the offeror, the team explicitly stated that there would be no interaction between the

* 17

I
I
I
offerors and the Government during the offerors' execution of the exercise. Should the offerors have I
any questions on the exercise, the offerors were instructed to identify appropriate assumptions, to
document those assumptions, and to proceed with the exercise based on those assumptions.

2.7.2.3 Methodologies and Tools

During the dry run of the SEE, the observation was made that while a particular methodology
may be considered complete and satisfactory in theory, it may turn out to require modification once it is
actually used on a real application. This was considered true for the object-oriented design
methodology used by the SEE team (see section 4.2). The instructions for the offeror specified that all
offerors must follow their proposed requirements analysis and design methodologies as documented in
the SDPs submitted with the CCPDS-R technical proposal; however, the offerors were also allowed to
submit with their delivered SEE products changes to their SDPs which provided further concise,
technical details regarding the methodologies used during the SEE requirements analysis and design
phases. These changes would be considered part of the offeror's technical proposal and subject to
I
Government evaluation.

The observation was also made that familiarity with the selected tool set was essential in order
to promote ease of design and development The fact that a number of the team members were not well
versed in the selected VAX Ada environment and tool set slowed progress. However, to require that
the CCPDS-R offerors use the actual tool sets proposed CCPDS-R did not appear suitable since the
offerors might not have all the tools in house. (It was not considered proper for the Government to
mandate that offerors expend funds to obtain these tools for the SEE.) Consequently, in the
recommended instructions for the offeror, the team specified only that the offerors use the tool set
proposed for CCPDS-R to the maximum extent practical, as this would be viewed more favorably by
I
the Government.

2.7.2.4 Team Composition U


During the course of the SEE dry run, it became evident that to develop the system correctly
and with ease, each team member needed to be well versed in the selected methodology and tool set, as
appropriate for the member's role on the team. It also became evident that outside consultants with very
strong skills in these areas could easily be brought in to carry out the exercise, thereby circumventing
the intent of having specific offeror personnel conduct the exercise. Consequently, in the suggested
instructions for the offeror, the team specified that offeror participation in the exercise be limited to
those key individuals identified in the offeror's technical proposal as part of the CCPDS-R FSD/P team,
that subcontractors who will be responsible for software development on CCPDS-R be active
participants, and that consultants be precluded from participating. To verify offeror compliance with
these ground rules and to assess the knowledge of individual offeror personnel in the methodology, the
team further specified that, after submission of the SEE products, the offerors present a briefing to the
Government on their SEE results, at which time the Government would be able to question all offeror
team members on their role in the exercise and on specific technical aspects of the submitted design,
I
methodology and tool set. Responses to these Government questions would be considered part of the
offeror's SEE products, and subject to evaluation by the Government.

1
18!
I
I
2.7.2.5 Deliverable Products

During the course of the SEE dry run, the question arose as to what materials the offerors
should submit for evaluation and in what format the products should be delivered. The team concluded
that, for those software architecture components the offerors chose to analyze and design, the offerors
should submit all requirements analysis and design products, both textual and graphical, that they
generated as part of their methodology and which are required per DOD-STD-2167, as tailored for
CCPDS-R. These products included, for example, SRSs, STLDDs, software detailed design
documents (SDDDs), and performance analyses. Also, the team concluded that the offerors should
submit all textual products of the exercise, including requirements analysis conclusions and
documentation, ADL listings, and other design documentation both in hardcopy form and in machine-
readable, 9-track tape. The tape format provided the Government the capability to browse through the
text, to apply certain design analysis tools to the ADL, and to verify that the offerors' ADL was
compilable. Finally, the team concluded that the offerors should present a briefing to the Government
on their SEE results. This briefing would take place following initial Government evaluation of the
SEE products and would enable the Government to verify its rating of the offerors' SEE performance
and to assess the knowledge of the offerors' team members, as described in section 2.7.2.4. The
instructions for the offeror were written to include these specific directions.

I
I
I
I
U
U
I
I
I
I 1

I
I
I
U SECTION 3

PLANS FOR EVALUATING THE OFFERORS

I To evaluate the offerors' performance on the SEE, the MITRE SEE team developed a set of
evaluation criteria based on a set of possible discriminating issues found during the dry run of the SEE.
This section presents a general overview of source selection evaluation terminology. It then describes
the source selection approach chosen for the SEE and delineates how these discriminators were used to
derive a set of objective source selection evaluation criteria. Next, this section presents a description of
some tools and techniques selected to assist the Government in evaluating the offerors' SEE products.
Finally, this section details how the Government presented the SEE to the offerors and how the
Government planned to evaluate the offerors' products.

3.1 SOURCE SELECTION TERMINOLOGY OVERVIEW

As defined in Air Force Regulation (AFR) 70-15, "Source Selection Policy and Procedures" [5]
and Electronic Systems Division supplement I to AFR 70-15 [61, during source selection, offerors'
proposals are evaluated against a set of predefined criteria. The evaluation criteria are con-elated to
important aspects of the program which are significant to the selection decision and particularly to
aspects of the program that constitute high risk. The evaluation criteria are arranged as evaluation areas
which are broken down further into items, which in turn may be broken down into evaluation factors
and possibly subfactors. The evaluation criteria and order of importance are described to the prospectivc
offerors in section M of the RFP; however, normally the evaluation factors and subfactors are not
identified in the RFP, section M.

During source selection, offerors' proposals are rated against the evaluation criteria using
predefined standards and scoring methods. At the lowest applicable evaluation criteria category (e.g.,
item, factor, subfactor), standards are prepared and used as positive indicators of the minimum
performance or compliance acceptable to enable an offeror to meet the requirements of that evaluation
criteria. Thus, standards are the measures by which the Government scores an offeror's proposal as
acceptable or unacceptable.

* 3.2 SEE SOURCE SELECTION APPROACH

3.2.1 SEE Evaluation Criteria

I Based on the dry run, the team determined that the critical issues for evaluating the CCPDS-R
FSD/P SEE products were the robustness and cohesion of the offeror's requirements analysis,
preliminary design, and detailed design methodologies; the offeror's familiarity with the methodologies
and tools; the offeror's Ada/software engineering expertise; the robustness, cohesion, and completeness
of the submitted exercise design; the offeroes ability to address and analyze real-time requirements and
issues; the offeror's clarity and communication of design, including the use of ADL to express design;
and the offeror's compliance with the SEE system specification and the offeror's own SDP. The team

I 21
I
I
I
assessed that an evaluation of these issues as reflected in the offeror's SEE products would provide
sufficient evidence as to the offeror's ability to design and develop a real-time system in Ada using
H
modem software engineering practices. Any other issues such as coding, metrics, and full compliance
with DOD-STD-2167 were considered unnecessary. Thus, the Government included only the above
high-level evaluation criteria for the SEE in section M of the CCPDS-R RFP.

Given this high-level criteria, the Government identified where the SEE should be included in I
the CCPDS-R FSD/P source selection process. Since the CCPDS-R source selection approach
included only two evaluation areas, technical and cost, the Government determined that the SEE be
included as one of the four items, of equal importance, in the technical area. The Government felt that
the SEE should not be incorporated under the source selection general considerations area, since this
area carries less weight than the evaluation areas. Also, the Government concluded that the SEE item
should be decomposed into three factors and associated subfactors as follows:

a. Factor: methodologies
1. Subfactor: requirements analysis methodology
2. Subfactor: design methodology
3. Subfactor: interrelationship between requirements analysis
I
and design methodology

b. Factor: design U
c. Factor: team expertise
1. Subfactor: methodologies
2. Subfactor team composition
U
Since these factors and subfactors only reflected a consolidation and reorganization of the SEE criteria
already contained in the RFP, section M, the Government elected not to include these factors and
subfactors in the section M provided to the prospective offerors [2].

3.2.2 SEE Scoring Method I


During the dry run of the SEE, it became evident to the team that an offeror's failure to follow a
stated software engineering approach could not be corrected via revision during the source selection
period. Thus, for source selection the Government decided to give the offerors only one opportunity to
I
carry out the SEE. The Government instructions stated that offeror revisions or changes to SEE
products accomplished after the conclusion of the SEE period would not be evaluated. Further, rather
than use the typical color coding and risk-scoring approach documented in AFR 70-15, which allows
the offerors to submit proposal revisions in response to Government clarification requests and
deficiency reports, the Government decided that a unique scoring approach be used for the SEE in
which no clarification requests or deficiency reports were employed. The scoring method defined for
the SEE was strictly pass/fail with no risk designated. With this approach, an offeror was assessed a
I
rating of pass for the SEE item if the offeror's SEE proposal was judged outstanding or satisfactory in
two of the three SEE factors; otherwise the offeror was assessed a rating of fail. Each of these three
SEE factors was rated as outstanding, satisfactory, or unsatisfactory. Outstanding indicated that the
I
offeror exceeded minimum requirements in a beneficial way with no significant weaknesses.
Satisfactory indicated that the offeror met minimum requirements with some weaknesses that could be
controlled. Unsatisfactory denoted that the offeror failed to meet the minimum requirements;

22 I
I
I
I
weaknesses could not be readily or reasonably corrected. With this approach, a rating of fail for the
SEE did not render an offeror automatically ineligible for award.

13.2.3 Discriminator Issues

Prior to the commencement of the CCPDS-R source selection technical evaluation, the MITRE
SEE team developed preliminary standards for each of the above SEE factors and subfactors based in
part on the recommended evaluation criteria and a set of lower level discriminating issues identified
during the course of the SEE dry run. The lower level discriminators related to identification of
specification ambiguities, allocation of timing requirements across system components, behavioral
aspects of the exercise system, interface specification, and consistent representation of design
information across ADL, text and graphics.

I 3.2.3.1 Specification Ambiguities

In the dry run of the exercise, the team discovered instances of incompleteness and ambiguities
in the draft exercise specification. Many of these instances were uncovered during requirements
analysis only after discussion among the team members; initially, each member thought he or she
understood the intent of the requirements and only when two members had to agree did the
incompleteness become apparent Many of these ambiguities were impossible to resolve fully until
derived requirements were presented at the SRS level. An example is the interpretation of the
requirement that the exercise system will "simulate the CCPDS-R missile warning capability in real-
time" (see appendix A). Other areas of incompleteness related to the difficulty of stating performance
requirements concisely; for example, the requirement that the "time from completion of [data] entry [by
the user] until the database is modified to reflect the update shall not exceed two seconds" (see
appendix A). Such requirements force end-to-end performance measurement across different
components. To prevent incorrect interpretations of specification ambiguities from having later
catastrophic and costly results, it is imperative that the methodology employed for requirements analysis
include approaches for detecting and resolving specification ambiguities and inconsistencies. The
offerors' SEE products were therefore expected to reflect a thorough identification and resolution of
specification ambiguities.
3 3.2.3.2 Timing Requirements

As a result of the dry run, the team found that allocation of timing budgets to software
components for the SEE was very difficult to support analytically. As mentioned above, this was due
in part to inherent problems in stating quantitative performance requirements in a rigorous, testable
manner. But the primary problem was due to the nature of the exercise: the analysis to support timing
budget allocation requires simulation and/or prototyping activities, and the tools and time needed to do
this were not available to the team during the exercise period. The team also found that timing analyses
must be done during the requirements analysis phase to do a proper allocation of requirements. The
offerors' SEE results were therefore expected to include an explicit performance analysis activity, done
during requirements analysis, which would provide input to the SRSs.

3.2.3.3 Behavioral Aspects of the System

A number of technical questions arose during the exercise dry run that related to the correct,
reliable operation of the system. The team felt that these issues should be addressed in the preliminary

* 23

I
I
I
design by means of explicit use of Ada language features. These issues were the clearidentification
(from the ADL and the graphical representation) of the major control threads running throughout the
I
system; the synchronization and pioritization of concurrent tasks; the avoidance of system-wide
deadlock; the effectiveness of the mechanisms used to initialize and terminate the exercise system (these
mechanisms can be implicit via reliance on Ada elaboration order or can involve explicitly implemented
procedures); and the effective use of Ada exception handling.

3.2.3.4 Interfaces

Interface consistency has typically plagued DOD software development efforts over the years,
and the advantages of Ada for producing consistent interface package specifications are obvious. While
the team did not really expect that an offeror would fail to use Ada properly for data definition on such a
small exercise, the team felt nevertheless that effective use of Ada should be demonstrated in the SEE
products.
3.2.3.5 Consistent Representation

The SEE system specification requires that both graphical representation and ADL be used to
describe design information and that they be employed consistently. The team found in the SEE dry run
that graphical representations are necessary and useful to depict top-level and detailed views of the
software architecture as well as relationships among components. ADL is then used to fill in details and
to enhance definitions. The team found that these techniques must supplement one another since they
I
lose effectiveness if used to describe different things; consequently, the offerors' SEE products were
expected to reflect compatible ADL and graphical design representations. I
3.3 EVALUATION TOOLS AND TECHNIQUES

Given the rather low level of detail described above against which the offerors' SEE products
would be evaluated together with the potentially large amount of data to be submitted by the offerors,
the SEE team identified the possible need for some additional tools and techniques to assist the
Government in evaluating the SEE products. To that end, the team recommended that the Government
use the ESD acquisition support environment (EASE) and a set of evaluation checklist questions for the
SEE source selection.
3.3.1 EASE
I
EASE is a prototype workstation-based tool intended to support Government review of contract i
technical documentation. Specifically, EASE, which was under development at the time of the SEE dry
run, is oriented towards the review of contractor products relating to the acquisition of Ada software.
These products will primarily consist of ADL and Ada code. At maturity, EASE will support a wide
range of analytic activities, including RFP preparation, modeling, requirements analysis, design
I
analysis, and tool assessment. EASE is not intended, however, to support management functions.

The EASE prototype executes on a Sun-3 UNIX®l-based workstation. EASE takes full

I. UNIX® is a trademark of AT&T Bell Laboratories. 3


24 3
i
I

i advantage of the Sun's large bit mapped display and windowing system. Different tools execute in their
own windows, and information is managed in a common database hidden from the user. At the time of
the CCPDS-R source selection, the only tools integrated with EASE were the GNU Emacs editor, the
Verdix Ada compiler and several utilities delivered with the compiler.

For the SEE, the team proposed the use of EASE specifically for browsing through the
offerors' textual products and for assisting in the evaluation of the ADL submitted with the products.
Since the SEE system specification required that the offeros' design be documented in compilable

3 ADL, the team recommended that the Govermnent use EASE to test whether the offerors' ADL did in
fact compile. The Government elected to follow these recommendations.

3.3.2 Checklist Questions

In addition to the use of EASE, the SEE team recommended that a set of informal checklist
questions be employed to assist the SEE evaluators in their rating of the offerors' SEE products against
the factors and standards. The questions would be correlated with specific factors and standards and
would highlight particular issues which must be addressed to determine if a standard is met. The
questions would serve two purposes: for those source selection evaluators who participated in the dry
run, the questions would serve as reminders of key points to look for in the SEE products, and for
those evaluators who were not familiar with the SEE prior to source selection, the questions would
serve as a checklist for evaluating the products and determining whether or not standards had been met.
In support of this recommendation, the SEE team prepared an extensive list of evaluation questions to
serve as a basis for the checklist.

3.4 RELEASE OF SEE TO OFFERORS

IBased upon MITRE's dry run of the SEE, it was expected that both the offeror preparation of
the SEE and the Government evaluation of the resulting products would be intensive and time
consuming. Furthermore, the Government resources to review the SEE products would be limited,
since the evaluators, about eight people, would most likely be responsible for reviewing both the SEE
products and the offerors' technical proposals. Therefore, to give the Government time to review the
offerors' CCPDS-R technical proposals and SDPs as well as to give the offerors adequate time to
prepare both the technical proposals and the SEE, the Government elected not to begin the SEE until
after receipt of the offerors' technical proposals and SDPs. Consequently, the Government did not
include the SEE system specification and detailed instructions for the offeror in the RFP released on
10 October 1986; it only included a copy of the SEE section M evaluation criteria and a preliminary set
of SEE instructions for the offeror in the RFP instructions for proposai preparation (IFPP). As stated
in section 3.2.1, the SEE section M evaluation criteria identified the basis on which the offerors' SEE
products would be judged. The preliminary instructions for the offeror contained the general ground
rules for the conduct of the SEE and a brief description of the SEE products to be generated and
submitted by the offerors for Government evaluation. Upon receipt of the offerors' technical
proposals, due on 10 November 1986, the Government planned to supply each offeror the actual SEE
system specification and the detailed instructions for the offeror. Figure 5 shows the interaction of
Government and offeror activities during the timeframe of source selection. Copies of the SEE system
specification and detailed instructions for the offeror, the RFP IFPP preliminary instructions for the
offeror, and the RFP section M may be found in appendices A, B, and C, respectively.

* 25

I
I I I

I I

I--- - I
_,Ie ,_. i U
i~ - I

_ I|
I-I
I
I

13.5 GOVERNMENT EVALUATION APPROACH

The Government's planned approach for evaluating the offerors' SEE products, delivered 3 1/2
weeks after receipt of the SEE system specification and detailed instructions for the offeror, consisted of
a first-pass evaluation, an in-house audit at each offeror's facility, and a completed evaluation. The
Government's intent was that upon receipt of the offerors' SEE products, the Government would
perform a preliminary evaluation of the products, allocating approximately one week for each offeror.
Following that evaluation period, the Government would conduct a I-day audit at each offeror's
facility. The offeror's SEE products and results of the audit would then be factored into a final
evaluation to be completed by the Government within a week of the audit. The following paragraphs
describe the process of the first-pass evaluation and the audit.

g3.5.1 First-Pass Evaluation

The purpose of the first-pass evaluation was to obtain a preliminary assessment of each
offerors performance on the SEE and to identify strengths and weaknesses in the offeror's SEE
products. The Government would carry out the first-pass evaluation by scoring each offeror's SEE
products against the predefined source selection factors and standards. For each offeror, the
Government would prepare draft documentation which would describe the offeror's SEE products, the
offeror's strengths and weaknesses relative to the factors and standards, and an overall assessment of
the offeror's performance on the SEE. Also, the Government would prepare a set of questions tailored
for each offeror which would be posed to the offeror during the on-site audit. The intent of these
questions was to verify the Government's interpretation and evaluation of the SEE products and to
assess the offerors SEE team capabilities in software engineering, Ada, and the selected methodologies
and tools.

1 3.5.2 Audit

As stated above, the purpose of the Government audit at each offeror's facility was to verify the
Government's preliminary assessment of the offerors SEE products and to obtain additional
information to complete its evaluation. As planned, the in-house SEE audit would consist of two parts:
an offeror briefing and a question and answer session. The briefing would provide an opportunity for
the offeror to explain the methodology proposed for CCPDS-R and employed on the SEE. The briefing
would include, at a minimum, a summary of the offeror's management approach, an overview of the
requirements analysis approach, an overview of the preliminary and detailed design approaches, an
identification of any assumptions made while carrying out the SEE and generating the SEE products,
and an identification of any deviations made from the SDP along with the rationale for those deviations.

The briefing would not include any discussion of further work which the offeror may have
completed following the submission of the SEE products, since the Government would not evaluate this
additional work. The question and answer session would provide an opportunity for the Government
to obtain clarifying information about the offeroes SEE products and to query individual offeror team
members spontaneously to test their expertise with the selected methodologies and tools. Each member
of the offers SEE teamn would be required to be present during the audit to respond to specific
questions directed to that individual. The Government would maintain a transcript of the questions and
answers. This transcript together with the briefing presentation material and the SEE products delivered
at the end of the 3 12-week exercise period would be considered part of the offeror's proposal and
included in the Government's final evaluation of the offeror's SEE results.

*27
I
I

m SECTION 4
N DRY RUN LESSONS LEARNED

3 As a result of the MITRE dry run of the SEE, numerous lessons were learned. These lessons
may be divided into two categories: administrative issues that relate to defining, organizing and
including the SEE as part of the source selection process, and technical issues that concern software
development and Ada in general. Lessons learned relating to administrtive issues have been described
throughout sections 2 and 3. This section summarizes the major technical lessons learned relating to
software engineering and specifically requirements analysis, object oriented design, Ada, and DOD-
STD-2167. It also highlights how those lessons were factored into either the CCPDS-R FSD/P
program and/or the standards prepared for the CCPDS-R source selection.

n 4.1 REQUIREMENTS ANALYSIS

During the course of MITRE's dry run of the SEE, the team made observations regarding the
time allocated for requirements analysis, Government interaction, a formal requirements analysis
methodology, and the completion of the requirements analysis phase.
54.1.1 Time Allocation

The team discovered that much more time than anticipated was needed to produce a thorough
requirements analysis and associated documentation. As reflected in section 2.3, the team spent
approximately three times longer on the requirements analysis effort than originally planned. This extra
time was due to the following conditions:

a. The team members initially assumed that the requirements in the exercise specification were
clear and would not require extensive analysis;
b. The original 1-week allottment for requirements analysis was overly optimistic but was
necessary to achieve the scheduled 15 July 1986 RFP release;
c. The team lacked a formal approach to requirements analysis at the start of the exercise; and

d. The DOD-STD-2167 SRS DID was new, requiring a learning curve, and it specified a
lower level of detail than anticipated by the team members.

5 Eliminating the extra time spent due to conditions a through c above, it was estimated that as
much as twice the originally scheduled time was spent on requirements analysis due to the DOD-STD-
2167 required level of detail. Based on this observation, the Government developed a projected
CCPDS-R FSD/P phase schedule which included approximately one additional month for requirements
analysis beyond that typically estimated. Also, the Government elected to scrutinize carefully during

5adequate
source selection the offerors' proposed CCPDS-R software development schedules to ensure that
time had been allocated for requirements analysis.

329
4.1.2 Government Interaction I
During the dry run of the SEE, the team observed that the presence of a "Government"
representative during the requirements analysis effort greatly facilitated progress during that phase.
This person was able to assist the development team by clarifying ambiguities and identifying incorrect
assumptions. As mentioned in section 2.7.2.2, the team recognized that the Government could not play
a similar role during the offerors' execution of the SEE. However, based upon this SEE observation,
the Government elected to include in the CCPDS-R statement of work (SOW) a provision for the
U
Government to maintain a representative on-site in the contractor's facility throughout the requirements
analysis phase to monitor the contractor's effort and to assist in obtaining responses to contractor
questions.

4.1.3 Formal Methodology 3


The SEE team found that a formal methodology was needed during the requirements analysis
phase to ensure a thorough analysis of the specification functional and performance requirements, and
to maintain control over the effort. By formal methodology, it is meant that the methodology be
documented, that it cover the areas required in DOD-STD-2167, and that it be teachable. Failure to have
such a formal methodology in place and one with which the team was well versed, caused initial delays
in the MITRE dry run. Based on this observation, the Government decided to examine the offerr's
SEE products to assess the presence of a formal requirements analysis methodology in which the
offeror's SEE team was knowledgeable.

4.1.4 Completion of Phase 3


The SEE dry-run team noted that it was difficult to determine where requirements analysis
stopped and design started, especially given the requirements of DOD-STD-2167. For example, the
DOD-STD-2167 requirements for the software requirements specification imply that design information
related to timing and sizing be included in the SRS (see section 4.4), which is produced during the
requirements analysis phase. It was determined that by having a formal requirements analysis
methodology and a formal design methodology, and by tailoring DOD-STD-2167, this problem would
be minimized. For the source selection therefore the Government planned, first, to scrutinize the
I
offeror's requirements analysis and design methodologies to ensure that they were robust formal
methodologies, and, second, to examine the interrelationship of the two methodologies to ensure that
they were compatible, with clearly defined steps for transitioning from one phase to the next.
I
4.2 OBJECT-ORIENTED DESIGN I
As mentioned in section 2.2, the MITRE SEE team selected OOD as defined by Grady Booch
for the methodology to be used during the design phase. Booch's OOD attempts to map solutions
directly to the problem as viewed in real-world terms, forcing a problem set to be viewed in terms of a
I
set of software objects, each with its own set of applicable operations. Booch's OOD consists of three
phases: problem definition, informal strategy, and formal strategy. The first phase consists of defining
the problem in English at a high level, the goal being to gain an understanding of the structure of the
problem space. The second phase consists of developing an informal strategy wherein natural English
descriptions of the problem space are used to narrate the problem. The goal of the second phase is to
continue expanding the designers' understanding of the problem without limiting their ability to think

30 £
I
about the problem and without concerning themselves with the structure of the solution. Finally, the
third phase consists of formalizing the strategy. Using the informal strategy developed in the second

I phiase, nouns and verbs ame extracted and become the objects and operations in the solution. The nouns
are used to imply abstract data types and specific real-world objects. The verbs are used to define real-
world operations with particular objects. Also, adverb phrases are extracted to identify attributes of the

I
I operations, and interfaces between objects are described. Fially, the operations previously identified
for each object are implemented in executable form (e.g., ADL). This process is repeated until a point
is reached where the level of decomposition is unesadbewithout further modularity.

I During the dry run of the SEE, it appeared to the team that Booch's QOD was an incomplete
methodology. While it provided practical guidance for object identification, it lacked support for
requirements traceability and completeness, performance analysis, concurrency (i.e., multitasking),

I initialization and termination conditions, and error detection and handling. It did not clearly specify
how to transition from requi rements analysis to design nor did it specify guidelines for the completion
of detailed design. Furthermore, the team discovered that the use of OOD's informal strategy was non-
productive in practice. To have the SRS and then have to write the informal strategy resulted in
duplications of effort. The team observed that in many cases the informal strategy could be developed
so as to produce contrived results.
As an outcome of these observations, the Government included specific SEE factors and
standards to ensure that the offerors' design methodologies were complete, robust, and that they
contained specific procedures to resolve the above issues. In particular, the Government planned to
assess whether the offerors' design methodologies contained clearly defined procedures for
transitioning between requirements analysis and design phases and for handling initialization and
termination, exception handling, concurrency, and performance analysis.

* 4.3 ADA
While dry nning the SEE, the team identified several observations relating to both the
technical and management aspects of Ada. These issues concerned control flow, ADL, and personnel.

4.3.1 Flow of Control


In a multitasking system, many control flow issues must be resolved. These issues typically
relate to shared access to resources, and include deadlock, process starvation, race conditions,
serialization, and synchronization. The team spent considerable time addressing deadlock and process
starvation, in particular.
Deadlock conditions exist when two or more processes cannot execute because each is waiting
for a resource held by another, similarly, process starvation occurs when a process is waiting to access
a resource and the scheduling mechanisms either never service the process' request, or result in an
I4.. FowoCoto
unpredictably long delay. In the usual sequential programming languages, operating systems ad cyclic
executives consider and handle most control flow issues; thus, only the limited group of operating
system programmers need address these issues. However, while dry rming the SEE, the team
discovered that, even at the application level, the Ada tasking constr must be used with great car to
avoid deadlock and process starvation; consequently, all applications software designers and
programmers must address these issues. To address such conditions within the applications software,

* 31
the software development methodology must include techniques for designing effective controls for the I
detecion and/or prevention of deadlock and process starvation.
During the dry run, these issues were dealt with in part through the use of canonical task idioms m
and strategies that provided controlled access to shared resources. Given the team's conclusion about
the importance of these control issues and the fact that the SEE subsystem was to be designed using
ADL, the Government chose to consider during source selection, as part of the completeness and I
robustness of the offerors' design methodologies, the ability of the offerors' design methodologies to
address flow control, in general, and deadlock and process starvation, in particular.

4.3.2 ADL U
The MITRE SEE team designed the SEE system using ADL, documented the design with the
ADL incorporated into the DOD-STD-2167 products prepared by the team, and presented the design to
the "Government" representatives via the ADL at a mock PDR. The basic conclusion the team reached
from these efforts was that the use of ADL by itself does not present a global picture of the entire
system to the developers. In its design meetings, the team came to rely on Buhr diagrams as the
primary design representation. Moreover, the team found that presenting only ADL at the dry-run PDR
I
failed to convey design information clearly to all reviewers. As a result of this conclusion, the
Government modified the CCPDS-R system specification to require the use of graphical notation to
convey design information in conjunction with the ADL. Furthermore, since the SEE system
specification contained the same graphical notation requirements as the CCPDS-R system specification,
the Government opted to consider during source selection, as part of the completeness and robustness
of the offerors' design methodologies and clarity and communication of design, the offerors' graphical I
notation to ensure that it was well-defined, it was consistent with the ADL, and it contained enough
features to convey the information available via the ADL constructs.
4.3.3 Personnel

During the course of dry running the SEE, the team encountered several issues related to Ada
and personnel. These consisted of personnel training, retention of Ada-trained staff, and presence of
Ada experts on development teams.
I
4.3.3.1 Training I
The team observed during the dry run that Ada training must occur at all levels of the software
development and acquisition teams; from users, programmers, and designers, to program managers and
reviewers. The team also noted that training for Ada programmers and designers is slower and more
difficult than training for other programming languages, primarily because Ada imposes the software
engineering discipline of a methodology on its users. To a greater extent than in other languages, an
Ada programmer must be a software engineer and must be knowledgeable of the methodologies
employed, the graphical notation used, ADL, and the Ada language itself. The Ada programmer must
I
be well versed in all these issues at all stages of development; simply learning Ada syntax and semantics
is not enough. Based on the SEE dry-min results, the team estimated that training for Ada could be at
least two to three times longer than for other languages. Finally, the team concluded that the SEE dry
I
run served as an excellent vehicle to teach Ada as well as software engineering and software acquisition.
The SEE dry run served as a far more substantive approach for teaching software engineering and Ada
than the typical 5-day courses offered in these areas, which usually concentrate only on theory rather

32 3
I
I

I
I than practical applications. Specifically, the SEE dry run provided team members hands-on training in
all aspects of software development, including Ada, methodologies, DOD-STD-2167, requirements
analysis, design, and software management. The one major limitation of the SEE dry run was that it
did not cover the complete software development cycle since it did ot progress all the way through
code and testing.

As a result of the above lessons learned, the Government decided to examine the CCPDS-R
FSD/P offerors to ensure that the offerors' companies provided in-depth Ada training which was geared
for all offeror personnel associated with CCPDS-R software development, as appropriate for assigned
roles, and which exceeded the typical 1- to 5-day courses. Furthermore, the Government made plans to
train its own CCPDS-R project personnel in Ada after source selection by rerunning the SEE from
requirements analysis through testing, with select project personnel serving as the team members, and
by having all project individuals participate in at least some typical, formal Ada courses, as appropriate
for their given roles and responsibilities.

4.3.3.2 Retention of Ada-Trained Staff

One of the primary risks associated with Ada today is the ability to obtain and retain highly
qualified Ada engineers, since the number of such individuals is extremely small. The MITRE SEE
,eam itself experienced problems in these areas during the dry run of the SEE. MITRE had difficulty in
assembling a sufficient number of Ada-trained people who could devote a significant amount of time to
the SEE, and the SEE team itself experienced the departure of one of the technical leads. As a result of
confirming this Ada risk during the SEE dry run, the Government decided to assess each CCPDS-R
offeror's ability to assemble and manage a team of engineers for the SEE who were well versed in Ada,
as well as the proposed methodologies, software development tools and procedures, and ADL. In
addition, to help deter the departure of the FSD/P contractor's key Ada/software engineering personnel,
the Government included an award fee plan in the CCPDS-R FSD/P model contract [2]. The contract
requires that the contractor flow down 50 percent of the award fee directly to the contractor employees

3 working on CCPDS-R, and not to the company as a whole. As defined, the award fee is tied to the
successful completion of specific CCPDS-R milestones.

4.3.3.3 Ada Expert

3As mentioned in section 2.1, the MITRE SEE team included two Ada/software engineering
experts who were familiar with all aspects of Ada and particularly its more complex constructs. These
individuals had significant software development experience as well as a deep understanding of the
relevant and often more complex software engineering issues. The presence of these two experts was
crucial to the SEE development progress. They served as mentors to the rest of the team and as such
3
Ito
were able to keep the rest of the team on track, to point out areas overlooked by the team members, and
answer or resolve detailed software engineering and Ada questions.

Given the importance of these Ada experts on the MITRE SIE team, the Government
concluded that the software development risks on a complex Ada development could be substantially
reduced if the contractor's team included at least one or more strong Ada technical leads/experts who
were well versed in all the detailed aspects of Ada and software engineering. As a result of this
observation, the Government elected to consider, as part of the offeror's team expertise, the offeror's
ability to organize a SEE team that included strong Ada/software engineering technical leads.

3 33
U
I
I
4.4 DOD-STD-2167 I

DOD-STD-2167 defines a software development process which is applicable throughout the


system life cycle. The system life cycle is divided into phases. Each phase has associated with it one or
more products to be generated, and it culminates in a review or audit. The products required at each
phase may consist of preliminary or completed products. DOD-STD-2167 is intended to be tailored for
each particular application, as necessary. Figure 6 illustrates the DOD-STD-2167 products and reviews
with each of the phases as tailored for the CCPDS-R effort, along with an identification of those
I
DOD-STD-2167 products and reviews considered applicable to the SEE.

The SEE team performed the dry run in accordance with DOD-STD-2167, as tailored for
CCPDS-R. As the dry run progressed, the team observed that the already existing CCPDS-R tailoring
of DOD-STD-2167 required further tailoring because of what the team considered inappropriate
requirements of the standard. For example, the DOD-STD-2167 DID for the SRS requires data which
seems premature and in some cases impossible to obtain during the requirements analysis stage of
I
development. In particular, the input, processing, and output sections of the SRS DID require the
specification of items such as units of measure and ranges for inputs and outputs, the exact intent of the
operation, error detection, and algorithms. However, the team found that during the requirements
analysis phase, units of measure at this level may be impossible to define and that delineation of the
processing section seemed to force the conceptualization of a design, which contradicts the intent of the
requirements analysis effort. The SRS DID also requires the specification of timing and sizing data
against which the software will be tested, since the SRS is the baseline document for software formal
qualification testing to the Government. For current Ada developments, this is almost impossible to do,
since previous data on programs developed in Ada is minimal. Thus, the team determined that any
timing and sizing estimates entered into an SRS during the requirements analysis phase for an Ada
I
development were especially weak; the possibility was extremely high that the timing and sizing data
contained in an authenticated SRS would hold no validity later in the development effort.

As a consequence of these observations, the Government made further modifications of its


tailoring of DOD-STD-2167 as contained in the CCPDS-R SOW. The tailoring included, for example,
the approach of only baselining the sizing and timing estimates contained in an SRS at the software
specification review, but not finalizing these estimates until system PDR, at which time the contractors,
I
through their performance analysis, design, and prototyping efforts would have substantial evidence to
support these estimates. A complete tailoring of DOD-STD-2167 for CCPDS-R may be found in the
CCPDS-R FSD/P Statement of Work and Contract Data Requirements List [21.
I
3
I
I

I
vi
I'
EgoiI
I~* !Iu u
15i

lug ~

I -~ .1 35
I
I
I SECTION S
3 FORMAL CONDUCT OF THE SEE

With the completion of the MITRE dry run of the SEE and the associated SEE exercise
specification, ground rules, and evaluation criteria, the Government was fully prepared to conduct the
SEE as part of the CCPDS-R FSD/P source selection. This section describes how the Government
conducted the SEE relative to the plans described in section 3. It details how the Government released
the SEE to the offerors, the products delivered by the offerors, the Government's evaluation team, the

3 tools and techniques the team used to aid in the evaluation of the SEE products, and the Government's
overall approach for evaluation of the offerors' SEE products.

5.1 ISSUANCE TO THE OFFERORS

The SEE was issued to the offerors following the plan described in section 3.4. The offerors
received a copy of the SEE section M evaluation criteria and a preliminary set of SEE instructions to the
offeror in the request for proposal package, issued on 10 October 1986. Upon submission of the
offerors' proposals to the Government on 10 November 1986, the offerors received the SEE detailed
instructions to the offeror and the SEE system specifications. The offerors were then given 3 1/2 weeks
to deliver their SEE products, due on 3 December 1986. For each offeror, the Government spent
approximately four days evaluating the delivered SEE products, one day conducting an audit at the
offeror's facility, and two days finalizing the evaluation results.

This method of issuing the SEE to the offerors worked out beneficially. First, the offerors
benefitted by not having to write their proposals and develop the SEE products at the same time.
Second, it allowed the Government SEE evaluation team time during the proposal evaluation period to
review each offeror's SDP prior to receiving the SEE products. (The SDP defines the offeror's
software engineering approach -- methodology, tool set, ADL, and terminology -- and is the baseline
against which the offeror's products were to be evaluated.) If the Government had to review each
offeror's SDP and the SEE products at the same time, either the Government would have required a
longer time period to review the SEE products (as opposed to 4 days per offeror) or the staff-hours
required of the SEE evaluation team would have been overwhelming.

1 5.2 PRODUCTS RECEIVED

As discussed in section 2.7.2.5, the Government expected the offerors to submit all
requirements analysis and design products, both textual and graphic, which the offerors generated as
part of their methodology and which are required by DOD-STD-2167, as tailored for CCPDS-R (e.g.,
software requirements specifications, software top-level design documents, software detailed design
documents (SDDDs), performance analyses, etc.); all textual products the offerors generated (e.g.,
requirements analysis conclusions and documentation, Ada design language listings, etc.) in both
hirdcopy form and in machine-readable, 9-track tape; and a briefing to the Government on the offerors'
j SEE results.

137
I
I
In general, the Government did receive most of the expected products from the offerors. In
some cases, the Government received documentation that was not required (e.g., diaries of the entire
SEE effort). However, the Government did not receive all of the expected "intermediate" products
(e.g., data flow diagrams), leading the Government to conclude that in the future the instructions may
need to be clarified to ensure that the offerors are aware that the "intermediate" products are required.
I
Overall, the SEE products delivered were of sufficient quality, content and scope to conduct a thorough
analysis of the offeror's software engineering capabilities.

5.3 GOVERNMENT EVALUATION TEAM 3


The Government SEE evaluation team consisted of the Source Selectiot. Fvaluation Board
(SSEB) and eight technical advisors. The evaluation team was broken down into tf .e groups based on
the three factors (methodologies, design, and team expertise) that comprised the SEE technical
evaluation item. Each member was assigned to one and only one group, and each group was
responsible for evaluating the offerors' SEE products for only that particular assigned factor and
associated standards. This division of labor greatly expedited the process of evaluating the SEE
products, since it reduced the amount of material any one individual needed to evaluate and, more
I
importantly, it reduced the amount of evaluation assessment documentation that any one individual
needed to prepare. Although the SEE evaluation team members were assigned to only one group, all
evaluators, regardless of the group to which they were assigned, were permitted and encouraged to
provide inputs to any of the three groups/factors. Frequent interaction did in fact occur among the
different group members during the evaluation process and resulted in an effective and rapid interchange
of technical evaluation assessments.

5.4 EVALUATION TOOLS AND TECHNIQUES 3


As stated in section 3.3, the Government planned to use the Electronic Systems Division
acquisition support environment and a set of checklist questions to assist in the evaluation of the
offerors' SEE products.
5.4.1 EASE 3
The Government intended to use the EASE tool for browsing through the offerors' textual
products and for assisting in the evaluation of the ADL submitted with the SEE products. However,
due to the limited EASE functionality and logistical problems, the Government used the EASE tool only
for verifying that an offeror's ADL was compilable.
5.4.2 Checklist Questions I
Although prior to the actual source selection evaluation the Government thought that checklist
questions, generated by the MITRE dry-run team, would be used by the evaluators as they were going
through the material to determine if factors had actually been met, it was found that this was not what
happened during the evaluation. This change in the use of the checklist questions from what was
I
originally intended was made because the checklist questions were found to be too general. It was not
until the Government team saw the offerors' actual SEE products that any specific questions could be
generated. The checklist questions were thus found to be of minimal benefit, being used only by those

38 U
I
I
I
evaluators who had not been part of the original MITRE dry-run team as a means of coming up to speed
on the type of details the Government was looking for, and by the experienced team members simply as
reminders. They were not used as a means to determine whether or not factors and standards had been
met, or as the basis for the questions to be asked during the in-house visit.

1 5.5 GOVERNMENT EVALUATION APPROACH

As discussed in section 3.5, upon receipt of the offerors' SEE products, the Government
intended to perform a first-pass evaluation of each offerots SEE products, lasting approximately one-
week per offeror, using each offeros proposed CCPDS-R FSD/P SDP as the definition of the
offeror's software engineering methodology;, conduct a 1-day audit at each offeror's facility; and, using
the first-pass evaluation and the results of the audit, to produce a final evaluation of each offeror's SEE
products within one week of the audit.

S5.5.1 First-Pass Evaluation

The Government evaluated the offerors' SEE products against the prepared factors and
standards. As a basis for this evaluation, the Government used each offerors SDP (evaluated during
source selection prior to receiving the SEE products), along with any augmentations to it, to determine
whether or not the offeror's methodologies, as described in the SDP, were followed during the
development of the SEE products. The Government evaluation was to determine not only that each
offeror's SDP was followed in the development of the SEE products, but that the requirements analysis
and design methodologies defined by the SDPs were adequate.

As strengths and weaknesses in an offerors SEE products or SDP were identified, vis-a-vis
the factors and standards, the evaluators documented them. For those instances where the evaluators
could not find the information necessary to evaluate a standard, were not sure of the offerors
motivation or rationale, had any questions about the products, or where the evaluation information
could not be ascertained directly from the SEE products delivered, the evaluators prepared questions to
be asked during the in-house audit. In addition, the Government evaluation team generated questions to
verify its own evaluation of whether or not system requirements had been met.

3 Although it was originally planned that one list of questions would be prepared for each
offeror, and would be presented to the offeror during the question and answer period of the in-house
audit, the Government concluded during the first-pass evaluation that the questions for each offeror fell
into two categories: those the Government would best benefit from by allowing the offeror 24 hours
during which to prepare an answer and those for which the Government would best benefit from by not
allowing the offeror more than 5 minutes during which to prepare an answer. Therefore, during the
first-pass evaluation, the Government prepared two sets of questions for each offeror based on the
results of the Government's evaluation of the following SEE factors.
5.5.1.1 Methodology Factor

Each offerors requirements analysis and design methodologies were evaluated to ensure that
they adequately addressed the major issues in each phase and that the methodologies were compatible.
This was accomplished by reviewing the SEE products to determine if the methodologies were robust
and cohesive and to ensure that the methodologies were consistent with each other and provided a

* 39

I
distinction between the end of requirements analysis and the beginning of design. A methodology was
considered robust if it adequately and completely addressed modem software engineering issues for a
real-time system. Attributes of the SEE products that contribute to requirements analysis methodology
robustness include, but are not limited to, inclusion of performance analysis during requirements
analysis, detection, and resolution of specification ambiguities known to exist in the SEE system
I
specification, effective employment of measures for tracing requirements, and identification of derived
requirements. In addition, the SEE products were reviewed to ensure that the SEE products contained
acceptable ADL and graphical representations consistent throughout the SEE products. Questions
generated for the in-house audit were intended to clarify methodology questions.

5.5.1.2 Design Factor I


Each offeror's design was evaluated to ensure that it addressed all of the functional and quality
requirements contained in the SEE system specification, as appropriate, for the subset of the architecture
which the offeror elected to design; that it demonstrated sufficient structure to support modularity,
flexibility, and ease of change and growth; and that it demfrnstrated no deadlock or race conditions. The
design was also evaluated to ensure that it employed effective approaches for managing data and control
flows; for handling initialization, termination, and exceptions; and for meeting performance and capacity
I
requirements. For this factor, the evaluation team generated questions for the in-house audit which
were intended to clarify design questions and to verify the Government's evaluation of whether or not
certain SEE system specification requirements were met,
5.5.1.3 Team Expertise Factor m
Based on the results of the exercise, the Government evaluated the offerors on their compliance
with the SDP, on the knowledge demonstrated by the offerors' SEE team members of their SDP
policies and procedures and their tools, and on the offerors' proper use of Ada as a design language to
represent the design. The offerors were also evaluated to assess whether their teams consisted of
individuals well versed in software engineering and real-time applications, and whether the teams
included strong software engineering/Ada technical leaders. The Government evaluated the offerors on
their teams' knowledge of the SDP policies, tools, software engineering, and real-time applications via
the responses to the "five-minute," spontaneous questions posed to the offerors during the in-house
I
audit. By addressing many of the spontaneous questions to particular offeror SEE team members, the
Government was able to determine the knowledge of an offeror's entire SEE team and not just that of
those members whom the offeror chose to have respond.
I
5.5.2 Audit 3
Following the first-pass evaluations, the Government conducted a 1-day audit at each offeror's
facility. As discussed in section 3.5.2, the purpose of the Government audit was to verify the
Government's first-pass evaluation and to obtain additional information necessary to complete the SEE
evaluation. The offerors were required to prepare a briefing of their SEE conclusions, and to answer
I
the Government's questions. The offerors were evaluated during the in-house audit based on their
ability to provide the required information during the in-house briefing and adequate answers during the
question and answer session.
I
In accordance with the procedures defined in section 5.5.1, the Government submitted two sets
of questions to the offerors for the in-house audiL One set of questions was submitted 24 hours in I
403

I
U

I advance, for which the offerors were required to respond during the audit as well as to provide formal,
written responses to the Government. The other set of questions was given to the offerors 5 minutes in
advance, for which the offerors were required to respond immediately and for which the Government
maintained a record via cassette tape.
I 5.5.3 Evaluation Completion

Following the in-house audit at each offeror's facility, the Government easily completed its
evaluation of the offerors' SEE products within a week of the in-house audit. The completion consisted
of updating the first-pass evaluation assessments and associated identification of strengths and
weaknesses to reflect the additional, clarifying information obtained from the in-house audit. In
addition, the Government transcribed the cassette-recorded responses to the spontaneous "five-minute"
questions. The transcripts, together with the formal responses to the "twenty-four hour" questions
were then entered into the offerors' official submission of SEE products.

I4
I
I
I

I
I
I
I
1 41

I
!
N
U SECTION 6

3 SOURCE SELECTION LESSONS LEARNED

5 As a result of conducting the SEE during the CCPDS-R FSD/P source selection, the
Government identified a number of lessons learned concerning the administration of a software
engineering exercise. These lessons learned relate to deliverable products, exercise scope and duration,
Government evaluation tools and techniques, and Government evaluation approach. The lessons
learned, presented herein, are intended to describe not only how the SEE might be changed for future
5 use or what did not work out as well as possible, but also to discuss those aspects of the SEE that did
work well and should be repeated in the future.

i 6.1 DELIVERABLE PRODUCTS

In some cases, offerors did not submit all "intermediate" SEE products (e.g., data flow
diagrams) which were expected by the Government. Therefore, future programs which elect to carry
out a SEE may need to evaluate their instructions to the offerors to see if they must be clarified to ensure
that the offemrs are aware that all requirements analysis and design products, including "intermediate"
products, are deliverable to the Government.

6.2 EXERCISE SCOPE AND DURATION

As a result of the use of the SEE during source selection, the Government concluded
time, level, and the coverage of the SEE was adequate. The volume and depth of the offerors'that the
delivered
SEE products indicate that 3 1/2 weeks was sufficient time.

I It was apparent during the evaluation of the offerors' performance on the SEE that it did
provide the Government with the answers it was looking for concerning the offerors' ability to
assemble a SEE team and address software engineering and Ada issues, in the context of the offerors'
SDP. The level of requirements in the system specification provided the opportunity for the offerors to
demonstrate their ability in the pertinent areas (e.g., real-time system design, modem software
engineering practices, Ada). It is not felt that a more difficult set of requirements would have added
anything to the Government's knowledge of the offerors' ability. To have increased the coverage of the
SEE requirements, or to have broadened the system, could have had the negative impact of forcing the
offerors to cover more area with less depth. The Government feels that no significant amount of new
evaluation information would have been gained, had more time been allocated to the offerors for
completing the exercise. More ADL might have been generated, or the products might have been more
complete, but it would not have added anything to the Government's assessment of the offerors' ability
to perform requirements analysis and design a real-time system.

I 43
!
I
6.3 EVALUATION TOOLS AND TECHNIQUES 3
As mentioned in section 5.4, the Government used the ESD acquisition support environment
and a set of checklist questions to assist in its evaluation of the offerors' SEE products. In addition, the
Government relied heavily on word processors to expedite its evaluation and associated documentation
efforts. The Government made the following observations regarding the use of these tools during the
evaluation of the offerors' SEE products.

6.3.1 EASE

As discussed in section 5.4, EASE use was attempted and largely abandoned during source
selection. The Government had trained a large portion of the evaluation team in the use of EASE;
I
however, the investment was not worth the return due to the limited EASE functionality and the
logistical problems associated with using a computer facility remote from the source selection. The
team members also felt that EASE was not really essential, given the volume of SEE materials
submitted.

At the time of the CCPDS-R FSD/P source selection, EASE provided text editing and Ada i
compilation functions, but did not provide tools to assist in identifying control flows and data flows,
perform syntax-related browsing and cross-referencing, or assess compliance to coding/design
standards. It required significant manual overhead for such activities as loading tapes and providing
backups, and because EASE was not collocated with the source selection facility, there was time-
consuming travel to transport materials to the EASE facility for evaluation. The EASE facility also had
to be locked and other EASE users could not have access while source selection sensitive materials were
installed. Not until these largely logistical deficiencies can be overcome will EASE and similar tools
I
become useful tools for SEE evaluations. Therefore, before using EASE or similar tools on future
software engineering exercises, programs should first assess the functionality, ease-of-use, logistics,
and potential benefits. If the selected tool is deficient in any of these areas, then its exact use during the
evaluation should be clearly specified prior to source selection. If programs opt to use automated tools
in the future, regardless of whether or not any of these deficiencies still exist, these programs should
consider training fewer evaluation team members since the cost and time required to undergo such
training is likely to be significant.
i
6.3.2 Checklist Questions

As discussed in section 5.4.2, the Government concluded that the checklist questions prepared
prior to source selection did not prove as useful as had been anticipated and were not worth the amount
of time it took to prepare them. As an evaluation tool, the questions were not very beneficial and
serious consideration should be given to either not using them or not investing so much time in
preparing them. 3
6.3.3 Word Processing Capabilities

At the start of the evaluation of the offerors' SEE products, the Government SEE evaluation
team had only minimal word-processing capabilities. As the evaluation continued, more word- I
processing capability was secured, and though it was helpful, it was still not at an adequate level. For
the CCPDS-R FSD/P source selection, the SEE evaluation would have been expedited if there had been
a separate word processor for each of the three groups which made up the team. A laser printer capable

44I
I
I
1 of producing letter-quality text and viewgraphs is also necessary. In the future, programs conducting a

3 SEE should ensure that sufficient word-processing capability is provided so that there is no contention
of resources when documenting the SEE evaluation results against the factors and standards.

i 6.4 IN-HOUSE AUDIT

As a result of conducting the in-house audit, the Government concluded that a 1-day audit at the
offerors' facilities was both beneficial and of sufficient time. Future programs which institute software
engineering exercises are strongly encouraged to conduct such audits if time and logistics permiL
Furthermore, as a result of conducting the in-house audit, the Government made several observations
regarding the submission of detailed questions to the offerors and the maintenance of a transcript of
3 offeror responses.

6.4.1 Detailed Questions

As discussed in section 5.5.1, the Government altered its method for presenting detailed
questions to the offerors as a result of the first-pass evaluation. The revised approach, consisting of
two sets of questions, one submitted 24 hours in advance and one 5 minutes in advance, proved
successful; its use is therefore recommended for other programs that may conduct a SEE audit. In the
case of the 24-hour set, the Government was able to get answers to questions that the offerors could not
have answered as completely or in as much detail if they had not had some time to prepare. For the 5-

U minute set, the Government was able to evaluate the offerors based on their ability to answer questions
extemporaneously which should have required no preparation time, assuming the offerors' teams were
fully trained in the methodologies as claimed in their proposals. The Government was able to direct
many questions to particular offeror SEE team members based on their area of responsibility on the
SEE, lending substance to the evaluation of the offerors' entire SEE team. The use of two sets of
questions for the offerors provided discriminating information that could not have been attained through
the use of only one set of questions.

1 6.4.2 SEE Transcripts

As mentioned in section 5.5.3, the Government completed its evaluation of the offerors' SEE
products by updating the first-pass evaluations to reflect the audit results. Transcripts of the offerors'
recorded spontaneous audit responses, together with the offerors' formal responses to the "twenty-four
hour" questions were entered into the offerors' official SEE product submissions.

I In general, this approach to completing the SEE evaluation worked well. The one major
drawback was the method chosen for documenting the offerors' responses to the spontaneous
questions. Originally, the SSEB planned to have the offerors maintain the written transcripts of these
responses, but this decision was overruled. Consequently, the CCPDS-R FSD/P SSEB had to
maintain the transcript. However, transcribing the cassette tapes placed an overwhelming burden on the
limited SSEB resources since it was such an extremely tedious, time-consuming process. Therefore, it
is suggested that in the future, if SEE audits are held, that either the SSEB be allowed to have the
offerors maintain the written transcripts or that some alternative method be found, such as only
requiring magnetic recordings or videotapes of the responses.

4
1 45

U
I
I
U SECTION 7

OFFEROR FEEDBACK

I In addition to making its own observations regarding administering and conducting a software
engineering exercise, the Government solicited feedback from the CCPDS-R FSD/P offerors on their
impressions of the SEE. The Government accomplished this by providing to the offerors an optional
questionnaire at the conclusion of the in-house audit (see appendix D). In general, the offerors
responding to the questionnaire felt that the SEE was a valuable exercise. This section summarizes the
offeror feedback, addressing the areas of the size of the SEE, the exercise appropriateness, the
resources expended, and the benefits to the offerors.

* 7.1 SIZE

On the average, the offerors considered the size of the SEE to be of an appropriate level, both in
terms of the time required and the time allowed by the Government, and in terms of the SEE system that
the offerors were required to design. The offerors made no recommendations to either increase or
decrease the scope of the SEE or the time allotted for it.

1 7.2 APPROPRIATENESS

Generally speaking, the offerors considered the SEE to be a challenging, appropriate exercise in
relation to CCPDS-R FSD/P source selection. The offerors indicated that the focus of the SEE on
requirements analysis and design was very appropriate because of the perceived high level of risk
associated with the requirements analysis and design of an Ada system. Some offerors felt that metrics
should have been included in the SEE, since metrics collection, reporting, and evaluation are integral to
program management. Additionally, some offerors felt that not enough opportunity was provided to
demonstrate products that they had expended resources on (e.g., prototypes) for use in CCPDS-R
FSD/P.

7.3 RESOURCES
On the average, the offerors considered the resources expended on the SEE to be of a
reasonable level. The percentage of time allotted by the offerors was fairly equally divided between
each of the phases (i.e., requirements analysis, top-level design, detailed design, and preparation for
and participation in the in-house audit). The average amount of resources expended by the offerors was
a little less than 1 staff-year.

7.4 BENEFITS

Overall, the offerors assessed the SEE as beneficial, for several reasons. First, the SEE
provided the offerors an opportunity to exercise and refine their software engineering methodologies.

I47
I
I
Prior to the SEE, the requirements analysis and design methodologies included in the offerors' SDPs I
had not been fully utilized. The SEE provided an opportunity for the offerors to exercise their
methodologies on an actual, albeit small, program and to receive feedback internal to the offerors'
organizations on those methodologies and the products delivered as a result of utilizing them. This
feedback provided the offerors with an opportunity to refine their methodologies where necessary, prior
to utilizing them on a large program such as CCPDS-R. Second, the SEE provided an actual illustration
of the benefits of various software engineering approaches (e.g., prototyping, reusable components,
etc.). This provided offeror insight into the value of these approaches and/or the need to modify these
software engineering approaches for CCPDS-R FSD/P. Third, the SEE provided CCPDS-R related
experience which can then be applied to the CCPDS-R FSD/P phase. Finally, successfully
accomplishing the SEE using their chosen methodologies and Ada provided the offerors with increased
confidence in those methodologies, their Ada expertise, and their software development core team.

I
I
I
I
I
, I
, I
1 I
I
i I
i I
I! 4 I
I

U SECTION 8

* CONCLUSIONS/RECOMMENDATIONS

I Overall, both the SEE dry run and the incorporation of the SEE into the FSD/P source selection
were successful. This section summarizes the Government's conclusions and recommendations
resulting from this successful CCPDS-R SEE, first from the perspective of the dry run of the SEE and
then from the actual use of the SEE during source selection.

8.1 DRY RUN

The Government assessed the dry run of the SEE as extremely beneficial, given the lessons
learned from that effort. More importantly, however, based on the results of the dry run, the
Government determined that the software engineering exercise demonstrated strong potential for being
an effective and discriminating source selection technique. This section summarizes the Government's
pre-source selection SEE conclusions and recommendations relative to the dry-run objectives, software
engineering, and Ada.

8.1.1 Objectives

IAs stated in section 2, the primary objectives of the MITRE SEE dry run were to generate a
clearly defined SEE system specification, to develop the ground rules for the offerors to follow when
conducting the SEE, to identify a discriminating set of evaluation criteria, and to assess whether the
SEE could reasonably be done in the time allotted to the offerors. The secondary objective was to
educate staff in software methodologies, Ada, ADL and DOD-STD-2167. The SEE dry run achieved
all these objectives satisfactorily. As a result of dry running the SEE, the Government was able to

a. Analyze the draft SEE system specification thoroughly, identify weaknesses in the draft
specification, resolve these weaknesses, and generate a final, concise SEE system
specification that contained heretofore omitted requirements pertinent to CCPDS-R, that
would serve as key technical discriminators

b. Develop a set of offeror instructions by which the Government scoped the SEE, expedited
both the offeror preparation effort and the Government evaluation, and maintained the
fai mess and objectivity of the SEE effort

c. Identify a low level set of technical discriminators geared specifically to the SEE system

u specification and Ada, which the Government felt would enable it to separate form from
substance in the offerors' results and thus distinguish those offerors who have strong
software engineering/Ada capabilities from those who do not

d. Verify that the SEE, as scoped per the detailed offeror instructions, could reasonably be
done within the 3 1/2 weeks allotted to the offerors

I I 49
I
I
e. Gain further knowledge, depending on the skill of the individual SEE team member, in
requirements analysis and design methodologies, Ada, ADL, DOD-STD-2167, all of which
U
would prove useful to the CCPDS-R program office during both the source selection and
the FSD/P phase.

MITRE expended approximately fifteen staff months of effort dry running the SEE, from initial
conceptualization of the SEE to completion of all SEE RFP documentation. Thus, the Government
considered the SEE dry-rm effort somewhat costly, however, the Government considered that the
I
benefits far outweighed the costs. Since the SEE was a new source selection technique at ESD, the
Government considered the dry running of the SEE mandatory to test out the concept of the SEE, to
verify that the SEE was a reasonable and workable source selection technique, and to ensure the overall
success of the SEE as a source selection technique for CCPDS-R and other future programs.
Moreover, by dry running the SEE, the Government was better able to identify specification
requirements and evaluation criteria it felt would serve as true discriminators during source selection and
to train staff in software engineering methodologies and Ada for both present and future use. Given
these benefits, it is strongly recommended that when a program includes a software engineering
exercise as part of its source selection approach, it dry run the exercise to some extent before the release
of the exercise to offerors. As a minimum, the dry run should focus on the generation of an appropriate
system specification and on the development of discriminating evaluation criteria.
I
8.1.2 Software Engineering and Ada I
A number of software engineering and Ada lessons learned resulted from the MITRE SEE dry
run. The major conclusions and associated recommendations are as follows: I
a. More time is needed for requirements analysis than is traditionally allocated. This extra
time is due to the difficulty of correctly interpreting user requirements and intentions, as
well as the increased level of detail required by DOD-STD-2167. Programs with a large
software development component should therefore plan appropriately for this additional
time.

b. Daily/weekly Government participation during the contractor's requirements analysis effort


may facilitate progress during that phase by helping to clarify specification ambiguities and
prevent incorrect assumptions by the contractor. It is therefore suggested that software
acquisition
contractor's programs consider
requirements having
analysis Government
effort to expediterepresentatives
those activities.on site during the
I
c. Object oriented design as defined in Booch's "Software Engineering with Ada" provides
useful guidelines but does not constitute a complete methodology. Consequently,
I
programs using Booch's OOD should either augment it to overcome the shortfalls in it or
should seek other alternative methodologies. Alternative methodologies should also be
scrutinized for completeness.
I
d. DOD-STD-2167 appears to require specification and authentication of data (e.g., timing
and sizing information) which seems premature and in some cases impossible to answer for
the given stage of development. This problem seems especially true for Ada developments
for which no previous data is available upon which decisions and estimates can be made.

50 I
I
U Therefore, programs using DOD-STD-2167 should consider tailoring the standard so that
specification and authentication of data occurs at achievable and realistic milestones.

e. With the Ada language and its tasking construct, applications software, and not just
operating system software, must consider and handle control flow issues such as deadlock
and process starvation. Thus, programs using Ada should ensure that the software
development methodologies employed on the program include techniques for designing
effective controls for the detection and/or prevention of deadlock and process starvation.

f. An Ada-based design language by itself is not a sufficient tool for effecting clarity and
communication of global system design information either among developers or between
developers and the Government. Hence, programs using ADL should require that a
graphical design representation technique, consistent with the ADL, also be used for
portraying design information.

g. Availability and retention of qualified Ada engineers constitutes a high risk on Ada
developments. Consequently, programs using Ada should investigate the use of different
contracting vehicles and incentives to obtain and retain qualified Ada engineers both within
the Government agencies and the contractors' organizations.

h. For programs using Ada, Ada training must occur at all levels of the software development
and acquisition teams. Proper training in Ada, however, takes longer than for other
languages. Therefore, programs designing and/or implementing in Ada should require
extensive Ada training for both Government and contractor personnel, as appropriate, and
should plan and account for any additional time and effort required to do so.

The Government also concluded that a software engineering exercise serves as an extremely
effective vehicle for training personnel in all aspects of software acquisition and software engineering.
The unique benefit of the SEE as a training approach is that it provides practical, interactive, hands-on
experience not offered in typical non-interactive theoretical courses, and it covers a range of issues,
such as requirements analysis methodologies, design methodologies, Ada, DOD-STD-2167, software
specifications and reviews, and software tools and technique. While the SEE is an effective training
technique, it is costly to conduct since participants must dedicate significant amounts of time and effort
to reap the benefits. However, the benefits are considered to far outweigh the cost.

* 8.2 ACTUAL SOURCE SELECTION

At the start of the CCPDS-R FSD/P source selection, the Government considered the purpose
of the SEE to provide discriminating information that would enable the Government to determine the
degree of risk associated with each CCPDS-R offerr's proposed software development methodology
and to determine the offeror's ability to organize a team fully knowledgeable in that methodology and in
Ada, the required CCPDS-R implementation language. This section provides a summary of the
conclusions the Government reached regarding the SEE versus its CCPDS-R objectives. It also
provides some general observations regarding the conduct of future software engineering exercises.

I
*!5
I
I
8.2.1 CCPDS-R SEE Objectives N
At the completion of the FSD/P source selection, the Government concluded that the CCPDS-R
SEE satisfied its objectives resoundingly. General conclusions regarding the CCPDS-R SEE as a
source selection technique may be summarized as follows:

a. The SEE was an extremely beneficial source selection technical area evaluation technique.
By having offerors develop actual products using their proposed software development
approach, the SEE provided the Government invaluable insights as to what an offeror
really can do versus what an offeror claims he can do. It provided a concrete example that
demonstrated the degree of robustness of an offeror's methodology, the offeror's ability to
follow the propos, d SDP, and the offeror's expertise in the proposed methodology and
tool set. It clearly demonstrated whether or not an offeror's proposed CCPDS-R FSD/P
team had sufficient expertise to design and develop a real-time system in Ada, as is required
for CCPDS-R.

b. The SEE served as an excellent vehicle by which to identify early problems in an offerors
software approach. For example, the use of the SEE helped to point out incomplete
methodologies that did not address all of the software engineering issues, areas where the
requirements analysis and design methodologies conflicted, and inadequate ADL and
graphical design representation techniques. By uncovering these problems during source
selection, the Government was better able to focus on these problems immediately at
FSD/P contract award, rather than waiting until they become apparent in the development
phase, when problems are more costly and difficult to correct and the contractor is less
willing to make changes.

In addition to meeting its stated objectives, the SEE also provided some additional benefits not I
originally anticipated. In particular, the SEE assisted in the source selection cost area evaluation by
yielding valuable information on offeror capabilities in such areas as level of experience with the
selected programming language and tools. This additional insight into actual offeror capabilities enabled
the Government to generate more representative inputs for its software cost estimation models and
thereby to assess cost and schedule risk associated with an offeror's software development approach for
the CCPDS-R FSD/P phase. Also, as the offeror feedback indicates, the SEE forced offerors to
solidify and test out their methodologies and teams and thus to make modifications, as appropriate, to
eliminate problems on their own prior to the FSD/P phase.

8.2.2 General Observations

Given the overwhelming benefits that were reaped from the CCPDS-R FSD/P SEE, the
Government SEE team strongly recommends the use of a software engineering exercise for other
acquisition programs. However, the team does so with the following caveats:

a. To conduct a software engineering exercise is costly, both for the Government and for the
offerors. For the CCPDS-R SEE, the Government expended approximately twenty staff-
months to dry run the SEE and to evaluate the offerors' SEE products during source
I
selection. Offerors expended approximately ten staff-months each to carry out the exercise.
As the Government becomes more used to conducting SEEs, the level of Government
effort expended will decrease, perhaps to ten staff-months. However, in any case, if a

52 I
U
I
I
program elects to conduct a software engineering exercise, it should be aware of and able to
accommodate the additional cost.

b. Evaluation of a software engineering exercise may add significant time to a source selection
if many offerois respond or if the Government evaluation team is not well prepared in
advance. Consequently, programs that opt to conduct a SEE should consider approaches
for minimizing the time required to conduct and evaluate a SEE. Possible approaches
include
forstaggered release of the technical
exercise, and
Government dry running
done CCPDS-R, and strong management teams toof the exercise
evaluate as was
the SEE
products.

c. A software engineering exercise is of benefit to a particular program's source selection only


if it is tailored for that program. For example, providing a missile warning exercise on a
local area network (LAN) program would be of little value since the offerors, most likely
communications software engineers, would not be evaluated on those LAN-unique
software areas which would truly demonstrate the offerors' capabilities to implement the
real program. Also, a software engineering exercise is beneficial only if it has specific,
realistic goals in mind. For example, in the case of CCPDS-R, concern existed that not all
offerors would be proficient in designing a missile warning, real-time system in Ada, as is
required for CCPDS-R. The CCPDS-R SEE and associated evaluation criteria were
specifically devised to address this concern. Therefore, if a program does choose to
conduct a software engineering exercise, it should ensure that the exercise is applicable to
the real program and that it is geared towards discerning particular, discriminating
information about the offerors.

Iinformation,
d. While a software engineering exercise provides discriminating source selection
it cannot be relied upon solely as a means to select a contractor. For example,
situations may occur, such as offerors not following the ground rules and using resources
not proposed in the SDP, which may invalidate the SEE results and consequently its
usefulness as an evaluation item. Thus, programs which elect to carry out a software
engineering exercise should include other evaluation items besides the exercise upon which
to make a source selection decision.

5In some respects, the CCPDS-R program was fortunate in that it was the first program at ESD
to conduct a software engineering exercise. Consequently, industry was not sure what to expect and,
therefore, industry followed the SEE instructions completely and satisfied the SEE intent fully.
However, as software engineering exercises become more common, the response of industry may be to
develop "professional exercise teams" analogous to the specialized proposal preparation teams now
evident, or to bring in outside consultants or employ other similar vehicles (e.g., submitting too much
material) which will in essence circumvent or negate the intent of the exercise. Future programs that
choose to conduct software engineering exercises must be aware of this possibility and thus take
additional precautions where necessary to prevent this situation from arising during source selection.

I
I

I 53 I
I

1 LIST OF REFERENCES

1 1. Department of Defense, "Military Standard: Defense System Software Development," DOD-


STD-2167, 4 June 1985.
2. Headquarters, Electronic Systems Division, "RFP F19628-86-R-0142, Amendment 001,
CCPDS-R FSD/P," 10 October 1986.
3. Booch, G., Software EngineeringWith Ada, Menlo Park, California: The
Benjamin/Cummings Publishing Company, Inc., 1983.

4. Buhr, R., System Design With Ada, Englewood Cliffs, New Jersey: Prentice-Hall, Inc.,
1984.
5. Headquarters, U. S. Air Force, "Contracting and Acquisition: Source Selection Policy and
I Procedures," AF Regulation (AFR) 70-15, 22 February 1984.
6. Headquarters, Electronic Systems Division, "Contracting and Acquisition: Source Selection
Policy and Procedures," ESD Supplement I to AFR 70-15, 21 March 1986.

I
U
I
I
I
I
I
I
1 55

I
I
I
APPENDIX A
SEE INSTRUCTIONS FOR THE OFFEROR
AND EXERCISE SPECIFICATION

This appendix contains the detailed instructions for the offeror


and the CCPDS-R SEE system
specification provided to the offerors upon receipt of the CCPDS-R FSD/P proposals.

I
U
I
I
I
I
I
I
I
I
I
I CCPDS-R SOFTWARE ENGINEERING EXERCISE

5Detailed Instructions for the Offeror

* 1.0 PURPOSE

The purpose of the software engineering exercise (SEE) is to permit the Government to evaluate
an actual application of each offeror's software development methodology as proposed for use during
the CCPDS-R full-scale development/production (FSD/P) phase. The SEE will concentrate exclusively
on the offerors' approach to requirements analysis, design, and their interrelationship. The
Government will not evaluate as part of the SEE the offeror's approach to implementation, integration,
test, quality assurance, configuration management, staffing level, productivity measures, software
metrics collection, and other development activities not explicitly mentioned in the following
paragraphs.

1 2.0 INSTRUCTIONS FOR THE OFFERORS

Each offeror will provide a prototypical example of his proposed software development
approach, as applied to a sample problem taken from the missile warning domain. [The attachment],
"Exercise Specification," presents the requirements for the sample problem. In performing the exercise,
the offeror shall comply with all provisions of his proposed software development plan and with section
3.3 of the CCPDS-R system specification. To the maximum extent practical, the offeror shall make use
of development tools and procedures that are proposed for the CCPDS-R FSD/P phase, as this will be
viewed more favorably by the Government; deviations shall be noted by the offerors.

Participation in the exercise shall be limited to those individuals identified in the offeror's
proposal as part of the CCPDS-R full-scale development team. Subcontractors who will be responsible
for software development on CCPDS-R shall be active participants. Consultants shall be precluded
from participating. Each offeror will deliver to the Government all requested materials, in the formats
described in section 3, no later than 12 noon local time, 3 December 1986. The Government will
review this material for a period of time not to exceed two (2) calendar weeks. Following completion of
the Government review, a Government team will conduct an on-site visit at the offeror's facility, at
which time the offeror shall brief his approach and provide responses to Government requests for
clarification. The Government will coordinate the schedule for the on-site visit with the offeror upon
receipt of the offeror's exercise results. Preliminary plans are for the Government to conduct the on-site
visit during the week of 15-19 December 1986. Note that there will be no interaction between the
offeror and the Government during the offeror's implementation of the exercise. Should the offeror
have any questions on the exercise, the offeror is instructed to identify appropriate assumptions, to
document these assumptions, and proceed with the exercise based on those assumptions.

The Government will conduct its evaluation of the offeror's delivered materials and assess the
offeror's proposed methodologies using as a primary reference the offeror's Software Development
Plan (SDP) submitted with the CCPDS-R proposal, and particularly the software standards and
3 procedures contained within the SDP. The offeror may submit with the SEE materials delivered on
3 December 1986 an augmentation to the SDP, not to exceed fifteen (15) pages, which provides further

5 59
U
SEE DETAILED INSTRUCTIONS (Continued) S
concise, technical, and explicit details regarding the offeror's proposed software development approach i
and methodologies. The Government will consider any such augmentation as part of the offeror's
proposal and subject to Government evaluation. 3
The Government will employ automated tools to conduct its evaluation of the offeror's
delivered materials. Therefore, as described in section 3, the offeror is required to deliver some of the
exercise products in machine-readable format. In order to assess the compatibility of the Government's
tools and the offeror's machine-readable products, the offeror is requested to deliver to the Government
no later than 12 noon local time, 19 November 1986, a demonstration tape containing sample files of
I
the offeror's methodology products (e.g., Ada-based design language (ADL) listings. etc.) in the same _____

format as will be submitted at the conclusion of the exercise period. The Government will not evaluate 3
the contents of this demonstration tape, but will merely use the tape to study and resolve any
compatibility issues that may develop between the Government's tools and the offeror's tape output.
The sample files on the demonstration tape do not need to represent actual products of the exercise; they
need only represent general products of the offeror's proposed methodologies, the types of which the
I
offeror will submit for evaluation at the end of the exercise period.

3.0 PRODUCTS OF THE EXERCISE

At the conclusion of the exercise period on 3 December 1986, the offeror shall deliver the
following items to the Government for evaluation:

a. A complete software architecture for the sample problem. This architecture shall contain an
identification of software components, an allocation of functions to these software S
components, a preliminary specification of interfaces, and an indication of control and data
flow throughout the system. 5
b. For two or more offeror-selected components of the system, all requirements analysis
conclusions reached and documentation. With respect to the selected components, the
requirements analysis shall represent a complete utilization of the tools and procedures
proposed by the offeror for use on CCPDS-R. The offeror shall identify any deviations
U
from these tools and procedures and the associated rationale for these deviations in his
briefing to the Government.

c. For two or more offeror-selected components of the system, all preliminary design
documentation, including requirements traceability, ADL listings, and graphics products.
With respect to the selected components, the preliminary design documentation shall
represent a complete utilization of the tools and procedures proposed by the offeror for
I
CCPDS-R. The offeror shall identify any deviations from these tools and procedures and
the associated rationale for these deviations in his briefing to the Government.

d. For at least one offeror-selected component of the system, all detailed design
documentation, including requirements traceability, ADL listings, and graphics products.
With respect to the selected component(s), the detailed design documentation shall

603

I
I
I
SEE DETAILED INSTRUCTIONS (Continued)

I represent a complete utilization of the tools and procedures proposed by the offeror for

3 CCPDS-R. The offeror shall identify any deviations from these tools and procedures and
the associated rationale for these deviations in his briefing to the Government.

All textual products of the exercise, including requirements analysis conclusions and
documentation, ADL listings, and other design documentation shall be delivered to the Government
both in hardcopy form and in machine-readable, 9-track tape. Exception will be made for materials that
the offeror does not propose to create and/or maintain online during the CCPDS-R FSD/P contract. In
particular, graphical representations shall be submitted in hardcopy form. The tape shall be in 9-track
1600 bpi format in accordance with ANSI X3.27-1978, ASCII labelled. and with an identified record
size and block size. The block size shall be 512 bytes. For readability, all tabs should be expanded to
spaces. The offeror shall provide ten (10) copies of all hardcopy products. The products delivered
shall be clear, coherent, legible, and prepared in sufficient detail for effective evaluation. Elaborate
documentation, expensive binding, detailed art work or other embellishments are unnecessary. The
offeror shall include with these products indices delineating the subject and contents of the hardcopy
material package and the 9-track tape; the operating system command(s) used to create the tape; a list of
if ADL compilation units; and a list of the compilation order of these units.

In addition to the delivered products described above, the offeror shall provide a briefing to the
Government during the on-site visit that summarizes his experience in carrying out the exercise and
describes the products generated. The briefing shall not exceed three (3) hours in duration. The topics
presented shall include the following:

1. Management approach, to include:

3B. A. Introduction of team members


A description of individual roles and experience

II. An overview of the requirements analysis approach, to include:

A. A rationale for the selection of the software components


B. A description of the tools and procedures employed
C. Significant requirements issues encountered and their resolution
D. A discussion of deviations from the proposed approach, and associated rationale
E. Other topics to be determined by the offeror
III. An overview of the approach to preliminary and detailed design, to include:

A. A rationale for the selection of the software components


B. A description of the tools and procedures employed
C. Significant design issues encountered, alternatives considered, and a rationale
for decisions made
D. A discussion of deviations from the proposed approach, and associated
rationale
E. Other topics to be determined by the offeror

* 61

I
I

SEE DETAILED INSTRUCTIONS (Concluded) I


IV. Other topics to be determined by the offeror I
The briefing shall not include any discussion of further work which the offeror may have
completed following the submission of the SEE products on 3 December 1986, since the Government
will not evaluate this additional work. All participants in the exercise shall be present at the briefing to
I
respond to Government requests for clarification. The offeror shall provide ten (10) paper copies of the
briefing slides and accompanying text at the time of the presentation. A transcript of the questions and
answers will be kept. All offeror responses to these Government clarification requests (i.e., the
transcript) together with the briefing presentation material and the products identified in items (a)-(d)
above shall be considered part of the offeror's proposal and subject to evaluation by the Government.

6
I
I
I
U
1
!
I
3
i
62 5
I
ATTACHMENT

CCPDS-R SOFTWARE ENGINEERING EXERCISE (SEE) SPECIFICATION

U 1.0 SCOPE

The exercise system will create scenarios under user direction and will simulate the CCPDS-R
missile warning (MW) capability in real time.

2.0 APPLICABLE DOCUMENT

=CCPD-R System Specification, section 3.3, dated 30 September 1986.

3 3.0 REQUIREMENTS

3.1 GENERAL DESCRIPTION

The exercise system shall maintain MW information and display the information in tabular form
in real time. Specifically, the exercise system shall create scenarios under user direction and store each
created scenario in a separate scenario file. It shall use a generated scenario to run the MW simulation in
real time. The system shall provide the capability for the user to run a simulation while editing,
deleting, creatir - .ing, or querying a scenario file (possibly the same file). The design for the
exercise system shall be modular to facilitate changes in software components which ae needed to
accommodate future changes in operational requirements.

33.2 HARDWARE

The exercise system will generate tabular displays only. No special graphics hardware or
capabilities shall be used. The user interface shall be designed to operate on a single dumb terminal
with keyboard entry device.

3 3.3 SIMULATION DATA

33.3.1 TW/AA Configuration


The TW/AA configuration to be simulated shall be as follows:

1. There shall be one command center, designated as NORAD.


2. There shall be seven sensors, designated as PAVE PAWS EAST (PPE), PAVE
PAWS WEST (PPW), BMEWS I, BMEWS II, BMEWS III, IR I, and IR II.

5 63

U
S
U
CCPDS-R SEE SPECIFICATION (Continued) I
3. There shall be five missile launch origin locations, designated as U
MLOC1 through MLOC5, and five predicted impact/nuclear detonation locations,
designated as IPLOCI through IPLOC5. 3
4. Sensor conncctivity shall be from each sensor to the command center.

5. The exercise system shall simulate the transmission and processing delay incurred
from the time a sensor transmits a missile warning message until the message has been
processed by the system and made ready for display. The processing delay parameter shall
I
be user selectable from 0-99 seconds and shall be constant during a given missile warning
simulation.
3.3.2 Missile Warning Data 3
Missile warning data shall consist of missile launches and nuclear detonations (NUDETs). A
missile launch message shall consist of launch origin location, launch type (ICBM, SLBM), reporting
sensor, position of predicted impact, and time of launch. Each launch shall be detected by (i.e.,
associated with) only one sensor. A nuclear detonation message shall consist of time and location.
Launch locations and impact locations shall be designated as described in 3.3.1.
I

3.4 DISPLAY FORMATS i


Display formats shall consist of menus for the user interface and tabular displays. 3
3.4.1 User Interface

The user interface shall be menu driven and user friendly. All user inputs shall be validated for
proper format and range of values. The user shall be notified of any entries that are erroneous or that
U
cannot be processed for any other reason. Error messages shall be self-explanatory and shall specify,
to the extent practical, the cause and location of the error.

General user capabilities to be provided shall include the capability to start and stop a session;
the capability to terminate the scenario generator (SG) and/or missile warning simulator (MWS) and exit
to the main menu upon user request; the capability to display the directory of scenario file names; the I
capability to select the processing delay parameter (see 3.3.1); and the capability to interface with the
scenario generator and miss-*:. warning simulator as described in 3.5 and 3.6, respectively. 3
All user inputs shall be acknowledged within one second of the input. For data entered by the
user, the time from completion of entry until the database is modified to reflect the update shall not
exceed two seconds. An advisory shall be provided within two and one half seconds if the system
cannot complete such an update. At a minimum, these performance requirements shall be met on
I
dedicated processing equipment and with at least twenty stored scenario files, consisting on the average
of 5,000 combined missile launch and NUDET events. 3
64
I
I
I CCPDS-R SEE SPECIFICATION (Continued)

I 3.4.2 Tabular Displays

The exercise system shall be able to generate three displays for MW data: a missile launch
summary display, a predicted impact/NUDET summary display, and a message display. The summary
displays shall present the MW information received by the command center as generated by a selected
scenario, summarized from the start of the scenario, in real time, and in accordance with the specified
processing delay (see 3.3.1). The formats for the missile launch summary display and the predicted
impact/NUDET summary display shall be as specified in figures I and 2, respectively. The message
display shall sequentially list the messages received by the command center, as received in real time.
The capability shall be provided to display the contents of at least the five most recently received
messages in the scenario. Display updates shall be processed and reflect a scenario event within one-
half second of the activation time of the event. (Activation time is defined in section 3.5.)

* 3.5 SCENARIO GENERATOR

The SG shall only be activated and deactivated as a result of user action. The SG shall be able
to create, delete, edit and save files containing scenario data. Edit capabilities for a selected scenario file
shall include changing the contents of events in the scenario file, adding events to the scenario file, and
deleting events from the scenario file. The capability shall be provided to save a scenario and any
changes to it as a new file or as the current file. Each event in a scenario shall have a unique activation
time to the nearest tenth of a second, where the activation time represents the time the reporting sensor
transmits the missile warning message. The user shall be precluded from entering multiple events into a
scenario with the same activation time. The user shall be able to query an individual scenario file to
search for events based on reporting sensor and/or time of event activation. The design for the exercise
system shall be flexible to allow as future growth the capability to perform this query across all scenario
files. The SG shall accept inputs from the keyboard to perform the above functions. There shall be a
default scenario file consisting of a total of 5,000 individual missile launch and NUDET events and their

3at associated times of activation covering a twenty minute scenario period. The SG shall support a total of
least 40,000 missile launch events and 10,000 NUDET events contained in one or more scenarios.

3.6 MW SIMULATION
I The MWS shall provide the user with the capability to select and run a scenario contained in a
scenario file. The MWS shall run this scenario in real time, generating the missile launch summary
display, the predicted impact/NUDET summary display, or the message display, as specified by the
user. The MWS shall be activated or deactivated only upon user request. Capabilities shall be provided
for the user to select the processing delay parameter (see 3.3.1), to suspend the simulation, to resume
the simulation, to fast forward the simulation (where fast forward means the run time between event
activations is reduced by two), and to stop the fast forward capability and return to the normal run time
between event activations. The user shall also have the capability to select which of the three MW
displays he wishes to view, and to move to other displays while the simulation is running.

I
5 65

U
I

CCPDS-R SEE SPECIFICATION (Continued) I


NUMBER OF MISSILE LAUNCHES
I
[

SENSOR MISSILE LAUNCH ORIGIN


MLOC1 MLO2 MLOC3 MLOC4 MLO5
PPE I
PPw
BMEWS I
BMEWS n1
BMEWS III
IRI
IR I1

TOTAL

FIGURE 1. MISSILE LAUNCH SUMMARY


I
IMPACT/NUDET LOCATIONS I
IPLOCI IPLOC2 IPLOC3 IPLOC4 IPLOC5

PREDICTED I
IMPACTS (PI)
PPE
PPW
I
BMEWS I
BMEWS II
BMEWS IIl
IRI
3
IR II

TOTAL PI 5
TOTAL NUMBER
OF NUDETS

FIGURE 2. PREDICTED IMPACT/NUDET SUMMARY 1


661
I
I
I CCPDS-R SEE SPECIFICATION (Concluded)

1 3.7 SIMULTANEOUS GENERATION AND SIMULATION

The exercise system shall provide the capability for the user to run the MWS and SG
simultaneously, either on the same or different scenario files, while still meeting the performance
requirements specified herein. Formats for the displays when both are running simultaneously will be
contractor defined as part of the design effort

When both the SG and the MWS are processing the same scenaio, the MWS displays shall
reflect a modification to an event in the scenario only if the event has not yet been processed by the
MWS; otherwise, the MWS displays shall not reflect the changes.

I
I
I
I
I
a
I
I
!
I
* 67

I
I
U
I APPENDIX B

I CCPDS-R IFPP SEE MATERIAL

This appendix contains the information included in the CCPDS-R RFP Instructions for
Proposal Preparation for incorporating the software engineering exercise as part of the CCPDS-R
source selection. This information, provided to the offerors in the initial release of the RFP, identifies
the requirement for all offerors to carry out the SEE as part of the CCPDS-R proposal effort. It also
provides a high-level set of instructions detailing to the offerors what is expected of them in carrying out
the SEE.

I
I
I
I
I
I
I
I

I
! 69
I
I
I
I CCPDS-R IFPP SEE MATERIAL

i Software Engineering Exercise (SEE). The offeror shall carry out a software engineering
exercise which will be defined by the Government. [The attachment] contains the general ground rules
for the conduct of the SEE and a brief description of the SEE products to be generated and submitted by
the offeror for Government evaluation. The Government will provide ,he SEE specification and the
detailed SEE ground rules following receipt of proposal, at which time the SEE shall commence.

I
I
I
U
I
I
I
I
I
* 71

I
U
I
ATTACHMENT

SOFTWARE ENGINEERING EXERCISE


Preliminary Instructions for the Offeror

1.0 PURPOSE

The purpose of the software engineering exercise (SEE) is to permit the Government to evaluate
an actual application of each offeror's software development methodology as proposed for the
CCPDS-R full-scale development/production (FSD/P) phase. The SEE will concentrate exclusively on
the offerors' approach to requirements analysis, design, and their interrelationship. The offeror's
approach to implementation, integration, test, quality assurance, configuration management, and other
development activities not explicitly mentioned in the following paragraphs will not be evaluated by the
Government as part of the SEE.

2.0 INSTRUCTIONS FOR THE OFFERORS

Each offeror will provide a prototypical example of his proposed software development
approach, as applied to a sample problem taken from the missile warning domain. The Government
will define the sample problem and provide the SEE problem specification to the Offeror following
receipt of proposal. In performing the exercise, the offeror shall comply with all provisions of his
proposed Software Development Plan and with section 3.3 of the CCPDS-R System Specification;
deviations shall be noted by the offerors.
Participation in the exercise shall be limited to those individuals identified in the offeror's
proposal as part of the CCPDS-R full-scale development team. Subcontractors who will be responsible
for software development on CCPDS-R shall be active participants. Consultants shall be precluded
from participating.

Each offeror will be allocated a period of four (4) calendar weeks from receipt of the exercise
materials until delivery to the Government of all requested materials in the formats described below.
The Government will review this material for a period of time not to exceed two (2) calendar weeks.
Following completion of Government review, the Government will conduct an on-site visit at the
offeror's facility, at which time the offeror shall brief his methodology approach to the Government and
provide responses to Government requests for clarification. The Government will coordinate the
schedule for the on-site visit with the offeror upon receipt of the offeror's exercise results. Note that
there will be no interaction between the offeror and the Government during the four week exercise
period. Should the offeror have any questions on the exercise, the offeror is instructed to identify
appropriate assumptions, to note these assumptions, and proceed with the exercise based on those
assumptions.

I
* 73

I
I
I
PRELIMINARY INSTRUCTIONS FOR THE OFFEROR (Continued)

3.0 PRODUCTS OF THE EXERCISE I


At the conclusion of the exercise, the offeror shall deliver the following items to the
Government: I
a. A complete software architecture for the sample problem

b. For two or more


conclusions offeror-selected
reached components of the system, all requirements analysis
and documentation

c. For two or more offeror-selected components of the system, all preliminary design I
documentation, including requirements traceability, Ada-based design language (ADL)
listings, and graphics products 3
d. For at least one offeror-selected component of the system, all detailed design
documentation, including requirements traceability, ADL listings, and graphics products.

All textual products of the exercise, including requirements analysis conclusions and i
documentation, ADL listings, and other design documentation shall be delivered to the Government
both in hardcopy form and in machine-readable, 9-track 1600/6250 bpi tape format in accordance with
ANSI X3.27-1978. Exception will be made for materials which the offeror does not propose to create
and/or maintain online during the CCPDS-R FSD/P contract. In particular, graphical representations
shall be submitted in hardcopy form. The offeror shall provide six (6) copies of all hardcopy products.
The products delivered shall be clear, coherent, legible, and prepared in sufficient detail for effective
expensive binding, detailed art work or other embellishments are
I
unnecessary.Elaborate documentation,
evaluation.

In addition to the delivered products described above, the offeror shall provide a briefing to the
Government that summarizes his experience in the carrying out the exercise and describes the products
produced. The briefing shall not exceed three (3) hours in duration. The topics presented shall include
the following:
I. Management approach

II. An overview of the requirements analysis approach

I1. An overview of the approach to preliminary and detailed design 5


IV. Other topics to be determined by the offeror

The briefing to the Government shall be presented between one and two calendar weeks after
delivery to the Government of the products of the exercise described in points (a) - (d) above. The
3
briefing shall not include any discussion of further work which the offeror may have completed
following completion of the four week SEE period. All participants in the exercise shall be present at
the briefing to respond to Government requests for clarification. All offeror responses to these

74 I
II
I!
I

I PRELIMINARY INSTRUCTIONS FOR THE OFFEROR (Concluded)

I Government clarification requests together with the briefing presentation material and the products

3 identified in items (a)-(d) above shall be considered part of the offeror's proposal and subject to
evaluation by the Government.

4.0 SCOPE OF THE SEE

IThe Government will not evaluate the following items:

3 a. Additional work accomplished on the SEE after the initial 4-week period

b. Level of staffing

3c. Measures of productivity and collection of software development metrics

d. Issues that relate to coding, integration, and test.

I
I
I
I
I
I
I
!
* 75

I
I
I
I APPENDIX C

3 CCPDS-R SECTION M SEE MATERIAL

3 Listed below is the material included in the CCPDS-R RFP section M, evaluation criteria, for
the SEE. This material identifies on what basis an offerors SEE products will be judged by the
Government.

3 Item: Software Engineering Exercise

The offeror will be evaluated on his familiarity with the selected software
development methodology and on his capability to utilize Ada. The offeror will be
evaluated on his corporate Ada/Software Engineering expertise; his requirements
analysis and design approaches and their inter-relationships; the robustness and
cohesion of his requirements analysis and design methodologies; his familiarity
and expertise with the methodologies; his familiarity with the tool set and the
development environment; the robustness, cohesion, and completeness of his
exercise design; his ability to address and analyze real-time requirements and
issues; his clarity and communication of design, including the use of ADL to
express design; and his compliance with the exercise specification requirements
and SDP. A visit to each offeror will be scheduled approximately six (6) weeks
after receipt of proposals to evaluate the software engineering exercise. The
evaluation will be considered as pass/fail; there will be no opportunity to re-
accomplish the exercise. The visiting Government team will be assisted by

! personnel from MITRE and the Software Engineering Institute.

It should be noted that while the Software Engineering Institute (SEI) was identified in section M as a
possible member of the Government SEE evaluation team, no representatives of the SEI did in fact
participate.

7
I
I
!
* 77

I
I
I

APPENDIX D

3 SEE QUESTIONNAIRE

This appendix contains the optional questionnaire which was submitted to all CCPDS-R FSD/P
SEE offerors.

I
I
I
I
I
I
I
I
I
I
I
I
I 79

I
i
I
N SEE QUESTIONNAIRE

I THE PURPOSE OF THIS QUESTIONNAIRE IS TO ASSESS THE BENEFITS OF THE SEE

3 FROM THE OFFERORS' PERSPECTIVE AND TO EVALUATE ITS POTENTIAL BENEFITS ON


FUTURE ACQUISITIONS.

PART I. PLEASE CIRCLE YOUR RESPONSE FOR EACH OF THE QUESTIONS BELOW.

1. Overall, the SEE was


a. beneficial

3 b. somewhat beneficial
c. rot beneficial

2. The instructions to the offeror were


a. adequate
b. marginal
c. not adequate

3. The SEE system specification was


a. adequate
3 b. marginal
c. not adequate

4. The scope of the SEE was


a. too broad
b. satisfactory
c. too narrow
3 5. The SEE, as a technical problem to be solved, was
a. overly challenging
b. adequately challenging
3 c. trivial

6. The SEE, relative to the CCPDS-R acquisition, was


a. relevant
b. somewhat relevant
c. not relevant

7. The time allotted for the SEE was


a. too long
b. adequate
c. too short

8. The requirement to use subcontractors on the SEE was


a. beneficial
b. not beneficial
c. detrimental

*81
I
I
SEE QUESTIONNAIRE (Continued) 3
9. The instructions given for the briefing were i
a. adequae
b. somewhat adequate
c. not adequate I
10. The questions you were given the day before the briefing were
a. too numerous
b. adequate in number
c. too few
11. The questions you were given the day before the briefing were i
a. relevant
b. somewhat relevant
c. not relevant i
12. The questions you were given during the briefing were
a. too numerous
b. adequate in number
c. too few
I
13. The questions you were given during the briefing were
a. relevant
3
b. somewhat relevant
c. not relevant 3
14. Assembling the SEE team was
a. difficult
b. somewhat difficult
c. not difficult

15. Have members of the SEE team worked together previously?


a. yes
3
b. some members have
c. no I

PART II. DESCRIBE THE MAJOR BENEFITS YOU GOT FROM PARTICIPATION IN THE SEE. 3
I

82 3
i
I
I
U SEE QUESTIONNAIRE (Concluded)

I PART IIl. FOR EACH OF THE FOLLOWING PHASES OF DEVELOPMENT FOR THE SEE,
DESCRIBE THE LEVEL OF EFFORT SPENT IN EACH PHASE (PERCENT OF TOTAL SEE
EFFORT) AND ANY DIFFICULTIES YOU RAN INTO DURING EACH PHASE. ALSO,
IDENTIFY THE TOTAL EFFORT (I.E., NUMBER OF STAFF-MONTHS EXPENDED ON THE
SEE).

3 A. Requirements Analysis

B. Top-Level Design

3 C. Detailed Design

D. Prototyping

E. Briefing

I
3 PART IV. WAS THERE ANYTHING YOU WOULD HAVE LIKED THE GOVERNMENT TO
HAVE SEEN IN THE SEE PRODUCTS BUT THERE WAS NO PLACE TO PUT IT?

3
PART V. HOW SHOULD THE SEE BE MODIFIED TO INCREASE ITS BENEFITS TO FUTURE
3 ACQUISITIONS?

I
PART VI. USE THIS SPACE FOR ANY ADDITIONAL COMMENTS.

I
I
I
I

I
i

II
U GLOSSARY
* Acronyms

3 ADL Ada-based design language


ANMCC Alternate National Military Command Center
3 ANSI American National Standards Institute

AFR Air Force Regulation


I AFSC Air Force Systems Command
ASCII American Standard Code for Information Interchange
CCPDS Command Center Processing and Display System

CCPDS-R Command Center Processing and Display System-Replacement


C(II) concept definition/design
CDR critical design review
CMAFB Cheyenne Mountain Air Force Base
CRISD computer resources integrated support document

CSC computer software component


3 CSCI computer software configuration item
DBDD database design document
3 DI data item
DID data item description
DOD Department of Defense
EASE ESD acquisition support environment

ESD Electronic Systems Division


3 FCA functional configuration audit

FM file manager
3 FSD/P full-scale development/production

* 85

I
I
GLOSSARY (Continued) I
Acronyms

ICBM intercontinental ballistic missile 3


IDD interface description document

IFPP instructions for proposal preparation 3


IRS interface requirements specification

ITW&A integrated tactical warning and attack assessment 3


LAN local area network
MWS missile warning simulator I
NMCC National Military Command Center

NORAD North American Aerospace Defense Command I


NUDET nuclear detonation 3
OPCC Offutt Processing and Correlation Center

OOD object-oriented design 3


PCA physical configuration audit

PDS processing and display subsystem 3


PDR preliminary design review
RFP request for proposal 5
SAC Strategic Air Command
SCIS Survivable Communications Integration System 3
SCMP software configuration management plan

SDDD software detailed design document I


SDF software development fie

SDP software development plan


SEE software engineering exercise 3
86 3
I
I
I
N GLOSSARY (Concluded)
3 Acronyms

3 SEI Software Engineering Institute


SG scenario generator
3 SLBM sea- or submarine-launched ballistic missile
SOW statement of work
3 SPS software product specification
SQEP software quality evaluation plan
SRR system requirements review
SRS software requirements specification
SSEB source selection evaluation board
3 SSPM software standards and procedures manual
SSR software specification review
SSS system segment specification
STD standard
SST2DC software test descriptions
STIDD software top-level design document
3 STP software test plan
STPR software test procedures
3 STR software test report
TRR test readiness review
TW/AA tactical warning and attack assessment
USI user-system interface
VDD version description document

I
I 87

You might also like