An Overview of Software Engineering

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

BRUCE I. BLUM and THOMAS P.

SLEIGHT

AN OVERVIEW OF SOFTWARE ENGINEERING

Computer software has become an important component of our defense systems and our everyday
lives, but software development is both difficult and costly. This article examines the similarities and
differences between software and hardware development, the essence of software, modern practices used
to support the software process, and the application of government methods. We also consider the role
of standards and technology centers, and conclude with a view into the future.

INTRODUCTION
Software is a part of everyday life at work and at 100~---------------r---------------'

home. Many things we take for granted are software de-


Hardware
pendent: watches, telephone switches, air-conditioning/
80 development/maintenance
heating thermostats, airline reservations, systems that de-
fend our country, financial spreadsheets. The discipline
U5
of managing the development and lifetime evolution of 0
u 60
this software is called software engineering. '0
Software costs in the United States totaled about $70 E
<lJ
billion in 1985, of which $11 billion was spent by the 2 40
<lJ
a...
Department of Defense. 1 Worldwide, spending was Software
about twice that amount-$l40 billion. At a growth rate maintenance
20
of 12070 per year, the United States will spend almost
$0.5 trillion annually on software by the turn of the cen-
tury. 0
Studies in the early 1970s projected that software 1955 1970 1985
Year
would rapidly become the dominant component in com-
puter systems costs (Fig. 1). The cost of computing hard- Figure 1-Hardware-software cost trends.
ware over the last few years has fallen dramatically on
a per-unit performance basis. That decrease resulted pri-
marily from the mass production of denser integrated
circuits. Software remains labor intensive, and no com- people, and operating procedures, the software process
parable breakthrough has occurred. Thus, the small in- is a subset of system engineering.
creases in software productivity have not overcome the There are two dimensions to the software process. The
increased cost of human resources. first concerns the activities required to produce a product
There is broad agreement on what is to be avoided that reliably meets intended needs. The major consider-
but a diversity of opinions regarding the best way to de- ations are what the software product is to do and how
velop and maintain software. We will examine here why it should be implemented. The second dimension ad-
software development is so difficult, what methods are dresses the management issues of schedule status, cost,
currently available to guide the process, how government and the quality of the software deliverables.
methods have responded to those difficulties, and what In a large system development effort, we commonly
roads to improvement are being explored. This article, find the same management tools for both the hardware
oriented to a technical audience with minimal back- and software components. These typically are organized
ground in software development, presents a survey of as a sequence of steps and are displayed in a "waterfall"
many different methods and tools, for that is the na- diagram. Each step must be complete and verified or
ture of the state of the art in software engineering. validated before the next step can begin; feedback loops
to earlier steps are included. A typical sequence is shown
THE ESSENCE OF SOFTWARE in Fig. 2 for software development. The steps are de-
DEVELOPMENT rived from the hardware development model. In fact,
The software process, sometimes called the software only two labels have been changed to reflect the differ-
life cycle, includes all activities related to the life of a ences in the product under development: software cod-
software product, from the time of initial concept until ing and debugging is similar to hardware fabrication,
final retirement. Because the software product is gener- and software module testing is similar to hardware com-
ally part of some larger system that includes hardware, ponent testing.

276 fohns Hopkins APL Technical Digest, Volume 9, Number 3 (1988)


complex process. Its cost is directly proportional
to the number of design decisions already made.)
Analysis 1-
4. Hardware reliability is a measure of how the parts
wear out. Software does not wear out; its reliability
provides an estimate of the number of undetected
errors.
Analysis of _
'- functions These differences suggest that, even with a strong par-
allel between hardware and software, overcommitment
to a hardware model may prove detrimental to the soft-
Detailed ware process. Some common errors are:
design
1. Premature formalization of the specification. Be-
cause the design activities cannot begin until the
analysis is performed and the specification is com-
Code and I-
- debug
plete, there often is a tendency to produce a com-
plete specification before the product needs are un-
derstood fully. This frequently results in an invalid
system. Unlike hardware, software can be incre-
'-- Module test 1- mentally developed very effectively. When a prod-
uct is broken down (decomposed) into many small
components, with deliveries every few months, the
designer can build upon earlier experience, and the
Integration 1_ final product has fewer errors. Another develop-
- test ment approach is to use prototypes as one uses
breadboard models to test concepts and build un-
derstanding. Of course, only the essence of the pro-
totype is preserved in the specification; its code is
- System test - discarded.
2. Excessive documentation or control. Software de-
" if velopment is a problem-solving activity, and docu-
Operations mentation serves many purposes. It establishes a
- and formal mechanism for structuring a solution, com-
maintenance municates the current design decisions, and pro-
vides an audit trail for the maintenance process.
Figure 2- Typical software development steps. But documentation demands often go beyond
pragmatic needs. The result is a transfer of activity
This structural similarity in the flow facilitates the from problem-solving to compliance with external
standards, which is counterproductive.
coordination and management of hardware and software
activities. There are, however, major differences between 3. The alteration of software requirements to accom-
hardware and software: modate hardware limitations. Since software is
relatively easy to change, there is the perception
1. Hardware engineering has a long history, with that deficiencies in hardware can be compensated
physical models that provide a foundation for de- for by changes to the software. From a systems
cision making and handbooks that offer guidance. engineering perspective, this strategy obviously is
But software engineering is new; as its name im- inappropriate. Although it may be the only reason-
plies, it relies on "soft" models of reality. able alternative, it clearly represents an undesirable
2. Hardware normally deals with mUltiple copies. design approach.
Thus, the effort to control design decisions and as- 4. Emphasis on physical products such as program
sociated documentation can be prorated over the code. Because code frequently is viewed as a prod-
many copies produced. In fact, it is common to uct, there is a tendency to place considerable store
reengineer a prototype to include design corrections in it. The most difficult part of software design,
and reduce manufacturing (i.e., replication) costs. however, is the determination of what the code is
Conversely, software entails negligible reproduction to implement. In fact, production of the code and
cost; what is delivered is the final evolution of the its debugging typically take one-half the time of
prototype. its design. Also, most errors are errors in design
3. Production hardware is expensive to modify. There and not in writing code. Therefore, managers
is, consequently, a major incentive to prove the de- should not be too concerned with the amount of
sign before production begins. But software is sim- code produced if the design team has a firm un-
ply text; it is very easy to change the physical me- derstanding of how they intend to solve the prob-
dia. (Naturally, the verification of a change is a lem. And programmers should not be encouraged

fohn s Hopkin s APL Technical Digest, Volume 9, Number 3 (1988) 277


Blum, Sleight - An Overview of Soft ware Engineering

to code before they have worked out the full design


of their target application. (The exception is the
prototype, which is discarded after its lessons have
been assimilated into the design.)
If we examine the essential steps of the software pro-
cess, we see that software development is based on the
same general model as that used to build a bridge, con-
duct a scientific experiment, or manage the development
of hardware:
1. First we determine what is to be done (i.e., anal-
ysis).
2. Next, we determine how to realize the desired be-
havior. This is called design, and it includes the
allocation of functions, detailed design, and cod-
ing. Often this is decomposed into "programming Operations
in the large," which involves the architecture of and
the software system, and "programming in the maintenance
small," which involves the creation of the code. (67%)
3. Following this, we test the product at various levels
of system completeness (units, modules, integrated
components, and, finally, the full system).
4. Finally, we use the software product, which often
changes the environment it was intended to sup-
port, thereby altering its initial specification. Con-
sequently, the software will evolve continuously un- Figure 3- Distribution by cost percentage of the software life
cycle. (Zelkowitz, Shaw, and Gannon, Principles of Software En-
til its structure degrades to the point where it is less gineering and Design, © 1979, p. 9. Reprinted by permission
expensive to retire it than to modify it. We can view of Prentice Hall , Inc., Englewood Cliffs, N.J .; adapted version
this "maintenance" activity as an iteration of the appeared in Computer, 1984.)
preceding steps.
Clearly, we can structure these four steps in a water-
fall organization. But since true system needs often are and by building the skills of those with less ex-
not understood without some preliminary experimenta- perience.
tion, we also use other development models wherein soft- MODERN PRACTICES AND THE
ware evolves from experience with prototypes and earlier SOFTWARE PROCESS
system versions. Boehm's spiral model is one example
of this revised flow; 2 most process models, however, Given our description of the software process, we now
are built from the four basic activities presented above. address the modern practices used to support that pro-
One advantage of the waterfall representation is its cess. We organize the discussion of methods, tools, and
long history of use, which has yielded insightful empirical environments according to their application to the major
data. For example, a major portion of the software cost process activities of analysis, design (programming in the
is expended on a product after it has been installed. This large), code (programming in the small), validation and
is called evolution or maintenance, and it can represent verification, and management. (We make no attempt to
one-half to three-quarters of the total life cycle cost. For- provide citations for all the tools and methods described.
ty percent of the development cost is spent on analysis References can be found in the most modern software
and design, 20070 on coding, and 40% on integrating and engineering textbooks . 4)
testing (the "40-20-40 rule"). The writing of program Analysis
code is a very small part of the total cost. A distribu- The objective of the analysis is to produce a descrip-
tion of expenditures for one set of data is shown in Fig. tion of what the software system is to do. Naturally, this
3. (See the boxed insert for other observations based on will depend on the domain of application. For example,
empirical data.) in an embedded application, the system engineers may
Our discussion thus far suggests that a good approach have specified all the software requirements as part of
to software development is one that: the system decomposition process; the functions, timing
1. Identifies and responds to errors as early as possible constraints, and interfaces may already be prescribed,
in the development cycle. and the design can proceed. But as often happens with
2. Assumes that there will be continuous change in an information system, the initial intent may be stated
the product and anticipates its eventual evolution. only vaguely, and an analysis of the existing operation,
3. Minimizes the importance of code production. along with a study of how automation may help, will
4. Maximizes the importance of people, both by follow. The result will be a specification of the product
bringing experienced people to the difficult tasks to be implemented.

278 John s Hopkin s A PL Technical Digest, Volume 9, Number 3 (1988)


Blum, Sleight - A n Overvie w of Soft ware Engineering

Although the parameters affecting costs, scheduling, and


quality vary from project to project, there are some gener-
ally accepted trends. Boehm recently identified the follow-
ing ten most important industrial software measures or met- Adaptive
rics: 3 (25%)
1. Finding and fixing a software problem after delivery Corrective
(20%)
can be up to 100 times more expensive than finding
and fixing it during the phases when the requirements
and early design are determined.
2. You can compress a software development schedule
up to 25070 of nominal, but no more. Perfective
3. For every dollar you spend on software development, (55%)
you will spend two dollars on software maintenance.
(Other studies have shown that the costs associated User enhancements (42%)
with perfecting the product represent the largest main- Documentation
tenance category, and costs associated with making Efficiency
corrections represent the smallest category. The re- Other
maining resources are used to adapt the software to
altered requirements. Figure 4 illustrates a typical al-
location of costs among maintenance categories.)
Figure 4- Allocation of system and programming resources
4. Software development and maintenance costs are to three maintenance categories (reprinted by permission, "Soft·
primarily a function of the number of source instruc- ware Engineering: Problems and Perspectives," IEEE Computer,
tions in the product. © 1984, IEEE).
5. Variations between people account for the biggest
differences in software productivity.
reorganized to provide a more effective implementation
6. The overall ratio of computer software costs to hard-
of what previously was a nonautomated process. All
ware costs has gone from 15:85 in 1955 to 85:15 in
flows (arcs) and actions (nodes) are labeled, and the
1985, and this trend is still growing.
nodes are further decomposed into DFDs until each node
7. Only about 15% of a software product development
is well understood. Because the arcs represent abstrac-
effort is devoted to programming.
tions of the data in the flow, " data dictionaries" are
8. Software systems and software products each typically
created that detail the data organization and content.
cost three times as much per instruction to develop
There are several variations of structured analysis. In the
fully as does an individual software program. Software
method developed by DeMarco and Yourdon, the low-
system products cost nine times as much.
est-level nodes are described in process- or minispecs that
9. Walk-throughs can catch 60% of the errors.
use "structured English" to detail the processing that
10. Many software phenomena follow a Pareto distribu-
the transformation must conduct. A sample DFD, dic-
tion: 80% of the contribution comes from 20% of
tionary, and minispec are shown in the boxed insert.
the contributors.
Most structured analysis techniques are designed for
information processing applications. The initial goal of
this method was to provide a manual technique that
A common method of analysis is called "structured would allow the analyst to detail his thoughts systematic-
analysis." The operational environment is first modeled ally and communicate the results to the sponsors and
as a network of input-output transformations and docu- users. Recently, automated tools have been developed
mented in the form of "data flow diagrams" (DFDs). to assist in drawing the DFD and maintain dictionaries
The nodes in the DFDs represent the transformations, of the transformations and flows. (Such tools are known
and the arcs represent the data flowing to and from the as CASE: computer-assisted software engineering.) Var-
transformations. Each node is given a title suggesting iations of the DFD also have been adopted for use with
the activity the transformation represents, and each arc real-time systems by adding symbols to model queues
is given the title of the data in the flow. To convey mean- and messages transmitted among nodes .
ing, abstraction is used to reduce the level of detail. For The requirements analysis is conducted in a top-down
increased information content, each node can be expand- mode. This decomposition approach imposes some de-
ed as a DFD; the only restriction is that all data flows sign decisions on the product; for example, the DFD es-
to and from that node are retained as inputs to and out- sentially establishes the module structure for the im-
puts from the lower-level DFD. plementation. Some suggest that this is a weakness of such
With this approach, one typically models the physical methods: the analyst must make critical design decisions
environment and then draws a boundary separating the when he least understands the problem being solved. The
automated system from the nonautomated system. Data alternative is a "composition" approach in which one
flows crossing that boundary represent the application models portions of the system that are well understood
interfaces. Next, the functions within the boundary are and builds the system from those components.

f ohns Hopkins A PL Technical Digest, Volume 9, N umber 3 (1988) 279


Blum, Sleight - An Overvie w of Software Engineering

STRUCTURED ANALYSIS DESCRIPTION


The figure below contains a simple example of the rep- Because the processing for this DFD is clear, there is no
resentation used during structured analysis. For this data need to expand it to another DFD level. Bubble 5.2, Define
flow diagram (DFD), we assume that there is a parent DFD, Schedule, is described in a minispec, which conveys the pro-
with at least five bubbles or activities. This diagram is an cessing while avoiding the detail required of a programming
expansion of the bubble, 5.0, Determine Schedule, of the language. For example, " get and list WBS# and WBS
parent activity. TITLE" is several instructions, and the reenter statement
Typically a DFD contains five to nine bubbles, although after printing the error message implies a GOTO (not
only three are shown. Each bubble is labeled with the ac- shown).
tivity it represents; the data flows to and from each bubble Finally, the data dictionary defines the major elements
are labeled; and the data stores (i.e., the file containing the in the data flow. Here, WBS is a table with five columns,
work-breakdown-structure [WBS] data) and external ele- and Task Group is a set of WBS#s. More detailed defini-
ments (i.e., the printer) are identified with their special sym- tions of the element formats and index schemes may be
bols. delayed until additional information is compiled.

5.0 DETERMINE SCHEDULE PROCESS (MINI) SPECIFICATION


Schedule Request
5.2 Define Schedule Process
for each TASK in TASK GROUP
get and list WBS# and WBS TITLE
enter START date
enter STOP date
if START < STOP then print error and reenter
end

DATA DICTIONARY

WBS = WBS# + Title + Start + Stop + Re-


sources
Schedule
Task Group = {WBS#}

Examples of composition techniques include the Jack- tablish what the procured software is to provide, so the
son System Design and object-oriented design ("design" specification becomes part of the contract that defines
implying that the process steps have considerable over- the deliverable. In the essential model of the software
lap). In the Jackson System Design, the target system process, however, there is continuity between analysis
is represented as a discrete simulation, and the implemen- and design activities, and the methods often support both
tation is considered a set of communicating sequential activities.
processes; that is, the method allows for the modeling The basic process is one of modeling the software sys-
of the real-world environment as a computer simulation, tem and adding details until there is sufficient informa-
which then is transformed into a set of sequential pro- tion to convert the design into a realization (i.e., pro-
grams that can operate asynchronously. Conversely, ob- gram). Design always begins with a specification, which
ject-oriented design first identifies the real-world objects is a product of the analysis step. At times, the specifica-
that the desired system must interact with and then con- tion is a formal document establishing a set of require-
siders how those objects interact with each other. There ments. Here, it is important to maintain traceability to
are several versions of object-oriented design, but exper- ensure that all design decisions are derived from a re-
ience with its use is limited. quirement and that all requirements are satisfied in the
design (i.e., there are neither extra features nor omis-
Programming in the Large-Design sions). At other times (e.g. , in the internal development
The design process begins after there is a specification of a product), the specification is less formal, and addi-
establishing what functions the software is to provide. tional subjectivity is needed to determine that the design
From the discussion of analysis, we see that there is no decisions are valid.
precise division between analysis (the decision of what For any set of requirements, there are many equally
is to be done) and design (the determination of how to correct designs. The task of the design team is to select
realize it). There sometimes is a contractual need to es- among the alternatives those system decisions yielding

280 John s Hopkin s APL Technical Digest, Volume 9, Number 3 (1988)


Blum, Sleight - An Overvie w of Software Engineering
a design that is, in some way, expected to be better than Every program can be written using only these three con-
the others. Studies of this activity indicate that consider- structs. A corollary, therefore, is that the OOTO state-
able domain experience is required. Also, the ability and ment is unnecessary.
training of the team members is some two to four times Structured programming introduced other concepts as
as important as any other factor in determining the cost well. Programs were limited to about 50 lines (one page
to produce an acceptable product. of output). Stepwise refinement was used to guide the
Design methods are extensions of analysis methods. top-down development of a program. When a concept
For example, decomposition techniques use the DFD, was encountered during programming that required ex-
and composition methods span the analysis and pansion, it would be represented as a procedure in the
programming-in-the-large tasks. With decomposition user program and later refined. This method allowed the
techniques, "structured design" is used to model the in- programmer to defer design activities; it also resulted in
teractions among software modules. Rules are available programs that were easier to read and understand. To
to guide the transition from DFDs to the "structure di- improve comprehension, indentation and white space
agrams" depicting module control flow. As with DFDs, were used to indicate the program's structure. In time,
data dictionaries are used to describe the elements in the the flow chart was replaced by the "program design lan-
data flow, and the functions of the modules are detailed guage" (e.g., the minispec), which captured the program
as "module specs" in structured English. structure but omitted many program details.
Other methods begin with models of the data and their Another concept introduced in the late 1970s was "in-
temporal changes, and then derive the processes from formation hiding," which emerged following analysis of
those data structures. The Jackson Program Design, for what characteristics should bind together, what functions
example, models the structure of the data and then builds are retained in a module (cohesion), and how modules
models of the procedures that reflect that structure. For should interact with each other (coupling). The goal of
data processing applications, there are several methods information hiding is to yield a logical description of the
used to define the data model. One widely used method function that a module is to perform and isolate the users
is the entity-relationship model. Here, the entities (e.g., of that module from any knowledge of how that func-
employees, departments) and their relationships (e.g., tion is implemented. Thus, the designers may alter the
works in) are identified and displayed graphically. Rules internal implementation of one module without affect-
then can be applied to convert this conceptual model into ing the rest of the program. This concept was refined
a scheme that can be implemented with a database man- and became known as the abstract data type. A data
agement system. type defines what kinds of data can be associated with
We have identified here many different (and often a variable symbol and what operators can be used with
mutually incompatible) methods, but the list is incom- it. For example, most languages offer an integer, real,
plete. Many of those methods use some form of dia- and character-string data type. The operator plus (+ )
gram. Most CASE tools support the DFD, structure dia- has a different meaning for each data type.
gram, Jackson System Design notation, and entity-rela- With an abstract data type, the designer can specify
tionship model. There also are proprietary tool sets that a new data type (e.g., the matrix) and operators that are
are limited to a single method. One of the benefits that valid for that data type (e .g., multiplication, inversion,
any good method provides is a common approach for scalar multiplication). Using the terminology of the
detailing a solution and communicating design decisions. Ada 5 programming language, the abstract data type is
Thus, for effective communication, an organization defined in a package with two parts. The public part in-
should rely on only a limited number of methods. The cludes a definition of the data type and the basic rules
DFD and the entity-relationship model are the most for the operations. The private part details how the oper-
broadly disseminated and, therefore, frequently will be ations are to be implemented. To use the abstract data
the most practical for the communication of concepts. type, the programmer includes the package by name and
then declares the appropriate variables to be that data
Programming in the Small-Coding type. This is an example of software "reuse." The data
Code involves the translation of a design document type operations are defined once and encapsulated for
into an effective and correct program. In the 1970s, the reuse throughout the software application, thereby re-
concept of "structured programming" was accepted as ducing the volume of the end product and clarifying its
the standard approach to produce clear and maintainable operation.
programs. The structured program relies on three basic Another technique to improve program quality is em-
constructs: bodied in the concept of "proof of correctness," mean-
ing that the resulting program is correct with respect to
its specification. There are some experimental systems
1. Sequence-a set of statements executed one after that can prove a program to be formally correct. Such
the other. systems have been used to verify key software products,
2. Selection-a branching point at which one of a set such as a security kernel in an operating system. But
of alternatives is chosen as the next statement to proof of correctness usually is applied as a less formal
be executed (e.g., IF and CASE statement). design discipline.
3. Iteration-a looping construction causing a block "Fourth generation languages" (40Ls) represent an-
of statements to be repeated (e.g., DO statement). other approach to software development. Here, special

fohn s Hopkin s APL Technical Digest, Volume 9, N umber 3 (1988) 281


Blum, Sleight - An Overview of Software Engineering

tools have been developed for a specific class of appli- mUltiplication of early errors and a relatively high cost
cation (information processing) that facilitate the devel- of correction per defect.
opment of programs at a very high level. For example, Before a formal specification exists (one that can be
one can produce a report simply by describing the con- subjected to logical analysis), the primary method for
tent and format of the desired output; one does not have both verification and validation is the review. In the soft-
to describe procedurally how it should be implemented. ware domain, this is sometimes called a walk-through
(Thus, 4GLs generally are described as being nonproce- or inspection, which frequently includes the review of
dural or declarative.) both design documents and preliminary code. The review
Validation and Verification process is intended to identify errors and misunderstand-
ings. There also are management reviews that establish
In the traditional descriptive flow for software devel-
decision points before continuing with the next devel-
opment, the activity that precedes operations and main-
opment step. The two types of reviews have different
tenance is called "test." Testing is the process of detect-
functions, and they should not be confused or combined.
ing errors. A good test discovers a previously undetected
Management reviews should occur after walk -throughs
error. Thus, testing is related to defect removal; it can
have been completed and technical issues resolved.
begin only when some part of the product is completed
Most software tests are designed to detect errors,
and there are defects to be removed.
which sometimes can be identified by examining the pro-
The validation and verification activity includes the
gram text. The tools that review the text are called "static
process of testing. But it begins well before there is a
analyzers." Some errors they can detect (such as iden-
product to be tested and involves more than the identi-
tifying blocks of code that cannot be reached) can be
fication of defects. Validation comes from the Latin vali-
recognized by compilers. Other forms of analysis rely
dus, meaning strength or worth. It is a process of predict-
on specialized, stand-alone software tools. "Dynamic
ing how well the software product will correspond to
analysis" tests, concerned with how the program oper-
the needs of the environment (i.e., will it be the right
ates, are divided into two categories. "White box" tests
system?). Verification comes from the Latin verus,
are designed to exercise the program as implemented.
meaning truth. It determines the correctness of a product
The assumption is that the errors are random; each path
with respect to its specification (i.e., is the system right?).
of the program, therefore, should be exercised at least
Validation is performed at two levels. During the anal-
once to uncover problems such as the use of the wrong
ysis step, validation supplies the feedback to review de-
variable or predicate. "Black box" tests evaluate only
cisions about the potential system. Recall that analysis
the function of the program, independent of its im-
requires domain understanding and subjective decisions.
plementation.
The domain knowledge is used to eliminate improper
As with equipment testing, software testing is organ-
decisions and to suggest feasible alternatives. The rank-
ized into levels. Each program is debugged and tested
ing of those alternatives relies on the analysts' experience
by the individual programmer. This is called unit test-
and judgment. The review of these decisions is a cogni-
ing. Individual programs next are integrated and tested
tive (rather than a logically formal) activity. There is no
as larger components, which are then function tested to
concept of formal correctness; in fact, the software's va-
lidity can be established only after it is in place. (Proto- certify that they provide the necessary features. Finally,
types and the spiral model both are designed to deal with the full system is tested in an operational setting, and
a decision is made to deploy (or use) the product. Natu-
the analyst's inability to define a valid specification.)
rally, if the software is part of an embedded system,
The second level of validation involves decisions made
within the context of the specification produced by the then, at some level, the software tests are integrated with
the hardware tests.
analysis activity. This specification describes what func-
tions should be supported by the software product (i.e., Management
its behavior). The specification also establishes nonfunc- We have so far emphasized the essential features of
tional requirements, such as processing time constraints software development; that is, what makes the devel-
and storage limitations. The product's behavior can be opment process unique for this category of product.
described formally; in fact, the program code is the most Some characteristics of the process make it difficult: the
complete expression of that formal statement. Nonfunc- software can be very complex, which introduces the
tional requirements, however, can be demonstrated only potential for many errors; the process is difficult to mod-
when the product is complete. el in terms of physical reality; there is always a strong
Validation and verification are independent concepts. temptation to accommodate change by modifying the
A product may be correct with respect to the contractu- programs; and, finally, the product is always subject to
al specification, but it may not be perceived as a useful change. (In fact, the lifetime cost for adaptation and en-
product. Conversely, a product may correspond to the hancement of a software product usually exceeds its de-
environment's needs even though it deviates from its velopment cost.)
specified behavior. Also, validation always relies on The management of a software project is similar to
judgment, but verification can be formalized. Finally, the management of any other technical project. Man-
both validation and verification can be practiced before agers must identify the areas of highest risk and the
there is code to be tested; failure to exercise quality con- strategies for reducing that risk; they must plan the se-
trol early in the development process will result in the quence of project activities and recognize when devia-

282 Johns Hopkins APL Technical Digest, Volume 9, Number 3 (1988)


Blum, Sleight - An Overview of Software Engineering

tions are imminent; they must budget time, personnel, ed for many years to standardize languages, beginning
and dollar resources and adjust these factors through- with CS-I in the 1960s, through the CMS-2 and the in-
out the process; they must maintain control over the troduction of Ada.
completed products; and they must establish procedures The Navy also has standardized computer hardware,
to ensure a high level of product quality. in particular, the AN/UYK-20, and more recently the
As with any project management assignment, the AN/UYK-44 and AN/AYK-14, which are the 16-bit
manager must understand something about both the do- common instruction set standards. The AN/UYK-7 is
main of application and the technology to be applied. upward-compatible with the newer 32-bit AN/UYK-43
In well-understood problem areas, this knowledge is less standard computer. For the Navy to support those com-
critical because design is reduced to the detailing of some puters with the new Ada language, the Ada Language
existing design concept. But in new domains there are System/Navy project is tasked to develop production-
uncertainties, and management must be sensitive to the quality Ada translators (first released in June 1988) for
early resolution of high-risk problems. (This is one area the AN/UYK-44, AN/AYK-14, and AN/UYK-43. Con-
in which prototyping can be most effective; another is siderable emphasis has been placed on the support of
the examination of the human interface.) fast interrupts, a requirement of many Navy embedded
Although software engineering is a relatively new dis- systems. Within a common command language frame-
cipline, there are many tools available to help support work, there are tools for version and database main-
its management. Cost-projection tools have been pro- tenance, software-problem-report tracking, documen-
duced that allow a manager to build upon previous ex- tation updating, and report generation.
perience to estimate cost and schedule. Commercial The Ada language has matured, and the translators
"configuration control" systems manage the software are rapidly becoming viable production tools. Some re-
versions and supply mechanisms to insulate programs cent benchmark programs executing on the MC 68020
from unauthorized or uncertified changes. Many modem processor indicate that the Ada code is almost as fast
program-support environments also contain tools that as code generated by popular C-Ianguage translators.
give management easy access to schedule and status in- Ada is now a workable language for military use, and
formation. the benefits of the object-oriented approach can be ob-
tained from specification through operational code and,
GOVERNMENT METHODS most important, during evolution and maintenance.
APPLIED AT APL APL has explored Ada applications for several tactical
The approaches to software engineering taken by systems. Whenever a mature Ada translator and target
APL's sponsoring organizations must be considered both run-time system have been available, the projects have
when an operational system is delivered and when a pro- been successful. One example is the redesign and reimple-
totype is developed that will evolve into government mentation of a simplified version of the Aegis Command
specifications and procurements. For many experiments and Decision System. 9 This project, while successful,
(e.g., at-sea or space-based), the whole effort is oriented also identified several problems with the 1985 Ada tech-
toward quick deployment of existing or slightly modified nology. Many of those problems have disappeared with
sensors and support for recording, telemetry, and anal- more mature Ada products. In follow-on projects, the
ysis. There are no sponsor-specified approaches, and reuse of both internally developed and externally avail-
development responsiveness often is the critical compo- able Ada modules has been successful. Also, Ada mod-
nent. The software becomes a crucial integrating element. ule integration has been faster, and fewer errors were
Since existing software must be modified, software en- detected than with other languages.
gineering techniques are applied less formally. Most software development at APL involves the sup-
In its work on Navy tactical systems, however, APL port of experimental or "one of a kind" systems. Al-
relies more on the use of standards. The development though these systems frequently require the proper use
and acquisition of mission-critical defense-systems soft- of software engineering techniques, they typically are not
ware is governed by DOD-STD-2167A,6 a standard controlled by the sponsor as is a mission-critical Navy
specifying a number of documents that should be gener- system (i.e., DOD-STD-2167A). This offers an oppor-
ated during software development. This document-driven tunity to try new techniques. For example, in some sat-
approach has been criticized for the lack of proper docu- ellite altimeter work for NASA, the DeMarco and Your-
ment selection and tailoring by the government and for don methodology, with Ward/Mellor real-time exten-
ignoring several modern software processes or develop- sions,1O is being successfully applied using new CASE
ment strategies as described above. Although this stan- tools. The recent APL projects requiring Ada have used
dard has some drawbacks, it has provided a sound and the Bahr object-oriented design approach, II which
consistent basis for software development over many (with some modification) has been particularly useful
years. Recently, it was modified (Revision A) to reflect when the target language is known to be Ada. Some pro-
better the methods available via the Ada language. This jects derived their approach from a combination and tail-
standards approach is not new; the DOD standard origi- oring of DOD-STD-2167A and emerging IEEE software
nated in earlier internal Navy standards. 7,8 The other standards. This approach provides guidelines and tem-
main component in the Navy's approach is the use of plates, and helps to alleviate problems arising from staff
specific programming languages. The Navy has attempt- reassignment and short corporate memory.

fohns Hopkins APL Technical Digest, Volume 9, Number 3 (1988) 283


Blum, Sleight - An Overview of Software Engineering

ROADS TO IMPROVEMENT the most effective technology to bear on rapidly improv-


There are many possible roads to improvement for ing the quality of operational software in mission-critical
the software engineering process, with numerous feeders computer systems; (2) bring modern software engineer-
and intersections, and some dead ends. Three promising ing techniques and methods into practice as quickly as
routes are standards and guidelines, technology centers, possible; (3) promulgate the use of modern techniques
and research. We focus below on the first two. (Research and methods throughout the mission-critical systems
is covered in the next section.) community; and (4) establish standards of excellence for
software engineering practice.
Standards and Guidelines
Software technology transition is SEI's major focus.
The primary roles of international standards and Unlike industrial consortiums, software engineering re-
guidelines in the software arena are to provide a~plica­ search is not a significant part of its mission. The institute
tion portability by defining interfaces and to defme t~e has convened several workshops over the last few years,
state of practice. The international standards process IS and many organizations (including APL) in industry,
well established. Standards usually deal with what should academia, and government have affiliated with it. As
be done, not how. This is sometimes expressed as the a result, SEI has projects in several areas, including tech-
public view (not the private view). When standards deal nology surveys, course and textbook development, soft-
with interfaces, the "how" often is stated, simply be- ware reliability techniques, uniform communication
cause it is a necessary part of the public view. medium for Ada programs, documentation and report-
Establishing international standards is a time-con- ing conversion strategy, software process modeling, soft-
suming process that depends on the interplay among in- ware maintenance, human interface technology, legal is-
dustry, government, and professional societies. In the sues, environments, and pilot projects.
national and international forum, it is a consensus-build- Several course modules for undergraduate and gradu-
ing process. A typical IEEE standard takes about five ate software engineering courses have evolved from the
years to progress through its various stages. Since most institute's software engineering education program. The
standards are based on experience, the impression fre- recently introduced master's degree course entitled "Pro-
quently is created that standards institutionalize past jects in Software Engineering" at the APL Center of
practices and stifle new technology. Exclusive of military The Johns Hopkins University Continuing Professional
environments, the major players in international software Programs is based on SEI material. A full description
standards development are the IEEE, the International of the university'S professional computer science curric-
Standardization Organization, and the American Nation- ulum was presented at a recent SEI education work-
al Standards Institute. It is not possible in this article shop. 13 Besides interesting technical and education ma-
to describe all the international software standards efforts terial, the institute has produced guidelines for program
and their interactions. A summary of 11 IEEE standards managers on the adoption of Ada 14 and a method for
and 13 standards projects is available, however. 12 Sig- evaluating an organization's software engineering capa-
nificantly, software standards play an ever-increasing role bility.
in establishing the state of software engineering practice
The SPC was formed in 1984 by 14 companies to close
and in interfacing various software processes. the mission-critical software gap between defense and
The primary Defense Department standard for soft- aerospace system hardware and the availability of soft-
ware development is DOD-STD-2167A. The IEEE and ware to drive those systems. Its goal is to build the soft-
other military standards cover similar subjects (e.g., ware tools and techniques needed to accelerate the soft-
documentation, quality assurance, audits). One major ware engineering cycle, improve the quality and function-
distinction exists. The military standard deals mainly with ality of the final software product, and make major sys-
contract deliverables; thus, document content is empha-
tems software easier and less expensive to maintain.
sized. The IEEE standards capture more of the essence The objective of SPC's technical program is to create
of the software development process. Additional Navy
a dramatic increase in the software productivity of mem-
standards traditionally have been stated in different
ber companies. Over the next five years, the consortium
terms-the use of standard, government-furnished
will develop and provide members with a range of soft-
equipment and software.
ware engineering products and technologies for an inte-
Technology Centers grated development environment. The 14 partners receive
Technology centers such as the Software Engineering exclusive rights to products, technologies, research, and
Institute (SEI) and the Software Productivity Consor- support services developed by SPC.
tium (SPC) have been established by the government and The products will exploit three key concepts: symbolic
industry to improve software engineering technology representation, prototyping, and reusable components.
transition. A third center, the Microelectronics and Com- Symbolic representation makes the process of develop-
puter Technology Corporation (MCC), has a broader ing software a "tangible" activity by representing soft-
mission that also includes software engineering. ware life-cycle objects in more "natural" ways to the
The SEI is a federally funded research and develop- working engineer. (The concepts of prototyping and re-
ment center operated by Carnegie-Mellon University un- usable components are discussed elsewhere in this arti-
der contract with the U.S. Department of Defense. Its cle.) Within the context of an integrated development
mission is to (1) bring the ablest professional minds and environment and project libraries, SPC will develop tools

284 John s Hopkin s APL Technical Digest, Volume 9, Number 3 (1988)


Blum, Sleight - An Overview of Software Engineering

for building requirement and design specifications and has not been demonstrated outside the laboratory, and
related code, synthesizing prototypes, performing dy- it is not clear how this process model can be managed.
namic assessments, and managing software development
projects. At a recent meeting of companies developing Automatic Verification
and marketing CASE tools, SPC launched an initiative In the discussion of validation and verification, we
to establish an industrywide consensus on effective tool- noted that proofs (verification) could be objective only
to-tool interface standards. Those standards will repre- when the parent specification was clear (formal). There
sent the fIrst steps in building an integrated environment. is, therefore, considerable interest in working with
MCC was established by 21 shareholder companies mathematically formal specifications at an early stage
in 1983. The consortium has several research programs, in the design process, since it then would be possible to
ranging from semiconductor packaging to software tech- prove that each detailing step was correct with respect
nology. Each program is sponsored by a subset of par- to this high-level source. A theorem prover could be used
ticipating shareholder companies. to automate the process. Testing then would not be
The software technology program focuses on the front necessary, because no errors would exist. Naturally, vali-
end, upstream in the software cycle, where little research dation still would be required.
has been performed. This program has created a com-
puter-aided software design environment call Leonardo. Automated Tools and Environments
The environment is to concentrate on requirements cap- There is considerable interest in the use of CASE tools
ture, exploration, and early design. Academic research and in program support environments. Unlike the for-
has focused on downstream activities, where formalism malisms addressed above, most tools and environments
and automation are more obvious. MCC's research em- are commercial products that implement techniques de-
phasis is on defining and decomposing a large problem veloped in the mid-1970s. Thus, these tools and environ-
into smaller problems and on selecting algorithms, and ments respond to a marketplace demand and provide
is geared to teams of professional software engineers a means for making the application of current practice
working on large, complex systems. Researchers are more efficient. Their primary benefit is one of reducing
working on Leonardo architecture, and three compo- the manual effort and thereby the number of errors in-
nents: complex design processes, a design information troduced. As new paradigms are introduced, this focus
base, and design visualization. may limit their use or force changes in the approaches
The corporation does not consider its research com- taken.
plete until it is put to use by the sponsoring companies. Artificial Intelligence and Knowledge
Also, MCC believes it is easier to transfer and establish Representation
tools and make them consistent than to transfer method-
Although there are many definitions of artificial in-
ologies and methods.
telligence and debates about what it accomplishes, it has
A VIEW TO THE FUTURE had an impact on our perceptions of what computers
We began this article with a review of how software can do and how to approach problems. The software
differed from hardware and noted that-once the tech- process is one of representing knowledge about a prob-
nical manager understands the software process-the lem in a way that facilitates its transformation (detail-
management of software is much like that of hardware. ing) into an implementable solution. Thus, there are
We then described the software process, characterized many software engineering methods and tools that owe
more by variety than by clarity and consistency. Despite their origins to artifIcial intelligence. Some projects, such
our signifIcant accomplishments with software, there re- as the development of object-oriented programming,
main conflicting methods, limited formal models, and have been successful and are available to developers;
many unsubstantiated biases. But we present below some many others still are in the research stage. One can ex-
significant trends in the field. pect that a knowledge orientation to the problem of soft-
ware design will have considerable impact.
Formalization
Some new paradigms extend the formalism of the New High-Order Languages
programming language into an executable specification. The major advances of the 1960s can be attributed
A specification defines the behavior for all implementa- to the use of high-order languages, but it is doubtful that
tions. An executable specifIcation does not exhibit the current language improvements will have much impact
intended behavior effIciently, but a program is an imple- on productivity. Many proven modern programming
mentation of that behavior, optimized for a specifIc com- concepts have been incorporated into Ada, and the com-
puter. We see this because there are systems that we mitment to this language clearly will familiarize de- .
know how to specify exactly, but we do not know how velopers with those concepts and thereby improve both
to implement them efficiently. For example, one can product quality and productivity. At another extreme,
specify what a chess-playing program should do without 4GLs offer tools to end users and designers that, for a
being able to describe an efficient implementation. The narrow application domain, yield a tenfold improvement
hope is that the executable specification will supply a in productivity at a price in performance. Still, neither
prototype for experimentation that ultimately can be high-order languages nor 4GLs can match the improve-
transformed into an effIcient program. But the concept ments we are witnessing in hardware cost performance.

Johns Hopkins APL Technical Digest, Volume 9, Number 3 (1988) 285


Blum, Sleight - A n Overvie w of Soft ware Engineering

Software Reuse Boehm, I in an article on improving productivity, was


The concept of software reuse was first perceived in more positive. Speaking of state-of-the-art software ap-
the context of a program library. As new tools have been plications, he offered this advice: write less code, reduce
developed, the goal of reusable components has expand- rework, and reuse software-especially commercially
ed. For example, Ada packages that encapsulate ab- available products, where possible.
stracted code fragments can be shared and reused. The Brooks 16 discusses the possibility of improving soft-
artificial-intelligence-based knowledge perspective also ware productivity. He has catalogued research directions
suggests ways to reuse conceptual units having a gran- in some detail and concluded that the biggest payoff
ularity finer than code fragments and program libraries. would come from buying rather than building, learning
Finally, the extension of 4GL techniques provides a by prototyping, building systems incrementally, and-
mechanism for reusing application-class conventions with most important to him-training and rewarding great
a natural human interface. designers. Of those four recommendations, the first
reflects the spinning off of tools that can be used by do-
Training and Domain Specialization main specialists, and the next two relate to the need to
All software development requires some domain build up knowledge about an application before it can
knowledge. In the early days of computing, the program- be implemented. The last of Brooks's positive approach-
mer's knowledge of the new technology was the key, and es recognizes that software design (like every other crea-
the domain specialist explained what was needed. To- tive activity) depends on, and is limited by, the individu-
day, almost every recent college graduate knows more al's ability, experience, understanding, and discipline.
about computer science than did those early program-
mers. Thus, there is an emphasis on building applica- REFERENCES and NOTES
tions. As more tools become available, one can expect l B. W . Boelun , " Improving Software Productivity," IEEE Computer 20, 43- 57
software developers to divide into two classes. The soft- (1987).
2B. W. Boelun, "A Spiral Model of Software Development and Enhancement, "
ware engineer will, as the name implies, practice en- IEEE Computer 21 , 61-72 (1 988).
gineering discipline in the development of complex soft- 3B. W. Boehm , " Industrial Software Metrics Top 10 List," IEEE Software,
ware products, such as embedded applications and com- 84-54 (Sep 1987).
4Two books that are highly recommended are R. Fairley, Software Engineer-
puter tools for end users. The domain specialists will use ing Concepts, McGraw- Hill , New York (1985) and R. Pressman, Software
those tools together with the domain knowledge to build Engineering: A Practitioner's Approach, 2nd ed ., McGraw-Hill , New York
applications that solve problems in their special environ- (1 987).
5Ada is a registered trademark of the U.S . Government, Ada Joint Project
ment. We can see this trend in the difference between Office.
Ada and the 4GLs. Ada incorporates powerful features 6DO D-STD-2 167A, "Military Standard Defense System Software Develop-
ment," (29 Feb 1988).
that are not intuitively obvious; the features are built on 7MIL-STD-1 679 (Navy), "Military Standard Weapon Software Development,"
a knowledge of computer science and must be learned. (I Dec 1978).
The 4GLs, however, offer an implementation perspec- 8SEC AVINST 3560.1, "Tactical Digital Systems Documentation Standards,"
(8 Aug 1974).
tive that is conceptually close to the end user's view. The 90 . F. Sterne, M . E. Schmid, M. J . Gralia, T. A. Grobicki, and R. A. R.
software engineer builds the language; the domain spe- Pearce, "Use of Ada for Shipboard Embedded Applications," Annual Wash-
cialist uses it. ington Ada Symp., Washington, D.C. (24-26 Mar 1985).
lOS. J. Mellor and P . T. Ward, Structured Development for Real-Time Systems,
Prentice-Hall, Englewood Cliffs, N.J. (1986).
WHAT OTHERS SAY II R. J. A. Bahr, System Design With Ada, Prentice-Hall, Englewood Cliffs,
N.J . (1984).
What do the experts in software engineering say about 12G . Tice, "Looki ng at Standards from the World View," IEEE Software 5,
the future of this discipline and the hope for significant 82 (1988).
l3V. G. Sigillito, B. I. Blum, and P . H . Loy, "Software Engineering in The
improvements in productivity? In explaining why the Johns Hopkins University Continuing Professional Programs," 2nd SEI Conf.
Strategic Defense Initiative is beyond the ability of cur- on Software Engineering Education, Fairfax, Va. (28- 29 Apr 1988).
rent (and near-term) software practice, Parnas 15 offered 14 J. Foreman and J . Goodenough, Ada Adoption Handbook: A Program
Manager's Guide, CMO/ SEI-87-TR-9, Software Engineering Institute (May
a negative critique of most research paths. He said that 1987).
the problem involves complex real-time communication 15 D. L. Parnas, "Aspects of Strategic Defense Systems," Commun. A CM 12,
demands, adding that there is limited experience in 1326- 1335 (1 985).
16F. P . Brooks, "No Silver Bullet," IEEE Computer 20, 10- 19 (1 987).
designing programs of this architecture and magnitude
and that there is no way to test the system thoroughly.
ACKNOWLEDGMENTS-The authors gratefully acknowledge the very
No ongoing approach, he concluded, could overcome helpful suggestions of J . E. Coolahan , M. J . Gralia, R. S. Grossman, and J . G .
these difficulties. Palmer.

286 f ohns Hopkin s A PL Technical Digest, Volume 9, N umber 3 (1 988)


Blum, Sleight - An Overview of Software Engineering

THE AUTHORS THOMAS P . SLEIGHT received


his Ph.D. from the State University
of New York at Buffalo in 1969. Be-
fore joining APL, he spent a year
BRUCE I. BLUM was born in New as a postdoctoral fellow at Leicester
York City. He holds M.A. degrees University, England. At APL, Dr.
in history (Columbia University, Sleight has applied computers to sci-
1955) and mathematics (University of entific defense problems. He has
Maryland, 1964). In 1962, he joined served as computer systems techni-
APL, where he worked as a pro- cal advisor to the Assistant Secretary
grammer in the Computer Center. of the Navy (R&D) and on the Bal-
During 1967- 74, he worked in pri- listic Missile Defense Advanced
vate industry, returning to APL in Technology Center's Specification
1974. His special interests include in- Evaluation Techniques Panel. He
formation systems, applications of has participated in the DoD
computers to patient care, and soft- Weapons Systems Software Manage-
ware engineering. From 1975- 83, he ment Study, which led to the DoD
served as director of the Clinical In- directive on embedded computer software management. Dr. Sleight served
formation Systems Division, Depart- as supervisor of the Advanced Systems Design Group from 1977-82 in
ment of Biomedical Engineering, support of the Aegis Program and the ANIUYK-43 Navy shipboard
The Johns Hopkins University. mainframe computer development and test program. Since 1982, he has
served in the Director's Office, where he is responsible for computing
and information systems.

Johns Hopkins APL Technical Digest, Volume 9, Number 3 (1988) 287

You might also like