An Overview of Software Engineering
An Overview of Software Engineering
An Overview of Software Engineering
SLEIGHT
Computer software has become an important component of our defense systems and our everyday
lives, but software development is both difficult and costly. This article examines the similarities and
differences between software and hardware development, the essence of software, modern practices used
to support the software process, and the application of government methods. We also consider the role
of standards and technology centers, and conclude with a view into the future.
INTRODUCTION
Software is a part of everyday life at work and at 100~---------------r---------------'
DATA DICTIONARY
Examples of composition techniques include the Jack- tablish what the procured software is to provide, so the
son System Design and object-oriented design ("design" specification becomes part of the contract that defines
implying that the process steps have considerable over- the deliverable. In the essential model of the software
lap). In the Jackson System Design, the target system process, however, there is continuity between analysis
is represented as a discrete simulation, and the implemen- and design activities, and the methods often support both
tation is considered a set of communicating sequential activities.
processes; that is, the method allows for the modeling The basic process is one of modeling the software sys-
of the real-world environment as a computer simulation, tem and adding details until there is sufficient informa-
which then is transformed into a set of sequential pro- tion to convert the design into a realization (i.e., pro-
grams that can operate asynchronously. Conversely, ob- gram). Design always begins with a specification, which
ject-oriented design first identifies the real-world objects is a product of the analysis step. At times, the specifica-
that the desired system must interact with and then con- tion is a formal document establishing a set of require-
siders how those objects interact with each other. There ments. Here, it is important to maintain traceability to
are several versions of object-oriented design, but exper- ensure that all design decisions are derived from a re-
ience with its use is limited. quirement and that all requirements are satisfied in the
design (i.e., there are neither extra features nor omis-
Programming in the Large-Design sions). At other times (e.g. , in the internal development
The design process begins after there is a specification of a product), the specification is less formal, and addi-
establishing what functions the software is to provide. tional subjectivity is needed to determine that the design
From the discussion of analysis, we see that there is no decisions are valid.
precise division between analysis (the decision of what For any set of requirements, there are many equally
is to be done) and design (the determination of how to correct designs. The task of the design team is to select
realize it). There sometimes is a contractual need to es- among the alternatives those system decisions yielding
tools have been developed for a specific class of appli- mUltiplication of early errors and a relatively high cost
cation (information processing) that facilitate the devel- of correction per defect.
opment of programs at a very high level. For example, Before a formal specification exists (one that can be
one can produce a report simply by describing the con- subjected to logical analysis), the primary method for
tent and format of the desired output; one does not have both verification and validation is the review. In the soft-
to describe procedurally how it should be implemented. ware domain, this is sometimes called a walk-through
(Thus, 4GLs generally are described as being nonproce- or inspection, which frequently includes the review of
dural or declarative.) both design documents and preliminary code. The review
Validation and Verification process is intended to identify errors and misunderstand-
ings. There also are management reviews that establish
In the traditional descriptive flow for software devel-
decision points before continuing with the next devel-
opment, the activity that precedes operations and main-
opment step. The two types of reviews have different
tenance is called "test." Testing is the process of detect-
functions, and they should not be confused or combined.
ing errors. A good test discovers a previously undetected
Management reviews should occur after walk -throughs
error. Thus, testing is related to defect removal; it can
have been completed and technical issues resolved.
begin only when some part of the product is completed
Most software tests are designed to detect errors,
and there are defects to be removed.
which sometimes can be identified by examining the pro-
The validation and verification activity includes the
gram text. The tools that review the text are called "static
process of testing. But it begins well before there is a
analyzers." Some errors they can detect (such as iden-
product to be tested and involves more than the identi-
tifying blocks of code that cannot be reached) can be
fication of defects. Validation comes from the Latin vali-
recognized by compilers. Other forms of analysis rely
dus, meaning strength or worth. It is a process of predict-
on specialized, stand-alone software tools. "Dynamic
ing how well the software product will correspond to
analysis" tests, concerned with how the program oper-
the needs of the environment (i.e., will it be the right
ates, are divided into two categories. "White box" tests
system?). Verification comes from the Latin verus,
are designed to exercise the program as implemented.
meaning truth. It determines the correctness of a product
The assumption is that the errors are random; each path
with respect to its specification (i.e., is the system right?).
of the program, therefore, should be exercised at least
Validation is performed at two levels. During the anal-
once to uncover problems such as the use of the wrong
ysis step, validation supplies the feedback to review de-
variable or predicate. "Black box" tests evaluate only
cisions about the potential system. Recall that analysis
the function of the program, independent of its im-
requires domain understanding and subjective decisions.
plementation.
The domain knowledge is used to eliminate improper
As with equipment testing, software testing is organ-
decisions and to suggest feasible alternatives. The rank-
ized into levels. Each program is debugged and tested
ing of those alternatives relies on the analysts' experience
by the individual programmer. This is called unit test-
and judgment. The review of these decisions is a cogni-
ing. Individual programs next are integrated and tested
tive (rather than a logically formal) activity. There is no
as larger components, which are then function tested to
concept of formal correctness; in fact, the software's va-
lidity can be established only after it is in place. (Proto- certify that they provide the necessary features. Finally,
types and the spiral model both are designed to deal with the full system is tested in an operational setting, and
a decision is made to deploy (or use) the product. Natu-
the analyst's inability to define a valid specification.)
rally, if the software is part of an embedded system,
The second level of validation involves decisions made
within the context of the specification produced by the then, at some level, the software tests are integrated with
the hardware tests.
analysis activity. This specification describes what func-
tions should be supported by the software product (i.e., Management
its behavior). The specification also establishes nonfunc- We have so far emphasized the essential features of
tional requirements, such as processing time constraints software development; that is, what makes the devel-
and storage limitations. The product's behavior can be opment process unique for this category of product.
described formally; in fact, the program code is the most Some characteristics of the process make it difficult: the
complete expression of that formal statement. Nonfunc- software can be very complex, which introduces the
tional requirements, however, can be demonstrated only potential for many errors; the process is difficult to mod-
when the product is complete. el in terms of physical reality; there is always a strong
Validation and verification are independent concepts. temptation to accommodate change by modifying the
A product may be correct with respect to the contractu- programs; and, finally, the product is always subject to
al specification, but it may not be perceived as a useful change. (In fact, the lifetime cost for adaptation and en-
product. Conversely, a product may correspond to the hancement of a software product usually exceeds its de-
environment's needs even though it deviates from its velopment cost.)
specified behavior. Also, validation always relies on The management of a software project is similar to
judgment, but verification can be formalized. Finally, the management of any other technical project. Man-
both validation and verification can be practiced before agers must identify the areas of highest risk and the
there is code to be tested; failure to exercise quality con- strategies for reducing that risk; they must plan the se-
trol early in the development process will result in the quence of project activities and recognize when devia-
tions are imminent; they must budget time, personnel, ed for many years to standardize languages, beginning
and dollar resources and adjust these factors through- with CS-I in the 1960s, through the CMS-2 and the in-
out the process; they must maintain control over the troduction of Ada.
completed products; and they must establish procedures The Navy also has standardized computer hardware,
to ensure a high level of product quality. in particular, the AN/UYK-20, and more recently the
As with any project management assignment, the AN/UYK-44 and AN/AYK-14, which are the 16-bit
manager must understand something about both the do- common instruction set standards. The AN/UYK-7 is
main of application and the technology to be applied. upward-compatible with the newer 32-bit AN/UYK-43
In well-understood problem areas, this knowledge is less standard computer. For the Navy to support those com-
critical because design is reduced to the detailing of some puters with the new Ada language, the Ada Language
existing design concept. But in new domains there are System/Navy project is tasked to develop production-
uncertainties, and management must be sensitive to the quality Ada translators (first released in June 1988) for
early resolution of high-risk problems. (This is one area the AN/UYK-44, AN/AYK-14, and AN/UYK-43. Con-
in which prototyping can be most effective; another is siderable emphasis has been placed on the support of
the examination of the human interface.) fast interrupts, a requirement of many Navy embedded
Although software engineering is a relatively new dis- systems. Within a common command language frame-
cipline, there are many tools available to help support work, there are tools for version and database main-
its management. Cost-projection tools have been pro- tenance, software-problem-report tracking, documen-
duced that allow a manager to build upon previous ex- tation updating, and report generation.
perience to estimate cost and schedule. Commercial The Ada language has matured, and the translators
"configuration control" systems manage the software are rapidly becoming viable production tools. Some re-
versions and supply mechanisms to insulate programs cent benchmark programs executing on the MC 68020
from unauthorized or uncertified changes. Many modem processor indicate that the Ada code is almost as fast
program-support environments also contain tools that as code generated by popular C-Ianguage translators.
give management easy access to schedule and status in- Ada is now a workable language for military use, and
formation. the benefits of the object-oriented approach can be ob-
tained from specification through operational code and,
GOVERNMENT METHODS most important, during evolution and maintenance.
APPLIED AT APL APL has explored Ada applications for several tactical
The approaches to software engineering taken by systems. Whenever a mature Ada translator and target
APL's sponsoring organizations must be considered both run-time system have been available, the projects have
when an operational system is delivered and when a pro- been successful. One example is the redesign and reimple-
totype is developed that will evolve into government mentation of a simplified version of the Aegis Command
specifications and procurements. For many experiments and Decision System. 9 This project, while successful,
(e.g., at-sea or space-based), the whole effort is oriented also identified several problems with the 1985 Ada tech-
toward quick deployment of existing or slightly modified nology. Many of those problems have disappeared with
sensors and support for recording, telemetry, and anal- more mature Ada products. In follow-on projects, the
ysis. There are no sponsor-specified approaches, and reuse of both internally developed and externally avail-
development responsiveness often is the critical compo- able Ada modules has been successful. Also, Ada mod-
nent. The software becomes a crucial integrating element. ule integration has been faster, and fewer errors were
Since existing software must be modified, software en- detected than with other languages.
gineering techniques are applied less formally. Most software development at APL involves the sup-
In its work on Navy tactical systems, however, APL port of experimental or "one of a kind" systems. Al-
relies more on the use of standards. The development though these systems frequently require the proper use
and acquisition of mission-critical defense-systems soft- of software engineering techniques, they typically are not
ware is governed by DOD-STD-2167A,6 a standard controlled by the sponsor as is a mission-critical Navy
specifying a number of documents that should be gener- system (i.e., DOD-STD-2167A). This offers an oppor-
ated during software development. This document-driven tunity to try new techniques. For example, in some sat-
approach has been criticized for the lack of proper docu- ellite altimeter work for NASA, the DeMarco and Your-
ment selection and tailoring by the government and for don methodology, with Ward/Mellor real-time exten-
ignoring several modern software processes or develop- sions,1O is being successfully applied using new CASE
ment strategies as described above. Although this stan- tools. The recent APL projects requiring Ada have used
dard has some drawbacks, it has provided a sound and the Bahr object-oriented design approach, II which
consistent basis for software development over many (with some modification) has been particularly useful
years. Recently, it was modified (Revision A) to reflect when the target language is known to be Ada. Some pro-
better the methods available via the Ada language. This jects derived their approach from a combination and tail-
standards approach is not new; the DOD standard origi- oring of DOD-STD-2167A and emerging IEEE software
nated in earlier internal Navy standards. 7,8 The other standards. This approach provides guidelines and tem-
main component in the Navy's approach is the use of plates, and helps to alleviate problems arising from staff
specific programming languages. The Navy has attempt- reassignment and short corporate memory.
for building requirement and design specifications and has not been demonstrated outside the laboratory, and
related code, synthesizing prototypes, performing dy- it is not clear how this process model can be managed.
namic assessments, and managing software development
projects. At a recent meeting of companies developing Automatic Verification
and marketing CASE tools, SPC launched an initiative In the discussion of validation and verification, we
to establish an industrywide consensus on effective tool- noted that proofs (verification) could be objective only
to-tool interface standards. Those standards will repre- when the parent specification was clear (formal). There
sent the fIrst steps in building an integrated environment. is, therefore, considerable interest in working with
MCC was established by 21 shareholder companies mathematically formal specifications at an early stage
in 1983. The consortium has several research programs, in the design process, since it then would be possible to
ranging from semiconductor packaging to software tech- prove that each detailing step was correct with respect
nology. Each program is sponsored by a subset of par- to this high-level source. A theorem prover could be used
ticipating shareholder companies. to automate the process. Testing then would not be
The software technology program focuses on the front necessary, because no errors would exist. Naturally, vali-
end, upstream in the software cycle, where little research dation still would be required.
has been performed. This program has created a com-
puter-aided software design environment call Leonardo. Automated Tools and Environments
The environment is to concentrate on requirements cap- There is considerable interest in the use of CASE tools
ture, exploration, and early design. Academic research and in program support environments. Unlike the for-
has focused on downstream activities, where formalism malisms addressed above, most tools and environments
and automation are more obvious. MCC's research em- are commercial products that implement techniques de-
phasis is on defining and decomposing a large problem veloped in the mid-1970s. Thus, these tools and environ-
into smaller problems and on selecting algorithms, and ments respond to a marketplace demand and provide
is geared to teams of professional software engineers a means for making the application of current practice
working on large, complex systems. Researchers are more efficient. Their primary benefit is one of reducing
working on Leonardo architecture, and three compo- the manual effort and thereby the number of errors in-
nents: complex design processes, a design information troduced. As new paradigms are introduced, this focus
base, and design visualization. may limit their use or force changes in the approaches
The corporation does not consider its research com- taken.
plete until it is put to use by the sponsoring companies. Artificial Intelligence and Knowledge
Also, MCC believes it is easier to transfer and establish Representation
tools and make them consistent than to transfer method-
Although there are many definitions of artificial in-
ologies and methods.
telligence and debates about what it accomplishes, it has
A VIEW TO THE FUTURE had an impact on our perceptions of what computers
We began this article with a review of how software can do and how to approach problems. The software
differed from hardware and noted that-once the tech- process is one of representing knowledge about a prob-
nical manager understands the software process-the lem in a way that facilitates its transformation (detail-
management of software is much like that of hardware. ing) into an implementable solution. Thus, there are
We then described the software process, characterized many software engineering methods and tools that owe
more by variety than by clarity and consistency. Despite their origins to artifIcial intelligence. Some projects, such
our signifIcant accomplishments with software, there re- as the development of object-oriented programming,
main conflicting methods, limited formal models, and have been successful and are available to developers;
many unsubstantiated biases. But we present below some many others still are in the research stage. One can ex-
significant trends in the field. pect that a knowledge orientation to the problem of soft-
ware design will have considerable impact.
Formalization
Some new paradigms extend the formalism of the New High-Order Languages
programming language into an executable specification. The major advances of the 1960s can be attributed
A specification defines the behavior for all implementa- to the use of high-order languages, but it is doubtful that
tions. An executable specifIcation does not exhibit the current language improvements will have much impact
intended behavior effIciently, but a program is an imple- on productivity. Many proven modern programming
mentation of that behavior, optimized for a specifIc com- concepts have been incorporated into Ada, and the com-
puter. We see this because there are systems that we mitment to this language clearly will familiarize de- .
know how to specify exactly, but we do not know how velopers with those concepts and thereby improve both
to implement them efficiently. For example, one can product quality and productivity. At another extreme,
specify what a chess-playing program should do without 4GLs offer tools to end users and designers that, for a
being able to describe an efficient implementation. The narrow application domain, yield a tenfold improvement
hope is that the executable specification will supply a in productivity at a price in performance. Still, neither
prototype for experimentation that ultimately can be high-order languages nor 4GLs can match the improve-
transformed into an effIcient program. But the concept ments we are witnessing in hardware cost performance.