UNIT-II Notes
UNIT-II Notes
The old way and the new: The principles of conventional software Engineering, principles of
modern software management, transitioning to an iterative process.
Life cycle phases: Engineering and production stages, inception, Elaboration, construction,
transition phases.
Artifacts of the process: The artifact sets, Management artifacts, Engineering artifacts,
programmatic artifacts.
22. Avoid tricks. Many programmers love to create programs with tricks constructs that
perform a function correctly, but in an obscure way. Show the world how smart you are by
avoiding tricky code
23. Encapsulate. Information-hiding is a simple, proven concept that results in software that is
easier to test and much easier to maintain.
24. Use coupling and cohesion. Coupling and cohesion are the best ways to measure software's
inherent maintainability and adaptability
25. Use the McCabe complexity measure. Although there are many metrics available to report
the inherent complexity of software, none is as intuitive and easy to use as Tom McCabe's
26. Don't test your own software. Software developers should never be the primary testers of
their own Software.
27. Analyze causes for errors. It is far more cost-effective to reduce the effect of an error by
preventing it than it is to find and fix it. One way to do this is to analyze the causes of errors
as they are detected
28. Realize that software's entropy increases. Any software system that undergoes continuous
change will grow in complexity and will become more and more disorganized
29. People and time are not interchangeable. Measuring a project solely by person-months
makes little sense
30. Expect excellence. Your employees will do much better if you have high expectations for
them.
Top 10 principles of modern software management are. (The first five, which are the main
themes of my definition of an iterative process, are summarized in Figure 2.1.)
1. Base the process on an architecture-first approach. This requires that a demonstrable
balance be achieved among the driving requirements, the architecturally significant
design decisions, and the life- cycle plans before the resources are committed for full-
scale development.
2. Establish an iterative life-cycle process that confronts risk early. With today's
sophisticated software systems, it is not possible to define the entire problem, design the
entire solution, build the software, and then test the end product in sequence. Instead, an
iterative process that refines the problem understanding, an effective solution, and an
effective plan over several iterations encourages a balanced treatment of all stakeholder
objectives. Major risks must be addressed early to increase predictability and avoid
expensive downstream scrap and rework.
3. Transition design methods to emphasize component-based development. Moving
from a line-of- code mentality to a component-based mentality is necessary to reduce the
amount of human-generated source code and custom development.
The economic benefits inherent in transitioning from the conventional waterfall model to an
iterative development process are significant but difficult to quantify. As one benchmark of the
expected economic impact of process improvement, consider the process exponent parameters of
the COCOMO II model. This exponent can range from 1.01 to 1.26. The parameters that govern
the value of the process exponent are application precedentedness, process flexibility,
architecture risk resolution, team cohesion, and software process maturity.
The following paragraphs map the process exponent parameters of CO COMO II to my top 10
principles of a modern process.
Team cohesion. Successful teams are cohesive, and cohesive teams are successful. Successful
teams and cohesive teams share common objectives and priorities. Advances in technology (such
as programming languages, UML, and visual modeling) have enabled more rigorous and
understandable notations for communicating software engineering information, particularly in
the requirements and design artifacts that previously were ad hoc and based completely on paper
exchange. These model-based formats have also enabled the round-trip engineering support
needed to establish change freedom sufficient for evolving design representations.
Software process maturity. The Software Engineering Institute's Capability Maturity Model
(CMM) is a well-accepted benchmark for software process assessment. One of key themes is that
truly mature processes are enabled through an integrated environment that provides the
appropriate level of automation to instrument the process for objective quality control.
LIFE CYCLE PHASES
Successful modern projects-and even successful projects developed under the conventional
process-tend to have a very well-defined project milestone when there is a noticeable transition
from a research attitude to a production attitude. Earlier phases focus on achieving functionality.
Later phases revolve around achieving a product that can be shipped to a customer, with explicit
attention to robustness, performance, and finish.
The transition between engineering and production is a crucial event for the various stakeholders.
The production plan has been agreed upon, and there is a good enough understanding of the
problem and the solution that all stakeholders can make a firm commitment to go ahead with
production.
Engineering stage is decomposed into two distinct phases, inception and elaboration, and the
production stage into construction and transition. These four phases of the life-cycle process
are loosely mapped to the conceptual framework of the spiral model as shown in Figure 2.2
The overriding goal of the inception phase is to achieve concurrence among stakeholders on the
life-cycle objectives for the project.
Primary Objectives
Essential Activities
Formulating the scope of the project. The information repository should be sufficient to
define the problem space and derive the acceptance criteria for the end product.
Synthesizing the architecture. An information repository is created that is sufficient to
demonstrate the feasibility of at least one candidate architecture and an, initial baseline
of make/buy decisions so that the cost, schedule, and resource estimates can be derived.
Planning and preparing a business case. Alternatives for risk management, staffing,
iteration plans, and cost/schedule/profitability trade-offs are evaluated.
Primary Evaluation Criteria
Do all stakeholders concur on the scope definition and cost and schedule estimates?
Are requirements understood, as evidenced by the fidelity of the critical use cases?
Are the cost and schedule estimates, priorities, risks, and development processes
credible?
Do the depth and breadth of an architecture prototype demonstrate the preceding criteria?
(The primary value of prototyping candidate architecture is to provide a vehicle for
understanding the scope and assessing the credibility of the development group in
solving the particular technical problem.)
Are actual resource expenditures versus planned expenditures acceptable?
ELABORATION PHASE:
At the end of this phase, the "engineering" is considered complete. The elaboration phase
activities must ensure that the architecture, requirements, and plans are stable enough, and the
risks sufficiently mitigated, that the cost and schedule for the completion of the development can
be predicted within an acceptable range. During the elaboration phase, an executable architecture
prototype is built in one or more iterations, depending on the scope, size, & risk.
Primary Objectives
Baselining the architecture as rapidly as practical (establishing a configuration-managed
snapshot in which all changes are rationalized, tracked, and maintained)
Baselining the vision
Baselining a high-fidelity plan for the construction phase
Demonstrating that the baseline architecture will support the vision at a reasonable cost
in a reasonable time
Essential Activities
Elaborating the vision.
Elaborating the process and infrastructure.
Elaborating the architecture and selecting components.
CONSTRUCTION PHASE
During the construction phase, all remaining components and application features are integrated
into the application, and all features are thoroughly tested. Newly developed software is
integrated where required. The construction phase represents a production process, in which
emphasis is placed on managing resources and controlling operations to optimize costs,
schedules, and quality.
Primary Objectives
Minimizing development costs by optimizing resources and avoiding unnecessary scrap
and rework
Achieving adequate quality as rapidly as practical
Achieving useful versions (alpha, beta, and other test releases) as rapidly as practical
Essential Activities
Resource management, control, and process optimization
Complete component development and testing against evaluation criteria
Assessment of product releases against acceptance criteria of the vision
Is this product baseline mature enough to be deployed in the user community? (Existing
defects are not obstacles to achieving the purpose of the next release.)
Is this product baseline stable enough to be deployed in the user community? (Pending
changes are not obstacles to achieving the purpose of the next release.)
Are the stakeholders ready for transition to the user community?
Are actual resource expenditures versus planned expenditures acceptable?
TRANSITION PHASE
The transition phase is entered when a baseline is mature enough to be deployed in the end-user
domain. This typically requires that a usable subset of the system has been achieved with
acceptable quality levels and user documentation so that transition to the user will provide
positive results. This phase could include any of the following activities:
Primary Objectives
Achieving user self-supportability
Achieving stakeholder concurrence that deployment baselines are complete and
consistent with the evaluation criteria of the vision
Achieving final product baselines as rapidly and cost-effectively as practical
Essential Activities
Synchronization and integration of concurrent construction increments into consistent
deployment baselines
Deployment-specific engineering (cutover, commercial packaging and production, sales
rollout kit development, field personnel training)
Assessment of deployment baselines against the complete vision and acceptance criteria
in the requirements set
Evaluation Criteria
Is the user satisfied?
Are actual resource expenditures versus planned expenditures acceptable?
ARTIFACTS OF THE PROCESS
The engineering sets consist of the requirements set, the design set, the implementation set, and
the deployment set.
Requirements Set
Requirements artifacts are evaluated, assessed, and measured through a combination of the
following:
Analysis of consistency with the release specifications of the management set
Analysis of consistency between the vision and the requirements models
Mapping against the design, implementation, and deployment sets to evaluate the
consistency and completeness and the semantic balance between information in the
different sets
Analysis of changes between the current version of requirements artifacts and previous
versions (scrap, rework, and defect elimination trends)
Subjective review of other dimensions of quality
Design Set
UML notation is used to engineer the design models for the solution. The design set contains
varying levels of abstraction that represent the components of the solution space (their identities,
attributes, static relationships, dynamic interactions). The design set is evaluated, assessed, and
measured through a combination of the following:
Analysis of the internal consistency and quality of the design model
Analysis of consistency with the requirements models
Translation into implementation and deployment sets and notations (for example,
traceability, source code generation, compilation, linking) to evaluate the consistency and
completeness and the semantic balance between information in the sets
Analysis of changes between the current version of the design model and previous versions
(scrap, rework, and defect elimination trends)
Subjective review of other dimensions of quality
Implementation set
The implementation set includes source code (programming language notations) that represents
the tangible implementations of components (their form, interface, and dependency
relationships)
Implementation sets are human-readable formats that are evaluated, assessed, and measured
through a combination of the following:
Analysis of consistency with the design models
Translation into deployment set notations (for example, compilation and linking) to evaluate
the consistency and completeness among artifact sets
Assessment of component source or executable files against relevant evaluation criteria
through inspection, analysis, demonstration, or testing
Execution of stand-alone component test cases that automatically compare expected results
with actual results
Analysis of changes between the current version of the implementation set and previous
versions (scrap, rework, and defect elimination trends)
Subjective review of other dimensions of quality
Deployment Set
The deployment set includes user deliverables and machine language notations, executable
software, and the build scripts, installation scripts, and executable target specific data necessary
to use the product in its target environment.
Deployment sets are evaluated, assessed, and measured through a combination of the following:
Testing against the usage scenarios and quality attributes defined in the requirements set to
evaluate the consistency and completeness and the semantic balance between information in
the two sets
Testing the partitioning, replication, and allocation strategies in mapping components of the
implementation set to physical resources of the deployment system (platform type, number,
network topology)
Testing against the defined usage scenarios in the user manual such as installation, user-
oriented dynamic reconfiguration, mainstream usage, and anomaly management
Analysis of changes between the current version of the deployment set and previous
versions (defect elimination trends, performance changes)
Subjective review of other dimensions of quality
Each artifact set is the predominant development focus of one phase of the life cycle; the other
sets take on check and balance roles. As illustrated in Figure 2.4, each phase has a predominant
focus: Requirements are the focus of the inception phase; design, the elaboration phase;
implementation, the construction phase; and deployment, the transition phase. The management
artifacts also evolve, but at a fairly constant level across the life cycle.
The separation of the implementation set (source code) from the deployment set (executable code) is
important because there are very different concerns with each set. The structure of the information
delivered to the user (and typically the test organization) is very different from the structure of the source
code information. Engineering decisions that have an impact on the quality of the deployment set but are
relatively
Incomprehensible in the design and implementation sets include the following:
Dynamically re-configurable parameters (buffer sizes, color palettes, number of servers,
number of simultaneous clients, data files, run-time parameters)
Effects of compiler/link optimization (such as space optimization versus speed
optimization)
Performance under certain allocation strategies (centralized versus distributed, primary
and shadow threads, dynamic load balancing, hot backup versus checkpoint/rollback)
Virtual machine constraints (file descriptors, garbage collection, heap size, maximum
record size, disk file rotations)
Process-level concurrency issues (deadlock and race conditions)
Platform-specific differences in performance or behavior
Each state of development represents a certain amount of precision in the final system description. Early
in the life cycle, precision is low and the representation is generally high. Eventually, the precision of
representation is high and everything is specified in full detail. Each phase of development focuses on a
particular artifact set. At the end of each phase, the overall system state will have progressed on all sets,
as illustrated in Figure 2.5.
Figure 2.5: Life-cycle evolution of the artifact sets
The inception phase focuses mainly on critical requirements usually with a secondary focus on
an initial deployment view. During the elaboration phase, there is much greater depth in
requirements, much more breadth in the design set, and further work on implementation and
deployment issues. The main focus of the construction phase is design and implementation. The
main focus of the transition phase is on achieving consistency and completeness of the
deployment set in the context of the other sets.
TEST ARTIFACTS
The test artifacts must be developed concurrently with the product from inception through
deployment. Thus, testing is a full-life-cycle activity, not a late life-cycle activity.
The test artifacts are communicated, engineered, and developed within the same artifact
sets as the developed product.
The test artifacts are implemented in programmable and repeatable formats (as software
programs).
The test artifacts are documented in the same way that the product is documented.
Developers of the test artifacts use the same tools, techniques, and training as the software
engineers developing the product.
Test artifact subsets are highly project-specific; the following example clarifies the relationship
between test artifacts and the other artifact sets. Consider a project to perform seismic data
processing for the purpose of oil exploration.
Management set. The release specifications and release descriptions capture the objectives,
evaluation criteria, and results of an intermediate milestone. These artifacts are the test plans
and test results negotiated among internal project teams. The software change orders capture
test results (defects, testability changes, requirements ambiguities, enhancements) and the
closure criteria associated with making a discrete change to a baseline.
Requirements set. The system-level use cases capture the operational concept for the
system and the acceptance test case descriptions, including the expected behavior of the
system and its quality attributes. The entire requirement set is a test artifact because it is the
basis of all assessment activities across the life cycle.
Design set. A test model for nondeliverable components needed to test the product baselines is
captured in the design set. These components include such design set artifacts as a seismic
event simulation for creating realistic sensor data; a "virtual operator" that can support
unattended, after- hours test cases; specific instrumentation suites for early demonstration of
resource usage; transaction rates or response times; and use case test drivers and component
stand-alone test drivers.
Implementation set. Self-documenting source code representations for test components and
test drivers provide the equivalent of test procedures and test scripts. These source files may
also include human- readable data files representing certain statically defined data sets that
are explicit test source files. Output files from test drivers provide the equivalent of test
reports.
Deployment set. Executable versions of test components, test drivers, and data files are
provided.
MANAGEMENT ARTIFACTS
The management set includes several artifacts that capture intermediate results and ancillary
information necessary to document the product/process legacy, maintain the product, improve
the product, and improve the process.
Business Case
The business case artifact provides all the information necessary to determine whether the project
is worth investing in. It details the expected revenue, expected cost, technical and management
plans, and backup data necessary to demonstrate the risks and realism of the plans. The main
purpose is to transform the vision into economic terms so that an organization can make an
accurate ROI assessment. The financial forecasts are evolutionary, updated with more accurate
forecasts as the life cycle progresses. Figure 2.6 provides a default outline for a business case.
Release Specifications
The scope, plan, and objective evaluation criteria for each baseline release are derived from the
vision statement as well as many other sources (make/buy analyses, risk management concerns,
architectural considerations, shots in the dark, implementation constraints, quality thresholds).
These artifacts are intended to evolve along with the process, achieving greater fidelity as the life
cycle progresses and requirements understanding matures. Figure 2-7 provides a default outline
for a release specification
Release Descriptions
Release description documents describe the results of each release, including performance
against each of the evaluation criteria in the corresponding release specification. Release
baselines should be accompanied by a release description document that describes the evaluation
criteria for that configuration baseline and provides substantiation (through demonstration,
testing, inspection, or analysis) that each criterion has been addressed in an acceptable manner.
Figure 2.9 provides a default outline for a release description.
Status Assessments
Status assessments provide periodic snapshots of project health and status, including the software
project manager's risk assessment, quality indicators, and management indicators. Typical status
assessments should include a review of resources, personnel staffing, financial data (cost and
revenue), top 10 risks, technical progress (metrics snapshots), major milestone plans and results,
total project or product scope & action items
Figure 2.8 Typical Software Development plan outline
Deployment
A deployment document can take many forms. Depending on the project, it could include several
document subsets for transitioning the product into operational status. In big contractual efforts
in which the system is delivered to a separate maintenance organization, deployment artifacts
may include computer system operations manuals, software installation manuals, plans and
procedures for cut-over (from a legacy system), site surveys, and so forth. For commercial
software products, deployment artifacts may include marketing plans, sales rollout kits, and
training courses.
Environment
An important emphasis of a modern approach is to define the development and maintenance
environment as a first-class artifact of the process. A robust, integrated development
environment must support automation of the development process. This environment should
include requirements management, visual modeling, document automation, host and target
programming tools, automated regression testing, and continuous and integrated change
management, and feature and defect tracking.
Most of the engineering artifacts are captured in rigorous engineering notations such as UML,
programming languages, or executable machine codes. Three engineering artifacts are explicitly
intended for more general review, and they deserve further elaboration.
Vision Document
The vision document provides a complete vision for the software system under development and.
supports the contract between the funding authority and the development organization. A project
vision is meant to be changeable as understanding evolves of the requirements, architecture, plans,
and technology. A good vision document should change slowly. Figure 2.11 provides a default
outline for a vision document.
PRAGMATIC ARTIFACTS
People want to review information but don't understand the language of the artifact.
Many interested reviewers of a particular artifact will resist having to learn the engineering
language in which the artifact is written. It is not uncommon to find people (such as veteran
software managers, veteran quality assurance specialists, or an auditing authority from a
regulatory agency) who react as follows: "I'm not going to learn UML, but I want to review
the design of this software, so give me a separate description such as some flowcharts and
text that I can understand."
People want to review the information but don't have access to the tools. It is not very
common for the development organization to be fully tooled; it is extremely rare that
the/other stakeholders have any capability to review the engineering artifacts on-line.
Consequently, organizations are forced to exchange paper documents. Standardized formats
(such as UML, spreadsheets, Visual Basic, C++, and Ada 95), visualization tools, and the
Web are rapidly making it economically feasible for all stakeholders to exchange
information electronically.
Human-readable engineering artifacts should use rigorous notations that are complete,
consistent, and used in a self-documenting manner. Properly spelled English words
should be used for all identifiers and descriptions. Acronyms and abbreviations should be
used only where they are well accepted jargon in the context of the component's usage.
Readability should be emphasized and the use of proper English words should be required in
all engineering artifacts. This practice enables understandable representations, browse able
formats (paperless review), more-rigorous notations, and reduced error rates.
Paper is tangible; electronic artifacts are too easy to change. On-line and Web-based
artifacts can be changed easily and are viewed with more skepticism because of their
inherent volatility.