Unit - 1
Unit - 1
UNIT- I
Software and Software Engineering: The Nature of Software, The Unique Nature of
WebApps, Software Engineering, The Software Process, Software Engineering Practice,
Software Myths
Process Models: A Generic Process Model, Process Assessment and Improvement,
Prescriptive Process Models, Specialized Process Models, The Unified Process, Personal and
Team Process Models, Process Technology, Product and Process.
Software is more than just a program code. A program is an executable code, which serves some
computational purpose. Software is considered to be collection of executable programming code,
associated libraries and documentations. Software, when made for a specific requirement is
called software product.
Engineering on the other hand, is all about developing products, using well-defined, scientific
principles and methods.
Software engineering is an engineering branch associated with development of software
product using well-defined scientific principles, methods and procedures. The outcome of
software engineering is an efficient and reliable software product.
Definitions
IEEE defines software engineering as:
1
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
Software takes Dual role of Software. It is a Product and at the same time a Vehicle for
delivering a product.
Software delivers the most important product of our time is called information
Defining Software
Software is defined as
1. Instructions : Programs that when executed provide desired function,
features, and performance
2
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
stated simply, the hardware begins to wear out. Software is not susceptible to the environmental
maladies that cause hardware to wear out
3
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
4
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
Open Source : Distributing source code for computing applications so customers can
make local modifications easily and reliably ( “free” source code open to the computing
community)
Legacy Software
• Legacy software is older programs that are developed decades ago.
• The quality of legacy software is poor because it has inextensible design, convoluted
code, poor and nonexistent documentation, test cases and results that are not
achieved.
As time passes legacy systems evolve due to following reasons:
• The software must be adapted to meet the needs of new computing environment or
technology.
• The software must be enhanced to implement new business requirements.
• The software must be extended to make it interoperable with more modern systems or
database
• The software must be re-architected to make it viable within a network environment.
In the early days of the World Wide Web, websites consisted of little more than a set of
linked hypertext files that presented information using text and limited graphics. As time passed,
the augmentation of HTML by development tools (e.g., XML, Java) enabled Web engineers to
provide computing capability along with informational content. Web-based systems and
applications (WebApps) were born. Today, WebApps have evolved into sophisticated computing
tools that not only provide stand-alone function to the end user, but also have been integrated
with corporate databases and business applications.
5
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
WebApps are one of a number of distinct software categories. Web-based systems and
applications “involve a mixture between print publishing and software development, between
marketing and computing, between internal communications and external relations, and between
art and technology.”
The following attributes are encountered in the vast majority of WebApps.
Network intensiveness. A WebApp resides on a network and must serve the needs of a
diverse community of clients. The network may enable worldwide access and
communication (i.e., the Internet) or more limited access and communication (e.g., a
corporate Intranet).
Concurrency. A large number of users may access the WebApp at one time. In many
cases, the patterns of usage among end users will vary greatly.
Unpredictable load. The number of users of the WebApp may vary by orders of
magnitude from day to day. One hundred users may show up on Monday; 10,000 may
use the system on Thursday.
Performance. If a WebApp user must wait too long, he or she may decide to go
elsewhere.
Availability. Although expectation of 100 percent availability is un reasonable, users of
popular WebApps often demand access on a 24/7/365 basis
Data driven. The primary function of many WebApps is to use hypermedia to present
text, graphics, audio, and video content to the end user. In addition, WebApps are
commonly used to access information that exists on databases that are not an integral part
of the Web-based environment (e.g., e-commerce or financial applications).
Content sensitive. The quality and aesthetic nature of content remains an important
determinant of the quality of a WebApp.
Continuous evolution. Unlike conventional application software that evolves over a
series of planned, chronologically spaced releases, Web applications evolve continuously.
Immediacy. Although immediacy—the compelling need to get software to market
quickly—is a characteristic of many application domains, WebApps often exhibit a time-
to-market that can be a matter of a few days or weeks.
6
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
Security. Because WebApps are available via network access, it is difficult, if not
impossible, to limit the population of end users who may access the application. In order
to protect sensitive content and provide secure modes
Aesthetics. An undeniable part of the appeal of a WebApp is its look and feel. When an
application has been designed to market or sell products or ideas, aesthetics may have as
much to do with success as technical design.
7
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
8
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
A task focuses on a small, but well-defined objective (e.g., conducting a unit test) that
produces a tangible outcome.
A process framework establishes the foundation for a complete software engineering
process by identifying a small number of framework activities that are applicable to all software
projects, regardless of their size or complexity. In addition, the process framework encompasses
a set of umbrella activities that are applicable across the entire software process.
A generic process framework for software engineering encompasses five activities:
Communication. Before any technical work can commence, it is critically important to
communicate and collaborate with the customer. The intent is to understand stakeholders
objectives for the project and to gather requirements that help define software features
and functions.
Planning. Any complicated journey can be simplified if a map exists. A software project
is a complicated journey, and the planning activity creates a “map” that helps guide the
team as it makes the journey. The map—called a software project plan—defines the
software engineering work by describing the technical tasks to be conducted, the risks
that are likely, the resources that will be required, the work products to be produced, and
a work schedule.
Modeling. Creation of models to help developers and customers understand the requires
and software design
Construction. This activity combines code generation and the testing that is required to
uncover errors in the code.
Deployment. The software is delivered to the customer who evaluates the delivered
product and provides feedback based on the evaluation.
These five generic framework activities can be used during the development of small, simple
programs, the creation of large Web applications, and for the engineering of large, complex
computer-based systems.
Software engineering process framework activities are complemented by a number of
Umbrella Activities. In general, umbrella activities are applied throughout a software project
and help a software team manage and control progress, quality, change, and risk. Typical
umbrella activities include:
9
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
11
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
Always specify, design, and implement knowing someone else will have to understand what you
are doing.
The Fifth Principle: Be Open to the Future
A system with a long lifetime has more value. Never design yourself into a corner. Before
beginning a software project, be sure the software has a business purpose and that users
perceive value in it.
The Sixth Principle: Plan Ahead for Reuse
Reuse saves time and effort. Planning ahead for reuse reduces the cost and increases the value
of both the reusable components and the systems into which they are incorporated.
The Seventh principle: Think!
Placing clear, complete thought before action almost always produces better results. When you
think about something, you are more likely to do it right.
Software Myths
Software Myths- beliefs about software and the process used to build it - can be traced to
the earliest days of computing. Myths have a number of attributes that have made them
insidious. For instance, myths appear to be reasonable statements of fact, they have an
intuitive feel, and they are often promulgated by experienced practitioners who “know
the score”
Management Myths :
Managers with software responsibility, like managers in most disciplines, are often under
pressure to maintain budgets, keep schedules from slipping, and improve quality. Like a
drowning person who grasps at a straw, a software manager often grasps at belief in a software
myth.
Myth : We already have a book that’s full of standards and procedures for building software.
Won’t that provide my people with everything they need to know?
Reality :
• The book of standards may very well exist, but is it used?
• Are software practitioners aware of its existence?
• Does it reflect modern software engineering practice?
• Is it complete?
• Is it adaptable?
12
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
13
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
Practitioner's myths.
Myths that are still believed by software practitioners have been fostered by 50 years of
programming culture. During the early days of software, programming was viewed as an art
form. Old ways and attitudes die hard.
Myth: Once we write the program and get it to work, our job is done.
Reality: Someone once said that "the sooner you begin 'writing code', the longer it'll take you to
get done.” Industry data indicate that between 60 and 80 percent of all effort expended on
software will be expended after it is delivered to the customer for the first time.
Myth: Until I get the program "running" I have no way of assessing its quality.
Reality: One of the most effective software quality assurance mechanisms can be applied from
the inception of a project—the formal technical review. Software reviews are a "quality filter"
that have been found to be more effective than testing for finding certain classes of software
defects.
Myth: The only deliverable work product for a successful project is the working program.
Reality: A working program is only one part of a software configuration that includes many
elements. Documentation provides a foundation for successful engineering and, more important,
guidance for software support.
Myth: Software engineering will make us create voluminous and unnecessary documentation
and will invariably slow us down.
Reality: Software engineering is not about creating documents. It is about creating quality. Better
quality leads to reduced rework. And reduced rework results in faster delivery times. Many
software professionals recognize the fallacy of the myths just described. Regrettably, habitual
attitudes and methods foster poor management and technical practices, even when reality dictates
a better approach. Recognition of software realities is the first step toward formulation of
practical solutions for software engineering.
14
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
PROCESS MODELS
15
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
In addition, a set of umbrella activities project tracking and control, risk management,
quality assurance, configuration management, technical reviews, and others are applied
throughout the process.
This aspect is called process flow. It describes how the framework activities and the
actions and tasks that occur within each framework activity are organized with respect to
sequence and time and is illustrated in following figure
A generic process framework for software engineering A linear process flow executes each of
the five framework activities in sequence, beginning with communication and culminating with
deployment.
16
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
An iterative process flow repeats one or more of the activities before proceeding to the next. An
evolutionary process flow executes the activities in a “circular” manner. Each circuit through the
five activities leads to a more complete version of the software. A parallel process flow executes
one or more activities in parallel with other activities (e.g., modeling for one aspect of the
software might be executed in parallel with construction of another aspect of the software).
Defining a Framework Activity
A software team would need significantly more information before it could properly execute any
one of these activities as part of the software process. Therefore, you are faced with a key
question: What actions are appropriate for a framework activity, given the nature of the problem
to be solved, the characteristics of the people doing the work, and the stakeholders who are
sponsoring the project?
Identifying a Task Set
Different projects demand different task sets. The software team chooses the task set
based on problem and project characteristics. A task set defines the actual work to be done to
accomplish the objectives of a software engineering action.
Process Patterns
A process pattern describes a process-related problem that is encountered during
software engineering work, identifies the environment in which the problem has been
encountered, and suggests one or more proven solutions to the problem. Stated in more general
terms, a process pattern provides you with a template —a consistent method for describing
problem solutions within the context of the software process.
Patterns can be defined at any level of abstraction. a pattern might be used to describe a
problem (and solution) associated with a complete process model (e.g., prototyping). In other
situations, patterns can be used to describe a problem (and solution) associated with a
framework activity (e.g., planning) or an action within a framework activity (e.g., project
estimating).
Ambler has proposed a template for describing a process pattern:
Pattern Name. The pattern is given a meaningful name describing it within the context of the
software process (e.g., TechnicalReviews).
Forces. The environment in which the pattern is encountered and the issues that make the
problem visible and may affect its solution.
17
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
18
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
Process patterns provide an effective mechanism for addressing problems associated with
any software process. The patterns enable you to develop a hierarchical process
description that begins at a high level of abstraction (a phase pattern).
Software Process
identifies is examined
by identifies
modification
s to capabilities
and risk of
Software Process
Assessment
Capability
Software Process leads to leads to
Determination
Improvement
19
PSCMR (Autonomous), Vijayawada
Software Engineering (R20)
motivates
20
PSCMR (Autonomous), Vijayawada
PRESCRIPTIVE PROCESS MODELS
Prescriptive process models were originally proposed to bring order to the chaos of
software development. Prescriptive process models define a prescribed set of process elements
and a predictable process work flow. “prescriptive” because they prescribe a set of process
elements—framework activities, software engineering actions, tasks, work products, quality
assurance, and change control mechanisms for each project.
The Waterfall Model
The waterfall model, sometimes called the classic life cycle, suggests a systematic,
sequential approach to software development that begins with customer specification of
requirements and progresses through planning, modeling, construction, and deployment.
A variation in the representation of the waterfall model is called the V-model. Represented in
following figure. The V-model depicts the relationship of quality assurance actions to the
actions associated with communication, modeling, and early construction activities.
As a software team moves down the left side of the V, basic problem requirements are
refined into progressively more detailed and technical representations of the problem and its
solution. Once code has been generated, the team moves up the right side of the V, essentially
performing a series of tests that validate each of the models created as the team moved down the
left side. The V-model provides a way of visualizing how verification and validation actions are
applied to earlier engineering work.
The waterfall model is the oldest paradigm for software engineering. The problems that
are sometimes encountered when the waterfall model is applied are:
1. Real projects rarely follow the sequential flow that the model proposes. Although the
linear model can accommodate iteration, it does so indirectly. As a result, changes
can cause confusion as the project team proceeds.
2. It is often difficult for the customer to state all requirements explicitly. The waterfall
model requires this and has difficulty accommodating the natural uncertainty that
exists at the beginning of many projects.
3. The customer must have patience. A working version of the program(s) will not be
available until late in the project time span.
This model is suitable when ever limited number of new development efforts and when
requirements are well defined and reasonably stable.
Incremental Process Models
The incremental model delivers a series of releases, called increments, that provide
progressively more functionality for the customer as each increment is delivered.
The incremental model combines elements of linear and parallel process flows discussed
in Section 1.7. The incremental model applies linear sequences in a staggered fashion as calendar
time progresses. Each linear sequence produces deliverable “increments” of the software in a
manner that is similar to the increments produced by an evolutionary process flow.
For example, word-processing software developed using the incremental paradigm might
deliver basic file management, editing, and document production functions in the first increment;
more sophisticated editing and document production capabilities in the second increment;
spelling and grammar checking in the third increment; and advanced page layout capability in
the fourth increment.
When an incremental model is used, the first increment is often a core product. That is, basic
requirements are addressed but many supplementary features remain undelivered. The core
product is used by the customer. As a result of use and/or evaluation, a plan is developed for the
next increment. The plan addresses the modification of the core product to better meet the needs
of the customer and the delivery of additional features and functionality. This process is repeated
following the delivery of each increment, until the complete product is produced.
Incremental development is particularly useful when staffing is unavailable for a
complete implementation by the business deadline that has been established for the project. Early
increments can be implemented with fewer people. If the core product is well received, then
additional staff (if required) can be added to implement the next increment. In addition,
increments can be planned to manage technical risks.
All software engineering activities exist concurrently but reside in different states.
Concurrent modeling defines a series of events that will trigger transitions from state to state for
each of the software engineering activities, actions, or tasks. This generates the event analysis
model correction, which will trigger the requirements analysis action from the done state into the
awaiting changes state.
Concurrent modeling is applicable to all types of software development and provides an
accurate picture of the current state of a project. Each activity, action, or task on the network
exists simultaneously with other activities, actions, or tasks. Events generated at one point in the
process network trigger transitions among the states.
Inception
Elaboration
Conception
Transition
Production
The inception phase of the UP encompasses both customer communication and planning
activities. By collaborating with stakeholders, business requirements for the software are
identified; a rough architecture for the system is proposed; and a plan for the iterative,
incremental nature of the ensuing project is developed.
The elaboration phase encompasses the communication and modeling activities of the
generic process model. Elaboration refines and expands the preliminary use cases that were
developed as part of the inception phase and expands the architectural representation to include
five different views of the software—the use case model, the requirements model, the design
model, the implementation model, and the deployment model. Elaboration creates an
“executable architectural baseline” that represents a “first cut” executable system.
The construction phase of the UP is identical to the construction activity defined for the
generic software process. Using the architectural model as input, the construction phase develops
or acquires the software components that will make each use case operational for end users. To
accomplish this, requirements and design models that were started during the elaboration phase
are completed to reflect the final version of the software increment. All necessary and required
features and functions for the software increment (i.e., the release) are then implemented in
source code.
The transition phase of the UP encompasses the latter stages of the generic construction
activity and the first part of the generic deployment (delivery and feedback) activity. Software is
given to end users for beta testing and user feedback reports both defects and necessary
changes. At the conclusion of the transition phase, the software increment becomes a usable
software release.
The production phase of the UP coincides with the deployment activity of the generic
process. During this phase, the ongoing use of the software is monitored, support for the
operating environment (infrastructure) is provided, and defect reports and requests for changes
are submitted and evaluated. It is likely that at the same time the construction, transition, and
production phases are being conducted, work may have already begun on the next software
increment. This means that the five UP phases do not occur in a sequence, but rather with
staggered concurrency.
PROCESS TECHNOLOGY
Process technology tools allow a software organization to build an
automated model of the process framework, task sets, and umbrella activities. The
model, normally represented as a network, can then be analyzed to determine typical
workflow and examine alternative process structures that might lead to reduced
development time or cost.
Once an acceptable process has been created, other process technology tools
can be used to allocate, monitor, and even control all software engineering activities,
actions, and tasks defined as part of the process model. Each member of a software
team can use such tools to develop a checklist of work tasks to be performed, work
products to be produced, and quality assurance activities to be conducted. The
process technology tool can also be used to coordinate the use of other software
engineering tools that are appropriate for a particular work task.