What Is Incremental Model?
What Is Incremental Model?
1. User Interface
2. Presentation
3. Application
4. Business
5. Data Access
6. Database
7. Infrastructure
8. Integration
9. Testing
10. Deployment
11. Operations
12. Maintenance
Winston Royce introduced the Waterfall Model in 1970.This model has five phases:
Requirements analysis and specification, design, implementation, and unit testing, integration
and system testing, and operation and maintenance. The steps always follow in this order and
do not overlap. The developer must complete every phase before the next phase begins. This
model is named "Waterfall Model", because its diagrammatic representation resembles a
cascade of waterfalls.
1. Requirements analysis and specification phase: The aim of this phase is to understand the
exact requirements of the customer and to document them properly. Both the customer and the
software developer work together so as to document all the functions, performance, and
interfacing requirement of the software. It describes the "what" of the system to be produced
and not "how."In this phase, a large document called Software Requirement Specification
(SRS) document is created which contained a detailed description of what the system will do in
the common language.
2. Design Phase: This phase aims to transform the requirements gathered in the SRS into a
suitable form which permits further coding in a programming language. It defines the overall
software architecture together with high level and detailed design. All this work is documented
as a Software Design Document (SDD).
3. Implementation and unit testing: During this phase, design is implemented. If the SDD is
complete, the implementation or coding phase proceeds smoothly, because all the information
needed by software developers is contained in the SDD.
During testing, the code is thoroughly examined and modified. Small modules are tested in
isolation initially. After that these modules are tested by writing some overhead code to check
the interaction between these modules and the flow of intermediate output.
4. Integration and System Testing: This phase is highly crucial as the quality of the end
product is determined by the effectiveness of the testing carried out. The better output will lead
to satisfied customers, lower maintenance costs, and accurate results. Unit testing determines
the efficiency of individual modules. However, in this phase, the modules are tested for their
interactions with each other and with the system.
5. Operation and maintenance phase: Maintenance is the task performed by every user once
the software has been delivered to the customer, installed, and operational.
When to use SDLC Waterfall Model?
Some Circumstances where the use of the Waterfall model is most suited are:
o When the requirements are constant and not changed regularly.
o A project is short
o The situation is calm
o Where the tools and technology used is consistent and is not changing
o When resources are well prepared and are available to use.
Advantages of Waterfall model
o This model is simple to implement also the number of resources that are required for it is
minimal.
o The requirements are simple and explicitly declared; they remain unchanged during the
entire project development.
o The start and end points for each phase is fixed, which makes it easy to cover progress.
o The release date for the complete product, as well as its final cost, can be determined
before development.
o It gives easy to control and clarity for the customer due to a strict reporting system.
Disadvantages of Waterfall model
o In this model, the risk factor is higher, so this model is not suitable for more significant
and complex projects.
o This model cannot accept the changes in requirements during development.
o It becomes tough to go back to the phase. For example, if the application has now shifted
to the coding phase, and there is a change in requirement, It becomes tough to go back
and change it.
o Since the testing done at a later stage, it does not allow identifying the challenges and
risks in the earlier phase, so the risk reduction strategy is difficult to prepare.
OR
Explain in detail about the evolving role of software.
EVOLVING ROLE OF SOFTWARE:
Software takes dual role. It is both a product and a vehicle for delivering a product.
As a product: It delivers the computing potential embodied by computer Hardware or bya
network of computers.
As a vehicle: It is information transformer-producing, managing, acquiring, modifying,
displaying, or transmitting information that can be as simple as single bit or as complex as a
multimedia
presentation. Software delivers the most important product of our time-information.
It transforms personal data
It manages business information to enhance competitiveness
It provides a gateway to worldwide information networks
It provides the means for acquiring information
The role of computer software has undergone significant change over a span of little more than
50 years
Dramatic Improvements in hardware performance
Vast increases in memory and storage capacity
A wide variety of exotic input and output options
1970s and 1980s:
Osborne characterized a ―new industrialrevolution‖
Toffler called the advent of microelectronics part of ―the third wave of change‖ in human
history
Naisbitt predicted the transformation from an industrial society to an ―information society‖
Feigenbaum and McCorduck suggested that information and knowledge would be the focal
point for power in the twenty-first century
Stoll argued that the ―electronic community‖ created by networks and software was the key to
knowledge interchange throughout the world
1990s began:
Toffier described a ―power shift‖ in which old power structures disintegrate as computers and
software lead to a ―democratization of knowledge‖.
Yourdon worried that U.S companies might lose their competitive edge in software related
business and predicted ―the decline and fall of the American programmer‖.
Hammer and Champy argued that information technologies were to play a pivotal role in the
―reengineering of the corporation‖.
Mid-1990s:
The pervasiveness of computers and software spawned a rash of books by neo-luddites.
Later 1990s:
Yourdon reevaluated the prospects of the software professional and suggested ―the rise and
resurrection‖ of the American programmer.
The impact of the Y2K ―time bomb‖ was at the end of 20th century
2000s progressed:
Johnson discussed the power of ―emergence‖ a phenomenon that explains what happens when
interconnections among relatively simple entities result in a system that ―self-organizes to
form
more intelligent, more adaptive behavior‖.
Yourdon revisited the tragic events of 9/11 to discuss the continuing impact of global
terrorism
on the IT community
Wolfram presented a treatise on a ―new kind of science‖ that posits a unifying theory
based primarily on sophisticated software simulations
Daconta and his colleagues discussed the evolution of ―the semantic web‖.
Today a huge software industry has become a dominant factor in the economies of the
industrialized world.
Explain the Prototyping model with a neat diagram along with advantages and disadvantages.
PROTOTYPING:
Context:
If a customer defines a set of general objectives for
software, but does not identify detailed input, processing, or
output requirements, in such situation prototyping paradigm is
best approach.
If a developer may be unsure of the efficiency of an
algorithm, the adaptability of an operating system then he can
go for this prototyping method.
Advantages:
The prototyping paradigm assists the software engineer and the
customer to better understand what is to be built when
requirements are fuzzy.
The prototype serves as a mechanism for identifying software
requirements. If a working prototype is built, the developer
attempts to make use of existing program fragments or applies
tools.
OR
Explain the software process framework.
A PROCESS FRAMEWORK:
Software process must be established for effective delivery of software engineering
technology.
A process framework establishes the foundation for a complete software process by identifying
a
small number of framework activities that are applicable to all software projects, regardless of
their size
or complexity.
The process framework encompasses a set of umbrella activities that are applicable across the
entire
software process.
Each framework activity is populated by a set of software engineering actions
Each software engineering action is represented by a number of different task sets- each a
collection
of software engineering work tasks, related work products, quality assurance points, and
project
milestones.
In brief
"A process defines who is doing what, when, and how to reach a certain goal."
A Process Framework establishes the foundation for a complete software process identifies a
small number of framework activities applies to all s/w projects, regardless of size/complexity.
also, set of umbrella activities
applicable across entire s/w process. Each framework activity has set of s/w engineering
actions.
Each s/w engineering action (e.g., design) has
- collection of related tasks (called task sets): work tasks work products (deliverables) quality
assurance points project milestones. Software process
Discuss the principal requirements engineering activities and their relationships.
Requirements engineering is a critical phase in software development where the needs and
expectations of stakeholders are gathered, analyzed, documented, and managed. The principal
activities in requirements engineering can be broken down into several key stages, each with its
own set of tasks and relationships:
1. Elicitation:
Definition: Elicitation involves gathering requirements from stakeholders,
including end-users, customers, domain experts, and other relevant parties.
Activities:
Conducting interviews
Holding workshops
Observing users in their environment
Analyzing existing documentation and systems
Relationships: Elicitation activities are closely related to analysis and validation.
The information gathered during elicitation serves as the foundation for analyzing
requirements and validating them against stakeholders' needs and expectations.
2. Analysis:
Definition: Analysis involves examining and understanding the collected
requirements to identify inconsistencies, conflicts, and ambiguities.
Activities:
Identifying stakeholders' goals and constraints
Analyzing functional and non-functional requirements
Prioritizing requirements based on importance and feasibility
Relationships: Analysis activities are closely linked to elicitation and validation.
The analysis helps to refine and clarify the requirements collected during
elicitation, making them more precise and actionable. Additionally, the analysis
informs the validation process by providing insights into the completeness and
correctness of the requirements.
3. Specification:
Definition: Specification involves documenting the requirements in a clear,
concise, and unambiguous manner using appropriate techniques and tools.
Activities:
Writing requirement documents
Creating use cases, user stories, or user scenarios
Modeling requirements using diagrams such as UML diagrams
Relationships: Specification activities are closely related to elicitation, analysis,
and validation. The specification documents serve as a communication tool
between stakeholders and development teams, ensuring a shared understanding of
the project's requirements. The specification is informed by the information
gathered during elicitation and refined through analysis. Additionally, the
specification is validated to ensure its accuracy and completeness.
4. Validation:
Definition: Validation involves ensuring that the specified requirements meet
stakeholders' needs and expectations and are consistent, complete, and feasible.
Activities:
Reviewing requirement documents with stakeholders
Conducting prototyping and user feedback sessions
Performing walkthroughs and inspections
Using techniques such as requirement tracing and coverage analysis
Relationships: Validation activities are closely linked to elicitation, analysis, and
specification. Validation ensures that the requirements accurately capture
stakeholders' needs identified during elicitation, are logically consistent and
complete, as determined through analysis, and are clearly documented in the
specification. Validation helps to uncover any discrepancies or misunderstandings
early in the development process, reducing the risk of costly errors and rework
later on.
5. Management:
Definition: Requirement management involves tracking, prioritizing, and
controlling changes to requirements throughout the software development
lifecycle.
Activities:
Establishing a baseline of requirements
Managing changes to requirements through a formal change control process
Communicating changes and updates to stakeholders
Tracing requirements to design, implementation, and testing artifacts
Relationships: Requirement management activities are interwoven throughout the
entire requirements engineering process. Effective management ensures that
requirements remain relevant, consistent, and aligned with project goals and
constraints. Requirement management activities support and facilitate elicitation,
analysis, specification, and validation by providing a framework for organizing and
tracking requirements and their associated artifacts.
These principal activities in requirements engineering are iterative and interactive, with
feedback loops between them to ensure that the requirements are continuously refined and
validated throughout the software development lifecycle. Effective coordination and
collaboration among stakeholders, analysts, developers, and testers are essential for successful
requirements engineering.
OR
Explain how a software requirements document (SRS) is structured?
Requirements elicitation is the process of gathering, identifying, and understanding the needs
and expectations of stakeholders regarding a software system. It involves various techniques
and methods to extract relevant information about what the software should accomplish and
how it should behave. Here's a discussion on requirements elicitation and analysis in
requirements engineering:
2. Elicitation Techniques: There are several techniques for gathering requirements, such as
interviews, surveys, questionnaires, brainstorming sessions, workshops, observations, and
prototyping. Each technique has its strengths and weaknesses, and the choice depends on
factors such as project size, complexity, and stakeholder availability.
4. Analysis and Prioritization: After requirements are gathered, they need to be analyzed to
ensure clarity, consistency, and feasibility. This involves identifying conflicts, ambiguities,
or missing requirements and resolving them through further communication with
stakeholders. Additionally, requirements need to be prioritized based on their importance,
urgency, and impact on the project's success.
5. Validation and Verification: Validation ensures that the requirements accurately represent
stakeholders' needs and expectations. Verification ensures that the requirements are
complete, consistent, and feasible. Techniques such as reviews, walkthroughs, and
prototyping are used to validate and verify requirements.
The typical design process involves several key stages, each contributing to achieving design
quality. Here are the key stages and their contributions:
1. Requirements Analysis: In this stage, the design team thoroughly analyzes the
requirements gathered during the requirements engineering phase. By understanding the
needs and constraints of stakeholders, the team can ensure that the design addresses all
essential aspects of the software system. Clear requirements analysis helps prevent
misunderstandings and ensures that the design aligns with stakeholders' expectations, thus
contributing to design quality.
2. System Architecture Design: This stage involves defining the overall structure and
organization of the software system. The design team identifies major components, their
relationships, and the interfaces between them. A well-designed architecture provides a solid
foundation for the system, ensuring scalability, maintainability, and flexibility. It also helps
manage complexity, making the system easier to understand and modify, thus contributing to
design quality.
3. Detailed Design: In this stage, the design team elaborates on the system architecture by
specifying detailed designs for individual components, modules, and subsystems. This
includes defining data structures, algorithms, interfaces, and behaviors. Clear and detailed
designs help ensure that each component functions as intended and interacts correctly with
other parts of the system. It also facilitates implementation and testing, leading to higher
design quality.
5. Evaluation and Validation: This stage involves evaluating the design against specified
requirements and quality criteria. Techniques such as reviews, inspections, simulations, and
testing are used to identify defects, inconsistencies, or deviations from requirements.
Validation ensures that the design meets stakeholders' needs, performs as expected, and
complies with quality standards. By detecting and addressing issues early, validation
contributes to improving design quality and reducing rework in later stages.
By following these key stages in the design process and ensuring attention to detail,
collaboration, and validation, design teams can achieve higher design quality, leading to
software systems that meet stakeholders' needs, perform reliably, and are adaptable to
changing requirements and environments.
OR
Discuss the design concept of software engineering in detail.
In software engineering, the design phase plays a crucial role in transforming requirements
into a blueprint for building a software system. The design concept encompasses various
principles, methodologies, and practices aimed at creating a well-structured, maintainable,
and scalable software solution. Here's a detailed discussion on the design concept in software
engineering:
2. Abstraction and Decomposition: Design involves breaking down the system into
manageable components, modules, and subsystems. This process of decomposition helps
manage complexity, enhance modularity, and promote reusability. Abstraction techniques,
such as encapsulation, inheritance, and polymorphism, enable designers to hide
implementation details and focus on essential aspects of the system's architecture.
3. Architectural Design: Architectural design defines the overall structure and organization of
the software system. Designers identify major components, their relationships, and the
interfaces between them. Common architectural styles include layered architecture, client-
server architecture, microservices architecture, and component-based architecture. The
architectural design provides a blueprint for the system's construction, ensuring scalability,
maintainability, and flexibility.
4. Modularization and Encapsulation: Modularization involves dividing the system into
separate, independent modules or units, each responsible for specific functionalities.
Encapsulation ensures that each module encapsulates its data and behavior, exposing only
necessary interfaces to other modules. Modular design promotes code reuse, enhances
maintainability, and facilitates parallel development by enabling teams to work on different
modules simultaneously.
5. Design Patterns: Design patterns are proven solutions to common design problems
encountered during software development. They provide reusable templates and best practices
for designing software systems. Examples of design patterns include creational patterns (e.g.,
Factory Method, Singleton), structural patterns (e.g., Adapter, Decorator), and behavioral
patterns (e.g., Observer, Strategy). Applying design patterns helps improve the quality,
flexibility, and extensibility of the design.
7. Iterative and Incremental Design: Design is an iterative and incremental process that
evolves over multiple iterations. Each iteration builds upon the previous one, incorporating
feedback, refining the design, and addressing emerging requirements or changes. Iterative
design enables designers to validate design decisions early, mitigate risks, and adapt to
evolving project needs effectively.