0% found this document useful (0 votes)
17 views

Software Engineering

Uploaded by

Moumi Samanta
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Software Engineering

Uploaded by

Moumi Samanta
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Software Engineering

*Software Development Life Cycle:

Definition: The software development life cycle refers to the series of phases that a software product goes through from its
conception to its retirement. It encompasses all activities involved in the development, deployment, maintenance, and eventual
decommissioning of the software.

Waterfall Model:

Description:
The Waterfall model is a linear sequential approach to software development.
It comprises distinct phases: Requirements, Design, Implementation, Testing, Deployment, and Maintenance.
Progression is rigid, with each phase being completed before moving to the next, resembling a cascading waterfall.
Software Development Life Cycle (SDLC):
1. Requirements Phase:
- Detailed gathering and documentation of software requirements from stakeholders.
2. Design Phase:
- Creation of detailed design specifications based on the gathered requirements.
3. Implementation Phase:
- Translation of the design into code, typically involving programming and configuration.
4. Testing Phase:
- Execution of tests to verify that the software meets requirements and specifications.
5. Deployment Phase:
- Release and deployment of the software to the production environment.
6. Maintenance Phase:
- Addressing any issues discovered post-deployment and providing ongoing support and maintenance.

Advantages:
Clear and well-defined phases make it easy to understand and manage.
Provides a structured approach with distinct deliverables for each phase.
Disadvantages:
Less adaptable to changes, as each phase is dependent on the completion of the previous one.
Testing occurs late in the cycle, increasing the risk of identifying issues at later stages.

V Model:

Description:
The V Model is an extension of the Waterfall model with a strong emphasis on testing.
Each development phase has a corresponding testing phase, forming a "V" shape.
Software Development Life Cycle (SDLC):
Similar to the Waterfall model but with testing activities closely aligned with each development phase.
Advantages:
Ensures early focus on testing and validation, reducing the likelihood of defects.
Provides a clear correlation between development and testing phases.
Disadvantages:
Similar to the Waterfall model, less adaptable to changes in requirements.

Incremental Model:

Description:
The Incremental model involves dividing the project into small, manageable increments or iterations.
Each increment adds new functionality or enhances existing features, with each increment building upon the previous one.
Software Development Life Cycle (SDLC):
- Iterative approach where development occurs incrementally, with each iteration going through the phases of requirements, design,
implementation, testing, and deployment.
Advantages:
Enables early delivery of working software, allowing for feedback and validation.
Provides flexibility and adaptability to changes in requirements.
Disadvantages:
Requires careful planning and coordination of increments to ensure seamless integration.

RAD Model (Rapid Application Development):

Description:
The RAD Model emphasizes rapid prototyping and iterative development.
It involves using pre-built components and tools to accelerate development and reduce time to market.
Software Development Life Cycle (SDLC):
- Iterative process involving rapid prototyping, feedback, and refinement, with a focus on user involvement.
Advantages:
Accelerates development through rapid prototyping and iteration.
Encourages user involvement and feedback, leading to greater stakeholder satisfaction.
Disadvantages:
Risk of incomplete or inaccurate requirements capture.
May lead to scope creep without proper management and control.

Agile Model:

Description:
Agile is an iterative and incremental approach to software development, emphasizing collaboration, flexibility, and adaptability to
change.
It values individuals and interactions over processes and tools, with a focus on delivering working software frequently.
Software Development Life Cycle (SDLC):
Iterative and incremental approach with short development cycles called sprints, typically lasting 2-4 weeks.
Continuous feedback and adaptation are key principles, with requirements and solutions evolving through collaboration between
self-organizing, cross-functional teams.
Advantages:
Responds well to changing requirements and priorities.
Promotes customer satisfaction through early and continuous delivery of valuable software.
Disadvantages:
Requires a high level of collaboration and communication between team members.
May be challenging to scale to larger projects and organizations.

Iterative Model:

Description:
The Iterative Model involves repetitive cycles of development, where each cycle adds new functionality or improves existing
features.
Each iteration goes through the phases of requirements, design, implementation, testing, and deployment, with each iteration
building upon the previous one.
Software Development Life Cycle (SDLC):
- Iterative approach where each cycle builds upon the previous one, allowing for incremental development and refinement.
Advantages:
Enables early delivery of working software, allowing for feedback and validation.
Provides flexibility and adaptability to changes in requirements.
Disadvantages:
Requires careful planning and management of iterations to ensure alignment with project goals and objectives.

Prototype Model:

Description:
The Prototype Model involves building a simplified version of the software, known as a prototype, to demonstrate key features
and functionalities.
Prototypes are iteratively refined based on feedback from users and stakeholders, with each iteration adding more detail and
functionality.
Software Development Life Cycle (SDLC):
Involves phases of requirements gathering, prototype development, evaluation, refinement, final implementation, testing, and
deployment.
Advantages:
Early user involvement leads to a better understanding of requirements and stakeholder needs.
Facilitates rapid development and flexibility in accommodating changes and enhancements.
Disadvantages:
Risk of incomplete or inaccurate requirements capture if the prototype does not accurately represent the final product.
Without proper management, prototype iterations can lead to scope creep and project delays.

Spiral Model:

Description:
The Spiral Model is an iterative model that combines elements of both the waterfall and prototype models.
It is divided into cycles, each involving risk analysis, planning, engineering, and evaluation, with progression occurring outward in
a spiral.
Software Development Life Cycle (SDLC):
- Involves iterative cycles of risk analysis, planning, engineering, and evaluation, with each cycle adding functionality and addressing
identified risks.

Advantages:
Incorporates risk management from the outset, allowing for early identification and mitigation of potential issues.
Allows for iterative development and refinement, making it suitable for projects with evolving requirements.
Disadvantages:
The complexity of the model and the need for expertise in risk analysis can make it time-consuming and costly to implement.
Requires a flexible approach to adapt to changes, which may not be suitable for all projects.
Comparative Studies:

Software Design Analysis:

SRS Document:

A Software Requirements Specification (SRS) document is a comprehensive description of the intended behavior and functionality of a
software system. It serves as a contract between the stakeholders (clients, users, developers, testers, etc.) to ensure a common
understanding of the system requirements throughout the software development lifecycle.

Requirements Principles and Analysis:


Requirements Principles:
Completeness: Ensure all necessary requirements are captured, leaving no gaps or ambiguities.
Consistency: Requirements should not conflict with each other or with the project's objectives.
Correctness: Requirements should accurately reflect stakeholders' needs and expectations.
Feasibility: Requirements should be achievable within the project's constraints, including budget, time, and resources.
Flexibility: Able to modify on changing the needs.
Traceability: Each requirement should be traceable to its source (e.g., stakeholder, regulation) and justification.
Verifiability: Requirements should be testable to ensure that they are met and can be validated against predefined criteria.
Maintainability: The design should be so simple so that it can easily be maintainable by other designer.
Specification Principles:
Clarity: Specifications should be clear and understandable to all stakeholders, avoiding technical jargon or ambiguous
language.
Completeness: All aspects of the system's behavior, functionality, and performance should be covered in the specifications.
Consistency: Specifications should not contain conflicting information or requirements that cannot be fulfilled
simultaneously.
Precision: Specifications should be precise and unambiguous, leaving no room for interpretation or misinterpretation.
Modifiability: Specifications should be flexible and easy to update as requirements change, ensuring that they remain
relevant throughout the project lifecycle.
Representations:
Textual: Written descriptions of system requirements and behavior, often using natural language or structured formats such
as user stories, use cases, or functional requirements documents.
Graphical: Visual representations of system components, interactions, and behaviors using diagrams such as use case
diagrams, activity diagrams, sequence diagrams, and state transition diagrams.
Mathematical: Formal methods and models for specifying system behavior and properties, including mathematical
notations, formal languages, and formal specification languages such as Z, Alloy, and TLA+.
Tabular: Structured tables representing data structures, decision tables, truth tables, and other tabular representations for
specifying system rules, conditions, and constraints.
Software Design Analysis:
Different Levels of DFD Design:
Context Level: Provides an overview of the system's interactions with external entities, showing high-level processes and
data flows.
Level 0: Decomposes the system into major processes and data flows, representing the main functionalities and
interactions.
Lower Levels: Further decompose processes into sub-processes, data flows, and data stores, providing detailed views of
the system's internal structure and behavior.
Physical and Logical DFD:
Physical DFD: Represents the actual implementation of the system, including hardware components, software modules,
and network connections.
Logical DFD: Represents the system's functionality and behavior without considering implementation details, focusing on
processes, data flows, and data stores.
Use and Conversions between Them:
Logical DFDs are used during requirements analysis and system design to define system functionality, interactions, and
data flows.
Physical DFDs are used during implementation and system architecture design to depict the actual system components,
interfaces, and interactions, incorporating implementation details such as hardware, software, and network configurations.
Decision Tables and Trees:
Decision Tables: Represent complex decision-making processes by listing possible conditions, actions, and corresponding
outcomes in a structured tabular format.
Decision Trees: Graphical representations of decision logic, showing decisions, events, and outcomes as nodes and
branches, facilitating visualization and analysis of decision-making processes.
Structured Analysis:
A methodology for analyzing and designing systems based on hierarchical decomposition, modularization, and structured
representation of system components, processes, and interactions.
Focuses on dividing the system into smaller, manageable modules or subsystems, defining their interactions, and specifying
their behavior and interfaces.
Types of Designs:
High-Level Designs:
Architectural Design: Defines the overall structure of the system, including components, modules, and their interactions,
providing a blueprint for system implementation.
System Design: Specifies the internal components, modules, and subsystems of the system, their interfaces, and
interactions, addressing functional and non-functional requirements.
Database Design: Describes the structure and organization of the system's data, including data models, schemas, tables,
and relationships, ensuring data integrity, security, and efficiency.
Following are the activities which will be performed in this design level:-
- Brief description and name of each module.
- Interface relationship and dependencies between each modules.
- Database tables identified along with their key elements.
- Complete architecture diagram along with technology details.
Detailed Designs:
Component Design: Specifies the detailed design of individual system components, modules, or classes, including their
attributes, methods, and relationships, addressing functional requirements at a granular level.
Interface Design: Defines the interfaces between system components, modules, or subsystems, including data formats,
protocols, and communication mechanisms, ensuring seamless interaction and integration.
Algorithm Design: Specifies the algorithms and data structures used in the system, addressing computational and
performance requirements, and ensuring efficiency, scalability, and maintainability.
Following activities are performed in the detailed designed phase:-
- Functional logic of the modules.
- Database tables which include type and size.
- Complete details of the interface.
- Listing of error messages.
- Complete input and outputs for every modules.

Coupling:

Definition: Coupling refers to the degree of interdependence between modules or components in a software system. It measures how
closely connected or dependent one module is on another.
Cohesion:

Definition: Cohesion refers to the degree to which the elements within a module are related to each other. It measures how strongly
the components of a module are bound together.

Relationship between Coupling and Cohesion:

Inverse Relationship: Typically, there is an inverse relationship between coupling and cohesion.
High cohesion and low coupling are desirable in software design as they lead to modular, maintainable, and flexible systems.
High coupling often correlates with low cohesion, indicating a poorly structured or poorly designed system.
Design Goals: The goal of software design is to achieve high cohesion and low coupling simultaneously.
High cohesion ensures that each module has a single, well-defined purpose, making it easier to understand and maintain.
Low coupling ensures that modules are loosely connected, reducing the impact of changes in one module on other modules and
promoting flexibility and scalability.
Types of Coupling:
Data Coupling: Modules communicate by passing data through parameters or shared data structures, with low coupling
achieved when modules share only necessary data.
Control Coupling: Modules communicate by controlling each other's behavior through control flags or parameters, with low
coupling achieved when modules are independent of each other's control flow.
Common Coupling: Modules communicate by accessing shared global data or variables, with low coupling achieved when
modules interact indirectly or minimally with shared data.
Content Coupling: Modules communicate by sharing internal implementation details or data structures, with low coupling
achieved when modules interact only through well-defined interfaces.
Coincidental Coupling: Modules are not related logically but still interact, usually due to poor design or architecture.
Types of Cohesion:
Functional Cohesion: Elements within a module perform closely related functions or tasks, with high cohesion achieved when
all elements contribute to a single, well-defined purpose or objective.
Sequential Cohesion: Elements within a module are executed in a sequential order, with high cohesion achieved when
elements are logically related and perform tasks in a sequential manner.
Communicational Cohesion: Elements within a module operate on the same data or share common inputs and outputs, with
high cohesion achieved when elements collaborate to manipulate shared data or achieve common goals.
Procedural Cohesion: Elements within a module are grouped based on a common sequence of actions or operations, with high
cohesion achieved when elements contribute to a specific procedure or process.
Temporal Cohesion: Elements within a module are executed together at the same time or under the same conditions, with high
cohesion achieved when elements are related by a common timeframe or event.
Logical Cohesion: Elements within a module are grouped based on their logical relationship, with high cohesion achieved when
elements perform operations related to the same logical entity or concept.

Software Cost Estimation: COCOMO Model

Definition:
COCOMO (Constructive Cost Model) is a widely used algorithmic cost estimation model developed by Barry Boehm.
It provides a systematic approach to estimate effort, duration, and resources required for software development projects.
COCOMO Sub-models:
1. Basic COCOMO:
Estimates effort and duration based on the size of the software.
Calculated using the equation: 𝐸𝑓𝑓𝑜𝑟𝑡=𝑎×(𝐾𝐿𝑂𝐶)𝑏Effort=a×(KLOC)b, where 𝐾𝐿𝑂𝐶KLOC is the estimated size of the
software in thousands of lines of code.
Parameters 𝑎a and 𝑏b are constants derived from historical data.
2. Intermediate COCOMO:
Builds upon Basic COCOMO by incorporating additional factors for complexity, personnel capability, and other project
attributes.
Divided into three modes: Organic, Semi-Detached, and Embedded.
Each mode has specific multipliers for effort and duration estimation based on project characteristics.
3. Detailed COCOMO:
Incorporates further factors such as reuse, documentation, and required reliability.
Provides a more detailed estimation based on specific project attributes and parameters.
Usage:
COCOMO is used during project planning and early stages of software development to estimate:
Effort: Person-months or person-years required to complete the project.
Duration: Time required to complete the project, typically in months.
Resources: Human resources needed for development, including developers, testers, and managers.
Factors Considered:
Size: Estimated size of the software product, often measured in lines of code (LOC) or thousands of lines of code (KLOC).
Complexity: Factors such as the complexity of requirements, architecture, algorithms, and interfaces.
Personnel Capability: Skills, experience, and productivity of the development team.
Product and Project Attributes: Characteristics of the software product and development project, including schedule
constraints, development environment, and required reliability.
Advantages:
Provides a structured and systematic approach to cost estimation.
Helps project managers make informed decisions about budgeting, staffing, and scheduling.
Can be tailored to specific project characteristics and environments.
Limitations:
Relies on historical data and assumptions, which may not always accurately reflect current project conditions.
May not account for all project-specific factors and uncertainties.
Requires expertise and careful calibration to produce accurate estimates.
Implementation:
Various tools and software packages are available to facilitate COCOMO estimation, including both standalone applications and
integrated project management suites.
These tools typically automate the estimation process, allowing project managers to input project parameters and receive
estimates based on COCOMO equations and algorithms.

You might also like