0% found this document useful (0 votes)
13 views7 pages

Software Engineering Exam Notes

The document provides an overview of various Software Development Life Cycle (SDLC) models, including RAD, Spiral, V-Model, Incremental, Big Bang, Prototype, and Waterfall, detailing their definitions, phases, advantages, and disadvantages. It also covers cost estimation methods like COCOMO, risk management processes, software metrics, project scheduling techniques, software quality standards, design approaches, testing types, and concepts of software reuse and domain analysis. Additionally, it contrasts modern and traditional project methodologies, emphasizing the shift towards iterative and flexible practices.

Uploaded by

optuhinpal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views7 pages

Software Engineering Exam Notes

The document provides an overview of various Software Development Life Cycle (SDLC) models, including RAD, Spiral, V-Model, Incremental, Big Bang, Prototype, and Waterfall, detailing their definitions, phases, advantages, and disadvantages. It also covers cost estimation methods like COCOMO, risk management processes, software metrics, project scheduling techniques, software quality standards, design approaches, testing types, and concepts of software reuse and domain analysis. Additionally, it contrasts modern and traditional project methodologies, emphasizing the shift towards iterative and flexible practices.

Uploaded by

optuhinpal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Software Engineering Exam Notes (20 Marks)

Software Development Life Cycle (SDLC) Models

1. RAD Model (Rapid Application Development)

o Definition: A software development methodology that prioritizes rapid prototyping


and iterative delivery to quickly produce a working system, emphasizing user
involvement through frequent feedback loops to refine requirements and ensure the
final product meets user needs efficiently.

o Phases: Business Modeling, Data Modeling, Process Modeling, Application


Generation, Testing.

o Advantages: Quick development, user feedback.

o Disadvantages: Requires skilled developers, not suitable for large projects.

o Diagram: RAD Model Diagram

2. Spiral Model

o Definition: A risk-driven software development model that integrates iterative


development with systematic risk assessment, progressing through cycles (or spirals)
that include planning, risk analysis, engineering, and evaluation, allowing for
incremental refinement and adaptability to changes.

o Phases: Planning, Risk Analysis, Engineering, Evaluation.

o Advantages: Risk management, flexibility.

o Disadvantages: Complex, costly.

o Diagram: Spiral Model Diagram

3. V-Model

o Definition: A sequential development model, also known as the Verification and


Validation model, where each development phase (e.g., requirements, design,
coding) is directly associated with a corresponding testing phase, ensuring that
testing is planned and executed alongside development to catch defects early.

o Phases: Requirements → System Design → Coding → Testing (Verification &


Validation).

o Advantages: Structured, early defect detection.

o Disadvantages: Rigid, no iteration.

o Diagram: V-Model Diagram

4. Incremental Model

o Definition: A development approach where the software is built and delivered in


small, manageable increments, with each increment adding new functionality to the
previous build, allowing for partial system delivery and iterative refinement based on
user feedback.
o Advantages: Early delivery, flexibility.

o Disadvantages: Requires good planning.

o Diagram: Incremental Model Diagram

5. Big Bang Model

o Definition: A simplistic development approach where little to no planning or formal


process is followed, and development begins immediately with all resources thrown
into coding, often resulting in a system that is built in one go with minimal structure
or documentation.

o Advantages: Simple, good for small projects.

o Disadvantages: High risk, poor structure.

6. Prototype Model

o Definition: A development methodology where a working prototype (a simplified


version of the system) is created early to visualize and refine user requirements,
allowing stakeholders to provide feedback and make adjustments before the final
system is developed.

o Advantages: User involvement, reduces risk.

o Disadvantages: May lead to scope creep.

o Diagram: Prototype Model Diagram

7. Waterfall Model

o Definition: A traditional, linear, and sequential development model where each


phase (e.g., requirements gathering, system design, implementation, testing, and
maintenance) is completed fully before moving to the next, with no overlap or
iteration between phases, ensuring a structured but rigid process.

o Phases: Requirements, Design, Implementation, Testing, Maintenance.

o Advantages: Simple, well-documented.

o Disadvantages: Inflexible, late testing.

o Diagram: Waterfall Model Diagram

Cost Estimation

8. COCOMO Model (Constructive Cost Model)

o Definition: An empirical cost estimation model developed by Barry Boehm, used to


predict the effort, cost, and schedule of software projects by analyzing factors like
project size (in KLOC - thousands of lines of code), complexity, and team experience,
offering three levels of detail: Basic, Intermediate, and Detailed.

o Types: Basic, Intermediate, Detailed.

o Formula: Effort = a * (KLOC)^b * EAF (Effort Adjustment Factor).


o Advantages: Quantitative, scalable.

o Disadvantages: Requires accurate inputs.

o Diagram: COCOMO Model Overview

Risk Management

9. Risk Management

o Definition: A systematic process in software engineering that involves identifying


potential risks (e.g., technical, schedule, or resource-related issues), analyzing their
impact and likelihood, planning mitigation strategies, and continuously monitoring
them to ensure project success and minimize disruptions.

o Steps: Risk Identification, Risk Analysis, Risk Planning, Risk Monitoring.

o Example: Technical risks (e.g., technology failure), Schedule risks (e.g., delays).

Software Metrics

10. Software Metrics

o Definition: Quantitative measurements used to evaluate various aspects of a


software product, process, or project, such as size, complexity, quality, or
performance, enabling developers and managers to make informed decisions, track
progress, and improve development practices.

o Types:

▪ Product Metrics: Lines of Code (LOC), Cyclomatic Complexity.

▪ Process Metrics: Defect Rate, Development Time.

▪ Project Metrics: Cost, Effort.

Project Scheduling

11. Project Scheduling

o Definition: The process of breaking down a software project into individual tasks,
determining their dependencies, estimating their duration, assigning resources, and
creating a timeline to ensure timely completion, often using tools like Gantt charts or
PERT charts to visualize progress.

o Tools: Gantt Chart, PERT Chart, CPM (Critical Path Method).

o PERT Chart Example (from sample paper):

▪ Activities: A (12 days), B (8 days), C (4 days), D (6 days), E (7 days), F (10


days).

▪ Dependencies: A → B, C; B → D; D → F; C → E; E → F.

▪ Critical Path: A → B → D → F (12 + 8 + 6 + 10 = 36 days).

▪ Diagram: PERT Chart Example


Software Quality

12. Software Quality

o Definition: The extent to which a software product meets specified requirements


and user expectations, encompassing attributes like reliability (consistency of
performance), usability (ease of use), maintainability (ease of modification), and
efficiency, often measured against standards like ISO 9126.

o Standards: ISO 9126, McCall’s Quality Model.

o Techniques: Reviews, Testing, Quality Assurance.

Software Design Approaches

13. Coupling

o Definition: A measure of the degree of interdependence between software modules,


where low coupling (e.g., data coupling, where modules share minimal data) is
preferred to reduce the impact of changes in one module on others, enhancing
modularity and maintainability.

o Types: Low (Data Coupling) to High (Content Coupling).

o Goal: Minimize coupling.

14. Cohesion

o Definition: A measure of how closely the elements within a single module are
related to each other in terms of functionality, where high cohesion (e.g., functional
cohesion, where all elements contribute to a single task) is desired to ensure the
module is focused and easier to maintain.

o Types: High (Functional Cohesion) to Low (Coincidental Cohesion).

o Goal: Maximize cohesion.

15. Object-Oriented Design (OOD)

o Definition: A design paradigm that organizes software around objects and classes,
leveraging concepts like inheritance (reusing code through parent-child
relationships), polymorphism (allowing objects to be treated as instances of their
parent class), and encapsulation (hiding data within objects) to create modular and
scalable systems.

o Advantages: Reusability, scalability.

16. Function-Oriented Design

o Definition: A design approach that structures software around functions or


procedures, breaking down the system into a set of functional units that perform
specific tasks, often using a top-down approach to decompose the system into
smaller, manageable functions.

o Advantages: Simplicity, clarity.

17. User-Interface Oriented Design


o Definition: A design methodology that focuses on creating intuitive and user-friendly
interfaces, prioritizing user experience by ensuring the interface is consistent
(uniform design), responsive (quick feedback), and simple (easy to navigate), often
involving user feedback during design iterations.

o Principles: Consistency, feedback, simplicity.

Software Testing

18. Testing Types

o White Box Testing:

▪ Definition: A testing approach where the tester has full knowledge of the
internal structure and code of the software, allowing them to design test
cases based on the code’s logic, often used to test paths, loops, and
branches within the program.

▪ Example: Code-based testing.

o Black Box Testing:

▪ Definition: A testing approach where the tester focuses solely on the


software’s inputs and outputs without knowledge of its internal code or
structure, ensuring the system meets functional requirements as specified,
often using techniques like boundary value analysis.

▪ Example: Functional testing.

o Gray Box Testing:

▪ Definition: A hybrid testing approach where the tester has partial knowledge
of the internal structure, combining elements of both white box and black
box testing, often used to test APIs or databases where some structural
understanding is beneficial.

▪ Example: API testing.

o Unit Testing:

▪ Definition: A testing method that focuses on verifying the functionality of


individual components or modules of a software system in isolation, typically
performed by developers to ensure each unit works as intended before
integration.

o Integration Testing:

▪ Definition: A testing method that verifies the interactions between


integrated modules or components, ensuring they work together correctly
and identifying issues in their interfaces or data flow, often following unit
testing.

o System Testing:

▪ Definition: A testing method that evaluates the entire software system as a


whole, ensuring all components work together to meet the specified
requirements, typically performed in an environment that mimics the
production setting.

o Acceptance Testing:

▪ Definition: A testing method conducted to determine whether the software


meets the end-user requirements and is ready for deployment, often
involving real-world scenarios and user feedback to validate the system’s
functionality and usability.

o Diagram: Testing Levels

Software Reuse, Domain Analysis, Component Classification

19. Software Reuse

o Definition: The practice of leveraging existing software components, such as


libraries, frameworks, or modules, in new projects to reduce development time,
improve quality, and ensure consistency, often requiring careful integration and
adaptation to fit new requirements.

o Advantages: Saves time, improves quality.

20. Domain Analysis

o Definition: A process in software engineering that involves studying a specific


application domain to identify and document common features, functionalities, and
requirements, enabling the creation of reusable components tailored to that
domain, often used in product-line engineering.

o Steps: Define domain, analyze requirements, model domain.

21. Component Classification

o Definition: The process of organizing software components into categories based on


their characteristics, such as functionality (what they do), structure (how they are
built), or behavior (how they interact), to facilitate reuse and management in large-
scale systems.

o Types: Functional, Structural, Behavioral.

Additional Notes

• Exploratory Style:

o Definition: A software development approach where the system is built iteratively by


exploring and refining requirements through experimentation and user feedback,
often associated with Agile methodologies that prioritize adaptability over rigid
planning.

• DFDs (Data Flow Diagrams):

o Definition: A graphical representation of how data flows through a system, showing


processes (transformations), data stores (storage), external entities (interactors), and
data flows (movement), used to model and analyze the system’s data processing at
various levels of detail.
o Levels: Level 0 (Context Diagram), Level 1 (Detailed Processes).

o Symbols: Process (Circle), Data Store (Open Rectangle), External Entity (Square), Data
Flow (Arrow).

o Diagram: DFD Example

• Modern vs. Traditional Projects:

o Definition: Modern projects adopt iterative and flexible methodologies like Agile and
DevOps, emphasizing collaboration and adaptability, while traditional projects follow
structured, sequential approaches like Waterfall, focusing on detailed planning and
documentation.

You might also like