0% found this document useful (0 votes)
7 views7 pages

Unit 3

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views7 pages

Unit 3

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Modularity

Modularity is a fundamental concept in software engineering, referring to the division of a


software system into discrete components, or modules, each with well-defined
responsibilities and interfaces. This division aids in managing the complexity of software
systems, making them more understandable, maintainable, and scalable.

Module Coupling

Module coupling is a measure of the degree of interdependence between modules in a


software system. It indicates how closely connected two modules are and how much one
module relies on another. There are several types of module coupling, ranging from low
coupling (desirable) to high coupling (undesirable).

Types of Coupling

1. Data Coupling: Modules communicate by passing data. This is the most desirable
type of coupling. It minimizes dependencies between modules.
Example: A module that calculates the area of a circle receives the radius as input and
returns the area as output. It only depends on the data (radius) passed to it.
2. Stamp Coupling: A complete data structure is passed between modules, possibly
including unnecessary data. It's less desirable than data coupling.
Example: A module that manages student records receives a complete student object,
including name, address, and courses. If it only needs the student ID to perform its
task, it's considered stamp coupling.
3. Control Coupling: Modules communicate by passing control information, such as
flags. This type of coupling is less desirable as it introduces dependencies based on
control flow.
Example: A module that processes user input and another module that validates the
input both rely on a shared flag to determine if the input is valid. This introduces
control coupling.
4. External Coupling: Modules depend on external entities, such as other software or
hardware. This type of coupling is less desirable as it makes the system less self-
contained.
Example: A module that reads data from a file relies on the file system, an external
entity. Any changes to the file system may impact the module, indicating external
coupling.
5. Common Coupling: Modules share global data. This is less desirable as it can lead to
issues with managing the shared data.
Example: Two modules share a global variable to communicate status information. If
one module modifies the variable, it affects the behavior of the other module,
demonstrating common coupling.
6. Content Coupling: One module directly modifies or accesses the contents of another
module. This is the most undesirable type of coupling as it creates strong
dependencies between modules.
Example: A module responsible for formatting text directly accesses the internal data
structures of another module that stores text. This tightly couples the formatting
module to the internal structure of the text storage module.
Module cohesion
Module cohesion is a measure of the degree to which the elements of a module are
functionally related. A strongly cohesive module implements functionality that is related to
one feature of the solution and requires little or no interaction with other modules.

Types of Cohesion

1. Functional Cohesion: The elements of the module are functionally related and
contribute to a single well-defined task or objective. This is the most desirable type of
cohesion. Example: A module that calculates the area of a circle, given its radius, is
functionally cohesive.
2. Sequential Cohesion: The output of one element becomes the input of another
element within the same module. This type of cohesion is based on the order of
execution. Example: A module that calculates the GPA of a student first calculates the
grade points for each subject, and then calculates the overall GPA.
3. Communicational Cohesion: The elements of the module operate on the same input
data or contribute to the same output data. This type of cohesion is based on data
communication. Example: A module that calculates both the current and cumulative
GPA of a student based on their grade records.
4. Procedural Cohesion: The elements of the module are related by a specific sequence
of execution. This type of cohesion is based on the order in which tasks are
performed. Example: A module that first calculates the student's GPA, then prints the
student's record, followed by calculating the cumulative GPA.
5. Temporal Cohesion: The elements of the module must be executed at the same time.
This type of cohesion is based on timing requirements. Example: A module that
performs initialization tasks, such as setting program counters or control flags
associated with programs.
6. Logical Cohesion: The elements of the module perform logically similar operations,
but there is no significant relationship between them. This type of cohesion is based
on logical classification. Example: A module that contains separate code for checking
the validity of each date in an input transaction.
7. Coincidental Cohesion: The elements of the module have no meaningful relationship
and are grouped together arbitrarily. This is the least desirable type of cohesion.
Example: A module that checks the validity of a date and prints a message, where
these tasks have no logical connection.

system design approaches:


There are several strategies or techniques for performing system design, including the
bottom-up approach, top-down approach, and hybrid approach.

1. Bottom-Up Design: This approach involves identifying modules required by many


programs and collecting them in a "library." Modules are combined to provide larger
ones, forming a hierarchy. Starting coding soon after designing a module allows for
earlier testing and validation, but it requires intuition to decide exactly what
functionality a module should provide.
2. Top-Down Design: In this approach, the specification is viewed as describing a black
box for the program. The designer decides how the internals of the black box are
constructed from smaller black boxes, and those inner black boxes are specified. This
process is repeated until the black boxes can be coded directly. This approach is
suitable when specifications are clear and development is from scratch.
3. Hybrid Design: Pure top-down or pure bottom-up approaches are often not practical.
A hybrid approach combines elements of both approaches. Some bottom-up approach
is essential for the success of a top-down approach, especially for permitting common
sub-modules, simpler intuition near the bottom of the hierarchy, and reuse of
modules.

Function Oriented Design (FOD) is an approach to software design that focuses on


decomposing the system into a set of interacting units, each with a clearly defined function.
This approach is advocated by Niklaus Wirth, the creator of Pascal and other languages, who
emphasizes stepwise refinement, a top-down design method.

In FOD, the design starts with a high-level description of the program's functionality. Each
step then refines one part of this description, specifying in greater detail what that part does.
While this method works well for small programs, its value for large programs is debatable.

The main challenge with FOD for large programs is understanding what the program does
overall. For example, determining the function of UNIX, an airline reservation system, or a
scheme interpreter is complex and context-dependent. This can lead to highly artificial
descriptions of reality in the design.

For instance, consider a scheme interpreter. A top-level function might be:

mathematica
Copy code
While (not finished)
Read an expression from the terminal;
Evaluate the expression;
Print the value;

This leads to a division of the interpreter into "read," "evaluate," and "print" modules.
However, each module needs to know about the different types of objects (integer, real, list,
etc.) that it manipulates. This tight coupling can lead to difficulties in maintenance and
extension.

As the refinement continues, the program's structure is represented as a tree of refinements,


reflecting the top-down design. However, this can result in highly specialized modules, with
each module used by at most one other module, its parent. For reusability, it's preferable to
have modules used by several others, forming a more reusable structure.

While FOD does not require a top-down creation, it often results in a function-oriented
program structure. However, to delay decisions about the system's functionality, it may be
better to structure the program around the data it manipulates rather than the actions it takes.
Object-oriented design (OOD): is indeed a fundamental concept in software
development, aiming to model real-world entities as objects with attributes and behaviors.
Here's a breakdown of the key points you mentioned:

1. Object-Oriented Focus: OOD focuses on the data (objects) and their interactions,
rather than just the functions performed by the program.
2. Objects and Classes:
o Objects: Represent individual entities with a state (attributes) and behaviors.
o Classes: Group objects with similar characteristics, defining their structure
and behavior.
3. Messages: Objects communicate by passing messages, which contain the identity of
the target object and the requested operation.
4. Abstraction: Managing complexity by highlighting essential details and hiding
irrelevant ones. It allows for different levels of understanding (e.g., driving a car vs.
designing a car engine).
5. Inheritance: Allows new classes (subclasses) to inherit attributes and behaviors from
existing classes (superclasses), promoting code reuse and hierarchical organization.
6. Polymorphism: Objects can be treated as instances of their superclass, allowing for
more generic and flexible code.
7. Encapsulation: Also known as information hiding, it separates the external aspects of
an object from its internal implementation, protecting the object's state and
implementation details.
8. Hierarchy: Organizing classes into hierarchies based on similarities and differences,
enabling better organization and understanding of complex systems.

So ware risk management is a cri cal process in so ware development that aims to iden fy,
assess, and mi gate poten al risks that could impact the success of a project.

Software projects can be derailed by a variety of risks, which can be identified through
brainstorming or past project data. Managing these risks requires a combination of experience
and knowledge of current software engineering and management practices. Here are some
common risk factors identified by Capers Jones:

1. Dependencies: Risks can arise from dependencies on external factors or agencies,


such as the availability of trained personnel, inter-component dependencies,
customer-provided items or information, and relationships with subcontractors.
Controlling these external dependencies can be challenging.
2. Requirement Issues: Uncertainty and changes in requirements can pose significant
risks to a project. Failure to resolve these issues can result in building the wrong
product or building the right product poorly. Factors include lack of clear product
vision, agreement on requirements, prioritization of requirements, and inadequate
impact analysis of requirement changes.
3. Management Issues: Project managers play a key role in identifying and managing
risks. Inadequate planning, lack of visibility into project status, unclear project
ownership and decision-making, unrealistic commitments, unrealistic expectations
from managers or customers, personality conflicts, and poor communication can all
contribute to project risks.
4. Lack of Knowledge: The rapid pace of technological change and turnover of skilled
staff can lead to knowledge gaps within project teams. Recognizing these gaps early
and taking preventive actions, such as training, hiring consultants, and assembling the
right team, can mitigate these risks. Factors include inadequate training, poor
understanding of methods and tools, lack of application domain experience, new
technologies, and ineffective or poorly documented processes.

Risk Management Activities

 Risk Assessment

 Risk Identification: Identify potential risks using brainstorming, historical data,


checklists, and expert judgment.
 Risk Analysis: Analyze identified risks to understand their impact and likelihood of
occurrence, prioritizing them based on severity.
 Risk Prioritization: Prioritize risks based on probability and potential impact to
focus on critical risks.

 Risk Control

 Risk Management Planning: Develop a plan outlining how risks will be managed,
including strategies for avoiding, mitigating, transferring, or accepting risks.
 Risk Monitoring: Continuously monitor risks to ensure the effectiveness of the risk
management plan and identify any new risks.
 Risk Resolution: Execute the risk management plan if a risk occurs, implementing
contingency plans or corrective actions to minimize its impact.

The Constructive Cost Model (COCOMO) is a widely-used hierarchy of software cost


estimation models. It was popularized by B.W. Boehm's book "Software Engineering
Economics" in 1981. COCOMO consists of basic, intermediate, and detailed sub models.
Basic Model
The basic model provides a quick and rough estimate for small to medium-sized software
projects. It considers three modes of software development: organic, semi-detached, and
embedded.
 Organic Mode: Small team of experienced developers in a familiar environment.
Project size ranges from small (a few KLOC) to medium (a few tens of KLOC).
 Semi-Detached Mode: Intermediate mode between organic and embedded. Project
size ranges from small to very large (a few hundreds of KLOC).
 Embedded Mode: Tight constraints, unique problems, difficult to find experienced
persons. Project size typically over 300 KLOC.
Equations
The basic COCOMO equations are:

Coefficients
The coefficients for the basic COCOMO model are:

Additional Calculations

The basic COCOMO model provides a useful tool for estimating project cost and
development time quickly, once the size of the project is estimated. The estimator needs to
determine which mode (organic, semi-detached, or embedded) is most appropriate for the
project.

You might also like