Assignment 3: Q1.Requirement Engineering Tasks in Detail
Assignment 3: Q1.Requirement Engineering Tasks in Detail
Assignment 3: Q1.Requirement Engineering Tasks in Detail
ANS:
1. Inception
Problem of scope: The customer give the unnecessary technical detail rather than
clarity of the overall system objective.
Problem of volatility: In this problem, the requirements change from time to time and it
is difficult while developing the project.
3. Elaboration
In this task, the information taken from user during inception and elaboration and
are expanded and refined in elaboration.
Its main task is developing pure model of software using functions, feature and
constraints of a software.
4. Negotiation
It is a set of activities that help the project team to identify, control and track the
requirements and changes can be made to the requirements at any time of the
ongoing project.
These tasks start with the identification and assign a unique identifier to each of the
requirement.
After finalizing the requirement traceability table is developed.
The examples of traceability table are the features, sources, dependencies,
subsystems and interface of the requirement.
ANS:
Requirements Elicitation:
It is related to the various ways used to gain knowledge about the project domain
and requirements. The various sources of domain knowledge include customers,
business manuals, the existing software of same type, standards and other
stakeholders of the project.
Requirements specification:
This activity is used to produce formal software requirement models. All the
requirements including the functional as well as the non-functional requirements
and the constraints are specified by these models in totality. During specification,
more knowledge about the problem may be required which can again trigger the
elicitation process.
The models used at this stage include ER diagrams, data flow diagrams(DFDs),
function decomposition diagrams(FDDs), data dictionaries, etc.
Verification: It refers to the set of tasks that ensures that the software correctly
implements a specific function.
Validation: It refers to a different set of tasks that ensures that the software that
has been built is traceable to customer requirements.
If requirements are not validated, errors in the requirement definitions would propagate to
the successive stages resulting in a lot of modification and rework.
The main steps for this process include:
The requirements should be consistent with all the other requirements i.e no two
requirements should conflict with each other.
Reviews, buddy checks, making test cases, etc. are some of the methods used for this.
Requirements management:
Requirement management is the process of analyzing, documenting, tracking, prioritizing
and agreeing on the requirement and controlling the communication to relevant
stakeholders. This stage takes care of the changing nature of requirements. It should be
ensured that the SRS is as modifiable as possible so as to incorporate changes in
requirements specified by the end users at later stages too. Being able to modify the
software as per requirements in a systematic and controlled manner is an extremely
important part of the requirements engineering process.
ANS:
Requirement gathering:
The process to gather the software requirements from client, analyze and document them is
known as requirement engineering.
Brainstorming:
Document Analysis:
Reviewing the documentation of an existing system can help when creating AS–IS
process document, as well as driving gap analysis for scoping of migration projects.
In an ideal world, we would even be reviewing the requirements that drove creation
of the existing system – a starting point for documenting current requirements.
Nuggets of information are often buried in existing documents that help us ask
questions as part of validating requirement completeness.
Focus Group:
Interface analysis:
Interview:
Interviews of stakeholders and users are critical to creating the great software.
Without understanding the goals and expectations of the users and stakeholders,
we are very unlikely to satisfy them. We also have to recognize the perspective of
each interviewee, so that, we can properly weigh and address their inputs. Listening
is the skill that helps a great analyst to get more value from an interview than an
average analyst.
Prototyping:
Reverse Engineering:
When a migration project does not have access to sufficient documentation of the
existing system, reverse engineering will identify what the system does. It will not
identify what the system should do, and will not identify when the system does the
wrong thing.
ANS:
The context diagram is a simple model that defines the boundaries and interfaces
of the proposed systems with the external world. It identifies the entities outside the
proposed system that interact with the system. The context diagram of student
result management system is given below:
Development of a Prototype :
One effective way to find out what the customer wants is to construct a prototype,
something that looks and preferably acts as part of the system they say they want.
The prototype should be built quickly and at a relatively low cost. Hence it will
always have limitations and would not be acceptable in the final system. This is an
optional activity.
After modeling the requirements, we will have a better understanding of the system
behavior. The inconsistencies and ambiguities have been identified and corrected.
The flow of data amongst various modules has been analyzed. Elicitation and
analyze activities have provided better insight into the system. Now we finalize the
analyzed requirements, and the next step is to document these requirements in a
prescribed format.
ASSIGNMENT 4
Q1.Explain design concepts and explain various architectures in brief.
ANS:
Design concepts:
Abstraction:
Abstraction simply means to hide the details to reduce complexity and increases
efficiency or quality. Different levels of Abstraction are necessary and must be
applied at each stage of the design process so that any error that is present can be
removed to increase the efficiency of the software solution and to refine the
software solution.
The solution should be described in broadways that cover a wide range of different
things at a higher level of abstraction and a more detailed description of a solution
of software should be given at the lower level of abstraction.
Modularity:
Modularity simply means to divide the system or project into smaller parts to reduce
the complexity of the system or project. In the same way, modularity in design
means to subdivide a system into smaller parts so that these parts can be created
independently and then use these parts in different systems to perform different
functions.
Architecture:
Refinement:
Pattern:
The pattern simply means a repeated form or design in which the same shape is
repeated several times to form a pattern. The pattern in the design process means
the repetition of a solution to a common recurring problem within a certain context.
Refactoring:
Refactoring simply means to reconstruct something in such a way that it does not
affect the behavior or any other features. Refactoring in software design means to
reconstruct the design to reduce and complexity and simplify it without affecting the
behavior or its functions. Fowler has defined refactoring as “the process of changing
a software system in a way that it won’t affect the behavior of the design and
improves the internal structure”.
Various architectures:
A data store will reside at the center of this architecture and is accessed
frequently by the other components that update, add, delete or modify the
data present within the store.
The figure illustrates a typical data centered style. The client software access
a central repository. Variation of this approach are used to transform the
repository into a blackboard when data related to client or data of interest for
the client change the notifications to client software.
The figure represents pipe-and-filter architecture since it uses both pipe and
filter and it has a set of components called filters connected by pipes.
Pipes are used to transmit data from one component to the next.
Each filter will work independently and is designed to take data input of a
certain form and produces data output to the next filter of a specified form.
The filters don’t require any knowledge of the working of neighboring filters.
If the data flow degenerates into a single line of transforms, then it is termed
as batch sequential. This structure accepts the batch of data and then
applies a series of sequential components to transform it.
It is used to create a program that is easy to scale and modify. Many sub-
styles exist within this category. Two of them are explained below.
The components of a system encapsulate data and the operations that must
be applied to manipulate the data. The coordination and communication
between the components are established via the message passing.
Layered architecture:
A number of different layers are defined with each layer performing a well-
defined set of operations. Each layer will do some operations that becomes
closer to machine instruction set progressively.
At the outer layer, components will receive the user interface operations and
at the inner layers, components will perform the operating system
interfacing(communication and coordination with OS)
ANS:
Horizontal partitioning:
defines separate branches of the modular hierarchy for each major program
function.
Simplest way is to partition a system into: input, data transformation
(processing), and output
Disadvantage:
more data to be passed across module interfaces
complicate the overall control of program flow
Vertical partitioning:
suggests the control and work should be distributed top-down in program
structure.
Advantages:
good at dealing with changes
easy to maintain the changes
reduce the change impact and and propagation
ANS:
It not only decreases the time involved in design, coding, and testing but the overall
software development cost is also liquidated gradually with several projects. A number
of studies so far have proven that the reusability of software design is the most
valuable way of reducing the cost involved in software development.
Functional Independence:
ANS:
Object oriented design started right from the moment computers were invented.
Programming was there, and programming approaches came into the picture.
Programming is basically giving certain instructions to the computer.
At the beginning of the computing era, programming was usually limited to
machine language programming. Machine language means those sets of
instructions that are specific to a particular machine or processor, which are in
the form of 0’s and 1’s. These are sequences of bits (0100110…). But it’s quite
difficult to write a program or develop software in machine language.
It’s actually impossible to develop software used in today’s scenarios with
sequences of bits. This was the main reason programmers moved on to the next
generation of programming languages, developing assembly languages, which
were near enough to the English language to easily understand.
These assembly languages were used in microprocessors. With the invention of
the microprocessor, assembly languages flourished and ruled over the industry,
but it was not enough. Again, programmers came up with something new, i.e.,
structured and procedural programming.
The OOP concept was basically designed to overcome the drawback of the above
programming methodologies, which were not so close to real-world applications.
The demand was increased, but still, conventional methods were used. This new
approach brought a revolution in the programming methodology field.
Object-oriented programming (OOP) is nothing but that which allows the writing
of programs with the help of certain classes and real-time objects. We can say
that this approach is very close to the real-world and its applications because
the state and behaviour of these classes and objects are almost the same as
real-world objects.
Data Abstraction:
Abstraction refers to the act of representing important and special features without
including the background details or explanation about that feature. Data abstraction
simplifies database design.
Encapsulation:
Inheritance:
Polymorphism:
is the ability of data to be processed in more than one form. It allows the
performance of the same task in various ways. It consists of method overloading
and method overriding, i.e., writing the method once and performing a number of
tasks using the same method name.
ANS:
Q6.Draw DFD level-0 and DFD level-1 for Hospital management system.
ANS:
LEVEL 0 DFD:
LEVEL 1 DFD:
Q7.Prepare an E-R Diagram for a simple Hospital Management System and explain it.
ANS:
Explanation:
Q8.Explain the difference between DFD and E-R diagram with symbols and example.
It stands for Data Flow Diagram. It stands for Entity Relationship Diagram or
Model.
Main objective is to represent the processes and Main objective is to represent the data object or
data flow between them. entity and relationship between them.
It explains the flow and process of data input, It explains and represent the relationship
data output, and storing data. between entities stored in a database.
Symbols used in DFD are: rectangles (represent Symbols used in ERD are: rectangles (represent
the data entity), circles (represent the process), the entity), diamond boxes (represent
arrows (represent the flow of data), ovals or relationship), lines and standard notations
parallel lines (represent data storing). (represent cardinality).
Rule followed by DFD is that at least one data Rule followed by ERD is that all entities must
flow should be there entering into and leaving represent the set of similar things.
the process or store.
It models the flow of data through a system. It model entities like people, objects, places and
events for which data is stored in a system.
ANS:
Data dictionary
It describes the meanings and purposes of data elements within the context of a
project, and provides guidance on interpretation, accepted meanings and
representation. A Data Dictionary also provides metadata about data elements. The
metadata included in a Data Dictionary can assist in defining the scope and
characteristics of data elements, as well the rules for their usage and application.
Q10.Explain feasibility study with the example of ATM machine in banking system. Draw
use case diagram of ATM machine.
ANS:
Inside a ATM banking system, first of all it should be checked whether your product
will be feasible or bot.
It will be tested based on various criteria called as feasibility study. There are 4
different dimensions:
For ATM system, we have to check whether this project is technically feasible or
not?
If any kind of errors may be generated inside your appliocation then it may be
reduced through technology.
Finance
A next criterion is also checked whether the product is financially feasible or not.
Time
There will be some time duration inside your product should be completed.
Resource
There are various actors like, Bank customer, Cashier and Maintenance person.
Cashier is the person who deposites money and then he will not present at the ATM
center.
Same, maintenance person check machine and repair machine. But he will not
present at ATM center.