2 Marks
2 Marks
UNIT - I
Ø Most software is custom built rather than being assembled from components.
Ø System software
Ø Application software
Ø Engineering/Scientific software
Ø Embedded software
Ø Web Applications
Ø Heterogeneity challenge
Software process is defined as the structured set of activities that are required to develop the software
system.
Ø Specification
Ø Validation
Ø Evolution
Ø Risk management.
Ø Reusability management.
Ø Measurement.
i. The incremental model can be adopted when tere are less number of people involved in the project.
iii. For a very small time span,at least core product can be delivered to the customer.
Ø Planning – All planning activities are carried out in order to define resources timeline and other
project related activities.
Ø Risk analysis – The tasks required to calculate technical and management risks.
Ø Engineering – In this the task region,tasks required to build one or more representations of
applications are carried out.
Ø Construct and release – All the necessary tasks required to construct,test,install the applications are
conducted. ¾_Customer evaluation – Customer‟ s feedback is
obtained and based on the customer evaluation required tasks are performed and implemented at
installation stage.
i. It is based on customer communication.If the communication is not proper then the software product
that gets developed will not be the up to the mark.
ii. It demands considerable risk assessment.If the risk assessment is done properly then only the
successful product can be obtained.
Level 1:Initial – Few processes are defined and individual efforts are taken.
Level 2:Repeatable – To track cost schedule and functionality basic project management processes are
established.
Level 4:Managed – Both the software process and product are quantitatively understood and controlled
using detailed measures.
The effector process is a process that verifies itself.The effector process exists in certain criteria.
The computer based system can be defined as “a set or an arrangement of elements that are organized
to accomplish some predefined goal by processing information”.
Verification represents the set of activities that are carried out to confirm that the software correctly
implements the specific functionality.
Validation represents the set of activities that ensure that the software that has been built is satisfying
the customer requirements.
i. Unit testing – The individual components are tested in this type of testing.
iii. Sub-system testing – This is a kind of integration testing. Various modules are
v. Acceptance testing – This type of testing involves testing of the system with customer data.If the
system behaves as per customer need then it is accepted.
Capability Maturity Model is used in assessing how well an organisation‟s processes allow to complete
and manage new software projects.
Requirement engineering is the process of establishing the services that the customer requires from the
system and the constraints under which it operates and is developed.
UNIT II
requirements
i. Prototype serves as a basis for deriving system specification. ii. Design quality can be improved.
i. Evolutionary prototyping – In this approach of system development, the initial prototype is prepared
and it is then refined through number of stages to final stage.
ii. Throw-away prototyping – Using this approach a rough practical implementation of the system is
produced. The requirement problems can be identified from this implementation. It is then discarded.
System is then developed using some different engineering paradigm.
This prototyping is used to pre-specify the look and feel of user interface in an effective way.
i. Correct – The SRS should be made up to date when appropriate requirements are identified.
ii. Unambiguous – When the requirements are correctly understood then only it is possible to write an
unambiguous software.
iii. Complete – To make SRS complete,it shold be specified what a software designer wants to create
software.
vi. Traceable – What is the need for mentioned requirement?This should be correctly identified.
iii. To devise a set of valid requirements after which the software can be built.
Data modeling is the basic step in the analysis modeling. In data modeling the data objects are examined
independently of processing. The data model represents how data are related with one another.
Data object is a collection of attributes that act as an aspect, characteristic, quality, or descriptor of the
object.
Attributes are the one, which defines the properties of data object.
Modality indicates whether or not a particular data object must participate in the relationship.
Entity Relationship Diagram is the graphical representation of the object relationship pair. It is mainly
used in database applications.
Data Flow Diagram depicts the information flow and the transforms that are applied on the data as it
moves from input to output.
Level0 DFD is called as „fundamental system model‟ or „context model‟. In the context model the entire
software system is represented by a single bubble with input and output indicated by incoming and
outgoing arrows.
State transition diagram is basically a collection of states and events. The events cause the system to
change its state. It also represents what actions are to be taken on the
The data dictionary can be defined as an organized collection of all the data elements of the system with
precise and rigorous definitions so that user and system analyst will have a common understanding of
inputs,outputs, components of stores and intermediate
calculations.
i. Data Dictionary
UNIT III
Design process is a sequence of steps carried through which the requirements are translated into a
system or software model.
iii. The design should exhibit uniformity and integration. iv. Design is not coding.
Changes made during testing and maintenance becomes manageable and they do not affect other
modules.
A cohesive module performs only “one task” in software procedure with little interaction with other
modules. In other words cohesive module performs only one thing.
i. Coincidentally cohesive –The modules in which the set I\of tasks are related with each other loosely
then such modules are called coincidentally cohesive.
ii. Logically cohesive – A module that performs the tasks that are logically related with each other is
called logically cohesive.
iii. Temporal cohesion – The module in which the tasks need to be executed in some specific time span
is called temporal cohesive.
iv. Procedural cohesion – When processing elements of a module are related with one another and
must be executed in some specific order then such module is called procedural cohesive.
v. Communicational cohesion – When the processing elements of a module share the data then such
module is called communicational cohesive.
Coupling is the measure of interconnection among modules in a program structure. It depends on the
interface complexity between modules.
49. What are the various types of coupling?
i. Data coupling – The data coupling is possible by parameter passing or data interaction.
ii. Control coupling – The modules share related control data in control coupling.
iii. Common coupling – The common data or a global data is shared among modules. iv. Content
coupling – Content coupling occurs when one module makes use of data or control information
maintained in another module.
i. System structuring – The system is subdivided into principle subsystems components and
communications between these subsystems are identified.
ii. Control modeling – A model of control relationships between different parts of the system is
established.
iii. Modular decomposition – The identified subsystems are decomposed into modules.
Vertical partitioning often called factoring suggests that the control and work should be distributed top-
down in program structure.
i. Data object – The data objects are identified and relationship among various data objects can be
represented using ERD or data dictionaries.
ii. Databases – Using software design model, the data models are translated into data structures and
data bases at the application level.
iii. Data warehouses – At the business level useful information is identified from various databases and
the data warehouses are created.
iv. Use information hiding in the design of data structure. v. Apply a library of useful data structures and
operations.
iii. Call and return architecture. iv. Object-oriented architecture. v. Layered architecture.
The transform mapping is a set of design steps applied on the DFD in order to map the transformed flow
characteristics into specific architectural style.
Real time system is a software system in which the correct functionalities of the system are dependent
upon results produced by the system and the time at which these results are produced.
Software Configuration Management is a set of activities carried out for identifying, organizing and
controlling changes throughout the lifecycle of computer software.
Software Configuration Item is information that is carried as part of the software engineering process.
UNIT IV
Software testing is a critical element of software quality assurance and represents the ultimate review of
specification, design, and coding.
ii. A good test case is one that has high probability of finding an undiscovered error.
63. What are the testing principles the software engineer must apply while performing the software
testing?
i. Component testing Individual components are tested. Tests are derived from developer‟s experience.
ii. System Testing The group of components are integrated to create a system or sub- system is done.
These tests are based on the system specification.
i. Test planning
v. Effective evaluation
The black box testing is also called as behavioral testing. This method fully focuses on the functional
requirements of the software. Tests are derived that fully exercise all functional requirements.
Equivalence partitioning is a black box technique that divides the input domain into classes of data.
From this data test cases can be derived. Equivalence class represents a set of valid or invalid states for
input conditions.
A boundary value analysis is a testing technique in which the elements at the edge of the domain are
selected and tested. It is a test case design technique that complements equivalence partitioning
technique. Here instead of focusing on input conditions only, the test cases are derived from the output
domain.
68. What are the reasons behind to perform white box testing?
There are three main reasons behind performing the white box testing.
some functions.Due to this there are chances of having logical errors in the program.To detect and
2. Certain assumptions on flow of control and data may lead programmer to make design errors.To
uncover the errors on logical path,white box testing is must.
3. There may be certain typographical errors that remain undetected even after syntax and type
checking mechanisms.Such errors can be uncovered during white box testing.
Cyclomatic complexity is a software metric that gives the quantitative measure of logical complexity of
the program. The Cyclomatic complexity defines the number of independent paths in the basis set of the
program that provides the upper bound for the number of tests
that must be conducted to ensure that all the statements have been executed at least once.
The cyclomatic complexity can be computed by any one of the following ways.
1. The numbers of regions of the flow graph correspond to the cyclomatic complexity.
3. V(G)=P+1 Where P is the number of predicate nodes contained in the flow graph.
71. Distinguish between verification and validation. ¾_Verification refers to the set of activities that
ensure that software correctly implements a specific function.
¾_Validation refers to a different set of activities that ensure that the software that has been built is
traceable to the customer requirements.
According to Boehm,
72. What are the various testing strategies for conventional software?
i. Unit testing ii. Integration testing. iii. Validation testing. iv. System testing.
Ø The “ driver” is a program that accepts the test data and prints the relevant results.
Ø The “ stub” is a subprogram that uses the module interfaces and performs the
Disadvantages:
Ø It is hard to debug.
77. What are the conditions exists after performing validation testing?
Ø The function or performance characteristics are according to the specifications and are accepted.
Ø The requirement specifications are derived and the deficiency list is created. The deficiencies then can
be resolved by establishing the proper communication with the customer.\
Ø Alpha test: The alpha testing is attesting in which the version of complete software is tested by the
customer under the supervision of developer. This testing is performed at developer‟s site.
Ø Beta test: The beta testing is a testing in which the version of the software is tested by the customer
without the developer being present. This testing is performed at customer‟s site.
1. Recovery testing – is intended to check the system‟ s ability to recover from failures.
Ø Brute force method: The memory dumps and run-time tracks are examined and program with write
statements is loaded to obtain clues to error causes.
Ø Back tracking method: The source code is examined by looking backwards from symptom to potential
causes of errors.
Ø Cause elimination method: This method uses binary partitioning to reduce the number of locations
where errors can exists.
Ø Quality plan – This plan describes the quality procedures and standards that will be used in a project.
Ø Validation plan – This plan describes the approach, resources and schedule required for system
validation.
Ø Configuration management plan – This plan focuses on the configuration management procedures
and structures to be used.
Ø Maintenance plan – The purpose of maintenance plan is to predict the maintenance requirements of
the system, maintenance cost and efforts required.
Ø Staff development plan – This plan describes how to develop the skills and experience of the project
team members.
UNIT V
Measure is defined as a quantitative indication of the extent, amount, dimension, or size of some
attribute of a product or process.
Metrics is defined as the degree to which a system component, or process possesses a given attribute.
Ø Direct metrics – It refers to immediately measurable attributes. Example – Lines of code, execution
speed.
Ø Indirect metrics – It refers to the aspects that are not immediately quantifiable or measurable.
Example – functionality of a program.
86. What are the advantages and disadvantages of size measure? Advantages:
Disadvantages:
Ø This method is well designed but shorter program may get suffered.
Ø Algorithmic cost modeling – the cost estimation is based on the size of the software.
Ø Expert judgement – The experts from software development and the application
Ø Estimation by analogy – The cost of a project is computed by comparing the project to a similar
project in the same application domain and then cost can be computed.
Ø Parkinson’s law – The cost is determined by available resources rather than by objective assessment.
Ø Pricing to win – The project costs whatever the customer ready to spend it.
COnstructive COst MOdel is a cost model, which gives the estimate of number of man- months it will
take to develop the software product.
2. Co-ordinator calls a group meeting in which the experts discuss estimation issues with
5. The Co-ordinator then calls a group meeting.In this meeting the experts mainly discuss the points
where their estimates vary widely.
6. The experts again fill out forms anonymously.
5 and 6 until the co-ordinator is satisfied with the overallprediction synthesized from experts.
The purpose of the timeline chart is to emphasize the scope of the individual task. Hence set of tasks are
given as input to the timeline chart.
Earned Value Analysis is a technique of performing quantitative analysis of the software Project.It
provides a common value scale for every task of software project.It acts as a measure for software
project progress.
92. What are the metrics computed during error tracking activity?
Ø DRE-requirement analysis
Ø DRE-architectural analysis
Ø DRE-coding.
Software change occurs because of the following reasons. New requirements emerge when the software
is used. The business environment changes. Errors need to be repaired. New equipment must be
accommodated. The performance or reliability may have to be improved.
The software change strategies that could be applied separately or together are:
requirements.
Ø Architectural transformation – It is the process of changing one architecture into another form.
Ø Software re-engineering – New features can be added to existing system and then the system is
reconstructed for better use of it in future.
Software maintenance is an activity in which program is modified after it has been put into use.
96. Define maintenance.
Maintenance is defined as the process in which changes are implemented by either modifying the
existing system‟ s architecture or by adding new components to the system.
Ø Corrective maintenance – Means the maintenance for correcting the software faults.
Ø Perfective maintenance – Means modifying or enhancing the system to meet the new requirements.
Ø Code based testing tools – These tools take source code as input and generate test cases.
Ø Specialized testing tools – Using this language the detailed test specification can
Ø Requirement-based testing tools – These tools help in designing the test cases as per user
requirements.