0% found this document useful (0 votes)
6 views11 pages

Long Answer Questions: Procedural Abstraction

The document discusses fundamental concepts of software design, including abstraction, modularity, and information hiding, which are essential for effective software development. It also covers software testing strategies, black-box testing metrics, risk management approaches, and the importance of software reviews. Additionally, it highlights various metrics for assessing software quality, source code, and maintenance.

Uploaded by

artdemios73
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views11 pages

Long Answer Questions: Procedural Abstraction

The document discusses fundamental concepts of software design, including abstraction, modularity, and information hiding, which are essential for effective software development. It also covers software testing strategies, black-box testing metrics, risk management approaches, and the importance of software reviews. Additionally, it highlights various metrics for assessing software quality, source code, and maintenance.

Uploaded by

artdemios73
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Long Answer Questions

1.Discuss briefly the following fundamental concepts of design:

a) Abstraction b) Modularity c)Information hiding

Fundamental software design concepts provide the necessary frame work for “getting it right.”

 Abstraction - Many levels of abstraction can be posed.

• The highest level states the solution in broad terms using the language of the
problem environment.

• The lower level gives more detailed description of the solution.

• Procedural Abstraction refers to a sequence of instruction that have specific and


limited functions.

• Data Abstraction is a named collection of data that describes a data object.

 Modularity – the software is divided into separately named and


addressable components called modules that are integrated to satisfy
problem requirements.

 Information Hiding – modules should be specified and designed so that


information contained within a module is inaccessible to other modules
that have no need for such information.

Information hiding provides greatest benefits when modifications are required during testing and
software maintenance.

2.What is software testing strategy? Explain the characteristics in detail.

 Software Testing
 Testing is the process of exercising a program with the specific intent of finding errors prior
to delivery to the end user.

 A number of software testing strategies have proposed in the literature. All provide the
software developer with a template for testing and all have the following generic
characteristics:

• To perform effective testing, a software team should conduct effective formal


technical reviews. By doing this, many errors will be eliminated before testing
commences.

• Testing begins at the component level and works “outward” toward the integration
of the entire computer-based system.

• Different testing techniques are appropriate at different points in time.

• Testing is conducted by the developer of the software and (for large projects) an
independent test group.

• Testing and debugging are different activities, but debugging must be


accommodated in any testing strategy.

3.Define black box testing .Explain the metrics for testing.

Black-Box Testing alludes to tests that are conducted at the software interface. A black-box test
examines some fundamental aspect of a system with little regard for the internal logical structure of
the software.

Software testing metrics are quantifiable indicators of the software testing process progress, quality,
productivity, and overall health.
 Defect Removal Effectiveness
 DRE Defects removed during development phase
 Defects latent in the product
 Defects latent in the product Defects removed during development
 Phase defects found later by user
 Efficiency of Testing Process (define size in KLoC or FP, Req.)
 Testing Efficiency Size of Software Tested
 Resources used

4.Difference between reactive and proactive risk strategies?

Reactive Vs Proactive risk


Reactive :
 Corrective measures are taken only when software enters into risk.
 No preventive after occurrence
 Older risk management approach
 If they are not solved, the project is in danger
Proactive:
 Begins before the risk occurs
 First, identify the risks then, we asses the impact of these risks on s/w then, risks are
prioritized
 high priority risks are managed first

5.Explain RMMM in RMMM plan?

RMMP(Risk Mitigation Monitoring And Management Plan)

RMMM plan

• It documents all work performed as a part of risk analysis.

• Each risk is documented individually by using a Risk Information Sheet(RIS).

• RIS is maintained by using a database system

6.Explain software reviews in brief?

 Purpose is to find errors before they are passed on to another software engineering activity
or released to the customer.

 Software engineers (and others) conduct formal technical reviews (FTRs) for software
quality assurance.

 Using formal technical reviews (walkthroughs or inspections) is an effective means for


improving software quality
Formal Technical Review

Review meeting in FTR

The Review meeting in a FTR should abide to the following constraints

1.Review meeting members should be between three and five.

2. Every person should prepare for the meeting and should not require more than two hours of work
for each person.

3. The duration of the review meeting should be less than two hours.

Review Reporting and RecordKeeping

Review summary report is a single page form with

possible attachments

The review issues list serves two purposes

1. To identify problem areas in the product

2. To serve as an action item checklist that guides the producer as corrections are made

Review Guidelines

Review on the product, not the producer

• Set an agenda and maintain it

• Limit debate and rebuttal

• Enunciate problem areas, but don’t attempt to solve every problem noted

• Take return notes

• Limit the number of participants and insist upon advance

preparation.

• Develop a checklist for each product i.e likely to be reviewed

• Allocate resources and schedule time for FTRS

• Conduct meaningful training for all reviewer

• Review your early reviews

7.Elaborate modeling component level design?

Component-level design metrics are crucial for assessing the quality of software components. These
metrics help in evaluating the internal structure and interrelationships within the software, providing
insights into maintainability, reliability, and performance.

Cohesion Metrics
Cohesion refers to how closely related and focused the responsibilities of a single module are. High
cohesion within a module implies that its components are strongly related and functionally similar,
which generally leads to more maintainable and understandable code. Metrics for measuring
cohesion include:

 Functional Cohesion: Measures whether the module performs a single well-defined task.

 Sequential Cohesion: Indicates that elements within a module are related by the sequence in
which they must be executed.

 Communicational Cohesion: Reflects that elements within a module operate on the same data or
contribute towards the same output.

Coupling Metrics

Coupling measures the degree of interdependence between software modules. Lower coupling is
generally preferred as it indicates that changes in one module are less likely to impact others,
enhancing modularity and reducing complexity. Coupling metrics consider:

 Data Coupling: Based on the exchange of data via parameters.

 Control Coupling: When modules communicate by passing control information.

 External Coupling: Interaction with external systems or devices.

Complexity Metrics

Complexity metrics quantify various aspects of software complexity to predict the effort needed for
maintenance and to identify potential error-prone parts of the software. These metrics help in
identifying code that may be difficult to understand, test, or modify.

 Cyclomatic Complexity: Developed by Thomas J. McCabe, it measures the number of linearly


independent paths through a program's source code. It is calculated using the control flow
graph of the program.

8.Compare validation testing and system testing in detail

 Focus is on software requirements

 Validation Test Criteria

 Configuration review

 Alpha/Beta testing

 Focus is on customer usage

 ALPHA TEST :

It is conducted at the developer side by a customer are conducted in a controlled environment

 The Software is used in a natural setting with the developer looking over the shoulder of the
user and recording errors and usage problems

 Alpha tests are conducted in a controlled environment.


BETA TEST:

 Beta test is conducted at one or more customer sites by the end user of the software.

 Beta test is a live application of the software in an environment that cannot be controlled by
the developer

 The customer records all the problems are encountered during beta testing and reports
these to developer at regular intervals of time .As a result of the problems reported during
the beta tests,

 software engineer corrected and release the entire customer base

9.What are metrics for Software quality? Explain

 Measuring Software Quality

1. Correctness=defects/KLOC(thousand line per code)

2. Maintainability=MTTC(Mean-time to change)

3. Integrity=Sigma[1-(threat(1-security))]

Threat : Probability that an attack of specific type will occur within a given time

Security : Probability that an attack of a specific type will be repelled

4.Usability: Ease of use

5. Defect Removal Efficiency (DRE)

DRE=E/(E+D)

E is the no. of errors found before delivery and

D is no. of defects reported after delivery

Ideal value of DRE is 1

10.Write a short notes on

a) Cost of quality

b) ISO 9000 quality standards

c) Software reviews

d) Review guidelines

Cost of Quality

 a. Prevention costs Quality planning, formal technical reviews, test equipment, training

 Appraisal costs In-process and inter-process inspection, equipment calibration and


maintenance, testing
 Failure costs rework, repair, failure mode analysis

 External failure costs Complaint resolution, product return and replacement, help line
support, warranty work

b) ISO 9000 describes the quality elements that must

be present for a quality assurance system to be

compliant with the standard, but it does not describe

how an organization should implement these elements.

c) Purpose is to find errors before they are passed on to another software engineering activity or
released to the customer.

d) Review on the product, not the producer

• Set an agenda and maintain it

• Limit debate and rebuttal

• Enunciate problem areas, but don’t attempt to solve every problem noted

• Take return notes

• Limit the number of participants and insist upon advance

preparation.

• Develop a checklist for each product i.e likely to be reviewed

• Allocate resources and schedule time for FTRS

• Conduct meaningful training for all reviewer

• Review your early reviews

11.Define the terms

a) What is Risk Identification?

Risk identification involves identifying and classifying sources of a risk to realize what must be
managed in a construction project. Risk identification is the first step in Risk management process

b) Risk refinement?

Risk Refinement: Process of explaining the risk in detailed way.

Also called Risk assessment

Refines the risk table in reviewing the risk impact based on the following three factors

Nature: Likely problems if risk occurs

Scope: Just how serious is it?

Timing: When and how long

c) Risk projection
Risk projection, attempts to rate each risk Also called risk estimation.

• It estimates the impact of risk on the project and the product

Estimation is done by using Risk Table

12.What are metrics for source code and maintenance? Explain.

Source Code:

 Also called Halstead’s Software Science: a comprehensive collection of metrics all predicated
on the number (count and occurrence) of operators and operands within a component or
program

• It should be noted that Halstead’s “laws” have generated substantial controversy,


and many believe that the underlying theory has flaws. However, experimental
verification for selected programming languages has been performed. The measures
are

 n1 = the number of distinct operators that appear in a program.

 n2 = the number of distinct operands that appear in a program.

 N1 = the total number of operator occurrences.

 N2 = the total number of operand occurrences.

 Halstead shows that length N can be estimated

N= n1 log2 n1 + n2 log2 n2

and program volume may be defined.

V = N log2 (n1+n2)

Maintenance:

The four main types of metrics used widely during the SDLC of a project are

Design,

Project,

Product, and

Maintenance.

Design metrics help measure the design.


Short Answer Questions

1. **Define System Modelling?**

- System modelling involves creating abstract representations of a system to understand and


communicate its structure and behavior.

2. **Explain context model?**

- A context model defines the boundary between the system and its environment, highlighting
interactions with external entities.

3. **Explain about the importance of test strategies for conventional software?**

- Test strategies ensure systematic, comprehensive, and effective verification of software


functionality and performance.

4. **Compare black box testing with white box testing?**

- Black box testing focuses on input-output validation without internal knowledge, while white box
testing involves internal code structure examination.

5. **Compare validation testing and system testing?**

- Validation testing ensures the product meets user needs, while system testing checks the entire
system’s compliance with specified requirements.

6. **What is glass-box testing?**

- Glass-box testing, also known as white-box testing, examines the internal structures and workings
of an application.

7. **What is metric?**

- A metric is a standard of measurement used to quantify various characteristics of a process or


product.

8. **What is metric for software quality?**


- Metrics for software quality include measurements like defect density, code complexity, and user
satisfaction.

9. **Explain about software risks?**

- Software risks are potential problems that could cause a project to fail or software to malfunction,
such as schedule delays or technical challenges.

10. **Elaborate the concepts of Risk management Reactive vs Proactive Risk strategies?**

- Reactive risk management addresses problems after they occur, while proactive strategies
identify and mitigate risks before they impact the project.

11. **Discuss design model?**

- A design model represents the architecture, components, interfaces, and data flow of a system,
guiding its construction and implementation.

12. **List the design concepts?**

- Key design concepts include abstraction, encapsulation, modularity, hierarchy, separation of


concerns, and design patterns.

13. **Discuss a framework for product metrics?**

- A framework for product metrics involves defining key measurements, collecting data, analyzing
results, and using insights to improve product quality.

14. **Explain about Metrics for maintenance?**

- Maintenance metrics assess aspects like code maintainability, defect density, change impact, and
mean time to repair.

15. **List the metrics for the design model.**

- Metrics for the design model include complexity, cohesion, coupling, modularity, and design
pattern usage.

16. **Describe metrics for source code and for testing.**

- Source code metrics include lines of code and cyclomatic complexity; testing metrics involve test
coverage, defect density, and test execution time.
17. **Discuss risk refinement?**

- Risk refinement involves detailing identified risks to understand their impact, likelihood, and
potential mitigation strategies.

18. **What is software reliability?**

- Software reliability is the probability of a software system functioning without failure under
specified conditions for a defined period.

19. **Demonstrate risk identification?**

- Risk identification involves recognizing potential project threats through techniques like
brainstorming, checklists, and SWOT analysis.

20. **Explain software reviews in brief?**

- Software reviews are evaluations of software products and processes to ensure quality and
compliance with requirements, typically involving inspections, walkthroughs, and audits.

You might also like