0% found this document useful (0 votes)
29 views29 pages

Chapter 7 (Part-1)

Uploaded by

TARANG SOLANKI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views29 pages

Chapter 7 (Part-1)

Uploaded by

TARANG SOLANKI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Chapter 7

Software Coding, Testing and Quality


Assurance

Prachi Shah
IT Dept.
BVM

1
Overview
 Coding Standards and Coding Guidelines
 Software Documentation
 Code Review
 Introduction to Testing
 Testing Strategies
 Testing Conventional Applications
◦ White Box Testing
◦ Black Box Testing
 Test Case Design
 Quality concepts
 Software quality assurance
 Formal technical reviews
 Software reliability
 The quality standards: ISO 9000, CMM, Six sigma for SE
 SQA plan
Coding Guidelines
Commonly used programming practices:
1. Standard Control construct required – single entry & single
exit
2. Avoid use of GOTO statements
3. Information hiding – Only access the functions
4. Avoid deep Nesting of one structure / function into another
5. User defined data types should be used to enhance
readability
6. No standard rule for size of Modules is there, but it should
not be too large
7. Avoid uncertain side effects, e.g. change in number of
parameters
8. Use of meaningful variable names for specific purpose
9. Well Documentation – Use comments where required
Coding Standards
Some well-defined standards or rules for coding:
1. Naming conventions for global variables, local
variables and constant identifiers
2. Reader must get idea about purpose of a File by its
name
3. Comments are important, because purpose of code
is to explain logic of the program
4. Declaration and Execution of Statements – Avoid
use of break, continue, complex conditional
expressions, do while statements etc.
Software Documentation
 Internal Documentation – provided in source code
itself
◦ Appropriate Comments
◦ Meaningful Variable names [Most useful]
◦ Code Structuring
◦ Use of Enumerated / User-defined data types and constant
Identifiers
◦ Code Indentation etc.
 External Documentation – supporting documents to
accompany software product
◦ User manual
◦ SRS
◦ Design document
◦ Test document etc.
V&V
 Verification refers to the set of tasks that
ensure that software correctly implements
a specific function.
 Validation refers to a different set of tasks
that ensure that the software that has been
built is traceable to customer
requirements.
 Boehm states this another way:
◦ Verification: “Are we building the product
right?”
◦ Validation: “Are we building the right product?”
 Method of Verification – Code Review
 Method of Validation – Software Testing
Code Review
 It is undertaken after the module successfully
compiles. i.e. After all syntax errors are eliminated
 For elimination of an error using Testing, 3 activities
are performed – testing, debugging and correction of
errors
 While in code inspection, errors are directly detected
 Two types of reviews carried out on module code:
1. Code Walkthrough
◦ Discover algorithmic and logical errors in code
◦ Hand simulation of code execution is carried out before a
meeting
2. Code Inspection
◦ To check for the presence of some common errors due to
programmer’s mistake
◦ To check whether coding standards have been followed or not
Introduction to Software Testing
 “Testing is the process of exercising a program with
the specific intent of finding errors prior to delivery to
the end user.”
 Purpose – to ensure whether the functions appear to
be working according to requirements or not
 For effective testing – conduct effective technical
reviews. Many errors will be eliminated by this before
testing.
 Testing is conducted by the developer of the software
and (for large projects) an independent test group.
 Testing and debugging are different activities, but
debugging must be accommodated in any testing
strategy.
Software Testing Steps
Testing Strategies
1. Unit testing
2. Integration testing
3. Validation testing
4. System testing
1. Unit Testing
Things to be tested
2. Integration Testing
Non-
Incremental Big-Bang
approach

Integration Top down


testing testing
approach
Bottom up
testing
Incremental
approach
Regression
testing

Smoke testing
Big-Bang approach
(Non-incremental approach)
 All components are combined in advance.
The entire program is tested as a whole.
 A set of errors is encountered.
 Correction is difficult because isolation of
causes is complicated by the vast expanse
of the entire program.
 Once these errors are corrected, new ones
appear and the process continues in a
seemingly endless loop.
2.1

2.2
2.3 Regression Testing
 It is the re-execution of some subset of tests
that have already been conducted – to ensure
that changes have not propagated unintended
side effects
 Example:
◦ Yesterday it worked, today it doesn’t
◦ I was fixing X, and accidentally broke Y
◦ That bug was fixed, but now it’s back
 It may be conducted manually, by re-executing
a subset of all test cases or using automated
capture/playback tools
2.4 Smoke Testing
 A common approach for creating “daily builds” for
product software
 Smoke testing steps:
1. Software components that have been translated
into code are integrated into a “build.” (all data files,
libraries, reusable modules, and engineered
components)
2. A series of tests is designed to expose errors that
will keep the build from properly performing its
function.
3. The build is integrated with other builds and the
entire product (in its current form) is smoke tested
daily.
 The integration approach may be top down or bottom up.
3. Validation Testing
 Validation testing follows integration testing
 Focuses on user-visible actions and user-recognizable output
from the system
 Demonstrates conformity with requirements
 Focus is on
◦ System input/output
◦ System functions and information data
◦ System interfaces with external parts
◦ User interfaces
◦ System behavior and performance
 After each validation test either,
◦ the function or performance characteristic conforms to specification
and is accepted
◦ a deficiency list is created from the specifications, which can be
resolved by communicating with customer
 Finally, a configuration review or audit ensures that all elements of
the software configuration have been properly developed
3.1 Acceptance Testing
 It is conducted to ensure that software works correctly in user
environment
 Carried out over a period of weeks or months
 Its two types:
1. Alpha testing
◦ Conducted at the developer’s site by customer
◦ Software is used in a natural setting with developers watching intently
◦ Testing is conducted in a controlled environment
2. Beta testing
◦ Conducted at customer’s sites
◦ Developer is generally not present
◦ It serves as a live application of the software in an environment that
cannot be controlled by the developer
◦ The end-user records all problems that are encountered and reports
these to the developers at regular intervals
 After beta testing is complete, software engineers make software
modifications and prepare for release of the software product to
the entire customer base
4. System Testing
1. Recovery testing
◦ Tests for recovery from system faults
◦ Forces the software to fail in a variety of ways and verifies that
recovery is properly performed
◦ Tests reinitialization, check pointing mechanisms, data
recovery, and restart for correctness
2. Security testing
◦ Verifies that protection mechanisms built into a system will, in
fact, protect it from improper access
3. Stress testing
◦ Executes a system in a manner that demands resources in
abnormal quantity, frequency, or volume
4. Performance testing
◦ Tests the run-time performance of software within the context
of an integrated system
◦ Often coupled with stress testing and usually requires both
hardware and software instrumentation
◦ Can uncover situations that lead to degradation and possible
system failure
Testing Strategies / levels

Unit Integration Validation System


Testing Testing Testing Testing

Non-Incremental Incremental Acceptance


Recovery testing
approach approach testing

Big-Bang Top down testing Alpha testing Security testing

Bottom up testing Beta testing Stress testing

Regression Performance
testing testing

Smoke testing
Testing Conventional Applications

White box Black box


Testing Testing

Control
Basis path Graph Based
structure
testing Testing
testing

Conditional Equivalence
testing Partitioning

Boundary
Loop testing
Value Analysis

Data flow Orthogonal


testing Array Testing
Testing Conventional Applications
1. Black box testing – Functions, operations, external
interfaces, external data and information
2. White box testing – Internal structures, logic paths, control
flows, data flows, internal data structures, conditions,
loops, etc. [Difference is in attached pdf]
White Box Testing
 Also called glass-box testing
 It is a test-case design philosophy that uses the control
structure described as part of component-level design
to derive test cases.
1) Basis Path Testing:
 It is a Structural Testing strategy to exercise all the
independent execution paths at least once
 Steps for path testing:
1. Design Flow graph for the program or component
2. Calculate cyclomatic complexity
3. Select a basis set of path
4. Generate test cases for these paths
Step 1:
FLOW CHART FLOW GRAPH
0 0

R4
1 1

2 2

3 R3
3

6 4 6 4
R2
7 8 5
7 R1 8 5
9
9
11 10 11 10
Cyclomatic Complexity
 Provides a quantitative measure of the logical complexity of a
program
 Defines the number of independent paths in the basis set
 Provides an upper bound for the number of tests that must be
conducted to ensure all statements have been executed at least
once
 Can be computed three ways
1. The number of regions
2. V(G) = E – N + 2, where E is the number of edges and N is the
number of nodes in graph G
3. V(G) = P + 1, where P is the number of predicate nodes in the flow
graph G
 Results in the following equations for the example flow graph
1. Number of regions = 4
2. V(G) = 14 edges – 12 nodes + 2 = 4
3. V(G) = 3 predicate nodes + 1 = 4
Independent Program Paths
 Defined as a path through the program from the start
node until the end node that introduces at least one
new set of processing statements or a new condition
(i.e., new nodes)
 Must move along at least one edge that has not been
traversed before by a previous path
 Basis set for flow graph on previous slide
◦ Path 1: 0-1-11
◦ Path 2: 0-1-2-3-4-5-10-1-11
◦ Path 3: 0-1-2-3-6-8-9-10-1-11
◦ Path 4: 0-1-2-3-6-7-9-10-1-11
 The number of paths in the basis set is determined
by the cyclomatic complexity
Generating test case
 Data should be chosen so that conditions at the
predicate nodes are appropriately set as each path is
tested.
 Each test case is executed and compared to
expected results.
 Once all test cases have been completed, the tester
can be sure that all statements in the program have
been executed at least once.
Another example

3
4
5 6

You might also like