0% found this document useful (0 votes)
9 views

Chapter 2 - Fundamentals of Testing Part 2 (v3)

Uploaded by

s.priss2305
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Chapter 2 - Fundamentals of Testing Part 2 (v3)

Uploaded by

s.priss2305
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 40

SOFTWARE TESTING

(ITS 64704)
Fundamentals of Testing
(Part 2)
Terms and Motivation

Topics
Principles of Testing

Fundamental Test Process

Test Cases, Expected Results and Test Oracles

Psychology of Testing

Ethics of Testing
What is testing?

• Testing is a systematic process with the main aim


of finding defects
• Defects found needs to be fixed by developers
Testing Objectives

1. Find bugs and provide programmers with the information they


need to fix important bugs
2. Gain confidence about the level of quality of the system
3. Prevent defects (through early involvement in reviews and
advanced test designs)
4. Provide information about the most important aspects of the
quality of the system under test
5. Help management understand system quality
SW Development Model:
Waterfall model (1970)
Verification

SW Development Model:
• examines the consistency of the products being
developed with products developed in previous phases.

Waterfall model (1981)


Validation
• process of evaluating a system or component during or
at the end of the development process to determine
whether it satisfies specified requirements
V-Model (B. Boehm, 1979)
V-Model (B. Boehm, 1979)

Validate Validate

Verify
Verify
General V-Model
Test Process

 Closely linked with software development


 It is however, a separate, independent process
 A test plan is necessary for every test level (level test plan)
 Testing cannot be considered as a single activity
 Many individual tasks are performed within testing
 The test process represents these individual testing activities in a
coherent process
Planning & Control

5 Stages of
Analysis & Design
Fundamental Test Process
Implementation & Execution

Evaluating Exit Criteria and


reporting • Each phase may have to be
executed a number of times in
Test closure activities order to fulfill exit criteria
• Although logically sequential,
the activities in the process may
overlap or take place
concurrently
Planning & Control

Stage 1: Planning & Control


Analysis & Design

Implementation &
Execution

Planning & Control


Evaluating Exit
1. Specification of test and
Criteria planreporting
• Determine test scope, risks, objectives, strategies
• Specify test approach (techniques, objects, coverage, teams, etc)
Test closure activities
2. Details of Test Plan
• Determine required test resources
• Implement test strategies
• Schedule test analysis & design
• Schedule implementation, execution and evaluation of tests
• Determine the test exit criteria (what level of coverage should be
reached?)

* Test schedule and test strategy are to be recorded in the test plan
Planning & Control

Stage 1: Planning & Control


Analysis & Design

Implementation &
Execution

Planning & Control


Evaluating Exit
3. Drawing up a rough schedule
Criteria and reporting
• Schedule tasks for test analysis and test specification
• Schedule test implementation, test execution and test evaluation
Test closure activities
4. Test Control
• Measure & analyze result
• Monitor and document progress, coverage and test exit criteria
• Initiate corrective actions
• Make decisions
Planning & Control

Stage 2: Analysis & Design


Analysis & Design

Implementation &
Execution

Analysis
Evaluating Exit
Design
Criteria
• Review the test and (e.g
basis • Design and prioritize
requirementsreporting
or design specs, combinations of test data,
quality risks, n/w architecture, actions and expected results
Test closure
SRS, FRS etc) activities • Identify the test data needed
• Identify and prioritize test for test conditions and cases
conditions, test requirements, • Design test environment
objectives, required test data • Identify infrastructure, tools
based on analysis of test items
• Evaluate feasibility of the
requirements and system
Planning & Control

Stage 3: Implementation & Execution


Analysis & Design

Implementation &
Execution

Implementation
Evaluating Exit
Execution
• Develop, implement
Criteria and and • Execute test case (manual or
prioritize test cases, create
reporting automated)
data, write procedures (steps) • Log test results, version of
Test closure
Create test scripts software under test, test tools
• activities
• Organize test suites and and the testware (test log)
sequences of test procedures • Compare actual and expected
• Verify the test environment result
• Report and analyze incidents
• Repeat corrected and/or
updated test
• Run confirmation and/or
regression test

Re-Testing: Repeat of test activities as a result of action taken to


resolve each incident/ defect
Planning & Control

Stage 4 & 5
Analysis & Design

Implementation &
Execution

Exit &Evaluating
ReportingExit
Closure
Criteria
• Check test logs and the
against • Confirm test deliverables, final
exit criteria inreporting
the test plan resolutions or deferral of bug
• Assess if more tests are
Test closure
reports, and the acceptance
needed or if the exit criteria
activities
of the system
specified should be changed • Finalize, archive the testware,
• Write a test summary report test environment and test
for stakeholders infrastructure
• Deliver testware to the
maintenance organization
• Perform a review to capture
improvements for future
releases, projects, test
process
Test Reporting
Test Reporting
Terms and Motivation

Topics
Principles of Testing

Fundamental Test Process

Test Cases, Expected Results and Test Oracles

Psychology of Testing

Ethics of Testing
Test cases for verification of specified result and of test object delivered results and
reactions

Criteria for Test Cases


• Positive test; expected input (happy path)
- include things like precondition (things
Set of documentation it’s detaining
required/need to have), postcondition (main
things that need to know and do - able
thing to achieve from that particular set of
to carry out the specific testing of a
Test testing)
cases that verify the specified handling of exceptionalparticular
situations and defects
function/module.
- test for cdf - tap by tap guiding - expected result
(testers carry out testing efficiently)
• Negative tests; expected false input
• e.g exception handling cases

Test cases for verification of reactions of the test object for invalid and unexpected inputs
or constraint
Eg: module– registration
no exception handling
system, specified
expect module registered successfully in the system

• Negative test, unexpected erroneous input

- invalid input, expect invalid input


Eg: module registration system, expect module registered
software testing, credit hour 10 instead of 4. Module registration
is not completed successfully

doesn’t matter positive/negative test,


valid/invalid input, outcome is not expected.
(unexpected)
Test Specification :
High level and specific test cases
Example:
 A company has ordered a program that should calculate the Christmas bonus
of the staff in relation to the time they have been working here. In the
description of the requirement you find the following text:

• Staff who have been with the company for more than 3 years will receive 50% of the
monthly salary as Christmas bonus.
• Staff who have been with the company for more than 5 years will receive 75%.
• Staff who have been with the company for more than 8 years will receive 100% of
their monthly salary.

How do the test cases look like?


Test Specification :
High level and specific test cases

 You can setup the following relationship between allowance of the bonus and
the time working for the company.
• Staff who have been with the company for more than 3 years will receive 50% of the monthly
salary as Christmas bonus.
• Staff who have been with the company for more than 5 years will receive 75%.
• Staff who have been with the company for more than 8 years will receive 100% of their monthly
salary.

Years with the company Bonus percentage


years with the company <=3 equals bonus = 0%
3 < years with the company <=5 equals bonus = 50%
5 < years with the company <=8 equals bonus = 75%

years with the company >8 equals bonus = 100%


Test Specification :
High level and specific test cases
(positive tests)
Give the conditions, indicate
the range

High level (logical) Test Case 1 2 3 4


Input value x X <=3 3<x<=5 5<x<=8 X>8
(years with the company)
Expected result 0 50 75 100
(bonus in %)
input/data to test

Specific Test Case 1 2 3 4


Input value x 2 4 7 12
(years with the company)
Expected result 0 50 75 100
(bonus in %)

Remarks:
• No pre- and post-conditions or constraints are considered
• The test cases were not derived systematically
• Only positive tests with expected results
Test Specification :
High level and specific test cases
(positive tests)
High level (logical) Test Case 1 2 3 4
Input value x X <=3 3<x<=5 5<x<=8 X>8
(years with the company)
Expected result 0 50 75 100
(bonus in %)

Specific Test Case 1 2 3 4


Exact
Input value x i
(bou2n nput an4d ca 7 12
daries
(years with the company) ) will b tegories
Expected result 0EP and B 50 e learne75 d in 100
(bonus in %) VA top
ic

Remarks:
• No pre- and post-conditions or constraints are considered
• The test cases were not derived systematically
• Only positive tests with expected results
Test Specification :
High level and specific test cases
(negative tests)
High level (logical) Test Case 1 2 3 4
Input value x X <=3 3<x<=5 5<x<=8 X>8
(years with the company)

Expected result Invalid input. Invalid input. Invalid input. Invalid input.
Bonus can’t be Bonus can’t Bonus can’t be Bonus can’t be
(bonus in %) calculated be calculated calculated calculated

Specific Test Case 1 2 3 4


Input value x 4 7 10 2
(years with the company)
Expected result Invalid input. Invalid input. Invalid input. Invalid input.
Bonus can’t be Bonus can’t Bonus can’t be Bonus can’t
(bonus in %) calculated be calculated calculated be calculated

Remarks:
• No pre- and post-conditions or constraints are considered
• The test cases were not derived systematically
• Negative tests with expected results
Test Specification :
High level and specific test cases
(negative tests)
High level (logical) Test Case 1 2 3 4
Input value x X <=3 3<x<=5 5<x<=8 X>8
(years with the company)

Expected result Invalid input. Invalid input. Invalid input. Invalid input.
Bonus can’t be Bonus can’t Bonus can’t be Bonus can’t be
(bonus in %) calculated be calculated calculated calculated

Specific Test Case 1 2 3 4


Input value x 4 Exact 7 10 2
(years with the company) i
(boun nput and ca
Expected result Invalid input. dInvalid w
tegorinput.
aries)input. Invalid i es Invalid input.
i
Bonus can’t be EPBonus can’tll beBonus
learned be
can’t Bonus can’t
(bonus in %) calculated be acalculated
nd BVA in
topicalculated
c
be calculated

Remarks:
• No pre- and post-conditions or constraints are considered
• The test cases were not derived systematically
• Negative tests with expected results
Expected Results and Test Oracle

 After each executed test case, it must be decided whether there is a failure or
not
 Compare the actual result vs expected result
 The expected behavior of the test object has to be determined in advance for
each test case
 The tester must obtain this information from appropriate sources when
specifying a test case
Topics

Terms and Motivation

Principles of Testing

Fundamental Test Process

Test Cases, Expected Results and Test Oracles

Psychology of Testing

Ethics of Testing
Good tester attributes

Professional
Curiosity A critical eye
pessimism

Good
communication
Attention to detail
skills (speaker and
listener)
Defining Tester Skills

Reading
• Specifications, e-mails, test cases, etc

Writing
• Test cases, bug reports, test documentation, etc

Technology, project, testing skills


• Technology: programming languages, OS, networking,
HTML/web, etc
• Application domain: banking, human factors, office applications,
etc
• Testing: scripting, exploring and attacking the system,
automation, etc
Differing Mindset

• Testing and reviewing are different than developing, so the


mindsets are different too
• Separation of duties helps focus testing (developers not testing
own code)
• Professional testers are more effective at testing activities
(especially finding failures), as they are more objective and don’t
have the biasness
Degrees of Independence

• Though developers can test their own code, independent testers


are typically more effective at finding failures
• Levels of test independence:

Developer testing their own code (low level of independence)

Testing done by team member

Testing done by a member from different team / team specialist

Testing done by a person from a different organization or company (high level


independence)
Clear Objectives

• Should have clear objectives on testing


• For e.g: Should testing find defects or confirm that software
works?
• If it is to find defects, what percentage?
• If it is to build confidence, to what level?
• A test policy can help to clearly state the objectives of testing
Bad news or Bad Guy?

 Testers are sometimes on receiving end of emotions brought


on by news of project problems
 Communication skills can help:

Collaborate for better quality

Communicate neutrally, about facts , without criticism

Understand your colleagues and how they’ll react to your findings

Conform your colleagues understood what you said

Confirm you understand your colleagues


Topics

Terms and Motivation

Principles of Testing

Fundamental Test Process

Test Cases, Expected Results and Test Oracles

Psychology of Testing

Ethics of Testing
Tester Ethics

CLIENT &
PUBLIC PRODUCT
EMPLOYER

JUDGEMENT MANAGEMENT PROFESSION

COLLEAGUES SELF
Tester Ethics

PUBLIC
• Certified software tester shall act consistently with the public interest

CLIENT & EMPLOYER


• Certified software testers shall act in the manner that is in the best
interests of their client and employer, consistent with the public interest

PRODUCT
• Certified software testers shall ensure that the deliverables they provide
(on the products and systems they test) meet the highest professional
standards possible.

JUDGEMENT
• Certified software testers shall maintain integrity and independence in
their professional judgment
Tester Ethics

MANAGEMENT
• Certified software test managers and leaders shall subscribe to and
promote an ethical approach to the management of software testing.

PROFESSION
• Certified software testers shall advance the integrity and reputation of
the profession consistent with the public interest

COLLEAGUES
• Certified software testers shall be fair to and supportive of their
colleagues, and promote cooperation with software developers.

SELF
• Certified software testers shall participate in lifelong learning regarding
the practise of their profession and shall promote an ethical approach to
the practice of the profession.
Effective vs Efficiency testing

Effective testing Efficient testing


• Use test design • Finding the
techniques to defects with the
write test to find least cost, time
more defects and resources
THANK
YOU

You might also like