0% found this document useful (0 votes)
3 views

Software Testing Interview Preparation Notes

The document provides comprehensive notes on software testing, covering the Software Testing Life Cycle (STLC), various testing types, and techniques. It includes definitions of key terms such as bugs, defects, and errors, along with guidelines for defect reporting and test planning. Additionally, it discusses testing strategies, risk mitigation, and approaches for handling projects without documentation.

Uploaded by

sakshi agarwal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Software Testing Interview Preparation Notes

The document provides comprehensive notes on software testing, covering the Software Testing Life Cycle (STLC), various testing types, and techniques. It includes definitions of key terms such as bugs, defects, and errors, along with guidelines for defect reporting and test planning. Additionally, it discusses testing strategies, risk mitigation, and approaches for handling projects without documentation.

Uploaded by

sakshi agarwal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Mastering Software Testing: Interview Preparation Notes

1) Software Testing Life Cycle (STLC) with Documents for Entry and Exit Criteria
STLC consists of multiple phases where each phase has defined entry and exit criteria.

STLC Phases with Entry and Exit Criteria


STLC Phase Entry Criteria Exit Criteria
Requirement BRD (Business Requirement RTM (Requirement Traceability
Analysis Document), SRS (Software Matrix), Feasibility Report
Requirement Specification)
Test Planning TRD (Test Requirement Test Plan, Test Strategy Document
Document)
Test Case Test Plan Approval Test Cases, Test Data
Development
Test Environment Test Data, Environment Test Environment is Ready
Setup Readiness Checklist
Test Execution Test Cases, Test Environment Defect Logs, Test Execution
Setup Completion Reports
Test Closure Completion of Test Execution Test Closure Report, Lessons
Learned Document

Example:
If the Test Execution phase is starting, its entry criteria would be that the environment is set
up, test cases are ready, and testers are prepared. The exit criteria would be that all test cases
have been executed, defects are logged, and execution reports are available.

2) Pesticide Paradox
The Pesticide Paradox states that running the same set of test cases repeatedly will not find
new defects over time.
Solution:
• Testers should review and update test cases periodically.
• Apply different testing techniques like exploratory testing or error guessing to uncover
hidden defects.
Example:
If a login page is tested repeatedly with the same username and password combinations,
undiscovered edge cases (like special characters in passwords or session timeouts) might
never be tested.

pg. 1 Prepared by V. Sai Teja


3) Types of Testing
I. Adhoc Testing
Informal testing without any documentation.
Performed when there is limited time and no structured test cases.
Example: A tester randomly tries invalid data in a payment form to see how the
system behaves.

II. Monkey Testing


Random inputs are given to the system to check for crashes.
Example: Clicking buttons rapidly on a UI to see if it freezes.

III. Gorilla Testing


One module is tested rigorously with many test cases.
Example: Testing only the checkout process of an e-commerce site with various
payment methods.

IV. Exploratory Testing


Testing where testers learn the application while testing.
Example: A tester navigates a new mobile app and discovers UI glitches without
predefined test cases.

V. Smoke Testing
Basic testing to ensure critical functionalities work.
Usually performed on new builds.
Example: Checking if the application launches without errors after deployment.

VI. Sanity Testing


Quick regression test to verify bug fixes without testing the entire application.
Example: If a defect in the login module is fixed, only that module is tested.

4) Dynamic vs Static Testing & Verification vs Validation


I. Static Testing (Verification)
Performed without executing the code.
Involves reviews, walkthroughs, and inspections.
Example: Reviewing the code for security vulnerabilities before deployment.

II. Dynamic Testing (Validation)


• Requires executing the software to check functionality.
Example: Running test cases on a login page to ensure credentials work correctly.

pg. 2 Prepared by V. Sai Teja


5) Bug vs Error vs Defect vs Failure
Term Definition Example
Error Mistake made by a developer Miscalculating tax in an e-commerce
app
Defect Issue found during testing before Login button not working in a test
(Bug) release environment
Failure Issue found after release in Payment gateway crashes for users
production

6) Defect Report
A defect report should include:
• Defect ID
• Summary
• Steps to Reproduce
• Expected Result
• Actual Result
• Severity & Priority
• Attachments (screenshots/logs)
Example: If clicking the "Add to Cart" button does nothing, the defect report should specify
steps, expected behaviour, and actual result.

7) Test Plan
A Test Plan is a document that outlines:
• Scope of testing
• Objectives
• Testing types
• Environment
• Deliverables
Example:
• Scope: Testing a banking app's transaction module.
• Test Strategy: Functional, API, and security testing.
• Test Deliverables: Test Cases, Defect Reports, Execution Summary.

pg. 3 Prepared by V. Sai Teja


8) Requirement Traceability Matrix (RTM)
Ensures every requirement is tested and linked to test cases.
Type Purpose Example
Forward Req → Test If Req#1 says, "Login via OTP," Test#10 must check OTP
Case login.
Backward Test Case → If Test#20 checks "Invalid login," ensure it maps to a login
Req security requirement.
Bi-Directional Both ways Ensures nothing is missed in testing.

9) Interface Testing
• Ensures correct communication between Client, Server, and Database.
Example:
In a banking app, checking if a money transfer request correctly updates the database
and displays the correct balance.

10) Integration Testing & Types


Big Bang (Non-Incremental):
• All modules integrated at once and tested together.
Example: Testing a fully developed banking app only after all features are ready.
Incremental:
• Modules tested step by step.
Top-Down: Start with main module → lower modules.
Example: Test Dashboard first, then Transaction History, then Payments.
Bottom-Up: Start with lower modules → main module.
Example: Test Database first, then APIs, then UI.

11) User Acceptance Testing (UAT) & Handoff


Final testing before release by end-users.
QA Team provides:
• Test Cases, Test Data, Setup Guide.
UAT Team checks:
• Real-world usage & business scenarios.
Example: A hospital system is tested by doctors to ensure:
Patient booking works.
Records can be accessed easily.

pg. 4 Prepared by V. Sai Teja


12) Priority vs Severity with Examples

Case Severity Priority Example


1. Sev Major functionality Fix Payment system fails – users
High, Pri is broken immediately can’t buy.
High
2. Sev Minor issue but Fix soon Logo misalignment on
Low, Pri business-critical homepage – branding issue.
High
3. Sev Major issue but rare Can wait Crash when entering 50-
High, Pri occurrence character password – affects
Low few users.
4️. Sev Minor issue, not Can be fixed Small typo in Help section.
Low, Pri urgent later
Low

13) Regression Testing


Regression Testing ensures that new code changes do not break existing functionalities. It is
performed after bug fixes, enhancements, or system updates to confirm that everything still
works as expected.
Example:
• A developer fixes a login issue in an e-commerce app.
• After the fix, Regression Testing is done to ensure:
o Login works correctly.
o The checkout process, product search, and payment flow still function as
expected.
Manual Regression: Used when UI frequently changes.
Automated Regression: Used for stable features (e.g., running automated tests on login
and checkout)

14) Testing Techniques

Error Guessing:
• Based on tester experience.
Example:
• Entering special characters in a name field to check for errors.

Boundary Value Analysis (BVA):


• Tests boundary values (min, max).

pg. 5 Prepared by V. Sai Teja


Example: If age field accepts 18-60, test 17, 18, 60, 61.
Equivalence Class Partitioning (ECP):
• Divides inputs into valid and invalid groups.
Example:
• If password length is 8-12, test 7 (Invalid), 8-12 (Valid), 13 (Invalid).

Decision Table Testing:


A technique to test all possible input combinations and their expected outcomes, ensuring full
coverage of business rules.
Example: Loan Approval System
Credit Score Annual Income Loan Approved?
Good (700+) High (>50K) Yes
Good (700+) Low (<50K) No
Poor (<700) High (>50K) No
Poor (<700) Low (<50K) No

15) Risk Mitigation in Testing


• Identify risks: Poor requirements, tight deadlines.
• Mitigation Strategies:
o Prioritize critical tests.
o Automate regression.
o Communicate risks early.
Example:
• If a team has less time for testing, prioritizing critical test cases helps ensure major
functionalities work.

16) Authentication vs Authorization


Authentication:
• Verifies who you are.
• Done via username/password, OTP, fingerprint, etc.

Authorization:
• Verifies what you can access after authentication.
• Grants permissions based on user roles.
Example:

pg. 6 Prepared by V. Sai Teja


• Authentication: Entering a correct password to log in.
• Authorization: An admin can access user data, but a regular user cannot.

17) Defect Cycle - Deferred State


Deferred State:
• When a defect is not fixed immediately due to low priority, dependency, or future
releases.
Example:
• A minor UI misalignment in a mobile app may be deferred for a future update.

18) Hotfix vs Patches


Hotfix:
• Immediate fix for a critical production issue.
Example: Fixing a login failure immediately after deployment.
Patch:
• A planned fix for multiple issues, usually included in the next release.
Example: A monthly update fixing bugs and performance issues in a mobile app.

19) Defect Age:


• Defect Age measures the time between a defect's detection and its resolution.
Example:
• A defect is found on March 1st and fixed on March 5th.
• Defect Age = 4️ days.
Monitoring defect age helps assess the efficiency of the defect management process.

20) Test Closure Report


• A document summarizing testing activities, results, and learnings after testing is
completed.
Includes:
• Test summary (cases executed, passed, failed).
• Defect summary (open/closed defects).
• Lessons learned.
Example:
• After testing an e-commerce app, the Test Closure Report states:
o 95% test cases passed.
o 5 major defects reported & fixed.

pg. 7 Prepared by V. Sai Teja


21) Error Seeding
• Intentionally adding defects to check if the testing process detects them.
Example:
• A known bug is inserted into the login module to verify if testers find it.
• If testers miss seeded defects, testing effectiveness is low.

22) Showstopper Defect - Suspension Criteria


Showstopper Defect:
• A critical bug that blocks further testing or usage.
Example:
• App crashes immediately after login, making further testing impossible.

Suspension Criteria:
• Testing is paused when major issues prevent meaningful progress.
Example:
• If the database is down, testing is suspended until it's fixed.

24) TDD, BDD, and ATDD


TDD (Test-Driven Development):
• Write tests first, then develop code.
Example: A developer writes a failing unit test before writing the actual login function.

BDD (Behavior-Driven Development):


• Uses natural language (Gherkin - Given, When Then) to define tests.
Example:
• Given the user is on the login page
• When they enter valid credentials
• Then they should be redirected to the dashboard

ATDD (Acceptance Test-Driven Development):


• Tests are written from the user’s perspective before development starts.
Example:
• A tester, developer, and business analyst define an order placement scenario before
coding.

pg. 8 Prepared by V. Sai Teja


25) Types of Defects
Latent Defect:
• A hidden defect that remains undetected for a long time.
Example:
• A tax calculation issue only noticed when a rare tax rule applies.

Masked Defect:
• A defect hidden by another defect.
Example:
• A UI crash prevents testers from noticing a broken search function.

Defect Clustering:
• A small part of the application has most of the defects.
Example:
• 80% of bugs in an e-commerce app occur in the checkout module.

Defect Cascading:
• One defect triggers other defects.
Example:
• A wrong database update → causes incorrect billing → leads to failed transactions.

26) Defect Rejection Ratio & Defect Leakage Ratio


Defect Rejection Ratio:
• Measures the percentage of invalid defects rejected by the development team.
• Formula:
(Rejected Defects / Total Reported Defects) × 100
Example:
• 5 defects out of 100 are rejected → Rejection Ratio = (5/100) × 100 = 5%

Defect Leakage Ratio:


• Measures defects missed during testing but found in production.
• Formula:
(Defects Found Post-Release / Total Defects Found) × 100
Example:
• 10 defects found post-release, 90 found in testing → Leakage Ratio = (10/100) × 100 =
10%

pg. 9 Prepared by V. Sai Teja


27) Bug Release & Bug Leakage
Bug Release:
• Intentional release of software with known, low-priority bugs that don’t impact
functionality.
Example:
• A small UI misalignment in an app that doesn’t affect core operations.

Bug Leakage:
• Unintentional defects that escape testing and are found by users in production.
Example:
• A critical checkout error in an e-commerce app discovered after release.

28) A/B Testing & Concurrency Testing


A/B Testing:
• Compares two versions of a feature to see which performs better.
Example:
• Login Button A (blue) vs. Login Button B (green) to check which gets more clicks.

Concurrency Testing:
• Evaluates system performance under multiple users accessing it simultaneously.
Example:
• 100 users booking tickets at the same time on a movie ticketing app.

29) Handling High-Priority Releases in a Short Time


Steps to Handle:
• Prioritize Critical Features: Test only high-risk areas.
• Parallel Testing: Distribute testing across multiple testers.
• Automate Regression Tests: Use automation for stable features.
• Frequent Builds & Quick Fixes: CI/CD ensures fast feedback.
• Risk-Based Testing: Test components that impact business the most.
Example:
• A banking app security update must be tested quickly. Focus on login, payments, and
fund transfers instead of minor UI fixes.

pg. 10 Prepared by V. Sai Teja


30) Handling a Project Without Documentation
Approach:
• Communicate with Developers & Stakeholders: Understand requirements.
• Explore the Application: Use exploratory testing to identify features.
• Reverse Engineering: Analyze database, APIs, and logs for system behavior.
• Document Key Findings: Create your own test cases & workflows.
Example:
• Testing a legacy billing system with no documentation → Analyze past invoices & user
actions to create test scenarios.

31) Approaching Test Planning for a Defect


Steps:
1. Analyze the Defect: Check logs, screenshots, and steps to reproduce.
2. Impact Assessment: Identify which modules are affected.
3. Define Test Cases: Cover main functionality & related areas.
4️. Regression Testing: Ensure fixes didn’t break existing features.
5. Prioritize Tests: Execute high-impact scenarios first.
Example:
• A bug in a payment gateway → Test payment, refunds, and failed transactions to ensure
complete coverage.

pg. 11 Prepared by V. Sai Teja

You might also like