Q.1 Any 4 1. Write Test Case Specifications
Q.1 Any 4 1. Write Test Case Specifications
Q.1 Any 4
1. Write test case specifications.
A Test case specification is a detailed document that outlines the specific conditions, inputs,
steps, and expected results for a test case to verify if a particular functionality or feature of a
system works as intended. It includes:
1. The purpose of the test.
2. Items being tested, along with their version/release numbers as appropriate.
3. Environment that needs to be set up for running the test case.
4. Input data to be used for the test case.
5. Steps to be followed to execute the test.
6. The expected result that are considered to be “correct result”
7. A step to compare the actual results produced with the expected results.
8. Any relationship between this test and other tests.
5 State the need of test deliverables & test plan for test planning.
● Need for Test Deliverables for Test Planning:
1. Provides documentation of testing activities and results.
2. Enables tracking of testing progress and coverage.
3. Ensures quality control and accountability.
4. Facilitates communication among stakeholders.
5. Serves as evidence of completed testing for audit and compliance.
Q2 Any 3
OR
(intro remains same)
Activities performed by Test People Management
1. Team Composition: Assemble a diverse team with complementary skills and experiences.
Consider factors like technical expertise, domain knowledge, and testing methodologies.
2. Clear Objectives: Communicate the objectives of the testing phase clearly to your team.
Ensure everyone understands the goals, scope, and expected outcomes of the testing effort.
3. Assigning Roles and Responsibilities: Clearly define roles and responsibilities within the
testing team. Assign tasks based on individual strengths and expertise, while also providing
opportunities for skill development.
4. Setting Expectations: Establish clear expectations regarding timelines, quality standards, and
reporting mechanisms. Ensure everyone understands their individual and collective
responsibilities.
5. Effective Communication: Foster open and transparent communication within the team.
Encourage regular updates, discussions, and feedback sessions to address any issues or
challenges promptly.
6. Risk Management: Identify potential risks and challenges early in the planning phase. Work
with your team to develop mitigation strategies and contingency plans to address any
unforeseen issues during testing.
2. Prepare a defect report after executing test cases for withdrawal of the amount from
the ATM machine.
(defect report format yei follow krna hai, attributes remain same, values change hongi!!)
changes:
project name: ATM Simulator
module: withdrawal
title: ATM cash Withdrawal Defect
description: no option of withdrawing amount excess of 3000.
resolution comment: limited denotation of options in cash withdrawal was noticed, hence
added option to ask custom amount for withdrawal, fixing the defect.
retest comment: successful withdrawal of amount excess of 3000
3. Calculate effort variance and schedule variance if actual effort = 110, planned effort
=100, actual calendar days = 310, planned calendar days = 300.
1. Defect Prevention:
Using techniques, methods, and standard processes to reduce the chance of defects.
2. Deliverable Baseline:
Setting milestones where deliverables are marked as complete and ready for the next stage.
Changes are controlled after this point, and errors are only considered defects after the baseline
is set.
3. Defect Discovery:
Finding and reporting defects for the development team to acknowledge. A defect is considered
discovered only when documented and confirmed by the responsible team.
4. Defect Resolution:
The development team prioritizes, schedules, fixes defects, and documents the fixes. The tester
is informed to verify the resolution.
5. Process Improvement:
Defects highlight issues in the development process. Fixing these processes leads to better
products with fewer defects.
6. Management Reporting:
Analyzing and reporting defect data helps management with risk management, process
improvements, and project oversight.
1. Introduction
○ Brief overview of the project and goals of the test plan.
○ Example: "This test plan is for a banking mobile application. The purpose is to
verify functionality, security, and performance before the production launch."
2. Scope of Testing
○ Defines the features to be tested and what is out of scope.
○ Example: “Testing will cover login, account balance check, fund transfer, and
transaction history. Features like bill payments and account settings are out of
scope.”
3. Test Objectives
○ Clearly state what the testing aims to accomplish.
○ Example: “Ensure all critical functionalities operate as expected, identify and fix
critical bugs, and ensure compliance with security standards.”
4. Test Strategy
○ Specifies the approach for testing (e.g., functional, performance, regression).
○ Example: “Functional testing will validate core features; regression testing will
ensure new changes do not disrupt existing features.”
5. Resources and Roles
○ Defines team members, their roles, and the required tools or environments.
○ Example: “Team will consist of a test manager, two functional testers, and one
performance tester. Tools required include JIRA for defect tracking and Selenium
for automation.”
6. Test Schedule and Milestones
○ Details the timeline, including key milestones and deadlines.
○ Example: “Test execution to start on 1st Nov, regression testing to begin on 15th
Nov, and final testing report to be completed by 25th Nov.”
7. Risk Management
○ Lists potential risks (e.g., delays in development) and plans to mitigate them.
○ Example: “Risk of feature delays due to development challenges will be mitigated
by having a buffer week for testing.”
8. Entry and Exit Criteria
○ Establishes conditions to begin (entry) and conclude (exit) testing.
○ Example: “Testing begins when the development team delivers a stable build.
Testing concludes when all critical defects are resolved and pass rate is 95%.”
9. Deliverables
○ Specifies expected outputs like test cases, defect reports, and test summary
reports.
Example: For a banking app, the plan might specify critical test cases for transaction accuracy,
security, and compatibility with different devices.
1. Introduction
● This section provides a high-level overview of the project and the purpose
of the test plan.
● Example: "This test plan is for an online shopping application. The primary
goal is to validate the functionality, security, and performance of core
features such as product search, cart, checkout, and payment before
launch."
2. Test Items
● Lists specific components or modules of the application to be tested.
● Example: "Test items include modules for user authentication, product
search, shopping cart management, order placement, and payment
processing."
3. Features to be Tested
● Specifies the features or functionalities included in the testing scope
based on project requirements.
● Example: "Features to be tested include user login and registration,
product search, cart functionality, checkout process, payment gateway
integration, and order tracking."
4. Approach
● Outlines the methods and strategies for testing, such as manual or
automated testing and black-box or white-box testing.
● Example: "A combination of manual and automated testing will be used.
Functional testing will be performed manually, while regression testing will
utilize automated scripts with Selenium."
5. Quality Objectives
● Defines the quality benchmarks that the software must meet in terms of
performance, reliability, and usability.
● Example: "Quality objectives include achieving 99.9% uptime, a maximum
page load time of 3 seconds, and zero high-priority defects at the time of
release."
6. Item Pass/Fail Criteria
● Sets the criteria to determine if a test case has passed or failed based on
expected results.
● Example: "A test case passes if the actual outcome matches the expected
result and the feature works as specified. A fail occurs if there is any
deviation from the expected outcome or a critical error that impacts the
user experience."
7. Suspension Criteria
● Defines conditions under which testing should be paused temporarily.
● Example: "Testing will be suspended if the application experiences
repeated server failures, critical defects in the checkout module, or
database connection issues."
8. Resumption Criteria
● Lists the conditions required to resume testing after a suspension.
● Example: "Testing will resume once server stability is restored, critical
defects are resolved, and the application passes smoke testing."
9. Test Deliverables
● Identifies the documents and artifacts produced during and after the
testing process.
● Example: "Deliverables include test cases, test execution reports, defect
logs, a test summary report, and a final test closure report."
10. Test Tasks
● Outlines specific tasks to be completed as part of the testing process,
such as preparing test cases, executing tests, logging defects, and
creating reports.
● Example: "Tasks include creating functional test cases, executing smoke
tests, performing regression tests, documenting defects, and preparing the
final test summary report."
11. Environmental Needs
● Specifies the hardware, software, and network requirements for the test
environment.
● Example: "Testing will be conducted on Windows and macOS operating
systems with browsers Chrome, Firefox, and Safari. The test environment
requires a stable internet connection, access to the staging server, and a
test database."
12. Responsibilities
● Defines the roles and responsibilities of each team member involved in
testing.
● Example: "The test lead will oversee the testing process and report
progress to stakeholders. Test engineers will create and execute test
cases, while the automation engineer will develop and maintain automated
test scripts."
13. Testing Types and Objectives
● Lists the types of testing to be performed (e.g., functional, performance,
security) and their specific objectives.
● Example: "Functional testing will validate core features work as expected,
performance testing will measure response times and load capacity, and
security testing will verify data protection during transactions."
14. Staffing & Training Needs
● Identifies the necessary team members, their skills, and any required
training.
● Example: "The testing team includes one test lead, two functional testers,
and one performance tester. New testers will undergo training on the
Selenium automation tool and JIRA defect management to ensure smooth
test execution."
15. Schedule
● Provides a timeline for test activities, key milestones, and deadlines.
● Example: "Test execution will start on 1st December, regression testing on
10th December, and the final report will be submitted by 25th December.
Milestones include completing functional testing by 15th December and
performance testing by 20th December."
16. Risks and Contingencies
● Identifies potential risks during testing and outlines mitigation strategies.
● Example: "Potential risks include delayed code delivery, hardware
malfunctions, and resource unavailability. Mitigation strategies involve
allocating extra buffer time in the schedule, ensuring backup devices, and
having standby testers."
Q1. Any 4
1. Describe the contents of "Test Summary Report” used in test reporting with suitable
example.
A Test Summary Report provides a comprehensive overview of the testing phase, summarizing
test results, highlighting major defects, and making recommendations about the product’s
release. Attributes include:
1. Report Identifier
○ A unique ID for tracking the report.
○ Example: "TSR-BankingApp-2024-Q1"
2. Overview and Description
○ Briefly outlines the project and scope of testing.
○ Example: "This report summarizes the functional and performance testing
conducted on the banking mobile application.”
3. Test Objectives
○ States what the testing aimed to achieve.
○ Example: "To verify that all critical functionalities operate as expected and ensure
high-priority bugs are resolved.”
4. Test Execution Summary
○ Provides details on total test cases executed, passed, and failed.
○ Example: "150 test cases executed, 135 passed, 15 failed.”
5. Defect Summary
○ Lists critical and major defects found, including their current status.
○ Example: "Two high-severity defects were identified in the fund transfer module;
one is resolved, one is open.”
6. Test Results
○ Detailed results for each test phase, such as functional, regression, and
performance testing.
○ Example: "Regression testing showed no critical failures; performance testing
met required benchmarks."
7. Deviations
○ Describes any deviations from the test plan, such as schedule changes.
○ Example: "Testing schedule was extended by two days due to feature changes in
the transaction module."
8. Conclusion and Recommendation
○ Final assessment of product readiness.
○ Example: "The application is stable and meets all functional requirements.
Recommended for release.”
2. Design any two test cases for a chatting application and prepare a defect report.
3. How to select a testing tool? Explain in detail.
The industry experts have suggested following four major criteria for selection of testing tools.
1) Meeting requirements.
2) Technology expectations.
3) Training / skills.
4) Management aspects.
1. Meeting Requirements: The tool should align with the project’s needs to ensure efficient and
effective testing. Choosing an unsuitable tool can lead to wasted time and reduced
effectiveness. Furthermore, it should facilitate seamless collaboration among team members,
enhancing communication and improving project outcomes.
2. Technology Expectations: The tool must be compatible with the current technology and allow
for easy modifications without excessive costs or vendor dependency. Additionally, it should
integrate smoothly with existing systems to minimize disruption & enhance overall productivity.
3. Training/Skills: Proper training is essential for all users of the tool. Without adequate skills,
the tool may not be used to its full potential. Ongoing support and resources should also be
provided to help users stay updated on best practices and new features.
4. Management Aspects: The tool should be affordable and not require significant upgrades.
Consider the overall cost-benefit before making a final decision. It should also provide clear
analytics and reporting features to help management make informed decisions about resource
allocation and project progress.
1) Severity Wise
1. Major: A defect, which will cause an observable product failure or departure from
requirements.
2. Minor: A defect that will not cause a failure in execution of the product.
3. Fatal: A defect that will cause the system to crash or close abruptly or affect other
applications.
2) Status Wise:
1. Open: The defect is acknowledged and needs to be addressed.
2. Closed: The defect has been fixed and verified.
3. Deferred: The defect is postponed for future evaluation.
4. Canceled: The defect will not be fixed and is disregarded.
3) Work product wise:
1. SSD: A defect from System Study document
2. FSD: A defect from Functional Specification document
3. ADS: A defect from Architectural Design Document
4. DDS: A defect from Detailed Design document
5. Source code: A defect from Source code
6. Test Plan/ Test Cases: A defect from Test Plan/ Test Cases
7. User Documentation: A defect from User manuals, Operating manuals
4) Errors Wise:
1. Comments: Inadequate/ incorrect/ misleading or missing
comments in the source code
2. Data error: Incorrect data population / update in database
3. Database Error: Error in the database schema/Design
4. In correct Design: Wrong or inaccurate Design
5. Navigation Error: Navigation not coded correctly in source code
6. System Error: Hardware and Operating System related error, Memory leak