Test Plan For ERA 2
Test Plan For ERA 2
Test Plan For ERA 2
1. Introduction
1.1 Purpose
The purpose of this test plan is to define the scope, approach,
resources, schedule, and deliverables for the testing activities of
the ERA 2 project. The main objective is to verify that the ERA 2
system meets all functional and non-functional requirements as
specified in the Software Requirements Specification (SRS)
document.
1.2 Scope
The testing scope includes all functional modules, non-functional
aspects such as performance, usability, and security, and user
interfaces of the ERA 2 system. The testing will ensure that the
system operates correctly across various environments, including
different operating systems and browsers.
1.3 Objectives
Validate that all functionalities work according to the SRS.
Ensure the system meets performance, usability, and
security requirements.
Identify and document defects for timely correction.
Verify compatibility with specified browsers and operating
systems.
Confirm the system is user-friendly and meets end-user
expectations.
1.4 References
Software Requirements Specification (ERA 2 SRS
V1.5)
Project Plan Document
Industry Standards and Guidelines (e.g., ISO/IEC 9126,
IEEE 829)
2. Test Items
2.1 Modules to be Tested
User Management: Registration, Login, Role Management,
Permissions
Standards Management: Adding, Editing, Reviewing,
Publishing Standards
Projects Management: Creating, Editing, Reviewing
Projects
Requirements Management: Adding, Reviewing,
Traceability of Requirements
System Navigation: User Dashboard, Standard and Project
Navigation
Security Features: Authentication, Authorization, Data
Protection
Notifications: Alerts and Notifications System
2.2 Features to be Tested
Functional Testing:
o User Authentication and Role-Based Access
o CRUD Operations on Standards and Projects
o Requirement Traceability and Impact Analysis
o System Dashboard and Navigation
o Alerts and Notifications
Non-Functional Testing:
o Performance: System response times, resource usage,
and scalability
o Usability: Interface ease of use, accessibility, and
overall user experience
o Security: Data encryption, access control, and
vulnerability assessment
o Compatibility: Browser (Chrome, Firefox) and OS
(Windows) compatibility
2.3 Features Not to be Tested
External systems or APIs not integrated into the ERA 2
system.
Any features explicitly marked as out-of-scope in the SRS.
3. Test Strategy
3.1 Testing Approach
Testing will be conducted in phases, starting with unit testing by
developers, followed by integration testing, system testing, and
finally, user acceptance testing (UAT). The strategy is designed to
ensure comprehensive coverage of all functional and non-
functional requirements.
3.2 Testing Levels
Unit Testing: Individual components or modules will be
tested by developers.
Integration Testing: Interactions between modules, such
as user authentication and project management, will be
tested.
System Testing: Full end-to-end testing of the entire
system against the SRS.
User Acceptance Testing (UAT): Testing by end-users to
validate that the system meets their requirements and is
ready for deployment.
3.3 Testing Types
Functional Testing: Verifies that the system functions as
expected according to the SRS.
Performance Testing: Assesses the system’s performance
under various load conditions.
Security Testing: Ensures that the system is secure and
protects sensitive data.
Usability Testing: Validates that the user interface is
intuitive, accessible, and user-friendly.
Compatibility Testing: Ensures the system works on the
specified browsers and operating systems.
3.4 Test Environment
3.4.1 Hardware
Servers: Intel Xeon E5-2670 v3, 128GB RAM, 2TB SSD,
Windows Server 2019
Client Machines: Intel i7, 16GB RAM, 512GB SSD, Windows
10/11
3.4.2 Software
Operating Systems: Windows 10, Windows 11
Browsers: Chrome (v104+), Firefox (v91+)
Database: SQL Server 2019
Front-End: React v17.0.2
3.4.3 Network
Simulated network environments to test under different
conditions, including 100 Mbps, 1 Gbps, and low-bandwidth
scenarios.
3.5 Test Data
Test data will be prepared based on scenarios outlined in the SRS,
including:
User credentials for various roles (Admin, Project Manager,
Standard User).
Standard and project data for CRUD operations.
Requirements data for traceability testing.
Alerts and notifications data for testing the notification
system.
3.6 Entry and Exit Criteria
3.6.1 Entry Criteria
Approval of all relevant documents (SRS, design documents).
Completion of the test environment setup.
Availability of validated test data.
All necessary tools and resources are in place.
3.6.2 Exit Criteria
All planned tests have been executed.
All high-priority defects have been resolved or deferred with
proper justification.
Successful completion of UAT.
Approval of the final test reports.
3.7 Test Deliverables
Test Plan Document
Test Cases and Scenarios
Test Scripts
Test Data
Defect Reports
Test Execution Reports
Final Test Summary Report
4. Test Schedule
4.1 Milestones
Test Planning Completion: August 20, 2024
Test Case Development Completion: August 25, 2024
Unit Testing Completion: August 30, 2024
Integration Testing Completion: September 5, 2024
System Testing Completion: September 10, 2024
User Acceptance Testing Completion: September 15,
2024
Final Test Report Submission: September 20, 2024
4.2 Schedule
Task Start Date End Date
Test Planning August 10, 2024 August 20, 2024
Test Case
August 15, 2024 August 25, 2024
Development
Unit Testing August 20, 2024 August 30, 2024
Integration Testing August 25, 2024 September 5,
Task Start Date End Date
2024
September 1, September 10,
System Testing
2024 2024
September 10, September 15,
UAT
2024 2024
September 15, September 20,
Final Test Report
2024 2024
5. Test Management
5.1 Roles and Responsibilities
Test Manager: Sarah Johnson – Responsible for overseeing
the testing process, ensuring adherence to the schedule, and
managing resources.
Test Engineers: John Doe, Emily Clark – Responsible for
writing and executing test cases, logging defects, retesting
after fixes, and providing daily status updates.
Developers: Alice Brown, David Miller – Conduct unit tests
and fix defects found during testing.
UAT Participants: Selected End-Users from Client's
Organization – Validate that the system meets their
requirements and is ready for deployment.
5.2 Communication Plan
Daily Stand-up Meetings: 9:00 AM daily to discuss
progress, issues, and next steps.
Weekly Status Reports: Summarized progress, issues, and
risks sent every Friday by 5:00 PM.
Defect Triage Meetings: Held every Wednesday at 3:00
PM to prioritize and discuss defects.
Test Summary Reports: Delivered at the end of each
testing phase.
6. Risk Management
6.1 Risks
Schedule Delays: Potential delays due to unforeseen issues
in development or testing.
Incomplete Requirements: Ambiguities in requirements
may lead to testing challenges.
Resource Unavailability: Unavailability of testing
resources or environments.
High Number of Defects: A large number of defects may
delay testing and subsequent phases.
6.2 Mitigation Strategies
Buffer Time: Allocate 10% additional time in the schedule
to account for unexpected delays.
Requirement Clarification: Regular reviews with
stakeholders to ensure requirement clarity.
Backup Resources: Maintain a pool of backup resources to
mitigate unavailability risks.
Defect Prioritization: Focus on resolving high-priority
defects first and deferring lower-priority ones if necessary.
7. Test Process
7.1 Test Case Development
Test cases will be developed based on scenarios outlined in the
SRS. Each test case will include:
Test Case ID: TC-001, TC-002, etc.
Description: Detailed description of what is being tested.
Preconditions: Any conditions that must be met before the
test case is executed.
Test Steps: Step-by-step instructions to execute the test.
Expected Results: What the expected outcome should be.
Actual Results: The actual outcome during execution.
Status: Pass/Fail based on the comparison of expected and
actual results.
7.2 Test Execution
Test cases will be executed according to the test schedule. Results
will be logged, and any defects will be reported and tracked using
Jira. Retesting will be conducted after defect fixes.
7.3 Defect Management
Defects will be logged in Jira with detailed information, including:
Defect ID: Generated by Jira.
Description: Detailed description of the defect.
Steps to Reproduce: Clear steps to reproduce the defect.
Severity: Categorized as Critical, Major, Minor, or Trivial.
Priority: Assigned as High, Medium, or Low.
Status: Open, In Progress, Resolved, Closed.
Assigned To: Developer responsible for fixing the defect.
Resolution Date: Date when the defect is fixed and
verified.
7.4 Test Reporting
Daily Reports: Summarize the number of test cases
executed, passed, failed, and any blockers.
Weekly Reports: Provide a more detailed summary,
including defect status and testing progress.
Final Test Summary Report: A comprehensive report
summarizing all testing activities, outcomes, and
recommendations.