ST Project Final Report 11 (1) - Final
ST Project Final Report 11 (1) - Final
CHETAN S (1CR21IS039)
HARISH M (1CR21IS057)
Deepak LC (1CR21IS046)
CERTIFICATE
Certified that the Mini Project work in the Software Testing Laboratory entitled “Shoe
Portal testing using selenium” is a Bonafide work carried out by Chetan S (1CR21IS039)
in partial fulfillment for the award of Bachelor of Engineering in Information Science and
Engineering of the Visvesvaraya Technological University, Belgaum during the year 2023-
2024. It is certified that all corrections/suggestions indicated for internal assessment
have been incorporated in their port and deposited in the department library. The mini-
project report has been approved as it satisfies the academic requirements regarding
project work prescribed for the said Degree.
NAME: Deepak L C
USN: 1CR21IS046
Examiners Details:
This project involves the automated testing of a web-based Shoe Portal using Selenium
WebDriver, Java, TestNG, and Maven. The primary objective is to ensure the functionality
and reliability of the portal's user interface by automating navigating through the website,
performing login operations, and validating search functionalities.
The test scenario includes several key steps:
1. Clicking on the overflow menu on the index page.
2. Selecting the login option from the menu.
3. Logging into the portal using a predefined username and password.
4. Searching for the term "formal shoes" after logging in.
5. Verifying the presence of the search term in the results to determine the build's
success.
The project leverages the capabilities of Selenium WebDriver for browser automation,
ChromeDriver for interacting with the Chrome browser, and TestNG for structuring and
executing test cases. Maven is used for project management and dependency resolution.
The test case is meticulously designed to simulate real user interactions, ensuring that the
core functionalities of the Shoe Portal are working as intended. The successful completion
of these automated tests provides confidence in the stability and usability of the portal,
ultimately contributing to a higher-quality user experience.
The report elaborates on the setup process, including the configuration of the testing
environment, implementation of the test scripts, and execution of the test cases.
Additionally, it discusses the results of the tests, highlighting any issues encountered and the
steps taken to resolve them. This comprehensive testing approach underscores the
importance of automated testing in maintaining and improving the quality of web
applications.
iii
ACKNOWLEDGMENT
The satisfaction and euphoria that accompany a successful completion of any task would be
incomplete without the mention of the people who made it possible. Success is the epitome of hard
work and perseverance, but steadfast of all is encouraging guidance.
So, it is with gratitude that I acknowledge all those whose guidance and encouragement served
as beacons of light and crowned our effort with success. I would like to thank Dr. Sanjay Jain,
Principal, CMRIT, Bangalore, for providing an excellent academic environment in the college and
his never-ending support for the B.E program.
I would like to express my gratitude towards Dr. Jagadishwari V, Associate Professor and HOD,
Department of Information Science and Engineering CMRIT, Bangalore, who provided
guidance and gave valuable suggestions regarding the project.
I consider it a privilege and honor to express my sincere gratitude to our internal guides Prof.
Jayashree M and Prof. Kanika Agrawal, Assistant Professor, Department of Information Science
and Engineering, CMRIT, Bangalore, for their valuable guidance throughout the tenure of this
project work.
I would like to thank all the faculty members who have always been very cooperative and
generous. Conclusively, I also thank all the non-teaching staff and all others who have done
immense help directly or indirectly during our project.
Deepak L C
(1CR21IS046)
iv
Contents
1
1. Introduction
1.1 What is Software Testing 1
1.2 Types of Testing 1
1.3 About Automation Tool Used – Selenium 2
WebDriver
1.4 Problem Statement 3
1.5 Objective of the Project 3
2. Literature Survey 4
2.1 About Manual Testing 4
2.2 About Automation Testing 4
3. System Architecture and Design 6
3.1 Architecture/Methodology Used 6
3.2 Flowchart/Path Flow Diagram 6
4. System Architecture and Design 7
4.1 Test Cases Table 7
4.2 Black Box / White Box Testing 7
4.3 Code 8
5. Results/Output 10
6. Conclusion/Future Scope 12
6.1Conclusion 12
6.2 Future Scope 12
References 14
v
List of Figures
vi
Shoe Portal testing using selenium
Chapter 1
Introduction
Software testing is an indispensable process in the software development lifecycle (SDLC) that
involves the systematic evaluation of a software product or service to ascertain its adherence to
specified requirements, functionalities, and user expectations. This critical practice encompasses
a range of techniques and methodologies aimed at identifying defects, errors, or inconsistencies
within the software, thereby ensuring its reliability, usability, and overall quality. Testing acts as
a safeguard against potential malfunctions, security vulnerabilities, and performance bottlenecks
that could adversely impact the user experience. By subjecting the software to rigorous
examination through various types of tests, such as functional, non-functional, and regression
testing, organizations can significantly reduce the risk of releasing flawed software, enhance
customer satisfaction, and build a strong reputation for delivering reliable products.
• Manual Testing: This traditional approach relies on human testers to manually execute test cases
by interacting with the software, emulating real-world user scenarios. While flexible and adept at
uncovering usability issues, it can be time-consuming and prone to human error.
• Automated Testing: Leveraging specialized tools and scripts, automated testing enables the
execution of predefined test cases without human intervention. This methodology boasts increased
efficiency, repeatability, and the ability to cover a wider range of test scenarios.
• Unit Testing: A fundamental building block of software testing, unit testing focuses on verifying
the correctness of individual code units (functions, methods, or classes) in isolation. This helps
pinpoint defects early in the development cycle.
• Integration Testing: As the name implies, integration testing checks the interaction between
different software modules or components, ensuring seamless collaboration and data exchange.
• System Testing: Taking a holistic approach, system testing evaluates the entire integrated system
against its requirements, verifying that it behaves as expected as a whole.
• Acceptance Testing: Often performed by end-users or stakeholders, acceptance testing
determines whether the developed system satisfies the criteria for acceptance and meets the needs
and expectations of the intended audience.
• Performance Testing: This type of testing assesses the responsiveness, stability, and scalability
of the software under different workloads and usage patterns, ensuring optimal performance under
real-world conditions.
• Security Testing: A critical aspect of software testing, security testing involves identifying
vulnerabilities, weaknesses, or potential entry points for unauthorized access, data breaches, or
other security threats.
• Usability Testing: Focused on the user's perspective, usability testing evaluates how intuitive,
user-friendly, and efficient the software is in achieving user goals.
• Regression Testing: After modifications or updates to the software, regression testing is
conducted to ensure that existing functionalities have not been adversely affected and that new
issues have not been introduced.
• Exploratory Testing: A less structured approach, exploratory testing involves testers learning
about the software through experimentation and discovery, uncovering defects that might not be
caught by scripted tests.
Each of these testing types plays a crucial role in the comprehensive evaluation of software,
ensuring that it meets stringent quality standards and delivers a seamless user experience. By
adopting a combination of manual and automated testing techniques, organizations can optimize
their testing efforts, detect defects early, and ultimately release software that is both reliable and
user-centric.
Selenium WebDriver is a powerful and versatile open-source framework widely adopted for
automating web browser interactions. It empowers developers and testers to create scripts in
various programming languages (such as Java, Python, C#, Ruby, etc.) that can simulate user
actions on web pages, including clicking buttons, filling forms, navigating through links, and
validating expected outcomes. With its cross-browser compatibility, Selenium WebDriver can
interact with major browsers like Chrome, Firefox, Safari, Edge, and Internet Explorer, enabling
comprehensive testing across different platforms and environments.
One of Selenium WebDriver's core strengths lies in its ability to locate and manipulate web
elements through a variety of strategies, including IDs, class names, XPath expressions, and CSS
selectors. This flexibility allows testers to interact with even the most complex web page
structures. Furthermore, Selenium WebDriver offers implicit and explicit waits, allowing scripts
to pause and synchronize with the application under test, ensuring that elements are loaded and
ready for interaction before proceeding.
Beyond its core functionality, Selenium WebDriver integrates seamlessly with popular testing
frameworks like TestNG and JUnit, providing features like test organization, reporting, and
parallelization. This integration facilitates structured test development, enhances maintainability,
and optimizes test execution time. Additionally, Selenium WebDriver's extensible architecture
allows for the integration of third-party libraries and tools, expanding its capabilities and
addressing specific testing needs.
The Shoe Portal, a web-based application for browsing and purchasing shoes, requires rigorous
testing to ensure functionality and reliability. Manual testing of navigation, user authentication,
and search functionality is time-consuming, error-prone, and inefficient, especially as the
application evolves. There is a need for an automated testing solution to verify interactive
elements, secure login processes, and accurate search results while reducing the time and effort
required for repetitive tests.
The objective of this project is to develop and implement an automated testing framework for the
Shoe Portal using Selenium WebDriver, Java, TestNG, and Maven. This framework aims to:
1. Automate Navigation: Ensure seamless and consistent navigation through the Shoe
Portal's user interface.
2. Validate User Authentication: Test the login process to confirm secure and reliable
access for valid users while restricting invalid users.
3. Verify Search Functionality: Conduct searches for terms like "formal shoes" and ensure
the search results are accurate and relevant.
4. Improve Testing Efficiency: Reduce the time and effort required for repetitive testing,
thereby increasing overall testing efficiency and reducing human error.
5. Ensure Build Success: Provide immediate feedback on the application's performance and
reliability by validating key functionalities, and ensuring that new builds meet quality
standards.
By achieving these objectives, the project aims to enhance the quality, reliability, and user
experience of the Shoe Portal.
Chapter 2
Literature Survey
Manual testing is a time-tested approach in software quality assurance where human testers
meticulously execute test cases, interact with the software, and assess its functionality based on
predefined expectations. This hands-on approach involves meticulously following test scripts,
simulating user scenarios, and carefully observing the software's behavior.
1. Exploratory Testing: Manual testing allows testers to explore the software intuitively,
uncovering unexpected issues or edge cases that might not be covered in scripted test cases.
This is particularly valuable for uncovering usability problems or design flaws.
2. Flexibility and Adaptability: Manual testers can quickly adapt to changes in
requirements, user scenarios, or software updates. They can readily modify their testing
approach based on real-time observations.
3. Cost-Effective for Small Projects: For smaller projects with limited scope and budget,
manual testing can be a more cost-effective option, as it doesn't require the initial
investment in automation tools and infrastructure.
4. User Experience Focus: Manual testers can assess the software from a user's perspective,
providing valuable insights into usability, intuitiveness, and overall user experience.
1. Time-Consuming: Manual execution of test cases, especially for large and complex
applications, can be extremely time-consuming and resource-intensive.
2. Prone to Human Error: Repetitive tasks can lead to fatigue and mistakes, potentially
missing critical defects.
3. Limited Test Coverage: Due to time and resource constraints, manual testing may not be
able to cover all possible combinations of inputs and scenarios, leading to potential gaps in
test coverage.
4. Non-Repeatable: Manual tests are not easily repeatable, making it challenging to
consistently reproduce and diagnose defects.
Automation testing represents a paradigm shift in software testing, where software tools and scripts
take center stage in executing test cases, validating results, and generating comprehensive reports.
This methodology has gained significant traction due to its ability to increase efficiency, accuracy,
and test coverage.
1. Efficiency and Speed: Automated tests can be executed rapidly, significantly reducing the
time required for testing cycles. This accelerated feedback loop enables faster identification
and resolution of defects.
2. Increased Accuracy: By eliminating human error, automated tests provide more
consistent and reliable results, ensuring that the software behaves as expected under various
conditions.
3. Expanded Test Coverage: Automation enables the execution of a vast number of test
cases across different environments, platforms, and configurations, leading to broader test
coverage and identifying issues that might be missed in manual testing.
4. Cost-Effective for Large Projects: While the initial setup of automated tests requires
investment, it can be highly cost-effective in the long run, especially for large-scale projects
with extensive test suites.
5. Reusability and Scalability: Automated test scripts can be reused for different versions
of the software, and the test suite can be easily scaled to accommodate new functionalities.
6. 24/7 Availability: Automated tests can be scheduled to run at any time, even outside of
business hours, enabling continuous testing and faster feedback.
7. Integration with CI/CD: Automated tests seamlessly integrate into CI/CD pipelines,
providing immediate feedback on code changes and ensuring that each build meets quality
standards.
1. Initial Investment: Setting up and maintaining automated test scripts requires upfront
effort, skilled resources, and investment in automation tools.
2. Limited Flexibility: Automated tests are less flexible than manual tests and may not be
able to adapt to unexpected scenarios or changes in requirements as easily.
3. Maintenance Overhead: Automated test scripts need to be updated and maintained as the
software evolves, adding to the overall maintenance effort.
4. False Positives/Negatives: Automated tests can sometimes produce inaccurate results due
to scripting errors, environment issues, or unexpected application behavior.
5. Not a Replacement for Manual Testing: Automation testing cannot entirely replace
manual testing. Manual testing is still essential for exploratory testing, usability testing,
and verifying aspects that are difficult to automate.
Chapter 3
System Architecture and Design
The system architecture for the automated testing framework of the Shoe Portal is designed using
the following components and methodologies:
Methodology
1. Page Object Model (POM): Separates test code from page-specific elements and actions
to enhance maintainability.
2. Data-Driven Testing: Allows for running tests with various data inputs to ensure
comprehensive coverage.
3. Continuous Integration (CI): Integrates automated tests into the build process for
immediate feedback on code changes.
This architecture ensures efficient, maintainable, and reliable testing of the Shoe Portal.
This flowchart visually represents the sequence of steps in the automated testing process and the
decision points for managing the outcomes
Chapter 4
System Architecture and Design
TC001 Verify navigation to the login page The login page is displayed Pass
TC002 Validate login with valid credentials The user is successfully logged in Pass
The login fails, error message is
TC003 Validate login with invalid credentials Pass
displayed
Search results display items related to
TC004 Search for "Formal Shoes" after login Pass
"Formal Shoes"
Verify the presence of "Formal "Formal Shoes" is present in search
TC005 Pass
Shoes" in search results results
The testing implemented for the Shoe Portal is a black-box testing approach. This method focuses
on evaluating the functionality of the Shoe Portal by testing its user interface and interactions
without knowledge of the internal code or system architecture. The tests check whether the portal
performs as expected in scenarios such as logging in, searching for products, and verifying search
results based on the requirements provided.
4.3 Code
This was all about the Code and the System Architecture, the following pages will convey about
the Results and the Future scopes related to the project.
Chapter 5
Results/Output
5.1 Snapshots
Fig 4 Execution
This showed how the project has been implemented and what are the things that are done.
Chapter 6
Conclusion/Future Scope
6.1 Conclusion
The automated testing of the Shoe Portal successfully validates the core functionalities of
the system through a black-box testing approach. By focusing on the user interface and
expected outcomes, the testing ensures that key features such as login, search
functionality, and result verification meet the specified requirements. The implementation
of these tests demonstrates the portal's reliability in providing accurate search results and
user interaction.
The use of Selenium with Java enabled efficient and effective test automation,
highlighting both the system’s strengths and areas for potential improvement. Overall, the
testing process confirms that the Shoe Portal operates as intended, providing a solid
foundation for further development and refinement.
• Include Additional Test Cases: Extend testing to cover more scenarios, including
edge cases and negative test cases, to ensure comprehensive validation of all
functionalities.
• Mobile Testing: Adapt tests for mobile versions of the Shoe Portal to ensure
functionality and usability across different devices and screen sizes.
2. Integration Testing:
• API Testing: Implement tests for backend APIs to verify data integrity and response
accuracy between the frontend and backend systems.
• Integration with Payment Systems: Test the integration of payment gateways to
ensure secure and smooth transactions within the portal.
3. Performance Testing:
• Load Testing: Assess how the portal handles high traffic volumes and multiple
simultaneous users to ensure stability and performance under stress.
• Response Time Analysis: Measure and optimize the response time of critical
features such as search and login processes.
• Automate Test Execution: Integrate automated tests into the CI/CD pipeline to
enable continuous testing and faster feedback during development.
• Regular Updates: Schedule regular updates and maintenance of test scripts to align
with new features and changes in the portal.
• Usability Testing: Conduct usability tests to gather user feedback and improve the
overall user experience and interface of the Shoe Portal.
• Accessibility Testing: Ensure the portal meets accessibility standards to provide a
better experience for users with disabilities.
These future improvements will enhance the robustness, usability, and performance of the
Shoe Portal, ensuring it meets evolving user needs and technological advancements.
References