0% found this document useful (0 votes)
14 views61 pages

Third Edition Software Testing & Quality Report

Uploaded by

Farid Rahman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views61 pages

Third Edition Software Testing & Quality Report

Uploaded by

Farid Rahman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

Company Size

Through this examination, our report underscores a nuanced evolution within the QA landscape.
This evolution, marked by gradual shifts rather than abrupt changes, is closely linked to the pace
of software delivery. Thorough testing processes drive incremental improvements over time,
fostering steady progress instead of sudden transformations.

THIRD EDITION

Software Testing
& Quality Report

Software Testing & Quality Report - 1


Contents
Introduction 03

Forward 06

State of Quality Report 08

QA Process and Benchmarks 22

Challenges, Priorities, & KPIs 37

The Future of Testing 46

Survey Details 50

Conclusion 59

About TestRail 61

Software Testing & Quality Report - 2


Introduction

Drawing on insights from thousands of quality assurance (QA) teams worldwide, we present the
findings from an in-depth exploration of current challenges, priorities, and emerging trends to
offer a comprehensive look into the steady evolution of software testing in 2023.

Since its inception in 2018, our annual user survey has played a pivotal role in our mission
to understand and address the evolving needs of QA teams worldwide. By connecting with
thousands of QA teams globally, we gain valuable insights into their current practices, challenges,
and priorities, allowing us to deliver timely solutions and drive excellence in test management
systems.

The survey for the Software Testing & Quality industry report gathered insights predominantly
from professionals directly involved in testing processes. Knowing that the respondents are
mainly hands-on testing professionals, such as QA/Test Engineers and Analysts, ensures that the
insights derived from the survey are grounded in practical experience and are directly applicable
to those in similar roles.

Software Testing & Quality Report - 3


Introduction

The geographical distribution of respondents, with a significant portion from the USA and India,
indicates that the survey results are likely influenced by the practices and industry standards
prevalent in these major tech hubs. This is vital for benchmarking global and regional practices
and adapting strategies that cater to these influential markets.

The representation of key industries like computer software, healthcare, financial services,
and game development highlights the sectors where testing and QA are most critical, guiding
industry-specific analyses and improvements. The increasing size of QA teams reported in the
survey underscores a growing recognition of the importance of quality assurance in product
development, indicating a broader industry trend towards more robust testing practices.

Software Testing & Quality Report - 4


Introduction

In line with our commitment to providing valuable insights, the third edition of the Software
Testing & Quality report continues this tradition, equipping QA teams with actionable insights to
navigate the complexities of testing in a steadily evolving landscape.

This report delves into the current state of QA, examining survey findings across three main areas:
Current trends in development, processes, and testing
QA responsibilities, challenges, and priorities
The future of testing

Software Testing & Quality Report - 5


SECTION 01

Forward

Software Testing & Quality Report - 6


Forward

Now in its third edition, the Software Testing & Quality Report provides both user-sourced insights
and expert analysis on critical operational areas of software testing and quality assurance
experienced by our TestRail community. Our motivation to engage annually with our users aligns
with the heuristics-based approach that TestRail uses to identify and confirm test management
gaps and product-led enhancements while aligning with industry trends.

In many industries, rapid progression and dramatic shifts in technology are both experienced and
expected. While many had expectations coming into 2023 that technologies such as machine
learning (ML) and artificial intelligence (AI) would impact software testing, the reality of the users
we surveyed indicates that these technologies still have a long way to go to live up to the hype.
Now more than ever, organizations are focusing on proven tools and technologies that will allow
teams to develop and deliver faster all while maintaining set standards of software quality. It’s
possible that one day AI and ML will revolutionize testing as we know it, but current practices
indicate more “AI assisted” approaches that still require a human touch, such as AI-enhanced test
case development.

One consistent area of both progression and evolution in the user survey is the increasing
demand for organizations to ship releases faster, with higher quality, all while leveraging reduced
engineering and testing resources. Given the global instability in the technology job market, many
teams are leveraging pooled resourcing using centralized Testing Centers of Excellence (TCOE) or
external contracting firms.

A byproduct of the lack of resources and the desire to increase agility and decrease time to market
is an ever-growing QA tech stack. In many cases, this array of tooling lacks cohesion and leads to
splinters and silos within QA teams—which continues to drive the need for a single source of truth
in a centralized test management solution.

Overall, this year’s user survey indicates that the QA is in a state of evolution rather than revolution.
AI and ML didn’t change the landscape of the industry overnight, but they are showing incremental
promise. Organizations are focused on shipping releases faster, but even more focused shipping
releases bug-free. Technologies like test automation and CI/CD have made testing more efficient,
but now the challenge is finding cohesion in an increasingly siloed tech stack.

This is the state of QA: taking small steps forward wherever we can, but laser-focused on making
the most out of the resources we have. We hope this report proves to be a useful resource itself—
here, you have 4,000 other QA professionals on-hand, ready to tell you about their wins, their
struggles, their goals, and their hopes for the future.

We hope you’ll finish this report inspired by what you’ve learned, and as optimistic about the future
of QA as we are.

Happy testing,
Simon Knight
Lead Product Manager

Software Testing & Quality Report - 7


SECTION 02

State of Quality
Report

Software Testing & Quality Report - 8


State of Quality Report

Trends in Development and Testing


The “Trends in Development and Testing” section focuses on understanding trends in manual
versus automated testing and the dynamic tooling landscape used by development teams. It’s
divided into two parts to provide a comprehensive analysis.

In the first part, “Manual vs Automated Testing,” we analyzed survey data to understand
the trends between manual and automated testing to provide insights into their usage and
effectiveness as practices and tooling evolve.

The second part, “QA Testing Tools and Technologies,” explores the diverse tech stacks used by
testing teams and the trends in tool adoption within the industry. From managing requirements
and defects through industry-standard solutions like Jira, to embracing advancements in CI/CD
tooling, respondents share valuable insights that reveal the current landscape of development
and testing trends.

Continue exploring this section to uncover insights on the following topics:


Testing performed manually
Testing performed with automation
Automated tool tech stacks
Tools used for requirement and defect tracking
Tools used for CI/CD

Software Testing & Quality Report - 9


State of Quality Report

Manual vs Automated Testing


Section Questions:

What kinds of testing does your organization currently do MANUALLY?

What kinds of testing does your team run with TEST AUTOMATION?

On average, how many automated tests does your organization run per day?

What percentage of your tests are automated versus manual?

2024 Annual Customer Survey Report - 10


State of Quality Report

What kinds of testing does your organization currently


do MANUALLY?
Total Respondants: 2,426

Key Findings
Manual testing remains the dominant method across various types of testing, with an increase
compared to the previous year. Functional, regression, smoke, and end-to-end testing are
most commonly performed manually, which is to be expected as these tests often establish the
foundational success and performance of a build.

While the data shows a lift in manual testing across the board compared to previous years, two
types of testing did not see an increase: regression and integration testing.

Software Testing & Quality Report - 11


State of Quality Report

What kinds of testing does your team run with TEST


AUTOMATION?
Total Respondants: 2,234

Key Findings
The top three automated testing types are regression, functional, and unit testing, reflecting the
necessity of thoroughly validating builds before proceeding with additional testing.

Notable changes from the previous year include a decrease in integration and unit testing,
alongside an increasing emphasis on automating functional testing.

Software Testing & Quality Report - 12


State of Quality Report

On average, how many automated tests does your


organization run per day?
Total Respondants: 2,288

Key Findings
While it can be difficult to measure how many automated tests an organization is running, overall
the data is showing that companies are increasing their number of automated tests. Specifically,
the data shows shows that there has been a decrease in the percentage of teams conducting 0 -
100 tests per day year-over-year, with 64% of respondents reporting that they now perform more
than 100 tests daily.

Software Testing & Quality Report - 13


State of Quality Report

What percentage of your tests are automated versus


manual?
Total Respondants: 2,190

Key Findings
Each year we ask respondents to share the percentage of tests they currently automate versus
the percentage of tests they desire to automate in the future. In previous years, the gap between
current and desired automation levels has been about 20%.

Slowly but surely that gap has narrowed with this year’s results seeing just a 16% difference. On
average, respondents reported automating 40% of their tests but aim to increase that to 56% next
year. This trend suggests that teams are actively working to close the gap between current and
desired automation levels.

Software Testing & Quality Report - 14


State of Quality Report

QA Testing Tools and Technologies


Section Questions:

What test automation tools, suites, or frameworks do you use?

How does your team document and track requirements?

What tool does your team use to track defects/bugs?

How many Jira users are in your organization?

Does your team use continuous integration/continuous deployment (CI/CD)


in your development process? If so, what tool or platform does your team use?

2024 Annual Customer Survey Report - 15


State of Quality Report

What test automation tools, suites, or frameworks do


you use?
Total Respondants: 2,122

Key Findings
This year’s survey found that QA professionals continue to prefer open-source frameworks and
tools for test automation. 44% of respondents report using Selenium, then Cypress automation as
the second most used tool, followed by Java frameworks leveraging TestNG.

Software Testing & Quality Report - 16


State of Quality Report

How does your team document and track requirements?


Total Respondants: 2,122

Key Findings
QA teams were also surveyed on the tools they use to document and track requirements. Jira
maintains its popularity year-over-year with 76% of respondents reporting its use to track their
requirements.

Notably, many of the tools respondents report using for requirement tracking are work
management tools (such as Confluence, Word or Google Docs, and spreadsheets) rather than
tools designed specifically for requirement tracking.

Software Testing & Quality Report - 17


State of Quality Report

What tool does your team use to track defects/bugs?


Total Respondants: 2,164

Key Findings
Jira is by far the most popular defect tracking tool, with the majority (82%) of respondents relying
on it to track defects and bugs during testing.

While Azure DevOps and GitHub Issues are also notable tools used by testing teams, there is a
significant gap between their usage and that of Jira, with less than 10% of respondents utilizing
these alternatives.

Software Testing & Quality Report - 18


State of Quality Report

How many Jira users are in your organization?


Total Respondants: 2,159

Key Findings
49% of respondents reported having over 100 Jira users in their organization. This Indicates a
trend of increasing Jira users over time, as the categories for 1-10 users, 11-50 users, and 51-100
users seem to be decreasing annually.

Software Testing & Quality Report - 19


State of Quality Report

Does your team use continuous integration/continuous


deployment (CI/CD) in your development process? If so,
what tool or platform does your team use?
Total Respondants: 2,059

Key Findings
CI/CD tools have become essential components of the development and testing process,
particularly with the adoption of automation within DevOps practices. Jenkins has consistently
maintained its position as the most popular tool among respondents for the past four years, with
38% still selecting it as their preferred CI/CD tool. However, GitHub Actions and GitLab CI/CD
Pipelines are the second and third most popular tools, respectively.

The data indicates a rising adoption of GitHub Actions, GitLab CI/CD Pipelines, and Azure
Pipelines among respondents. Conversely, there has been a decrease in adoption for Bitbucket
Pipelines, TeamCity, CircleCI, and Travis CI.

Software Testing & Quality Report - 20


State of Quality Report

Section Summary
The survey data provides valuable insights into the evolving landscape of development and
testing, particularly concerning the interplay between manual and automated testing and the
utilization of testing tools and technologies.

Despite the increasing emphasis on automation, manual testing continues to play a significant
role in QA processes, with respondents indicating a rise in manual testing compared to the
previous year.

The data reflects a positive trajectory in automation adoption, with an increasing number of
companies running automated tests daily. Additionally, there is a narrowing gap between the
percentage of tests currently automated and the desired future state, indicating a concerted
effort by teams to bridge this divide and automate a larger portion of their tests.

Open-source frameworks like Selenium continue to dominate the testing tools arena , reflecting
a preference for flexible, community-driven solutions among QA professionals. Similarly, Jira
remains the go-to choice for requirement tracking and defect management.

The integration of CI/CD tools into the development and testing pipeline is on the rise, with
Jenkins retaining its position as the most popular choice. However, there is a notable uptick in
adoption for GitHub Actions, GitLab CI/CD Pipelines, and Azure Pipelines suggesting a shift
towards more modern and integrated CI/CD solutions.

Overall, the survey data underscores a continued emphasis on automation, a preference for open-
source tools, and a gradual evolution towards more streamlined and integrated testing processes.

Software Testing & Quality Report - 21


SECTION 03

QA Process and
Benchmarks

Software Testing & Quality Report - 22


QA Process and Benchmarks

QA Process and Benchmarks


The “QA Processes and Benchmarks” section provides an in-depth analysis of current testing
practices and benchmarks within the industry. It’s divided into two sections:

This first section “QA Practices” explores the testing landscape from test definition to release
deployment, highlighting the importance of collaboration among internal and external teams
across different methodologies.

The second “QA Benchmarks” section delves deeper into best practices, examining tasks and
strategies utilized by teams to gauge their operational efficiency and performance.

Continue exploring this section to uncover insights on the following topics:


Who defines tests within organizations
Release cycles
Development and testing methodologies
Partnering with external organizations
Defect backlog management
Compliance strategies
Test planning in the development lifecycle
Root cause analysis (RCA)
Testing activities management

Software Testing & Quality Report - 23


QA Process and Benchmarks

Manual vs Automated Testing


Section Questions:

How often do these roles define tests in your organization?

Does your team use the following development methodologies or techniques today?

How often does your organization deploy new releases or ship new products?

How many external organizations do you partner with to help with testing?

2024 Annual Customer Survey Report - 24


QA Process and Benchmarks

How often do these roles define tests in your


organization? (1 means they never define tests and 5 means they always define tests)
Total Respondants: 2,261

Key Findings
The primary team members defining tests are QA/Tester team members scoring an average
of 4.6 out of 5. Following closely, software developers also commonly play a significant role in
defining tests.

Other team members also contribute to defining tests albeit to a lesser extent. On average,
non-QA team members are defining tests about half of the time scoring 2.4 out of 5. This group
includes software developers, product managers, project/implementation managers, DevOps
engineers, UX/UI designers, business analysts, compliance teams, and external partners.

Software Testing & Quality Report - 25


QA Process and Benchmarks

Does your team use any of the following development


methodologies or techniques today?
Total Respondants: 2,157

Key Findings
Development teams across the industry employ a diverse range of methodologies and
techniques. The top three methodologies used by testing teams today include Agile (69%), Scrum
(55%), and CI/CD (34%). The adoption rates of Agile and Scrum are notably higher, with at least
20% more usage compared to any other methodology.

Software Testing & Quality Report - 26


QA Process and Benchmarks

How often does your organization deploy new releases


or ship new products?
Total Respondants: 2,195

Key Findings
The decision on release cadence is often determined by an organization’s specific needs,
resulting in release cycles that vary from daily to quarterly or longer intervals. The majority (22%)
of survey respondents indicated that their organization deploys bi-weekly.

Weekly and monthly release deployments are also prevalent, each accounting for 18% of
responses.

Notably, 69% of organizations are deploying new releases monthly or more frequently. This
suggests that companies continue to find value in deploying frequently.

Software Testing & Quality Report - 27


QA Process and Benchmarks

How many external organizations do you partner with to


help with testing your product(s)?
Total Respondants: 2,195

Key Findings
To address the demand for accelerated testing, many teams opt to augment their resources by
partnering with an external organization like a consultant or contractor.

While 39% of respondents reported managing all testing in-house, a substantial 60% of
respondents indicated that they are using at least one external organization to help them with
testing. Additionally, 41% of respondents said they partner with at least two external organizations
to help test their product(s).

Software Testing & Quality Report - 28


QA Process and Benchmarks

QA Benchmarks
Section Questions:

What does your defect backlog trend look like?

What strategies do you employ to ensure compliance with industry-specific regulations


while maintaining efficient testing processes?

At what stage in the development lifecycle does test planning start?

Does your team conduct root cause analysis on escaped defects or issues?

How are activities for testing, quality, and automation managed as part of a backlog?

What would you consider the rate of your hotfixes versus your planned releases?

2024 Annual Customer Survey Report - 29


QA Process and Benchmarks

What does your defect backlog trend look like?


Total Respondants: 2,244

Key Findings
As testing teams develop and refine processes they also need to consider how to manage
specific testing and automation activities. Often, this requires placing these activities in a backlog.

In an ideal world, backlogs would diminish over time. However, survey results show that 36% of
respondents reported an increasing backlog, while 38% indicated that their backlog remained
relatively the same, and 27% reported a decrease in their backlog.

Software Testing & Quality Report - 30


QA Process and Benchmarks

What strategies do you employ to ensure compliance


with industry-specific regulations while maintaining
efficient testing processes?
Total Respondants: 2,317

Key Findings
Testing teams operating in regulated industries like finance, healthcare, and energy must
ensure compliance with industry-specific standards such as Health Insurance Portability and
Accountability Act (HIPAA), General Data Protection Regulation (GDPR), or Sarbanes–Oxley Act
(SOX). To ensure compliance with these standards, teams can employ various strategies.

Over half of the survey respondents responsible for compliance maintenance indicated they
do this by reviewing requirements with the entire team during planning sessions. Additionally,
common approaches include linking test artifacts to user stories (40%), clearly defining a
‘definition of done’ (37%), and having a subject matter expert provide acceptance demos for the
project scope (35%).

Software Testing & Quality Report - 31


QA Process and Benchmarks

At what stage in the development lifecycle does test


planning start?
Total Respondants: 2,209

Key Findings
Test planning can begin at various stages within the Software Development Life Cycle (SDLC),
but initiating it earlier is generally advantageous. The survey data shows that most respondents
(51%) start test planning early in the development lifecycle, specifically during the design or
requirements phase.

Additionally, 33% of respondents said they start test planning while code is actively being
developed, while the remaining 16% initiate test planning after a feature has been coded and
developed.

Software Testing & Quality Report - 32


QA Process and Benchmarks

Does your team conduct root cause analysis on escaped


defects or issues?
Total Respondants: 2,213

Key Findings
Root cause analysis plays a crucial role in identifying and addressing the actual cause of defects
rather than just the symptoms. The survey results highlight the importance of RCA, with the
majority (79%) of respondents stating that their teams perform RCA on escaped defects.
Conversely, significantly fewer (21%) respondents indicated that their teams do not conduct any
kind of RCA.

Software Testing & Quality Report - 33


QA Process and Benchmarks

How are activities for testing, quality activities, and


automation managed as part of a backlog?
Total Respondants: 2,166

Key Findings
Testing organizations often need to employ a strategy to manage a variety of activities beyond
test execution. These activities may encompass plans for automation, addressing technical debt,
or even determining necessary actions.

According to survey data, 75% of respondents have some activities planned as part of a backlog.
Among these activities, 29% of respondents plan to automate most activities, 11% have plans to
address tech debt, and 35% are actively tracking, managing, and addressing tech debt.

Software Testing & Quality Report - 34


QA Process and Benchmarks

What would you consider the rate of your hotfixes versus


your planned releases?
Total Respondants: 2,258

Key Findings
The survey data reveals a distribution of hotfix rates among respondents. Notably, 15% reported a
high hotfix rate, suggesting frequent issues requiring immediate attention. This could potentially
signify areas for improvement in software stability or deployment processes.

Meanwhile, the majority (47%) reported a medium hotfix rate, indicating a balance between
occasional issues and overall stability—a common scenario in software maintenance.

Interestingly, a significant portion (38%) reported a low hotfix rate, suggesting a relatively stable
software environment with fewer urgent issues.

Software Testing & Quality Report - 35


QA Process and Benchmarks

Section Summary
The survey data provides valuable insights into the state of QA processes and benchmarks
within the industry, illustrating the crucial role of QA and tester team members in test definition,
augmented by significant contributions from software developers.

The prevalence of Agile, Scrum, and CI/CD methodologies reflects the adaptability of testing
teams to modern development practices. Moreover, a diverse range of release cadences,
spanning from daily to quarterly, underscores the importance of flexibility in deployment
strategies.

To enhance testing efficiency, many teams are leveraging external resources, with a majority
partnering with at least one external organization for testing purposes. This collaborative
approach allows teams to augment their capabilities and effectively address resource constraints.

Early test planning in the development lifecycle emphasizes the importance of proactive testing
strategies, while widespread adoption of root cause analysis demonstrates a commitment to
quality improvement.

Despite the challenges in backlog management, with a significant portion experiencing an


increase in backlog size, the prioritization of activities within backlogs reflects a strategic
approach to managing testing-related tasks, with automation and tech debt emerging as key
focus areas for many teams.

The distribution of hotfix rates among respondents suggests varying degrees of software stability
and deployment efficiency, indicating potential opportunities for improvement.

Overall, the findings underscore the paramount importance of collaboration, both internally
and externally, along with a proactive approach to the strategic management of testing-related
activities.

Software Testing & Quality Report - 36


SECTION 04

Challenges, Priorities
and KPIs

Software Testing & Quality Report - 37


Challenges, Priorities, and KPIs

Challenges in Testing
Section Questions:

What kinds of compliance or regulatory standards does your QA team have to abide by?

What are your/your team’s biggest challenges around testing & QA right now?

2024 Annual Customer Survey Report - 38


Challenges, Priorities, and KPIs

What kinds of compliance or regulatory standards does


your QA team have to abide by?
Total Respondants: 1,973

Key Findings
A majority (63%) of respondents report that their teams are not governed by specific compliance
or regulatory standards. This suggests a diverse project landscape and potentially varying levels
of risk management strategies in place. Among those who are required to navigate compliance-
related complexities, ISO 9001 (quality management systems standards) emerges as the most
prevalent standard (22%), emphasizing its significance in quality management processes. Close
behind, ISO/IEC 27001 (international standard for information security management) is adhered
to by 17% of respondents, underscoring the critical nature of information security management in
today’s digital age.

Furthermore, HIPAA (12%), GDPR/CCPA/LGPD (10%), and SOC 2 (9%) reflect the increasing focus
on data protection and privacy in healthcare and general business practices. Interestingly, less
common standards like PCI DSS, FedRAMP, FDA GxP, HITRUST, FISMA, and GAMP5 demonstrate
the wide-ranging and specific compliance needs across different sectors.

This diverse regulatory landscape highlights the importance for QA teams to remain agile and
well-informed, ensuring their testing strategies effectively meet compliance requirements while
maintaining efficiency.

Software Testing & Quality Report - 39


Challenges, Priorities, and KPIs

What are your/your team’s biggest challenges around


testing & QA right now?
Total Respondants: 1,856

Key Findings
While the challenge of developing automated tests remains a concern, its prominence has slightly
declined from 48% in 2020 to 41% in 2023. This suggests improving adaptability or the maturation
of automation tools, even as the issue remains prominent.

Time constraints continue to pose a significant hurdle, with 32% of respondents in 2023
expressing difficulty finding enough time to complete testing tasks, a slight increase from 31% in
2020. This pressure underscores the constant demand for development teams to deliver quality
results swiftly.

Moreover, ensuring adequate staffing for testing roles has become more concerning, rising from
28% in 2020 to 32% in 2023. This may indicate some economic disruption in the technology
sector and a growing recognition of the importance of testing roles.

End-to-end testing and managing data/testing environments have seen some fluctuation in
concern but remain significant issues. In 2021, end-to-end testing was a concern for 35% of
teams, dropping to 23% in 2023, possibly reflecting advancements in integration testing tools and
methodologies.

Software Testing & Quality Report - 40


Challenges, Priorities, and KPIs

Testing Priorities and KPIs


Section Questions:

What are your/your team’s top objectives around testing & QA right now?

how important each of the following priorities is in your QA processes?

Which metrics or KPIs does your team track and report on?

2024 Annual Customer Survey Report - 41


Challenges, Priorities, and KPIs

What are your/your team’s top objectives around testing


& QA right now?
Total Respondants: 1,640

Key Findings
In 2023, reducing bugs in production emerged as the top priority with a score of 8.5, indicating
a continuous effort to improve product reliability. Streamlining testing processes and expanding
test automation closely followed, scoring 8.2 and 8.0 respectively, signaling a shift towards more
efficient and automated test environments.

Comprehensive test coverage retained its significance, tied with automation at 8.0, indicating a
commitment to thorough testing across all aspects of software.

Notably, incorporating testing earlier in the development cycle, a practice known as “shift left,”
scored 6.6, reflecting a growing adoption of proactive testing strategies. Other areas, such as
reducing the testing cycle for new releases and expediting bug resolution times, also received
considerable attention, emphasizing the importance of speed in the testing process.

Involving a broader range of stakeholders in testing processes was the least prioritized objective,
with a score of 4.8, pointing towards a more targeted engagement strategy in testing activities.
These insights reflect an evolving landscape where quality, efficiency, and automation are key
drivers in testing strategies.

Software Testing & Quality Report - 42


Challenges, Priorities, and KPIs

On a scale of 1 to 5, rate how important each of the


following priorities is in your QA processes.
(Rank from highest [1] to lowest [12] priority.)
Total Respondants: 1,855

Key Findings
“Being more efficient” consistently emerges as the top priority, with a weighted average of 4.2,
emphasizing a continual push for streamlined processes. Completing testing promptly and
ensuring test relevance also hold significant importance, both maintaining a weighted average of
3.9 in 2023.

There’s a noted increase in the value placed on “Improving collaboration with the development
team,” rising to 3.9, indicating a heightened drive for integration between QA and development
teams.

Conversely, areas such as “Automating more of your team’s test cases” and “Maintaining
traceability between requirements, tests, and defects” saw slight decreases, suggesting evolving
challenges or shifts in both testing and overall quality assurance. “Maintaining compliance with
regulatory requirements” saw a dip to 3.4, hinting at a stable yet less critical concern compared to
other areas.

Software Testing & Quality Report - 43


Challenges, Priorities, and KPIs

Which metrics or KPIs does your team track and report on?
Total Respondants: 1,941

Key Findings
The survey results uncover insightful trends regarding the metrics and KPIs prioritized by QA
teams. A significant 50% of teams focus on test pass/fail rate, underscoring its importance in
assessing the immediate success of testing efforts.

45% of respondents prioritize tracking the number of defects reported in production,


emphasizing the significance of identifying issues before they impact end-users. Operational
efficiency and effectiveness are further illustrated by the 29% of teams tracking the total
number of tests executed, with other crucial metrics like testing progress, test cycle time, and
requirements coverage following closely behind.

Interestingly, the survey points to a balanced approach between traditional testing metrics
and the growing recognition of automation’s role in QA. This is evident in respondents tracking
automated tests executed (19%) and the percentage of automated versus manual tests (19%).
Such findings reflect an understanding of QA’s evolving landscape, blending efficiency with
comprehensive test coverage.

Software Testing & Quality Report - 44


Challenges, Priorities, and KPIs

Section Summary
This year’s survey data provides a comprehensive overview of the challenges facing QA
teams and their evolving priorities. Concerning challenges, the diverse regulatory landscape
underscores the need for teams to remain agile and informed, ensuring their testing strategies
are both effective and compliant. Persistent time constraints and staffing issues indicate the
constant pressure QA teams face to deliver quality results swiftly and the growing awareness of
the importance of QA roles.

In terms of priorities, our respondents’ focus is squarely on enhancing software quality


and operational efficiency. Top priorities include reducing bugs in production, streamlining
testing processes, and expanding test automation, reflecting a shift towards more efficient
and automated QA environments. Additionally, there is continued interest in proactive testing
strategies like “shift left,” emphasizing a trend towards integrating testing earlier in the
development cycle and making quality an ongoing process.

In terms of metrics and KPIs, there is a balanced approach between traditional testing metrics
and the growing recognition of automation’s role in the overall testing and quality assurance
strategy. Key indicators such as test pass/fail rate and the number of defects reported in
production highlight the immediate success of testing efforts and the importance of identifying
issues before they affect end-users. This nuanced understanding of QA’s evolving landscape
emphasizes efficiency alongside comprehensive test coverage.

Software Testing & Quality Report - 45


SECTION 05

The Future of Testing

Software Testing & Quality Report - 46


The Future of Testing

Trends in Development and Testing


In “The Future of Testing” section, we begin exploring technological adoption and strategic
integration within QA processes.

This exploration begins by examining whether teams are considering the adoption of various
testing types in the next 12 months. This reveals emerging trends and priorities that are set to
shape the future landscape of software testing.

Following this, we delve into the role of artificial intelligence (AI) in software testing, a topic at the
forefront of innovation in the field. We uncover how teams are currently integrating AI into their
workflows, from automating routine tasks to enhancing the precision of test cases. This section
not only highlights the progressive shift towards more advanced and efficient testing strategies
but also captures the industry’s pulse on embracing AI-driven solutions to address the challenges
of modern software development.

Through this analysis, we aim to provide a comprehensive outlook on how development teams
are positioning themselves for the future, underscoring the technologies and methodologies
poised to redefine the standards of quality and efficiency in testing.

Continue exploring this section to uncover insights on the following topics:


Future adoption of testing types/methods
Use of AI in software testing and quality assurance

Software Testing & Quality Report - 47


The Future of Testing

Is your team considering adopting any of the following


kinds of testing in the next 12 months?
Total Respondants: 1,866

Key Findings
The survey results reveal an emphasis on automation across various testing types, with
automated regression testing being the top priority for 39% of respondents. This underscores a
growing recognition of the efficiency and reliability benefits that automation brings to testing and
quality assurance processes.

Following closely behind is load/performance testing, indicating a heightened awareness of the


importance of ensuring software performance under varying workloads. Functional and unit
testing are also prominent, suggesting a commitment to maintaining core functionality and code
integrity.

Notably, there is considerable interest in automated UI testing and automated end-to-end


testing, reflecting the industry’s focus on delivering seamless user experiences. However, the
relatively lower percentages for manual regression testing and smoke testing indicate a shift
towards automated solutions, possibly driven by the need for speed and scalability in modern
development environments.

Overall, these findings reveal a concerted effort among QA teams to leverage automation
technologies to enhance testing efficiency, accuracy, and ultimately, software quality.

Software Testing & Quality Report - 48


The Future of Testing

How do you incorporate AI into your quality assurance


processes?
Total Respondants: 2,134

Key Findings
This year, we included a question on how QA teams are leveraging AI within testing and QA
processes to benchmark exploration into this evolving area. More than half (54%) of respondents
indicated that they do not currently incorporate AI technology into their quality assurance efforts,
highlighting a considerable portion of the industry still on the verge of AI adoption. However,
22% of teams are leveraging AI to write test cases or scenarios, while 19% use it to create test
automation scripts, demonstrating an open-mindedness within the industry towards AI’s potential
to streamline and enhance testing.

A smaller but noteworthy fraction of respondents (14% and 12% respectively) are employing AI
for the management of test data and debugging of test code, indicating a recognition of AI’s
capability to address more nuanced aspects of modern software testing. Additionally, evaluating
results and assisting in test environment preparation were noted uses, albeit by a smaller
fraction of respondents, highlighting emerging areas where AI could significantly impact testing
efficiency.

Notably, 10% of respondents express no desire to incorporate AI technology, pointing to a


resistance or perceived lack of necessity among certain segments of the testing and quality
assurance community.

Software Testing & Quality Report - 49


SECTION 06

Survey Details

Software Testing & Quality Report - 50


Survey Details

Which of the following best describes your job


responsability and title:
Total Respondants: 4,214

Key Findings
We’ve surveyed our users annually since 2018, and this year marks the third edition of the Software
Testing & Quality Report. This report compares and contrasts survey results from 2020–2023 to
better understand trends, changing work styles, and shifts in focus and priority across QA teams
globally.

Each year, we extend an invitation to every TestRail user to complete our annual survey. The most
recent survey received a total of 4,214 responses.

The largest group of survey participants identified their role as “QA/Test Engineer,” comprising
30.9% of total responses. Following closely, “QA/Test Analyst” represented the second largest
group, at 13.6%. “QA/Test Lead’’ and “QA Manager” accounted for 13.1% and 8.1% respectively. These
findings suggest that our feedback primarily comes from actual testers—those using TestRail to
create, execute, and analyze tests on a daily basis.

Software Testing & Quality Report - 51


Survey Details

What country are you based in?


Total Respondants: 4,156

Key Findings
The responses to “Which country are you based in?” have been roughly consistent year-over-year.
The United States of America and India make up nearly half of survey participants, with 25% and
21% respectively. Other top countries include Canada, the United Kingdom, and Ukraine.

Software Testing & Quality Report - 52


Survey Details

How would you rate your level of proficency regurding


software testing?
Total Respondants: 4,192

Key Findings
This year, we added a question to gauge survey participant’s self-assessed testing proficiency.
70% of respondents consider themselves to have an above average level of proficiency regarding
software testing.

Software Testing & Quality Report - 53


Survey Details

Summarize your SDLC or Agile Maturity:


Total Respondants: 2,186

Key Findings
We also asked respondents to gauge their team’s maturity level within an agile software
development lifecycle. Among them, 57% reported being “mostly agile” and 21% described their
team as “seasoned” with a “high level of agility.” These responses suggest a significant shift in
the industry away from the waterfall model of development. While most teams still don’t consider
themselves agile veterans, the majority are actively adopting agile methodologies.

Software Testing & Quality Report - 54


Survey Details

What industry does your organization work in?


Total Respondants: 3,477

Key Findings
As seen in prior years of our user survey, the top 4 industries represented continue to be
computer software, healthcare, financial services, and game development.

Software Testing & Quality Report - 55


Survey Details

How large is the company you work for?


Total Respondants: 3,951

Key Findings
The survey results reveal a wide range of company sizes. 21% of respondents said they work for
smaller companies made up of 100 or fewer employees.

Nearly 40% of respondents reported working for large organizations with 1000 or more
employees.

Respondents working for larger companies often have more structured processes and tools while
smaller companies might face more resource constraints.

Software Testing & Quality Report - 56


Survey Details

How large is your organization’s software development


and engineering team?
Total Respondants: 3,571

Key Findings
82% of respondents reported working with a software development team of more than 10 people.
24% of respondents work with a development team of more than 100 but less than 1000 people.

Software Testing & Quality Report - 57


Survey Details

Approximately how many individuals are dedicated to QA


in your organization?
Total Respondants: 3,567

Key Findings
Very few survey respondents reported 0 employees dedicated to testing and QA at their
organization. The QA team size represented in our annual survey has been steadily increasing
year-over-year, most notably in the 51–100 employee range.

Software Testing & Quality Report - 58


SECTION 07

Conclusion

Software Testing & Quality Report - 59


Conclusion

Summary of Key Trends


The data from the third edition of this report offers valuable insights into the current landscape
and emerging trends shaping the field of QA. Teams must continue to embrace efficiency and
streamlined processes to deliver high-quality products with speed, accuracy, and confidence.

The report highlights the following


Continuing investment in automation
Collaborating internally and externally
Identifying strategic testing objectives
Preparing for the future

A notable emphasis emerges on continuing automation investments while retaining the value of
human expertise through manual testing. As teams navigate the increasing complexities of the
software testing environment, collaboration emerges as a cornerstone, both within organizations
and through external partnerships. This collaborative effort enables the pooling of resources,
ensuring comprehensive testing coverage while addressing staffing constraints.

The report stresses the importance of defining clear testing objectives and metrics to drive quality
assurance efforts. By establishing clear goals and performance indicators rather than focusing
solely on activity metrics, teams can accurately measure progress and prioritize efforts effectively.
This proactive approach also extends to fostering a cohesive team environment.

Looking ahead, the adoption of AI in testing and quality assurance holds promise for enhancing
testing methods and efficiency. There is a growing recognition of AI’s potential to streamline
test automation, test case creation, and test data management. As teams explore AI integration,
they stand to redefine the quality processes and efficiency in testing as they seek to adopt new
methodologies and techniques.

The State of Quality report offers a comprehensive overview of ‘where we are and where we’re
going’. By leveraging actionable insights from thousands of development teams worldwide,
organizations can strategically shape their testing strategies, prioritize initiatives, and embrace
emerging technologies to drive excellence in software quality assurance. As the industry continues
to evolve, collaboration, innovation, and adaptability will be key drivers of success in navigating the
complexities of modern software development.

Software Testing & Quality Report - 60


About TestRail
Gurock Software was founded in 2004 and now has offices in Frankfurt, Dublin, Austin, and
Houston. Our flagship test case management solution, TestRail, is used by more than 100,000
members of development and QA teams to build rock-solid software—including companies like
Amazon, NASA, Adobe, Sony, PayPal, and Siemens.

TestRail is the only platform that empowers QA teams to build, connect, and optimize all their
testing processes. TestRail’s Quality OS centralizes manual and automated test management
and gives you visibility into your entire quality operation so you can manage your team more
flexibly and build repeatable, scalable workflows. And, with a unified platform that integrates
with your DevOps pipelines, you can share testing timelines, data, and insights across your whole
organization.

TestRail is a leader in the G2 Grid for Test Management and Software Testing, with top ratings
year-over-year for best results, most implementable, and overall enterprise leader. For more
independently verified research and reviews, visit the TestRail page at G2 or Capterra.

Let TestRail lift your team out of chaos and toward faster, frictionless releases. Experience the
TestRail difference with a free trial today—no credit card required. Get started today and start
releasing higher-quality software, faster!

Try TestRail Get a Demo

Software Testing & Quality Report - 61

You might also like