0% found this document useful (0 votes)
143 views148 pages

01 Build A Software Quality Assurance Program Phases 1 4

The document outlines the evolution and importance of Quality Assurance (QA) in software development, emphasizing its role as a strategic partner rather than a cost center. It presents a structured approach to building a QA program, highlighting the need for integration throughout the solution delivery lifecycle and the adoption of modern practices like automation and AI. Key challenges in QA implementation are identified, along with strategies to enhance collaboration, define quality standards, and ensure continuous improvement in software quality.

Uploaded by

Mohit Trehan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
143 views148 pages

01 Build A Software Quality Assurance Program Phases 1 4

The document outlines the evolution and importance of Quality Assurance (QA) in software development, emphasizing its role as a strategic partner rather than a cost center. It presents a structured approach to building a QA program, highlighting the need for integration throughout the solution delivery lifecycle and the adoption of modern practices like automation and AI. Key challenges in QA implementation are identified, along with strategies to enhance collaboration, define quality standards, and ensure continuous improvement in software quality.

Uploaded by

Mohit Trehan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 148

Build a Software

Quality Assurance
Program
Build a robust strategy and ensure quality is
at the core of everything you do.

Info-Tech Research Group Inc. is a global leader in providing IT


research and advice. Info-Tech’s products and services
combine actionable insight and relevant advice with ready-to-
use tools and templates that cover the full spectrum of IT
concerns.
© 1997-2024 Info-Tech Research Group Inc.
Table of
Contents
3 Analyst Perspective 98 Phase 4: Establish a QA
Roadmap
4 Executive Summary
114 Bibliography
5 Insight Summary
116 Appendix
23 Phase 1: Assess Your QA
Process

50 Phase 2: Align on Improved


QA Practices

74 Phase 3: Build Your QA


Toolbox

Info-Tech Research Group | 2


Analyst
Perspective
Quality assurance (QA) has undergone a significant transformation over the
years, evolving from a step in the development lifecycle to a pivotal,
integrated process throughout the solution delivery cycle. This shift reflects
a broader understanding that QA is not merely a gatekeeper of quality but
a strategic partner in ensuring software excellence and reliability.
Bhavya Vora Today, the emphasis is on a proactive QA approach that aligns with Agile
Research Analyst, methodologies and CI/CD practices, fostering a culture where quality is
Special Projects everyone's responsibility. By leveraging advances in automation, AI, and
Info-Tech Research Group machine learning (ML), modern QA practices emphasize collaboration,
early defect detection, and preventive measures to ensure product
robustness.
This perspective positions QA as a strategic investment, a revenue
Andrew Kum-Seun enabler, and crucial to the company's long-term value, rather than
Research Director,
as a cost center. A modern QA strategy, therefore, must encapsulate a
Application Delivery &
holistic view of quality, integrating it with business goals and leveraging
Management
Info-Tech Research Group
automation to ensure your products consistently meet the highest
standards of quality demanded in today's competitive market.

Info-Tech Research Group | 3


Executive Summary
Your Challenge Common Obstacles Info-Tech’s Approach
Given the rapid change in solution delivery • The perception of QA as a cost center • Standardize your definition of
over the last few years, the role of quality can lead to the diversion of QA quality. Come to an organizational
assurance (QA) has also evolved. Latest investments to other value-added agreement of what attributes define a
disruptors helped evolve the role of QA: capabilities. This decision may come high-quality solution. Accommodate both
• The widespread adoption of from leadership, development teams, or business and IT perspectives in your
collaborative methodologies like Agile other stakeholders who prioritize value definition.
necessitate changes in how QA is generation over cost savings. • Clarify the role of QA throughout
integrated into the delivery process. • Significant focus is on the testing phase your solution delivery lifecycle.
• Maturity of automated delivery practices rather than the inclusion of QA practices Indicate where and how QA is involved
such as CI/CD pipelines have throughout the solution delivery cycle. throughout solution delivery. Instill
significantly changed how QA is • QA teams are unable to accommodate quality-first thinking in each stage of your
conducted and enforced. new and evolving security risks and pipeline to catch defects and issues early
technologies, aggressive performance and motivate cross-functional
• Organizations are looking to invest in AI collaboration.
to automate QA due to the industry standards, constantly changing
priorities, and misunderstood quality •
Adopt good QA practices to better
hype. AI can involve significant
organizational changes, but current policies. support your quality definition,
systems, processes, and roles may not • The marketplace for test automation business, IT environments, and
be ready or able to adopt them. and automated testing tools is crowded priorities. Ensure your QA activities
and difficult to navigate. satisfy your criteria for a high-quality and
• Test requirements and scenarios are successful solution with the right
Info-Tech Insight
broader and more complex. Manual templates, technologies, and tactics in
testing is unable to achieve the desired
QA is not a role but is a way of working; hiring QA roles can be a start to building a practice but it is not scalable.
your toolbox.
test coverage.
Automation can help but is limited by the roles who are developing and using it. QA must be delegated and ingrained in
every aspect of work. Info-Tech Research Group | 4
Insight Overarching Info-Tech Insight
QA is not a role but is a way of working; hiring QA roles can be a

Summary start to building a practice, but it is not scalable. Automation can


help, but it is limited by the roles who are developing and using it.
QA must be delegated and ingrained in every aspect of work.

QA performed early and throughout the solution delivery lifecycle (SDLC) improves
the accuracy and effectiveness of downstream tests and reduces costs of fixing
defects late in delivery. QA activities should be embedded in one’s job description.

QA is not a role. It is a mindset (way of working) that revolves around quality-first


and proactive thinking. Good QA practices future-proofs solution investments by
ensuring they are maintainable, scalable, and transferable.

QA is a shared responsibility. Your test plans and test cases are not static documents
nor built in a single event. They are continuously updated and improved through
feedback during the solution delivery process in collaboration with developers and
other key stakeholders.

Start small to evaluate the fit and acceptance of new and modified roles, processes,
and technologies. Aggressive initiatives and timelines can jeopardize success in big-
bang deployments. Gradual and thoughtful adoption of QA ways of working helps
your teams focus on practice fit rather than fighting the status quo. This Info-Tech
approach Research Group | 5
must involve change tolerant teams, solutions, and cooperative stakeholders.
The value of QA stems from the assurance of
sustainable and valuable solution delivery
Twitter series B
What is quality and QA? How is financing:
QA perceived in
the organization?
• Solution quality is the degree a system, feature, component, or
• Efficient and effective QA
process meets specified customer needs, customer expectations,
practices are vital because
and nonfunctional requirements. QA is a program of tasks and
solutions need to readily
activities to ensure software quality priorities, standards and policies
adjust to constantly evolving
are met throughout the SDLC.
and changing business
• Do not expect a universal definition of quality. Everyone will have a priorities and technologies
different understanding of what quality is and will structure people, without risking system
processes, and technologies according to that interpretation. stability and breaking
What is the core of QA? business standards and
expectations.
Verification Validation
is evaluating work items to is evaluating the solution • However, investments in QA
determine if they meet during and at the end of are often afterthoughts. QA is
specified business and delivery to determine if often viewed as a lower
technical requirements. they satisfy specified priority compared to other
business and technical
Is the solution built right? requirements. SDLC capabilities (e.g. design
and coding) and is typically
Is the right solution built? the first item cut when
Info-Tech Research Group | 6
Maximize the value you expect to gain from QA
Your QA program is not just about keeping pace with changes. QA is about setting a standard of
software quality excellence aligned to stakeholder expectations and priorities while anticipating
future challenges and opportunities.

Improved Increased
Enhanced Business resource
customer
security continuity utilization
satisfaction
Solution issues are QA enforces the right Good QA increases QA practices streamline
identified and protocols and tactics stakeholder the SDLC process by
addressed before are employed during confidence that the reducing the time
they can negatively the solution delivery. solution can reliably spent on fixing issues
impact the customer. This standard is operate in sunny- late in the lifecycle,
Preventative aligned to the day and rainy-day ramping up resources
measures can then organization’s scenarios and meet unfamiliar with the
be implemented to security risk defined service level solution, and paying
maintain a tolerance, latest agreements. down technical debt.
consistent trends, and industry
experience. regulations. Info-Tech Research Group | 7
QA remains a challenge for many organizations
These challenges highlight critical gaps in our current approach, showcasing the necessity for a shift toward more integrated
and automation-driven QA processes. This focus ensures that QA ultimately drives your competitive advantage.

Top challenges in achieving your software quality


goals

Lack of mature processes 24%

Unclear quality goals and targets 24%


Challenges

Too short release life cycles 25%

Lack of resources 27%

Challenges in applying test automation 33%

Lack of time to ensure quality 39%

0 5 10 15 20 25 30 35 40 45

Percentage of respondents

Source: Katalon, 2023.


Info-Tech Research Group | 8
Increasing the QA budget does not guarantee success
Organizations are investing more into their QA practice.
Forty percent of large-scale companies are spending more than 25% of their budget on testing, with
nearly 10% of enterprises spending more than 50% of their budget on testing (LambdaTest, 2023).
This showcases how important quality is for all organizations.

But the reception and value of software On-time, on-budget solutions do not indicate successful
products do not justify the money invested. delivery practices.

78 Average customer 90 of CIOs see at least some


business frustration with IT’s
satisfaction score for the
% software industry. % failure to deliver value.

Source: Fullview, 2023. Source: CEO-CIO Alignment


Diagnostics, Info-Tech Research Group,
November 2022 to October 2023;
N=115.

Info-Tech Research Group | 9


Design your QA approach
Stakeholders expect the speed and responsiveness of product delivery Quality assurance is more than just software testing. Embrace a
will not come at the expense of quality. A well-structured strategy quality-first mindset that instills product quality accountability,
streamlines delivery processes, upholds quality standards, and bolsters fosters collaboration, and sets delivery expectations across all solution
the solution delivery team’s reputation as a trusted partner. delivery lifecycle (SDLC) roles.

Achieve a Quality-First Vision Build Your Toolbox


• Establish clear targets to guide and QA Strategy • Compile a comprehensive set of QA
measure effective QA. tactics, tools, methods, and
• Create a unified definition of quality. standardized templates.
• Align to common quality standards. • Architect, rationalize, and monitor your
QA tools and technologies.
Embrace industry good
Delegate QA Responsibilities practices and leading-edge
technology
• Integrate QA early in the delivery • Adopt AI and automation capabilities
cycle to identify critical issues in QA design, execution, and analysis.
sooner. • Incorporate iterative and collaboration
• Perform quality checks throughout practices (Agile, DevOps) into your QA
the SDLC. practices.
Instill Cross-Functional
Coach, Mentor & Support
Accountability
• Include solution request accuracy Purpose of • Guide SDLC teams and stakeholders
QA Strategy with the priorities and tactics of the QA
and go-live sign-offs as stakeholder
accountabilities. QA Standards QA Roles and way of working.
QA Objectives
• Empower SDLC teams to make local and Responsibilitie • Facilitate knowledge sharing and
and Metrics
quality, low-risk decisions. QA Definitions s collaboration across the different SDLC
Communicatio QA Tools and functions.
QA Process
n and Technologies Info-Tech Research Group | 10
Reporting
Assess and strengthen your QA capabilities
Vision and Buy-In Execution
Clear direction and goals of Management
the QA practice and The tactics to manually and
ensuring the practice has automatically execute the
the appropriate funding various QA activities and
and stakeholder buy-in. test types.

Planning Core Capabilities Reporting and


The creation of artifacts
to Grow and
Communication
needed to define the scope Mature Your QA
Preparing and delivering
of QA activities and for Practice
the outcomes of QA
teams to confidently plan activities for the
and commit to that scope. consumption of decision
makers, SDLC teams, and
Cross-Functional Practice Management
dependent roles.
Collaboration Defined QA roles and
The collaboration among responsibilities; processes,
SDLC roles in QA activities tools, and technology
and the involvement of QA management; and tactics to
support and improve the
in other SDLC activities.View the QA Current-State Assessment
Tool practice.
Info-Tech Research Group | 11
Continuous delivery and release cycles, intake feedback, and

Your solution improvement.

delivery
lifecycle should
embrace quality
at its core

Connect all phases with


a solution-centric
approach that goes from
the first idea all the way
through to maintenance.

Definition of Done
Voice of
the
Product/Enterprise
Alignment Customer
Voice of the Customer Voice of
Voice of the Teammate the
Teammat See our Evolve Your Software Development Lifecycle
Into a Solution Delivery Lifecycle blueprint for more Info-Tech Research Group | 12
e information.
Explore the trends in the QA marketplace

1 4
AI and ML-Embedded QA Tools End-to-End Testing
01 Embedding AI and ML in QA tools increased the efficiency,
scope, and accuracy of test design and execution, enabling
04 The configuration and orchestration of automated tests to evaluate
the functionality of the solution from start to finish under real user
them to: scenarios. End-to-end testing looks at:
• Create test cases from functional requirements and test • Testing to ensure specific software layers or components work
scripts without code. consistently and reliably across other parts of the software and
• Analyze past testing activities to predict potential issues and system.
defects of current delivery efforts and suggest root causes • Testing to ensure specific functions work smoothly across the
and solutions. technical stack (from the user interface down to the
• Provision accurate and realistic synthetic test data using infrastructure).
production data to train ML models. This approach can involve the use of automated testing solutions

2 5
See Leverage Gen AI to Improve Your Test Automation Strategy alongside nontraditional testing solutions like robotic process
Autonomous
and Adopt GenerativeTesting
AI in Solution Delivery for more QAOps
automation. See
02 information. testing involves tests and other QA activities to be
Autonomous
created, executed, and managed through intelligent algorithms
05 Build a involves
QAOps
Enhance
Winning the
Your
Business Process
embedding
Solution
ofAutomation
Architecture
Playbook
QA procedures,
Practices to
reporting, and technologies into the SDLC pipeline
and
automation,
identify the
without the need for human intervention. This capability enables impacts of your 2022).
(BrowserStack, changesThetogoals
your systems and users.
are to guarantee high software
the: quality consistently across teams and operationalizing QA-
• Configuration of QA activities to new requirements, testing optimized, CI/CD processes for broader organizational adoption.
scenarios and observations of solution delivery activities. QAOps shares many of the principles, behaviors, and best
• Self-healing of test scenarios and scripts when issues occur. practices of DevOps and Agile methodologies. See
• Immediate feedback during any phase of the SDLC of Implement DevOps Practices That Work for more information.

6
potential risks or conflicts with quality standards and industry

3
frameworks and regulations.
Scriptless Automated Testing QA Tool Ecosystem
03
See Tech Trends 2024 for more information on autonomous back
Low-
office.and no-code capabilities reduce or remove the technical
skills traditionally needed to create, script, execute, and manage
06 The team’s personal preferences for specific QA tools and
technologies have been and are continually shifting away from
the siloed, monolithic tooling and vendor stack that the industry
automated tests. This capability motivates the shifting of
standardized in the past. This demand pushed many vendors to
solution quality accountability earlier in the SDLC and enables
position their solutions to build and strengthen relationships with
the discovery of risks and defects before they cause negative
third parties and deliver out-of-the-box plugins and customizable
impact and become expensive to fix. See Info-Tech Research Group | 13
APIs. See Applications Priorities 2024 for more information on
Satisfy Digital End Users With Low- and No-Code for more
multisource ecosystems.
Extend the QA mindset beyond testing
Shift QA left and right Bridge your silos with DevOps
An emerging trend in QA is the adoption of shift- DevOps purposefully blurs the lines between these
left and shift-right testing. Shift-left testing is a responsibilities, forcing collaboration. The developers
software testing approach that places strong start building the mindset of continually checking for
emphasis on conducting testing activities earlier errors in their code. The testers increase their
in the development process by shifting all testing responsibilities from validating the application to
activities to earlier development stages rather ensuring it is deployable at all times. They may even
than leaving them until the very final stages fix code as needed. All these pieces work together to
(Katalon, 2023). ensure rapid delivery of features. The focus
Integrating on the
QA into DevOps:
customer drives the work of the entireteam
• Realign team.
structure
On the other hand, shift-right testing implies • Automate as much as possible
extending testing activities beyond the traditional • Use metrics to track progress
development and release phases.
• Run tests in parallel
This involves performing testing activities in the
QA involvesenvironment
testing across • Have common set of processes
production or the SDLC
closer to the end users
after the software has been deployed. & tools
• Continuous feedback
• Increasing viability
• Sufficient training Source: TestRail,
Testing 2022.
Testing new Testing Testing every Testing on See Info-Tech’s Implement DevOps Practices That
requirements new code every build deploymen production Work blueprint for more information. Info-Tech Research Group | 14
t
Info-Tech’s methodology for building a
software QA program
1. Assess Your 2. Align on Improved 3. Build Your QA 4. Establish a
QA Process QA Practices Toolbox QA Roadmap
Phase 1.1 List your QA 2.1 Define your QA guiding 3.1 Define your defect 4.1 Build your QA
Steps objectives and metrics principles tolerance roadmap
1.2 Analyze your 2.2 Define your foundational 3.2 Align on your QA
current QA state QA process activities and tools

Phase • Solution quality • QA guiding principles • Test defect risk • List of QA


Outcomes definition • Target QA process and tolerances initiatives and
• QA objectives artifacts • Test definitions and roadmap
• Metrics to gauge • RACI chart of QA owners • Communication
QA success capabilities • Test data and map
• Current QA state • QA resource allocation environment
assessment approach and structure management
requirements
• List of QA tools
currently available in
your organization
Info-Tech Research Group | 15
• List of desired QA tools
to be used
Key deliverable: Blueprint deliverables
Each step of this blueprint is accompanied by supporting
deliverables to help you accomplish your goals:

QA Strategy
Template
A template to help you QA Current-State As
document a comprehensive sessment Tool
description of the QA practices
Assess the current state
for your organization. It presents of QA in your
several activities required to organization at team
validate and verify software and organizational
solutions. level.

Test Plan Test Case


Template Template
Complete description Specify and
of the plans and communicate the
scope of tests. specific conditions
and scenarios that
need to be validated
and verified. Info-Tech Research Group | 16
Blueprint benefits
Demonstrate the value QA brings to your Consistent delivery of quality across your
organization solution portfolio and teams

• Consolidate different and siloed perspectives and


• Level set your QA expectations with stakeholders to definitions of quality into a single interpretation and
ensure they are achievable and aligned to their strategic statement to rally behind.
goals. • Leverage a common toolbox of QA tools, tactics, and
• Pinpoint and optimize the QA capabilities inhibiting your templates that were optimized to meet your quality
team from delivering solutions your customers need. policies and standards.
• Measure the effectiveness of your QA practice in the • Embed and delegate QA tasks and activities throughout
language your stakeholders understand and empathize your SDLC roles and processes and empower team-level
with. decision making where possible.
• Clearly illustrate your QA vision and optimization plan • Encourage the applying QA practices to the artifacts
through a QA strategy and good communication. supporting the SDLC (verification) alongside the solution
being delivered (validation).

Notable Impacts Notable Impacts


• Clear understanding of the funding needed for the QA • Organizational accountability of solution quality before,
practice. during, and after the solution delivery process.
• Executive buy-in for broader QA improvement initiatives • Common interpretation of quality attributes to ensure
spanning across business and IT functions. consistent compliance with your organization’s,
Info-Tech Research Group | 17

• SDLC functions motivated to collaborate, learn from each industry’s, and regulator's policies and standards.
Measure the value of this
blueprint
Outcome Project Metrics Impact
Improved software
• 25% reduction per
quality by reducing the Select and Use SDLC Metric
s Effectively quarter/year
number of defects
• 20% increase in
throughput after nine
Increased solution
Select and Use SDLC Metric months
delivery throughput s Effectively • Sustainable velocity after
one year
• 50% reduction in rework
Reduction of rework due
due to compliance after
to defects found during
Select and Use SDLC Metric one year
the solution delivery s Effectively • 90% reduction after two
process
years
Increased application
• 10% increase in
and end-user End User Satisfaction Diagn
ostic satisfaction in one year
satisfaction
• 10% increase in
Increased IT satisfaction CIO Business Vision Diagno
stic satisfaction in one year
Info-Tech Research Group | 18
Executive Brief INDUSTRY SOURCE

Case Study
Government Info-Tech Research Group
Workshop

Government Agency
A government agency worked with Info-Tech to develop a Results
strategy to mature and scale their QA practice.​The QA team
identified several key QA objectives that they want to achieve By conducting the workshop, the organization was
through their practice: able to:
• Ensure software products meet business, functional, and
• Build a consensus of what QA means and list the
nonfunctional (including security, performance, integration,
and regression) requirements. necessary changes to be successful.​

• Build a disciplined and formal QA practice. • Gauge the maturity and capability of the current
• Increase customer and stakeholder confidence, trust, and QA practice to define a list of optimization
respect. initiatives and build a roadmap.​
However, they recognized key challenges standing in their • Create the initial design of target QA roles and
way, such as: processes of the QA practice.
• Low QA resource capacity.
• Finalize the future state of QA roles, processes,
• Low availability of business subject-matter experts.
See our sample workshop deliverable to know how an Info- tools, and tactics, including their implementation
• Lack of Tech
automated testing and
Quality Assurance test automation
Workshop tools.
helps improve your QA to other products and systems, with the existing
practice.
• Very tight project timelines resulting in the cutting of QA test strategy.​
activities. Info-Tech Research Group | 19
Info-Tech offers various
levels of support to best suit
your needs
Executive &
Guided Technical
DIY Toolkit Implementation Workshop Counseling Consulting
“Our team has already “Our team knows that we “We need to hit the “Our team and processes “Our team does not have
made this critical project a need to fix a process, but ground running and get are maturing; however, to the time or the knowledge
priority, and we have the we need assistance to this project kicked off expedite the journey we’ll to take this project on. We
time and capability, but determine where to focus. immediately. Our team need a seasoned need assistance through
some guidance along the Some check-ins along the has the ability to take practitioner to coach and the entirety of this
way would be helpful.” way would help keep us this over once we get a validate approaches, project.”
on track.” framework and strategy deliverables, and
in place. opportunities.”

Diagnostics and consistent frameworks are used throughout all five


options.
Info-Tech Research Group | 20
Guided Implementation A Guided
What does a typical GI on this topic look like? Implementation
(GI) is a series
of calls with an
Info-Tech analyst to
Phase 1 Phase 2 Phase 3 Phase 4
help implement our
best practices in
your organization.
Call #1: Call #3: Call #5: Call #7:
State your Define your Define your List your QA
quality guiding defect objectives and A typical GI is
definition
and list your
principles tolerance. define your 8 to 12 calls over
and target roadmap.
QA state the course of
objectives. process. 4 to 6 months.
Call #2: Call #4: Call #6: Call #8:
Review your Build your Build Finalize your QA
QA current QA RACI your QA strategy.
state. chart. toolbox.

Info-Tech
Info-Tech Research
Research Group| |2121
Group
Workshop Overview
Contact your account representative for more information.
[email protected] 1-888-670-8889

Session 1 Session 2 Session 3 Session 4 Session 5


Assess Your QA Align on Improved QA Build Your QA Toolbox Establish a QA Complete Your QA
Process Practices Roadmap Artifacts (Post
Workshop)
Activities

1.1 Define solution 2.1 Define your QA 3.1 Define your defect 4.1 Build your QA 5.1 Complete your QA
quality in your guiding principles tolerance roadmap strategy
context
2.2 Define your QA 3.2 Define your tests 5.2 Review the workshop
1.2 State your QA target state deliverables and
3.3 State your test data
objectives and discuss next steps
and environment
metrics
requirements
1.3 Assess the current
3.4 List your QA tools
state of your QA
practice

1. Solution quality 1.QA guiding principles 1.Test defect risk 1.List of QA initiatives 1.QA strategy
definition tolerances and roadmap
2.Target QA process and 2.Next steps with Info-
2. QA objectives artifacts 2.Test definitions 2.Communication map Tech
Deliverables

3. Metrics to gauge QA 3.RACI chart of QA 3.Test data and


success capabilities environment
management
4. Current QA state 4.QA resource allocation
requirements
assessment approach and structure
4.List of QA tools
currently available in
your organization
Info-Tech Research Group | 22
5.List of desired QA tools
to be used
Phase 1 This phase will walk you
through the following
activities:
Assess Your QA Process 1.1.1 Define solution quality in
your context

1.1.2 State your objectives

1.1.3 List the metrics that will


Align on Improved gauge your success
Assess Your QA QA Practices
1.2.1 Understand the challenges
Process
of your QA practice
1.1 List your QA 2.1 Define your QA
objectives and guiding principles 1.2.2 Complete a current-state
metrics 2.2 Define your assessment
1.2 Analyze your foundational QA
This phase involves the
current QA state process
following participants:
• QA team
Build Your QA Establish a QA • SDLC team
Toolbox Roadmap • QA stakeholders and
3.1 Define your 4.1 Build your QA management
defect tolerance roadmap
3.2 Align on your QA • IT operations
activities and tools
Build a Software Quality
Assurance Program
Info-Tech Research Group | 23
Step 1.1 This step involves the
following participants:

List your QA objectives and metrics • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
1.1.1 Define solution quality in your context
1.1.2 State your objectives
1.1.3 List the metrics that will gauge your success

Outcomes of this step


Assess Your QA Process
• Solution quality definition

• QA objectives
Step 1.1 Step 1.2
• Metrics to gauge QA success

Info-Tech Research Group | 24


Build the outcomes of this phase
What is a QA strategy?
into your QA strategy document
• A QA strategy document defines
the approaches and practices to
achieve your organization’s
quality objectives and product HOW IS A QA STRATEGY
success criteria. Ideally, this VALUABLE?
document is applicable to most,
• Promotes consistent and repeatable
if not all, teams, projects, and execution of QA practices across all
products. solutions, projects, and teams.
• Common components of a QA • Assures adherence to universally
strategy document include: role accepted and defined quality standards.
definitions, tools, artifacts, • Offers insights of QA effectiveness
communication practices, across the organization using the right Use Info-
frameworks and standards, QA metrics. Tech’s
process, QA guidelines, metrics, • Shares the proven and IT-approved QA QA Strategy T
emplate
defect management, and risks tools, technologies, tactics, and
.
and mitigations. templates anyone can use.

• In some cases, a test plan may • Justifies funding of optimization


initiatives.
contain the same information as
a QA strategy, which is often • Info-Tech
Serves as Insight
training material for those new
good enough for small to QA.
In some cases, a test plan may contain the same information as a QA
organizations, projects, and
strategy, which is often good enough for small organizations, projects, and
products. Info-Tech Research Group | 25
products.
Clarify your definition of quality
Quality does not mean the same thing to everyone.
Do not expect a universal definition of quality. Each department, person, and industry standard will have
a different interpretation of quality and will perform certain activities and enforce policies that meet those
interpretations. Misunderstanding of what is defined as a high-quality solution within business and IT teams
can lead to further confusion behind governance, testing priorities, and compliance.
Each interpretation of quality can lead to endless testing, guardrails, and constraints, or lack
thereof. Be clear on the priority of each interpretation and the degree of effort needed to ensure
each
For is met.
example: I have access
Persona: to each layer Persona:
Customer of the mobile
stack,
Operations
including the
What does an code and
data.
accessible
application
mean? I can access it
The
application is
on cellphones,
Application tablets, and a
accessible
24/7 with 95%
Owner web browser.
uptime.
Persona:
Developer Info-Tech Research Group | 26
Quality must include business value and the technical
requirements needed to validate and verify the product
Stakeholder
Quality Core
& QA Activities and
Perspectives and Quality
Relationship Artifacts
Scenarios Attributes
Management
External Business Defect
Sunny and Business and Test Types
Relationship Functionality Documentat
Rainy-Day Technical Risks and Tools
s Fulfillment ion
Scenarios
Stakeholder Test
User Experience Design, Code, System
Test Strategy, Test
Builds, and Managemen Dashboards Plan, Test Cases,
and User Stability
Releases t and Reports and Scripts
Monitoring
Test
Nonfunctional Value
Generation Environment
Requirements and Data

Info-Tech Insight
It is easy to lose sight of what matters when we look at quality from a single point of view. Many organizations
simply define quality as valuable, usable, and stable products and changes to end users. This definition omits the
importance of technical attributes that make solutions maintainable, scalable, and reusable. Solutions are not often one-
offs, and technical excellence is necessary to ensure the right decisions are made that minimize technical debt.
Info-Tech Research Group | 27
Enforce your quality definition through attributes in all
delivery activities
Quality attributes are properties that dictate how the system should behave at runtime and how it should be
designed, implemented, and maintained. These attributes capture the deeper structural characteristics of the
solution architecture that enable system functionality.

Quality
Definitions
Attribute
The product is an intuitive solution. Usability helps define the ease with which users can perform a specific task
Usability on the system. Limited training and documentation are required.
Usability and performance are closely related. A solution that is slow is not usable. Performance represents the
degree to which a product or system provides functions that meet stated and implied needs when used under
Performance specified conditions. Baseline performance metrics are defined, and changes must result in improvements.
Performance is validated against peak loads.
Availability is the degree to which a software system can be accessed by the users when it is required. The
application system is present, accessible, and ready to carry out its tasks when needed. The application is
Availability accessible from multiple devices and platforms, is available 24/7/365, and teams communicate planned
downtimes and unplanned outages. Teams must not put undue burden on end users accessing the systems.
Reasonable access requirements are published.
Security refers to the degree to which a software system safeguards information or data so that users or other
systems have appropriate access to these data based on their authorization level. Applications handle both
Security private and personal data and must be able to segregate data based on permissions to protect privacy. Users
want it to be secure but seamless. Vendors need to understand and implement the organization’s security
requirements into their products. Teams ensure access is authorized, maintain data integrity, and enforce privacy.
Reusability defines the degree to which a system component or an asset can be utilized on several systems or in
Reusability building other components or assets. This attribute minimizes the duplication of components and implementation
time. Teams ensure a modular design that is flexible and usable in other applications.
Interoperabilit A system's ability to communicate or exchange data seamlessly between different operating systems, databases,
Info-Tech Research Group | 28

y and protocol conditions.


Enforce your quality definition through attributes in all
delivery activities (cont’d.)
Quality Attribute Definitions
Scalability The ability of a software system to handle an increased load without decreasing its performance.
 Horizontal scalability (scaling out): Adding more resources, such as memory, discs, server, or
processors to logical units.
 Vertical scalability (scaling up): Adding more resources to a physical unit, such as adding more
memory to a single computer.
Ease of maintenance and enhancements are critical. Additional care is given to custom code because of
the inherent difficulty to make it scale and update.
Modifiability Modifiability is the degree to which a software system can be effectively altered without causing defects
or bugs or decreasing the quality of the existing system. Teams minimize the barriers to change and get
business buy-in to keep systems current and valuable.
Testability Testability shows the degree to which test criteria can be used for the software system and tests can be
performed to determine whether these criteria have been satisfied. Applies to all applications, operating
systems, and databases.
Supportability Supportability is the degree to which a system, product, or component performs specified functions
under specified conditions for a specified period. Applies to all applications and systems within the
organization’s portfolio, whether they are custom-developed applications or vendor-provided solutions.
Resources are invested to better support the system.
Cost Efficiency The application system is executed and maintained in such a way that each area of cost is reduced to
what is critically needed. Cost efficiency is critical (e.g. printers cost per page, TCO, software, cost of
downtime), and everyone must understand the financial impact of their decisions.
Source: ISO 25000, Software and Data Quality, n.d.
Self-Service End users are empowered to make configurations, troubleshoot, and make changes to their application
without the involvement of IT. The appropriate controls are in place to manage the access to
Info-Tech Insight unauthorized access to corporate systems.

QA performed early and throughout the SDLC improves the accuracy and effectiveness of
downstream tests and reduces costs of fixing defects late in delivery. QA activities should be
embedded in one’s job description. Info-Tech Research Group | 29
Input Output

1.1.1 Define solution • Industry definitions of • Solution quality

quality in your context solution quality definition in your


context

30 minutes
1. Review the various business (e.g. stakeholders, management, end users) and
technical (e.g. development, infrastructure) perspectives of solution quality.

2. List three quality attributes that your organization sees as important or high
priority (e.g. usability, security, scalability).

3. Describe each quality attribute from the perspective of a solution delivery,


solution support and operations, and end user as shown on the following
slide.
Materials Participants
4. Review each description from each persona to come to an acceptable
definition that can be applied broadly across the organization for any solution,
team, or technology. For distinct assets, create separate solution quality
definitions. • Whiteboard • QA team
• Markers • SDLC team
5. Document a summary of this exercise in the “QA Context” section of the QA
StrategyDownload
Template.the QA Strategy Template • QA stakeholders and
management
• IT operations

Info-Tech Research Group | 30


1.1.1 Example
Solution Support and
Solution Delivery End Users
Operations
• Elegant • No or few critical incidents • Intuitive
• Producing work and change that is accurate • The extent to which a product satisfies user • User friendly
and meets all requirements requirements • Easy to navigate GUI
• Well-commented and documented code • Processes are efficient and effective • Highlight mandatory fields
• Structured functions and objects and no • Policies and procedures are followed • User-friendly systems and applications
repeating code • Minimal training needed to adopt and support • Meets or exceeds customer expectations
• Efficient queries and algorithms applications • A reliable and functional product
Usability • Build an efficient user-friendly system • Support is efficient and helpful • Supports multiple user paths
• Well-understood error messages • Quick application and system response
• Training material and help guides are easily • User should be able to easily understand and interact
understood and applicable for all system with application or product
changes • Consistent experience across platforms
• System requires minimal maintenance and • Solution meets all user requirements
support
• Security is considered in every phase of the • Meeting information security requirements (PCI • Users only have access to required interface or data
software development lifecycle compliance, etc.) based on roles
• Current software patches are applied • Correct interpretation and application of security • Private and sensitive data cannot be shared or
• Strong cryptographic algorithm standards transferred easily
Security • Application hardening is part of the delivery • Regular user access reviews • Accounts are secured and cannot be hacked
process • Defined roles and access rights
• Strong data protection protocols, mechanisms, and
tactics
• Enabling and restricting accesses accordingly
• Large amount of transactions can be • Possibility of adding new capabilities with • Applications handle multiple users/customers without
supported reasonable effort affecting the user experience
• System performance is well tested • Testing surge demand • Applications handle large volumes of data
• Solution is able to continue to function • Identify and forecast demand range and • The system scales well to maintain or increase its level
Scalabilit efficiently with growing number of users
• Object-oriented design
expected performance KPI of performance, even as it is tested by larger
operational demands
y • Ability to upscale the capacity over regular • System is configurable for new products, customers,
usage data, and use cases
Info-Tech Research Group | 31
• Solution is structured in a way that known • System performance is not compromised by sudden
future development is considered when surge on demand
1.1.2 Example
Stakeholde
Areas of Concern Involvement
r
• Ensure business continuity Go/no-go approval of
CIO
• Ensure stability and reliability of enterprise systems releases to production
Consulted for satisfaction of
Enterprise • Ensure all business capabilities and processes are
enterprise architecture
Architect sufficiently supported by products
standards
• Ensure products accommodate high-priority use Responsible for validating
End User
cases and are consistently available value of products

QA Value
Challenges QA Practices
Opportunities
• Unrealistic • Clarify test coverage and
commitments to product • Upstream QA Planning effort to improve release
updates and releases estimates

• Lack of business • Stakeholder • Early stakeholder


involvement in test feedback to better align
design, planning, and Involvement product to expectations
execution
• QA activities across
• Misalignment of • Standardized Quality organization achieving
business and technical the same value goals
objectives to QA goals Definition
• Issues and risks revealed
• Defects found late in the and addressed early, Info-Tech Research Group | 32
product delivery pipeline • Continuous Testing reducing rework
1.1.1 Example
Your solution quality definition:

Quality is the delivery of valuable and useable software


products to end users, where value is characterized by the
product's ability to meet or exceed customer expectations in a
cost-effective manner. Software products must also satisfy
nonfunctional requirements, including functional suitability,
reliability, performance efficiency, operability, security,
compatibility, maintainability, and transferability, all of which
contribute to the product's overall value proposition.

Info-Tech Research Group | 33


Map your QA practice to value-generation opportunities
QA is designed to instill the appropriate amount of validation and verification practices to ensure alignment to
business priorities, value, and technical excellence. This balance considers operational, technical, and resource
constraints of your organization. With the right practices, governance, and organizational buy-in, you can convert several
key QA challenges into value-generating opportunities. Be realistic in defining the objectives for your QA
practice.

Challenges QA Practices Value Opportunities

• Clarify test coverage


• Unrealistic commitments
• Upstream QA planning. and effort to improve
to solution updates and
release estimates.
releases.
• Stakeholder • Early stakeholder
• Lack of business
involvement. feedback to better align
involvement in test
solution to expectations.
design, planning, and
execution. • Rally behind common
• Standardized quality
• Misalignment of business QA goals across the
definition.
organization.
and technical objectives
to QA goals. • Issues and risks
• Continuous testing. revealed and addressed
• Defects found late in the
early, reducing rework
solution delivery
and release delays.
pipeline. • Automated testing and
test automation. • Expand test coverage
• Manual completion of Info-Tech Research Group | 34
and enable continuous
tests and management
delivery.
Balance your QA objectives from both business and
IT priorities
Business Objectives IT Objectives
Use your business and IT strategies to Your solution delivery teams have their own
reveal and define the strategic and QA success criteria (e.g. increase
operational objectives of your QA practice maintainability, decrease operating costs,
with your stakeholders. Your QA practice increase scalability). IT stakeholders and
should align with long-term, strategic goals managers may create these objectives as a
to support your organization’s overall means to be more productive and effective,
growth, viability, and stability. achieve technical excellence, mitigate
risks, and pay down technical debt. They
Strategic Objectives – Designed to can also be rooted in your team’s culture.
generate positive outcomes that relate to
meeting organization-wide objectives. This Including technical requirements and
gives insight into the strategic value insights in your QA success criteria helps
gained from the proposed QA initiatives. ensure business objectives are realistic and
For example: technically sound. For example:
• Improving customer satisfaction. • Align QA artifacts to business
• Gaining competitive advantage. requirements.
• Increase maintainability of source code.
Operational Objectives – Designed to • Minimize time to complete development
highlight process efficiencies in day-to-day and maintenance tasks.
business and IT operations. Improvements • Align to industry standards and
typically include quality enhancements, frameworks.
workflow streamlining, and automation. • Increase applicationInfo-Tech
and system
Research Group | 35
For example: development.
Input Output

1.1.2 State your • Understanding of QA • QA stakeholders

objectives challenges

• Business and IT
• QA value
• Business and technical
1 hour strategy
objectives

1.Identify your QA stakeholders. List the areas of concern for each


stakeholder and describe how each stakeholder will be involved in
QA.

2.Brainstorm the challenges and business needs your QA practice is


expected to solve or alleviate using your business and IT strategy,
if available, as a starting point.

3.List the value opportunities that QA practices will bring to your


Materials Participants
organization and align these opportunities to the challenges and
business needs brainstormed earlier.
• Whiteboard and • QA team
4.Brainstorm the business and technical objectives of your QA markers
• SDLC team
practice and the value they are intended to provide to your • QA strategy
organization. • QA stakeholders and
management
5.Document a summary of this exercise in the “QA Context” section • IT operations
of the QA Strategy Template.
Download the QA Strategy Template

Info-Tech Research Group | 36


1.1.2 Example
QA Practice Objectives

• Objective: Improve ROI on development initiatives.


Value: Make better use of development resources.

• Objective: Increase communication among teams and with the


business.
Value: Business and IT stakeholders are better informed of
resource availability and progress on current initiatives.

• Objective: Align development capabilities closely with the


organization strategy.
Value: Development is directly and actively supporting critical
business capabilities.

• Objective: Ensure high solution quality.


Value: Increase user satisfaction and minimize deployment issues. Info-Tech Research Group | 37
Measure the value of QA with metrics tied to your quality
definition, business value, and QA practice objectives
Assign metrics to your QA objectives.

The importance of measuring QA


Objective Examples Metric Examples
value through metrics.
Increase revenue through products Customer lifetime value (LTV) The better an organization is at using
metrics to evaluate QA’s ability to
Implement data monetization Average revenue per user (ARPU) deliver business value, the more
satisfied the organization is with QA’s
Reduce labor costs Contract labor cost performance as a business partner.
Assigning metrics to your prioritized
Reduce overhead Effective cost per install (eCPI) objectives will allow you to more
accurately measure a product’s value to
Limit failure risk Mean time to mitigate fixes the organization and identify QA
optimization opportunities.
Collaboration Completion time relative to deadline

See Info-Tech’s Select and Use SDLC


Customer satisfaction Net promoter score Metrics Effectively for more information.

Info-Tech Research Group | 38


Look at your cost of quality and performance metrics to
gain a complete picture of QA effectiveness
Cost of Quality Performance Monitoring
The cost of quality is the accumulated cost of not creating a Performance monitoring refers to monitoring and managing
quality product. This metric involves determining the extent the performance and service availability of software
QA and development resources are used to prevent poor applications. It utilizes tools to detect, diagnose, resolve, and
quality and assure good quality. report an application’s performance to ensure that it
• Internal failure costs are costs that are caused by continues to meet the needs of the users and the
products or services not conforming to requirements or expectations of the business. Performance monitoring
customer/user needs and are found before delivery of involves several distinct aspects:
products and services to external customers. • Performance of every system request and transaction
• External failure costs are costs that are caused by • Code-level performance profiling
deficiencies found after delivery of products and services
to external customers that lead to customer dissatisfaction.
• Usage and performance of all application dependencies

• Prevention costs are costs of all activities that are • Detailed traces of individual requests and transactions
designed to prevent poor quality from arising in products • Basic server monitoring and metrics
or services. • Application framework metrics
• Appraisal costs are costs that occur because of the need • Application log data
to control products and services to ensure high quality in
all stages and conformity to quality standards and • Application errors
performance requirements. • Real user monitoring (RUM)

Info-Tech Research Group | 39


Expand your code coverage
High code and test coverage ensures your solutions meet the business and system
requirements that your organization prioritized as key value drivers.

Formula
Use the following formula to define the percentage of code
you would like tested:
Code Coverage
What is it? Code coverage is performed to verify the extent
to which the code is executed. Code coverage can ensure
quality standards are maintained so only the optimal quality
code is pushed to production. It is primarily performed at the
Values of Code Coverage:
unit testing level. Be sure to negotiate the code coverage
• Quantitatively indicates if there are enough tests in the
with stakeholders, since 100% coverage is not necessarily
feasible nor cost efficient. unit and component test suites and if more tests are
needed.
What are the levels of code coverage? There are several
• Adhering to a high percentage of code coverage can
levels of code coverage to consider. Which ones you decide
• Loop
to measure depends on the coding coverage
standards your team lower the chances of escaped bugs detected later in
decides to adopt. development.
• Condition coverage
• • Code coverage motivates the removal of untouched and
Branch coverage
unneeded code to improve the efficiency and size of the
• Function coverage entire code base and ease the downstream build,
Info-Tech Research Group | 40
• Statement coverage deployment, and testing processes.
Expand your test coverage
Formula
Use the following formula to define the percentage of
requirements (functional and nonfunctional) you would like
Test Coverage tested:

What is it? Test coverage is a black box technique that


monitors the number of tests that have been executed
against the overall sets of tests in your test plan and in your
various requirements documents, such as functional, user, Values of Test Coverage:
and nonfunctional requirements. Test coverage can be
• Helps prevent defect leakage by ensuring the right set of
measured and evaluated for all tests, but its purpose is
dependent on the business priorities of the testing team, the requirements are actually tested.
tests they perform, and the type of software under test. • Reveals the application components or transaction flows
What are the coverage types? Context is critical for test that were not tested.
coverage as its interpretation will be different for a mobile • Motivates software delivery teams to create additional
versus web application, for example. Test coverage is test cases to meet test coverage mandates.
designed based on the attribute we want to test, such as: • Indicates the value of test cases by identifying which test
• Feature coverage cases were not executed due to their low priority,
• Risk coverage difficulty to prepare and execute, and/or the low
While code and test coverage can help gauge the effectiveness of yourprobability that defects
testing practices, willnot
it does occur.
capture the impact it has
•on the
Requirements coverage
broader solution delivery pipeline. In fact, automated testing can stress existing downstream bottlenecks. Supplement
your test
Source: coverage
BrowserStack, measurements with other solution quality and delivery throughput metrics to review the overall impact on
2020
the final product. See Info-Tech’s Select and Use SDLC Metrics Effectively for more information.
Info-Tech Research Group | 41
Input Output

1.1.3 List the metrics • Business and IT • List of metrics

that will gauge your objectives

success
30 minutes
1.Identify the major areas that will be targeted for monitoring by
determining the metrics that will validate your QA business and IT
objectives. Aim to identify at least one metric per expectation.

2.Complete your metrics definition by stating the following:


a) Name: Name of metric
b) Business or IT objective: Team, operational, and
stakeholder expectations defined from previous exercises
Materials Participants
that relate to the metric
c) Collection method: How and from where the data is
• Whiteboard and • QA team
gathered for the metric markers
• SDLC team
d) Target/benchmark: A measurement that the metric is • QA strategy
intended to reach • QA stakeholders and
management
3.Document a summary of this exercise in the “QA Context” section • IT operations
of the QA Strategy Template.
Download the QA Strategy Template

Info-Tech Research Group | 42


1.1.3 Example
QA Objectives Metrics Metrics Source
Defect density, test and requirements coverage,
Ensure software products meet business and functional test results, defect severity, traceability Power BI; Azure DevOps; BRD (test coverage); test
requirements metrics, customer satisfaction traceability matrix
Defect density, test and requirements coverage,
test results, defect severity, traceability
Ensure software products meet nonfunctional requirements metrics, customer satisfaction, mean time to Power BI; Azure DevOps; BRD (test coverage); test
(including security, performance, integration, regression) failure traceability matrix
Repeatability of testing practices, IT Audits, retrospectives, velocity and burndown
satisfaction, reusability of templates, number of charts, project completion, throughput metrics,
projects and work items completed, testing continuous review of current practices, testing
Build a disciplined and formal QA practice performance, percentage of passed tests reports, team satisfaction
Increase customer and stakeholder confidence, trust, and IT satisfaction, net promoter score, application
respect satisfaction Satisfaction survey
Expand system regression testing suite (including end-to-end Test strategy; Power BI; Azure DevOps; BRD (test
testing) Regression test coverage coverage); test traceability matrix
Implement automated testing Number of tests automated Test strategy
Number of changes and incidents (particularly
Reduce system failure risks after go-live emergency) Footprint, Azure DevOps

Info-Tech Research Group | 43


Step 1.2 This step involves the
following participants:

Analyze your current QA state • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
1.2.1 Understand the challenges of your QA
practice
1.2.2 Complete a current state assessment

Outcomes of this step


Assess Your QA Process
• SWOT Analysis

• Current QA state assessment


Step 1.1 Step 1.2

Info-Tech Research Group | 44


Uncovering key challenges
affecting software quality
These challenges highlight critical gaps in our current approach, showcasing the
necessity for a shift toward more integrated and automation-driven QA “Quality requires more
processes. This focus ensures that QA ultimately drives your competitive attention than ever
advantage. because you are now on a
massive growth path.
Top challenges in achieving your software quality People are moving towards
goals more digitalization, and you
Lack of mature processes 24% must ensure what you
release or what you give to
Unclear quality goals and targets 24% customers is of the highest
Challenges

Too short release lifecycles 25% quality.”


Lack of resources 27%

Challenges in applying test automation 33% – Manager, Quality


Lack of time to ensure quality 39%
Engineering
0 5 10 15 20 25 30 35 40 45

Percentage of respondents

Source: Katalon, 2023.


Info-Tech Research Group | 45
Leverage SWOT to take a balanced
look at your QA practices Helpful Harmful

• Use SWOT to analyze your subject’s strengths,


weaknesses, opportunities, and threats. Strengths Weaknesses

Internal Focus
• The SWOT is structured on two key axes:
o Internal vs. External: Whether or not the Internal
Internal characteristics
item originates from inside the organization
characteristics that that are
or outside will impact how you decide to are favorable as unfavorable or
proceed. Remember internal vs. external is they relate to your need
directly related to level of control. (Can I environment. improvement.
change or simply mitigate? Can I enhance or
simply encourage?)
Opportunities Threats

External Focus
o Helpful vs. Harmful: Elements can either
help you or hinder you. Knowing which is
Info-Tech Insight External
important. External characteristics
characteristics that
Some existing wisdom discourages celebrating what
you may use to your that may be
helps you and just focusing on the challenges. This potential sources
advantage.
is misguided, as giving appropriate time to your of failure or risk.
strengths lets you know what not to focus on.
Info-Tech Research Group | 46
Input Output
1.2.1 Understand the challenges of
your QA practice • Understanding of
current QA practices
• SWOT analysis

1-3 hours
1. Complete a SWOT analysis of your current QA practice.
2. Use the outcomes of this exercise to frame your discussions in the following
exercises.
Strengths Weaknesses

Internal characteristics Internal characteristics


that are favorable as that are unfavorable or Materials Participants
they relate to QA. need improvement.
Opportunities Threats • Whiteboard and • QA team
markers
• SDLC team
• QA strategy
• QA stakeholders and
management

External characteristics External characteristics • IT operations


that you may use to your that may be potential
advantage. sources of failure or risk.
Info-Tech Research Group | 47
Assess, build, and strengthen your QA
practice
Vision and Buy-In Execution
Clear direction and goals of Management
the QA practice and The tactics to manually and
ensuring the practice has automatically execute the
the appropriate funding various QA activities and
and stakeholder buy-in. test types.

Planning Core Capabilities Reporting and


The creation of artifacts
to Grow and
Communication
needed to define the scope Mature Your QA
Preparing and delivering
of QA activities and for Practice
the outcomes of QA
teams to confidently plan activities for the
and commit to that scope. consumption of decision
makers, SDLC teams, and
Cross-Functional Practice Management
dependent roles.
Collaboration Defined QA roles and
The collaboration among responsibilities; processes,
SDLC roles in QA activities tools, and technology
and the involvement of QA management; and tactics to
in other SDLC activities. support and improve the
View the QA Current-State Assessment Info-Tech Research Group | 48
Tool practice.
Input Output

1.2.2 Complete a • SWOT analysis • Current state

current state • Experiences from


SDLC and QA teams
assessment of QA
practice

assessment
1-3 hours
1.Select one QA or SDLC team employing QA practices. This will be
and stakeholders • Gaps in the current QA
practice

the scope of the current state assessment.

2.Complete the QA Current-State Assessment Tool to assess your


team against various assessment factors.

3.Review the assessment results on Tab 3, Assessment Results.

4.Brainstorm solutions to address the gaps or challenges in your QA Materials Participants


practice to at least meet the acceptable state (team level).
a. If broader organizational QA practices and standards are
necessary, brainstorm solutions to modify your team-based • QA current • QA team
practice so that it is applicable in other teams. assessment tool
• SDLC team
• QA strategy
5.Repeat this assessment for other QA or SDLC teams. • QA stakeholders and
management
6.Document your findings in the QA Strategy Template.
• IT operations

Download the QA Strategy Template

Info-Tech Research Group | 49


Phase 2 This phase will walk you
through the following
activities:
Align on Improved QA Practices 2.1.1 Define your QA guiding
principles

2.2.1 Implement your QA


process and artifacts

Align on 2.2.2 Identify your QA roles and


Assess Your QA Improved responsibilities
Process QA Practices 2.2.3 Select and define your QA
1.1 List your QA 2.1 Define your QA resource structure
objectives and guiding principles
metrics 2.2 Define your
1.2 Analyze your foundational QA
This phase involves the
current QA state process
following participants:

• QA team
Build Your QA Establish a QA • SDLC team
Toolbox Roadmap
3.1 Define your • QA stakeholders and
defect tolerance 4.1 Build your QA management
3.2 Align on your QA roadmap
• IT operations
activities and tools
Build a Software Quality
Assurance Program
Info-Tech Research Group | 50
Step 2.1 This step involves the
following participants:

Define your QA guiding principles • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
2.1.1 Define your QA guiding principles

Outcomes of this step


Align on Improved QA Practices
• QA guiding principles

Step 2.1 Step 2.2

Info-Tech Research Group | 51


Instill holistic quality accountability and understanding
with collaboration, clarity, and integration
SDLC
QA implements quality thinking QA
Teams and
and accountability early in Stakeholde
delivery so that product COLLABORATION
rs
decisions and plans abide by
quality standards, a critical CLARITY
factor for business satisfaction. INTEGRATION

Collaboration Clarity Integration

IT, business, and customers Create common Explore methods to integrate


work together to verify and understandings and assure the workflows and toolsets
validate quality through all alignment of quality standards among the business, IT, and
stages of the product lifecycle, across all teams and products customers. The goal is to
from roadmapping and delivery by clarifying expectations. This become more reactive to
to maintenance and retirement. approach validates the changes in business and
The goal is to ensure quality achievability of development customer expectations, which
standards are consistently work to meet quality definitions dictates product success.
Info-Tech Research Group | 52

enforced. and makes it easier to


Embrace a common QA mindset with
guiding principles
Teams may have their own perspectives on and practices for how
they can deliver value. These attributes can motivate you to adopt
guiding principles that will lay out your core values to cement a
consistent organizational perspective on how to improve your
quality. Guiding principles can help you achieve the following:
• Align QA strategies with organizational objectives, prioritizing
tasks that offer the highest business value and user satisfaction.
• Foster a unified approach to quality, embedding a quality-first
mindset across all roles and departments, enhancing overall
software integrity.
• Improve collaboration and communication, ensuring that QA
efforts are in sync with development and business goals,
leading to more efficient and effective outcomes.
• Promote continuous improvement in QA processes and tools,
leveraging innovations to enhance efficiency and adapt to
evolving quality standards.

Info-Tech Research Group | 53


Define guiding principles to adopt a QA way of working
Prioritize a quality-first Select tools that best fit
approach for every task your needs
By placing quality at the forefront of Using the right tools for specific
every decision and process, this project requirements, which can
principle ensures that the end lead to more effective testing,
product consistently meets the better resource management, and
highest standards.
Embed QA throughout 01 06 higher solution quality,
Embrace industry best
the solution delivery
lifecycle 02 05 practices and leading-
edge technology
Embed QA practices from the Ensures that your QA process
initial stages of solution
delivery through to
03 04 is efficient by adopting AI and
automation capabilities in QA
completion. design, execution, and
Provide
analysis. support to ongoing
Foster cross-functional learning and development in
collaboration for quality QA
Encouraging joint accountability for Continual learning, coaching, and
quality among all team members mentoring within QA teams
promotes a culture of continuous cultivates knowledge sharing and
improvement. builds a robust QA practice.
Info-Tech Insight
QA is not a role. It is a mindset (way of working) that revolves around quality-first and proactive thinking. Good QA
practices future-proofs solution investments by ensuring they are maintainable, scalable, and transferable. Info-Tech Research Group | 54
Input Output

2.1.1 Define your QA • Stakeholder and team • QA guiding principles

guiding principles expectations

• Understanding of
30 mins organization culture

1.Consider what are the guiding principles of your QA practice. Your


guiding principles should consider key business priorities, team
and personal objectives, and team and organizational cultures,
such as:
a. Improve collaboration between business and development
teams.
b. Improve resource utilization and productivity. Materials Participants
c. Improve the quality and acceptance of products.

2. Once you’ve identified your guiding principles, ensure they are


• Whiteboard and • QA team
aligned with your definition of quality by thinking through the markers
definition, rationale, and impact (see example below). • SDLC team
• QA strategy
• QA stakeholders and
3. Document a summary of this exercise in the “QA Overview” management
section of the QA Strategy Template.
• IT operations
Download the QA Strategy Template

Info-Tech Research Group | 55


2.1.1 Example
Guiding Principles
• Be Bold: Remove the Scaffold
and Don’t Trip
• Team Over the Individual
• Embrace the Challenge
• You Have the Obligation to Share
Information • Stronger Together
• Make the Boat Go Faster, Have the End in • One Force
Mind
• Test to Be the Best
• Right Communication, Correct Action
• Speak Well, Listen Better
• Tie Your Laces to Win Races, Be Prepared,
• We Are in This Together
Name Definition Rationale Impact
Right • Communicate quality • Ensure consistent • Increase user
Communication priorities and ensure it application of satisfaction.
, Correct Action is properly interpreted quality principles • Ensure decisions
by delivery roles. and requirements. are made with
• Provide sufficient • Understand all risks data-driven
information of QA and impacts from quality insights.
outcomes to decision the results found in • Increase technical
makers to make the QA activities. quality of
right decisions. applications.
Info-Tech Research Group | 56
Step 2.2 This step involves the
following participants:

Define your foundational QA process • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
2.2.1 Document your QA process and artifacts
2.2.2 Identify your QA roles and responsibilities
2.2.3 Select and define your QA resource structure

Outcomes of this step


Align on Improved QA Practices
• Target QA process and
artifacts
Step 2.1 Step 2.2 • RACI chart of QA capabilities

• QA resource allocation
approach and structure
Info-Tech Research Group | 57
Continuous delivery and release cycles, intake feedback, and
Your solution improvement.

delivery lifecycle
should embrace
quality at its
core
Connect all phases with a
solution-centric approach
that goes from the first idea
all the way through to
maintenance.

Definition of Done Voice of


Product/Enterprise
the
Alignment
Voice of the Customer
Customer
Voice of
Voice of the the
Teammate
Teammat See our Evolve Your Software Development Lifecycle
Into a Solution Delivery Lifecycle blueprint for more Info-Tech Research Group | 58
e information.
Assure your QA process is adjusted to
satisfy the various products and teams
It’s important to determine the project’s and work item’s risk levels and complexities up front, as each risk
level will have a specific degree of QA control, analysis, and evaluation that will need to be completed. That
being said, not all organizations will have just one approach to QA.
• Products that result in a transformative change in the way you do business and have significant costs and/or
risks. Level 4 risk affects all lines of business and multiple technology areas.
Level 4 • All rigorous QA activities and controls are performed with executive sign-offs at each key point in the process
and organization-wide participation.
• Example: Implement ERP

• Products that affect multiple lines of business and have significant costs and/or risks.
• Most QA activities and controls are performed rigorously with business lead (e.g. vice president) sign-offs at
Level 3 all key decision points in the process. QA and SDLC teams are empowered to make some QA and delivery
decisions and department representatives must participate in QA activities.
• Example: Implement CRM

• Products with broader exposure to the business that present a moderate level of risk to business operations.
• Certain QA activities and controls are performed rigorously with business lead sign-offs at specific decision
Level 2 points in the process. QA and SDLC teams are empowered to make most decisions in the delivery process.
Department representatives should participate in QA activities.
• Example: Deploy Office 2013
• Routine/straightforward product changes or development with limited exposure to the business and low risk
of negative business impact.
• QA activities and controls are rigorously performed as needed with business lead sign-offs as optional. QA
Level 1 and SDLC teams are empowered to make all decisions in the delivery process. Department representatives
are consulted if needed.
• Example: SharePoint Update Info-Tech Research Group | 59
Make sure every QA artifact has the information you
need for testing
An extensive suite of artifacts may be needed to describe the test planning, design, execution, results, and
conclusions revealed by testing activities. These artifacts lay out the scope and processes to validate and verify
value delivery in order to justify the resources and effort needed to complete these tests.

Organizations that supplement text-based test management practices with model-driven practices (such as context
models, business process models, and use-case models) are able to drive efficiency through reusability and
increased clarity of testing objectives.
Test Plan Test Dashboards and Reports
Test Cases Pass/Fail Results
Priorities Conclusion and
Acceptance Criteria Recommendation
Stakeholder

0 0 0 0
Communication
0
1 2 3 4 5
QA Strategy Test Scripts and Defect Documentation
Frameworks Environment Root Cause Analysis
Standards Test Data and Environment Traceability to Other
Management Artifacts
Resources
Test Execution Info-Tech Research Group
Fix Request | 60
Acceptance Criteria
Input Output

2.2.1 Document your QA • Current understanding • Target QA process and

process and artifacts of QA artifacts

1 hour
1.Complete a suppliers, inputs, processes, outputs, and customers
(SIPOC) table to ground your understanding of your SDLC process:
a) Supplier: Who provides the artifacts needed for the
process?
b) Input: What artifact is needed to initiate and execute the
process?
c) Process: What phase is being executed and who is Materials Participants
responsible in this phase?
d) Output: What is produced at the end of the process?
• Whiteboard and • QA team
e) Customer: Who consumes the completed artifact? markers
• SDLC team
2.Identify who is involved in each step, stage gates, and sign-off • QA strategy
• QA stakeholders and
points. Verify and validate the fit of these items against different management
types of work items and risk levels.
• IT operations
3.For each SDLC phase, ask yourself the following questions and
document the results:
a) How can QA capability/thinking help?
Download the QA Strategy Template Info-Tech Research Group | 61
b) How can QA activities/teams be helped?
2.2.1 Example
Suppliers Inputs Processes Outputs Customers
Software delivery Approved solution approach Build Functioning solution build (stable in test environment) Software delivery team, especially
team (scope, technical design, (Developers, QA, vendors) Updated source code repository QA
Vendors solution design, UI mock-ups, Unit testing results service desk
Procurement team SME sign-off) Software documentation (e.g. support after go-live, functional Infrastructure team
design, contacts of SME, access, release notes, training
documentation draft, etc.)
Test environment and data provisioned
Software delivery Functioning solution Test Test results (e.g. defect list, root causes, test performance) Software delivery team
team, especially Unit testing results (QA, SME, customers) Decision to go live (UAT sign-off and decision-maker approval) Vendors and procurement team
QA Software documentation (e.g. SME and end-user feedback Business sponsors
service desk support after go-live, Test status reporting
Infrastructure teamfunctional design, contacts of Updated project/maintenance backlog based on accepted or
SME, access, release notes, declined defects
etc.) Updated software documentations
Functional and nonfunctional End-user training and onboarding documentation
requirements
Software delivery Approved and tested solution Deploy Request for change (RFC) and CAB go/no-go meeting End users
team build (infrastructure, release and Solution build push to production Customers
Vendors software documentation (e.g. deployment team, Defined warranty period Service desk/operational support
Procurement team support after go-live, organizational change Hand off to service desk post warranty period Organizational change leader
functional design, contacts of leaders) Updated end-user training and onboarding documentation
SME, access, release approval End-user communication of releases
to go live)

Info-Tech Research Group | 62


2.2.1 Example
Processes How Can QA Capability/Thinking Help? How Can QA Activities/Teams Be Helped?
Updated code review guidelines and development standards Delegate some testing responsibilities from QA to development
Complete good, comprehensive unit tests and share results Review and update of test plans using development lessons learned and
QA ability to deploy own build to test environments completed work
Writing good deployment plans to illustrate how builds should be configured Demonstrations and walk through session of work in progress to give insights
and orchestrated in deployments to target environments on what is completed
Apply QA thinking to increase the value of unit tests (i.e. not doing it for the Provide quick feedback on work (e.g. paired programming)
Build sake of doing it) Teaching and training on the development approach, decision and code
(developers, QA, Catch the right defects early with good unit tests employed to help upskill resources, define testing focus, and rapidly address
vendors) Appropriate planning to ensure sufficient time for unit tests and developers potential issues
are trained to design good unit tests Definition of development done indicating what activities must be completed
Increase reusability of unit tests (e.g. through unit test frameworks) with and deliverables provided to be accepted into testing (at the PMO level)
good documentation to describe scope and depth
Clarification of development level testing (e.g. unit test, component test,
build test, code-level regression and integration test)
Motivate the writing of test scripts early
Testing our tests, including test environment and data Consistent documentation of defects and the code that was developed and
Breaking up tests into small test cases and orchestrate them in sequences changed
when more complicated tests are completed Complete a trend analysis of defects to proactively address common defect
Quickly inject feedback and defects to be fixed into the backlog for types (e.g. via tagging)
prioritization Business accountability for the push of tested solutions into the deployment
Test Defect management categorization based on severity process
(QA, SME, Test with automation in mind Test status meetings to identify blockers for test completion
customers) Gaining more knowledge and training by working with business teams we Definition of test done indicating what activities must be completed and
are not familiar with or who have not worked with IT before deliverables provided to be accepted into deployment (at the PMO level)
Communicate with end users and deliver early end-user training for the Include some security-related test cases (e.g. entering the wrong login
developed solution credentials ten times)
Improve the testing of the stability and reliability of dependent systems with
the solution deployed (regression testing)
Verification and validation of the solution documentation QA input into the RFC
Verification and validation of the production environment configurations Sharing and storing of solution documentation in an accessible place
Run automated tests Validation that the QA tactic applied works and is valuable (retrospective)
Deploy QA involved in and provide input into infrastructure-related tests (e.g. Ensuring that the solution developed is being used and viewed as valuable
(infrastructure, smoke) Opportunities to improve the solution based on immediate feedback from
End users involved in final rounds of tests users and monitoring tools
release and
System monitoring to gauge the fit and performance of solutions in Rapidly address issues that occur in immediate deployment to Research
Info-Tech production
Group | 63
deployment team, production (pre- and post-deployment) (warranty period)
organizational Warranty period Definition of deployment done indicating what activities must be completed
Involve all relevant business and technical
roles in QA
Obtain a complete 360-degree view of solution quality by including the relevant parties in QA
planning, designing, and execution activities.

Without consideration of all impacted parties, your


Organization
delivery decisions may omit and fail to meet critical
quality objectives, encounter product rejection, or set
unrealistic demands. Roles include:

• Business Stakeholders: Those expected to set the IT


Stakeholders
tone and direction for QA. These roles are critical in Business End
defining quality standards and acceptance criteria and Stakeholders Users
in gaining buy-in and maintaining the momentum for
your QA initiatives. Delivery
Teams
• End Users: Any internal business units or external
customers of your products who can provide critical
insights of end-user needs and challenges.

• IT Stakeholders: Internal IT teams (e.g. system


architecture, infrastructure, security, data) that are
directly or indirectly impacted by the actions of QA.

• Delivery Teams: Roles who are directly involved in


the completion of QA planning, design, execution, and Info-Tech Research Group | 64

reporting activities.
Recognize the capabilities critical for QA
success Solution Operations
Intake and Solution Development (Including Service Desk
CROSS-FUNCTIONAL Business
Backlog Architecture and Release and and
COLLABORATION Management and UX Design Analysis Implementatio Change Maintenance
n Management)

Quality
Test Data
Policies, QA Resource
Test Case and
QA PLANNING QA Plan Standards, QA Schedule Test Scripts Capacity
Design Environment
and Planning
Planning
Frameworks

Test Stage Gate Code-Level


Testing User Application
EXECUTION Environment Review and Functional Nonfunction
(Including Unit Acceptance Performance
MANAGEMENT and Data Artifact Testing and Testing al Testing
Management Validation Testing Monitoring
Code Analysis)

QA Results Troubleshoot Bug and QA


Feedback Feedback
REPORTING AND and ing and Root Defect Communicat
Loops Loops Post-
ANALYTICS Progress Cause Managemen ion
During SDLC Go-Live
Reports Analysis t Channels

Organization
QA QA Practice QA Strategy
QA Practice Executive al Desire
VISION AND BUY-IN Objectives Fund and
Vision Sponsorship and
and Metrics Sourcing Approach
Motivation

QA
Knowledge QA Artifact
Continuous QA Resource Governing QA Practice
PRACTICE QA Managemen Standards QA Guiding
QA Tools Improvemen Managemen or Performance
MANAGEMENT Processes t and and Principles
t t Collaborativ Dashboard
Sharing Versioning
e BodyInfo-Tech Research Group | 65
Input Output

2.2.2 Identify your QA roles and


responsibilities • Desired QA
capabilities
• RACI chart of QA
capabilities

• Understanding of
1-2 hours current and desired
SDLC roles
1.Review your current QA capabilities and their effectiveness in
sufficiently satisfying stakeholder needs and QA objectives.

2.Build a RACI chart for the various roles in your delivery team to
identify who will be supporting and executing your QA capabilities
as shown on the following slide. Refer below for a definition of
RACI:
a) Responsible – A single role has the authority to execute the Materials Participants
capability.
b) Accountable – A role has ownership over the capability and
is responsible for its success. These items can be • Whiteboard and • QA team
markers
delegated. • SDLC team
• QA strategy
c) Consulted – Roles who are asked for their input into the • QA stakeholders and
capability. management

d) Informed – The outputs of the capability is communicated • IT operations


to these roles.
Download the QA Strategy Template
3.Document a summary of this exercise in the “QA Overview”
section of the QA Strategy Template. Info-Tech Research Group | 66
2.2.2 Example
Solution Delivery Organization
QA Capabilities Product Owner QA Manager IT Operations IT Management
Team Executives
QA Practice Vision I R A I C C
QA Objectives and Metrics I R A C C C
QA Practice Fund Sourcing I R A I C C
Executive Sponsorship I C R I A C
Organizational Desire and Motivation C C R C A I
QA Strategy and Approach C R A C C
Intake and Backlog Management A C C C C
Solution Architecture and UX Design C A C I
Business Analysis A R I I
Solution Development and
C A C I
Implementation
Operations (Including Release and
C R A I
Change Management)
Service Desk and Maintenance C C I A I I
QA Plan C R A C I I
Quality Policies, Standards, and A (Business
R C A (IT Oriented) I
Frameworks Oriented)
QA Schedule C R A C I
Test Case Design C R A C
Test Data and Environment Planning I R A C
Test Scripts I R A I
QA Resource Capacity Planning I R A I I Info-Tech Research Group | 67

Test Environment and Data


R A I
2.2.2 Example
Solution Delivery Organization
QA Capabilities Product Owner QA Manager IT Operations IT Management
Team Executives
Code-Level Testing (Including Unit
Testing and Code Analysis) R A I

Functional Testing C R A C
Nonfunctional Testing C R A C
User Acceptance Testing C R A I C
Application Performance Monitoring C C C A
QA Results and Progress Reports I R A I I I
Troubleshooting and Root Cause
C R A C
Analysis
Bug and Defect Management C R A C I I
QA Feedback Loops During SDLC C R A C I
QA Feedback Loops Post-Go-Live C I A R I C
QA Communication Channels I R A C A I
QA Tools R A C I
QA Processes C R A C I I
Practice Continuous Improvement C R A C I
QA Resource Management I R A I I I
Knowledge Management and Sharing I R A I C
QA Artifact Standards and Versioning I R A I C
QA Guiding Principles C R A C I
QA Governing or Collaborative Body C C R C A C Group | 68
Info-Tech Research
QA Practice Performance Dashboard I R A I I I
Learn the different patterns to structure and
resource QA to your product delivery teams
The primary goal of any product delivery team is to improve the delivery of value for customers and the business based on your product
definition and each product’s demand. Each organization will have different priorities and constraints, so your team structure may take on a
combination of patterns or may take on one pattern and then transform into another.
How Are Resources and Work
Delivery Team Structure Patterns
Allocated?
Functional • Division of teams by functional responsibilities (e.g. developers, testers,
BAs, operations, help desk) and arranged according to their placement in
Completed work is handed off from
team to team sequentially as outlined in
Roles the software development lifecycle.
the organization’s SDLC.
Shared
Service and • Teams are divided by functional responsibilities (e.g. developers, testers,
business analysts, operations, help desk) and arranged according to their
Resources are pulled whenever the work
requires specific skills or is pushed to
Resource placement in the software development lifecycle (SDLC).
areas where product demand is high.
Pools
Product or • Teams are dedicated to the development, support, and management of Work is directly sent to the teams who
are directly managing the product or
specific products or systems.
System directly supporting the requester.

Skills and • Teams are grouped based on skills and competencies related to Work is directly sent to the teams who
Competencie technology (e.g. Java, mobile, web) or familiarity with business
capabilities (e.g. HR, finance).
have the IT and business skills and
s competencies to complete the work.

Info-Tech Note
When deciding which is the right delivery pattern for you:
• Is there enough work (e.g. projects, systems) to warrant a separate team to do testing and
QA?
• Will a separate QA or test group be a formal testing function, or will they come together to Info-Tech Research Group | 69
build a center of excellence?

Staffing models for delivery teams
Functional Shared Service and Product or Skills and
Roles Resource Pools System Competencies

 Specialized  Flexible demand/capacity  Teams are invested in the  Teams are invested in the
resources are easier management full life of the product technology
Pros to staff  Supports full utilization of  Standing teams enable  Standing teams enable
 Product knowledge resources continuous improvement continuous improvement
is maintained
 Demand on specialists  Unavailability of resources can  Changes in demand can  Technology bias can lead
Cons
can create lead to delays lead to downtime to the wrong solution
bottlenecks  Product knowledge can be lost  Cross-functional skills  Resource contention when
 Creates barriers to as resources move make staffing a challenge team supports multiple
collaboration solutions
Use Case
• When you lack people • When you have specialists such • When you have people • When you have a
with cross-functional as those skilled in security and with cross-functional skills significant investment in a
skills operations who will not have who can self-organize specific technology stack
full-time work on the product around the request
Info-Tech Research Group | 70
QA team structure pattern examples
Functional Roles Shared Service Product or Skills and
and Resource System Competencies
Pools
Intake Business
Analysis

Business Analysis
Development

Intake Intake
Testing
Development
Product Team: Website Product Team: Java
Applications
Operations
Resourced
as needed
Testing
Intake
Product Product
Release Release
Product Team
Operations

Product
Product Release
Release

Info-Tech Research Group | 71


Input Output

2.2.3 Select and define your • Understanding of • QA resource allocation

QA resource structure current QA demands approach and


structure

30 minutes
1.Document your current staffing model for your solution delivery
team. Identify all roles who are directly or indirectly involved in
managing, governing, and executing QA and the roles who are
dependent on the test results.

2.Evaluate the pros and cons of each model, as specified on the


previous slide, relative to how you currently work, how you
envision your target state, and the guiding principles you want to Materials Participants
abide.
a) Discuss how QA resourcing can be improved to better
support your QA capabilities. • Whiteboard and • QA team
markers
3.Design your ideal target state of your team. Recognize that one • SDLC team
• QA strategy
model may not fit universally across the organization, hybrid • QA stakeholders and
models may be necessary, and some teams may need an management
intermediary state to reach your target state. • IT operations

4.Document a summary of this exercise in the “QA Overview”


section of the QA Strategy Template.
Download the QA Strategy Template
Info-Tech Research Group | 72
2.2.3 Example
Resource Pool Project/Product
Management
Chief Product
Owner
Director of
Developmen
t
Product
Developmen BA Operations Manager
QA Manager
t Manager Manager Manager

Developer Release Product Owner: Product Owner: Product Owner:


QA 1 BA 1 Business External Development
1 Manager 1
Applications Website Tools

Developer Application
QA 2
2 s Support 1
Domain Expert Domain Expert Domain Expert
Developer
3

Resources are allocated to work when needed.

Info-Tech Research Group | 73


Phase 3 This phase will walk you
through the following
activities:
Build Your QA Toolbox 3.1.1 Define your defect risk
tolerance

3.2.1 List your Test Categories

3.2.2 Define your test data and


Align on environment requirements
Assess Your QA Improved QA 3.2.3 List your desired QA tools
Process Practices
1.1 List your QA 2.1 Define your QA
objectives and guiding principles
metrics 2.2 Define your
1.2 Analyze your foundational QA This phase involves the
current QA state process following participants:

• QA team
Build Your QA Establish a QA • SDLC team
Toolbox Roadmap • QA stakeholders and
3.1 Define your
defect tolerance 4.1 Build your QA management
3.2 Align on your QA roadmap
• IT operations
activities and tools
Build a Software Quality
Assurance Program
Info-Tech Research Group | 74
Step 3.1 This step involves the
following participants:

Define your defect tolerance • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
3.1.1 Define your defect risk tolerance

Outcomes of this step


Build Your QA Toolbox • Test defect risk tolerances

Step 3.1 Step 3.2

Info-Tech Research Group | 75


Build a severity matrix to define tolerance levels and
classify test defects
What do I
need to Prioritization rationalizes the ranking of test defects from both business and IT
prioritize perspectives, justifies when or if the defect should be addressed or accepted,
test and determines whether testing should continue.
defects? Severity (Priority)=Impact x Urgency
Mission-critical defects that affect a large number of people should
always come first in remediation and be treated as showstoppers URGENCY
(i.e. Severity Level 1).
The bulk of defects within the mid-tier severity (i.e. Severity Level 3 Critical High Medium Low
or 4) are individual or isolated needs or problems that will require
stakeholder or product owner consultation to determine whether Critical High Medium Medium
Extensive
they should be addressed or accepted. 1 2 3 3

IMPACT
Some questions to consider when deciding on defect severity Significan High High Medium Low
include: t 2 2 3 4
• How is productivity affected? Medium Medium Medium Low
Moderate
3 3 3 4
• How many users are affected?
Medium Low Low Low
• How many systems are affected? Localized
3 4 4 4
• How critical are the affected systems to the organization?
Why do I
Decide how many severity levels QA Prioritizing
needs to have to manage
test defectstest
is critical to ensure the most impactful issues are
defects. need to resolved first, aligning bug fixes with business priorities and customer needs.
prioritize High-priority defects are typically blockers for release; understanding their
test severity and impact guides informed decision making regarding product
defects? launch or update rollouts. Info-Tech Research Group | 76
Instill the right accountability in the decision to go live
Business needs to be accountable for the decision to push solutions and changes into production at
the end of testing. Otherwise, they must give IT full empowerment to make that decision.

The accountability for pushing solutions with perceived What Should Be Done for Go-Live Decisions?
tolerable test results into production often lies with IT. In other 1. Instill Business Accountability – Shift the
cases, the business quickly signs off on the go-live decisions accountability of solution risks and quality
without fully understanding the risks and trade-offs illustrated concerns to the business or product owners. This
in the test results. Both scenarios risk significant and approach requires QA to prepare reports to
undesired consequences, such as: understand the risks and trade-offs stakeholders
• Expectations Misalignment: Solutions do not meet are accepting or tolerating so they can make
stakeholder expectations due to the various decisions and confident decisions. IT governance will hold these
changes occurring during the delivery process. stakeholders to their decisions and protect solution
• Blame Game: When solution issues are found, IT is blamed delivery teams.
for its release despite the sign-off from the business. 2. Empower IT Decision Makers – Ensure IT teams
• IT-Centric Risk Tolerance: Defect tolerance levels are have clear guidelines and autonomy to make
centered on IT’s interpretation rather than what the informed decisions based on predefined and
business truly tolerates or accepts, which may conflict with business-accepted acceptance criteria.
what the business actually cares about. Stakeholders must be accepting of whatever
Confident go-live decision making involves accessible and consumable test results. decision IT makes with the full trust that their
 Simplify Reporting: Develop concise and clear reports that highlight key outcomes, risks, and recommendations in atoway
concerns were taken that any decision maker
heart.
(including the business) can easily understand.
 Visual Dashboards: Utilize visual dashboards that provide at-a-glance insights into test results, potential impacts, and decision points. These
dashboards should be accessible on-demand and updated live or as close to live as possible.
 Decision Frameworks: Offer decision frameworks that guide stakeholders through evaluating the implications of deploying code with tolerable
risks, balancing business needs and technical realities.
Info-Tech Research Group | 77
Input Output

3.1.1 Define your defect risk • Quality definition • Test defect risk
tolerance • Prioritization and
triaging techniques
tolerances

1 hour
1. Start by identifying the indicators of high- or low-priority test defects. Once you
have these sketched out, you can begin to break them into manageable levels.

2. Define each level of impact and its contributing factors considering your quality
definition. Outline the impact of defects from multiple perspectives, such as
business operations, end users, and enterprise systems. Provide examples for
each level.

3. Define each level of urgency. Outline the factors and timelines that will dictate
how soon a request needs to be addressed. Consider your quality definition in this
Materials Participants
exercise. Provide examples for each level.

4. Combine your urgency and impact levels to define the severity levels that will be
used to prioritize your test cases. Indicate additional escalations if necessary. See • Whiteboard and • QA team
the following slide for an example. markers
• SDLC team
• QA strategy
5. Identify exceptions to the prioritization matrix that may include specific systems, • QA stakeholders and
issues roles, departments, or timing around business processes that will need to • QA plan management
be treated as high priority.
• IT operations
6. Highlight the course of actions to address failing tests for each severity level.
Identify the suspension and resumption criteria for a failing test in a test suite.
Download the QA Strategy Template
7. Document a summary of this exercise in the “Acceptance Criteria” section of the
QA Strategy Template. Info-Tech Research Group | 78
3.1.1 Example
Business Technical
Impact Number Business Number of
Rating Security
of users Requestor criticality of systems Data affected
risks
affected application affected

Will cause
Will impact
1 Multiple Will impact organizatio
100 or Mission multiple
Extensive Departmen many n-wide
more Critical databases or
Widespread ts systems security
warehouses
risks

Will cause Will impact


2 Will impact departmen most data
Significant 50 to 99 Executives Significant several t-wide entities within
Large systems security a database or
risks warehouse

Will impact Will impact a


Will cause
3 Departmen multiple few data
Moderate/ team-wide
Moderate 6 to 49 t applications entities within
Limited security
Limited Heads within a a database or
risks
system warehouse
Will not Will not
4 Will not impact
Team impact cause
Minor 1 to 5 Low any data
Leads other security
Localized entities
applications risk

Info-Tech Research Group | 79


3.1.1 Example
Business
Urgency
Regulatory Vendors Operations and
Rating Financials

Operations are financially


Regulatory Vendor will not support
Emergency consequences now the application
impacted if the defect or
risk is not addressed

Regulatory Vendor will provide less External operations are


High consequences within 48 than 50% of support for disrupted if the defect or
hours the application risk is not addressed

Regulatory Vendor will provide more Internal operations are


Medium consequences between than 50% of support for disrupted if the defect or
48 hours and one week the application risk is not addressed

Operations can continue


No regulatory
Vendor will provide full without disruption if the
Low consequences for at
support for the application defect or risk is not
least one week
addressed

Info-Tech Research Group | 80


3.1.1 Example
Severi Minimu
Descripti Suspension and
ty Course of Action m Pass
on Resumption Criteria
Level Rate
Critical
system is
Teams will halt work in
down; little Suspension criteria: The issue is
progress and dedicate all of
to no logged and requires fixing before
their time to address the
functionality; further testing can take place (a
defect before the next
no 100% Pass blocking issue).
1 release or sprint. Fix is
workaround; Rate
immediately updated to the
many Resumption criteria: The issue will
source control with a long-
services need to be fixed before the test item
term solution slated for
affected; is returned to QA for testing.
future release.
many users
affected
Functionality
<Set by 1a) Suspension criteria: Significant
severely The defect will be placed at
Product differences exist between observed
restricted; no the top of the backlog and
2 Owners or behavior of the test item and that
workaround; scheduled to be addressed in
Stakeholder shown in test case.
several users the next release or sprint.
s>
affected
Basic 1b) Resumption criteria:
functionality The defect will be placed in Development, QA, and PM must
with some the middle of the backlog <Set by come to a conclusion on resolving
restrictions; and scheduled to be Product the issue and agree on a definition
3 workaround addressed when time is Owners or of the expected behavior.
available; available. It will be reviewed Stakeholder
one or more in future refining exercises s> 2a) Suspension criteria: A test item
users for relevance. sent for testing fails more than 20%
affected of tests.
The request or incident will
Minor 2b) Resumption criteria: The test Info-Tech Research Group | 81
be placed near the bottom of <Set by
problem; item must be fixed or code
the backlog and scheduled Product
Step 3.2 This step involves the
following participants:

Align on your QA activities and tools • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
3.2.1 List your test categories
3.2.2 Define your test data and environment requirements
3.2.3 List your desired QA tools

Outcomes of this step


• Test definitions
Build Your QA Toolbox
• Test data and environment
management requirements
Step 3.1 Step 3.2 • List of QA tools current
available in your organization

• List of desired QA tools to be


used Info-Tech Research Group | 82
Build your test plan, document your test cases
WHAT IS A TEST PLAN? WHAT IS A TEST CASE?
• A test plan document describes the scope, • A test case is a set of conditions or variables under
approach, resources, and schedule of intended which a tester will determine whether a system
testing activities for a specific system, product, under test satisfies requirements or works correctly.
project, or team. It identifies test items, the features The process of developing test cases can also help
to be tested, the testing tasks, who will do each find problems in the requirements and design of an
task, the test environment, the test design application.
techniques, entry and exit criteria, the rationale for
the choice of tests, and any risks requiring • Common components of a test plan document
contingency planning. include: prerequisites or preconditions that must be
fulfilled prior to executing the test, test data and
• Common components of a test plan document environment requirements for the test, step-by-step
include: list of test items and their versions, in-scope procedure
and out-of-scope
HOW IS A to execute
TEST the VALUABLE?
CASE test, expected results, and
HOW IS A TEST features to be tested, test
PLAN VALUABLE? acceptance criteria.
approach, test environment, pass/fail criteria, • Clearly describes the steps and conditions to
• schedule,
Focus testing efforts on high-priority
responsibilities, test cases and
risks, and approvals. complete the tests correctly.
Source: Software Testing Fundamentals.
avoid test redundancies.
• Validates and verifies the satisfaction of functional,
• Level set expectations of just enough testing and nonfunctional, and performance requirements.
the tolerance of allowable escaped defects.
• Motivates delivery roles to break up test scope to
• Coordinate and schedule testing activities and enable test parallelization.
resources.
Use Info-Tech’s Test Plan Template to document your Use Info-Tech’s Test Case Template to document your
testing initiatives. test suites and cases.
Info-Tech Insight
QA is a shared responsibility. Your test plans and test cases are not static documents, nor built in a single event. They
are continuously updated and improved through feedback during the solution delivery process in collaboration with Info-Tech Research Group | 83
developers and other key stakeholders.
Assure sufficient test coverage using a model-
based approach
Each test case should have a Not all test cases are
clear objective and be linked to equally critical. Prioritize
specific requirements. There must them based on factors
be a clear understanding of the like business impact, user
acceptance criteria and assurance needs, and likelihood of
that the requirements are failure, focusing on high-
testable. risk areas first.

Incorporate
Define test Model your
Understand Prioritize different
objectives and processes to
requirements test cases strategies while
acceptance reveal various maintaining
thoroughly against risk
criteria test scenarios reusability

Begin by gaining a deep Utilize a mix of testing types (functional,


understanding of the regression, performance, etc.) and
software requirements and Appropriately model the techniques (boundary value analysis,
specifications. This ensures requirements to reveal equivalence partitioning, etc.) to cover a
that the test design is aligned ideal test coverage. wide range of scenarios. Design tests in a
with what the software is way that they can be reused in future
intended to do. projects or testing cycles.
To learn more about test design practices, refer to
Appendix A: Test Design Best Practices. Info-Tech Research Group | 84
Define and streamline your
Look at the various tests that can
tests
be applied throughout the delivery
Automat
pipeline and use them to ed & Business-Facing Manual
Manual
standardize a list of tests to meet
• • Exploratory Testing
Functional Tests
your quality standards and • Scenarios
• Examples
• Usability Testing

(Clarify thinking of the problem


acceptance criteria, considering •

(Uncover prior mistakes and


Critique Product
Story Tests

Supporting the
• User Acceptance Testing

and the code behaves as

omissions, i.e. escaped


your constraints. • Prototypes
• • Alpha/Beta (Canary
Simulations
Releases)

expected)

defects)
Team
Your QA practice should clearly
describe the tests and discuss the • Performance & Load
• Unit Tests
Testing
benefits of completing them. • Component Tests
• Security Testing
• Integration
• “-ility” Testing
Automat Specializ
ed ed Tools
Technology-
Facing Source: Crispin, 2012.
Info-Tech Research Group | 85
Input Output

3.2.1 List your test • Acceptance criteria • Test definitions

categories • Defect tolerance

1 hour
1.List and define the tests that QA is expected to complete,
considering your acceptance criteria and defect tolerances.

2.Identify who owns each test.

3.Document a summary of this exercise in the “QA Overview”


section of the QA Strategy Template.

Materials Participants

• Whiteboard and • QA team


markers
• SDLC team
• QA strategy
• QA stakeholders and
• QA plan management
• IT operations
Download the QA Strategy Template

Info-Tech Research Group | 86


3.2.1 Example
Owne
Test Description
r
Unit Testing Unit testing refers to tests that verify the functionality of Developme
a specific section of code, usually at the function level. nt Team

Integration Testing Integration testing verifies the interfaces between Developme


components against a software design. nt Team

Performance Testing Performance testing is generally executed to determine QA and


how a system or subsystem performs in terms of Deploymen
responsiveness and stability under a particular workload. t Team

Regression Testing Regression testing focuses on finding defects after a Deploymen


major code change has occurred. t Team

User Acceptance This test consists of a process of verifying that a solution BA


Testing (UAT) works for the user with the user.

Functional Testing Functional testing refers to activities that verify a specific QA


action or function of the code.

Nonfunctional Testing Nonfunctional testing refers to aspects of the software QA


that may not be related to a specific function or user
action, such as scalability or other performance,
behavior under certain constraints, or security. Source: “Software Testing,”
Wikipedia.

Info-Tech Research Group | 87


Ensure your test environment and However, test environment and
data management remains a
data meet your testing challenge for many
organizations.
requirements
Setting up and managing good test environments and data are critical to QA 44% of organizations stated
success. Any flaws in these processes may generate inaccurate test results, that test data availability
leading to costly fixes and upset stakeholders. As organizations begin looking at was the most impactful
challenge preventing them
automation tools to expand test coverage, having readily accessible, on-demand 44% from integrating quality
environments and data become even more important. Work with operations to engineering into the
get the environments and data you need to effectively complete testing. DevOps/DevSecOps
Build good test environment and data management practices in your QA pipeline (N=750).
and IT operations practices.
34% of organizations stated
• Use production or production-like data whenever possible to verify that legacy systems and
and validate a solution can reasonably operate in the real-world scenarios. decisions was one of two
main things holding QA
• Review, refresh, and reconfigure development, QA, and staging teams back from doing
environments alongside solution release cycles, vendor updates, and more with automation. 11% 34%
maintenance activities. indicated lack of
environments and 13%
• Leverage cloud, virtual, and container technologies to enable and indicated lack of data
automate rapid environment provisioning, refreshes, and teardown to (N=615).
alleviate the burden on IT operations teams.
• Define environment and data configurations, parameters, and other
requirements early in the SDLC process to allow sufficient preparation time Source: Sogeti, 2023.
and weigh the risks of using currently available items.
• To learn
Actively monitor and more, refer to the
govern Appendix C: Test use
shared Data and
of test data and Info-Tech Research Group | 88
Environment Management Good Practices.
environments (e.g. checkout policies) to ensure alignment quality and
Input Output

3.2.2 Define your test data and


environment requirements • Test cases, scenarios,
and plan
• Test data and
environment
management
30 minutes • Test requirements
requirements
1. Review your test cases that you need to execute and other requirements and • Understanding of test
documents that describe the technical environment the solution will operate data and
in. environments
currently available
2. Discuss the test data and environments that are currently available and what
can be used and easily provisioned to meet your testing needs.

3. Discuss the test environment requirements including hardware, software, and


infrastructure and the test data requirements to successfully execute your
tests.
Test State or Type Description Refresh Materials Participants
4. Document a summary of this exercise in the QA Strategy Template.
Environ Version Cycle
ment
Application Version 3.2 On-The environment used Within two weeks • Whiteboard and • QA team
server premis
onsite will consist of an of major
es application server and a production markers
• SDLC team
database server. The environment
• QA strategy
application and database updates • QA stakeholders and
servers will be installed • QA plan management
Test Data Requirements according to the Applicable
implementation notes. • IT operations
Test
Production-like “Anonymized” data based on a request from QA. Final sanity tests
data As much as possible of the original production before go-live
data and details are maintained.
Download the QA Strategy Template Info-Tech Research Group | 89
Select your QA tools Learn the Types of QA Tools
Your QA tools should accommodate the various quality requirements from the Despite a common set of features, QA tool vendors and
various stages of your SDLC to ensure the product meets stakeholder providers (proprietary or open-source) will differentiate
expectations while abiding nonfunctional requirements. Today’s QA tools themselves based on the type of technology stack they
leverage graphical user interfaces and automation capabilities to provide support, integration partners, degree of QA lifecycle support,
developers, testers, and operations with the ability to easily create, manage, and automation capabilities. Some tools only manage testing
execute, and report tests and results. Common attributes include: activities and do not provide the native capability to test.
• Record and Script Tests: Ability to generate tests by recording the actions Therefore, your choice of tool will depend on the reason why
of a user and write and modify test scripts in a scripting language (e.g. you need to test and how you intend on using the test results.
VBScript, JavaScript, Python) within a coding editor. • Test Management Solutions – These solutions provide a
• Reporting: A reporting suite containing canned reports with the ability to centralized, unified, and comprehensive platform for
export in multiple formats. managing manual and automated tests, QA processes,
assets, plans, activities, and results against defined quality
• Requirements Traceability: Errors, queries, and defects logged and
goals and requirements. A wide range of tests (e.g.
tracked throughout the testing lifecycle, test cases/scripts mapped to high-
regression tests, integration tests, performance tests,
level requirements, and ability to trace progress on resolving issues
function tests, security tests) can be natively executed
connected to requirements.
through these platforms for applications on multiple
• Test Planning: Build test cycles and phases, define milestones to coordinate platforms.
dependencies and releases, and then assign resources to execute the o Notable vendors in this category include: Microsoft,
planned tests.
Micro Focus, Perforce, Parasoft, IBM, and TestRail.
• Test Case Generator: Automatically generate test cases directly from
• Test Execution Solutions – These solutions are designed
requirements, the source code, recordings of user actions, and visual and
to perform a specific set of tests for applications on certain
graphical workflows and models using a GUI interface.
technology stacks. Some tools are aligned to particular
• Risk Analysis: Ability to perform impact analysis based on test results on industry testing frameworks (such as NUnit), support certain
the business and technical risks of an application going live. development methodologies (e.g. ATDD, BDD), or are
• Test Artifact Management: Create, template, manage, and version-test architected in a specific way (e.g. cloud, SOA). These tools
See SoftwareReviews Software Testing Tools
artifacts (e.g. test plans, test cases, test scripts, test dashboards, defect logs, are often used to complement feature gaps in test
for more information.
test data, test environments). management solutions or are plugged into ALM solutions to
extend their testing capabilities. Info-Tech Research Group | 90
• Integration with Third-Party Tools: Easily integrate with other systems
o Notable vendors in this category include: T-Plan,
Review the differentiating features of
QA tools
• Traceability: Testing artifacts map to business and nonfunctional requirements and
links to code, test cases, defects, support tickets, and other development artifacts.
• API Testing: Testing for desktop framework APIs (e.g. Java, .NET), web-based APIs, and
cloud APIs.
• Mobile Testing: Performance testing (network-related, response time, memory usage),
security testing (client-side and server-side security), functionality testing (form factor,
gestures, internationalization/localization, interruptions), compatibility testing,
automated testing, record and reuse test scripts across different applications, and
testing across multiple mobile platforms (web, native, hybrid).
• Cloud Testing: Record and reuse test scripts across all applications to conduct testing
in multiple cloud test labs that are running concurrently and perform functional,
security, and performance testing of a cloud-based application.
• Test Lab, Data, and Environment Management: Creation, execution, and removal
of virtual and cloud test labs and environments. Ability to simulate the application’s
intended environment with virtual or real data or infrastructure/network images.
• Business Process Testing: Trace business processes to test cases and run business
components in business process testing.
• Unit Testing: Ability to test individual code modules and send notifications when Info-Tech Research Group | 91

testing for code modules fails/succeeds to respective stakeholders.


Review the differentiating features
of QA tools (cont’d.)
• Regression Testing: Ability to test the impact of code changes and send notifications
when code changes fail/succeed to respective stakeholders.
• Security Testing: Evaluate a system’s susceptibility to security threats, including
applications scanning, penetration tests, static analysis, threat modeling, and
vulnerability scanning.
• Low-Code and No-Code Test Creation: Automatic generation of test cases and
scripts using natural language, business process flows, or wireframe diagrams.
• Real User Monitoring: Record and track all user interactions of a deployed
application.
• Application Performance Monitoring (APM): Tracks an application’s use and
performance by monitoring the various system stack components supporting the
application’s execution.
• Predictive Defect Analysis and Forecasting: Trigger alerts of potential issues
based on observations made in the software development lifecycle or provide static
analysis that indicates potential issues without the execution of the code. These
potential issues can be based on historic work or vendor-provided insights.
• Static Code Analysis: Review the code structure, quality, readability, coding style,
and code coverage of your applications. Info-Tech Research Group | 92
Know the difference between automated
testing and test automation See Info-Tech’s Automate Testing to Get
More Done for more information.
What is automated testing? What is test automation?
Automated testing is the act of conducting specific tests Test automation refers to automating the process of
via automation (e.g. a set of regression tests) as opposed tracking and managing the different tests (Tricentis, 2017)
to conducting them manually (Tricentis, 2017). This and the artifacts supporting them. This capability includes:
capacity includes: • Automated creation of test cases, suites, and cases using
• Automated execution of functional and nonfunctional submitted requirements, designs, and artifacts created
tests through predefined scripts and testing tools. via low- and no-code.
• Coordinating use of the various tools and technologies • Provisioning, refreshing, and teardown of test data and
needed to perform automated tests. environments according to QA usage policies.
• Preparation and verification of the results generated from • Analysis of test results to suggest solution and QA
the tests. process improvements.
How is automated testing valuable? How is test automation valuable?
• Volume of Tests: Automatically runs many tests across • Standardized QA Planning: Enforces the use of
many different technology, user, and business frameworks and standards for all QA activities
conditions. consistently across all teams and solutions, including
• Test Execution Accuracy: Executes tests consistently and test case management and prioritization.
reliably across all solutions and reduces errors • Enhanced Test Coverage: Defines more and broader
otherwise found in manual tests. test cases and scenarios using historic test results,
• Speed of Test Cycles: Reducing time spent in testing industry and user trends, and regulation and standard
time-consuming, trivial manual tasks and allowing changes.
teams to focus on more complex tests. • Stakeholder Communication: Tailors the Info-Tech
progress and
Research Group | 93

results of QA activities in a form decision makers can


Review our SoftwareReviews reports
Software Testing Tools A/B Testing Software
Test management helps product delivery A/B testing software were used to perform a
teams design, manage, and coordinate testing randomized experimentation process where
and quality assurance activities among two versions of an asset were compared by
business and IT testers. This solution marketers or researchers to gauge the
encompasses test planning, orchestrating the performance of an asset to promote best
automation and execution of testing initiatives digital experiences and campaign results. The
and resources, analysis and reporting of test software includes split testing capabilities,
results, and alignment of features to quality server-side testing, audience targeting, and
standards.
Penetration Testing Software analytics.
Mobile Application Testing Tools
Penetration testing is an active, invasive Mobile application testing tool provides
scanning activity that uncovers systems or solutions that allow your team to test and fix
applications vulnerabilities, and exploits those your mobile applications before launch. A
vulnerabilities in the same manner as a hacker successful product will provide numerous
would. It is considered an “ethical hacking” testing environments, concurrent testing, and
activity that is endorsed by the target robust reporting.
organization.

Application Security Testing Tools


Application security testing tools identify
security vulnerabilities in applications and
include static application security testing
(SAST), which analyses source code; dynamic
application security testing (DAST), which
tests code while it executes; and software
composition analysis (SCA), which identifies Info-Tech Research Group | 94
vulnerabilities in third-party components,
Input Output

3.2.3 List your desired • QA objectives and • List of QA tools current

QA tools expectations

• Current state of the


available in your
organization

1 hour QA practice • List of desired QA


tools to be
• Target state role
1.Review the key outcomes in the previous exercises to help inform
definitions, processes,
the features and vendor support you require to support your QA and tactics
needs:
a) QA objectives and expectations.
b) Current state of the QA practice.
c) Target state role definitions, processes, and tactics.
Materials Participants
2.For each QA capability, list the tools currently available in your
organization and the use cases for those tools as shown on the
following slide. Indicate how your tools must change to meet your
• Whiteboard and • QA team
QA needs. markers
• SDLC team
3.List the tools that you want to have, but are currently missing, in • QA strategy
• QA stakeholders and
your QA and SDLC practice. Indicate the purpose of these missing • QA plan management
tools.
• IT operations
a) If time allows, begin brainstorming the features you need to
fulfill the desired tooling use cases.

4.Document a summary
Download of this exercise
the QA Strategy Template in the “QA Tools” section of
Info-Tech Research Group | 95
the QA Strategy Template.
3.2.3 Example
QA Capabilities Desired Tools Tooling Use Cases
Cross-Functional Collaboration Atlassian Jira (application lifecycle management) Intake and backlog management
GitHub (source code management) Solution architecture and UX design
ServiceNow (IT service management, configuration Business analysis
management database) Solution development and implementation
Operations (including release and change management)
Service desk and maintenance
QA Planning SmartBear Zephyr (test management) QA plan, schedule, and resource capacity planning
GAP – TestComplete (automated testing) Adherence to quality policies, standards, and frameworks
Test case design
Test data and environment planning
Test scripts management
Execution and Management SmartBear Zephyr (test management) Test environment and data management
GAP – TestComplete (automated testing) Stage gate review and artifact validation
GAP – Selenium (automated testing) Code-level testing
GAP – SonarQube (source code analysis) Functional testing
New Relic APM (application performance Nonfunctional testing
monitoring) User acceptance testing
Application performance monitoring
Reporting and Analytics SmartBear Zephyr (test management) QA results and progress reports
ServiceNow (IT service management, configuration Troubleshooting and root cause analysis
management database) Bug and defect management
Atlassian Confluence (communication and Feedback loops during SDLC and post-go-live
collaboration) QA communication
Vision and Buy-In Microsoft Word and PowerPoint QA strategy
QA funding
Practice Management Microsoft Word and PowerPoint QA tooling and process management
Atlassian Jira (application lifecycle management) QA resource management
SmartBear Zephyr (test management) Knowledge management and sharing
QA artifact standards, templates, and versioning
Info-Tech Research Group | 96
Hit a home run with your stakeholders
Use a data-driven approach to select the right tooling vendor for your needs – fast.
Info-Tech Insight
Not all software selection
projects are created equal
– some are very small,
and some span the entire
enterprise. To ensure that
IT is using the right
framework, understand
the cost and complexity
profile of the application
you’re looking to select.
Info-Tech’s
Rapid Application Selectio
n Framework
approach is best for
commodity and mid-tier
enterprise applications.
Selecting complex
applications is better
handled by the
methodology in Info-Tech’s
Investing time improving your software selection methodology has big returns.
Implement a Proactive an
d Consistent Vendor
Info-Tech Research Selec
Group | 97

tion Process
Phase 4 This phase will walk you
through the following
activities:
Establish a QA Roadmap 4.1.1 Define your roadmap

4.1.2 Draw your QA


communication flow

Align on
Assess Your QA Improved QA
Process Practices This phase involves the
1.1 List your QA 2.1 Define your QA following participants:
objectives and guiding principles • QA team
metrics 2.2 Adopt your
1.2 Analyze your foundational QA • SDLC team
current QA state process
• QA stakeholders and
management
Build Your QA Establish a QA • IT operations
Toolbox Roadmap Build a Software Quality
3.1 Define your 4.1 Build your QA Assurance Program
defect tolerance roadmap
3.2 Execute your QA
activities

Info-Tech Research Group | 98


Step 4.1 This step involves the
following participants:

Build your QA roadmap • QA team

• SDLC team

• QA stakeholders and
Activities management

• IT operations
4.1.1 Define your roadmap
4.1.2 Draw your QA communication flow

Outcomes of this step


Establish a QA Roadmap
• List of QA initiatives and
roadmap
Step 4.1 • Communication map

Info-Tech Research Group | 99


Lay out your next steps
Take gradual steps to build a QA roadmap.
A QA roadmap is a time-based plan that defines where you are,
where you want to go, and how to get there (Chicago Quality
Later Assurance Association, 2017). It serves as a strategic plan that
outlines the key quality assurance activities, milestones, and
objectives over a specific timeframe.

The Now, Next, Later technique is a method for


prioritizing and planning improvements or tasks. This
Next involves breaking down a list of tasks or improvements
into three categories:
E
f  Now tasks must be completed immediately. These are
Effort

f usually urgent or critical, and they must be completed


o
r
to keep the project or organization running smoothly.
t  Next tasks should be completed soon. These are not
as critical as Now tasks, but they are still important
Now and should be tackled relatively soon.
 Later tasks can be completed later. These are less
critical and can be deferred without causing major
problems.
Using this technique, you can prioritize and plan the most
important tasks, while allowing the flexibility to adjust as
necessary.
Criticality Info-Tech Research Group | 100
This technique also helps clarify what must be done first
vs. what can wait. This prioritizes the most important
Anticipate and mitigate risks in your QA roadmap delivery
Understanding these risks not only prepares us for potential challenges in implementing our QA
roadmap.

1. 2. 3. 4. 5.
Lack of Overreliance
Resource Resistance Business Integration on
Constraints to Change Alignment Challenges Automation

Limited Any Misalignment Difficulty in Overemphasis


resources, organizational between the integrating on automation
including resistance, QA roadmap new tools and without
staffing and from the and practices with considering the
budget, can development overarching existing importance of
restrict the team, QA business systems and manual testing
ability to team, or other objectives may workflows can for certain
implement new stakeholders, lead to efforts disrupt scenarios can
tools, can hinder the that do not operations and lead to gaps in
technologies, adoption of contribute to delay benefits test coverage
and processes new QA the company’s realization. and quality
outlined in the practices and strategic goals. assurance.
Info-Tech Insight
roadmap. technologies.
Start small to evaluate the fit and acceptance of new and modified roles, processes, and technologies. Aggressive
initiatives and timelines can jeopardize success in big-bang deployments. Gradual and thoughtful adoption of QA ways
of working helps your teams focus on practice fit rather than fighting the status quo. This approach must involve
Info-Tech Research Group | 101
change tolerant teams, solutions, and cooperative stakeholders.
Input Output

4.1.1 Define your • Gaps between the QA • List of QA initiatives

roadmap current and target


state
and roadmap

1-2 hours • QA objectives

1. Brainstorm and list all potential QA initiatives by reviewing your current state
assessments, the gaps that were revealed, and the brainstormed solutions to
fill those gaps.

2. For each initiative listed, identify and assign a responsible owner. This person
will be accountable for the planning, execution, and success of the initiative.

3. Estimate the degree of effort to complete each initiative (small, medium,


large) and its criticality to the overall success of your QA practice (small,
medium, large).
Materials Participants
4. Group each initiative by how soon you need to address it:
a) Now: Let’s do this ASAP. • Whiteboard and • QA team
b) Next: Sometime very soon, let’s do these things. markers
• SDLC team
c) Later: Much further off in the distance, let’s consider these things. • QA strategy
• QA stakeholders and
5. Document a summary of this exercise in the “Communication” section of the management
QA Strategy Template.
• IT operations

Download the QA Strategy Template


Info-Tech Research Group | 102
4.1.1 Example
Initiatives Owner Effort​ Criticality
​ ​ Small/Medium/Large Small/Medium/Large
As a BA, I want to include QA during the requirements elicitation to better prepare
S​ L​
QA and to leverage the expertise from QA for better requirements documentation​. Mark Lee
As a QA, I would like to have a better knowledge base (ADO wiki) library for us to
S​ L​
access as it will be more efficient for the QA to understand. ​ Luke Cage
As a QA, I would like to see more recognition at a corporate level (not so much
individual level) for the great work that our QA team has accomplished, to help S​ L​
keep me motivated to go above and beyond what's required of me​. Reed Richards
As a QA, I want to implement automation testing for regression test cases, so that
we can reduce manual effort and time required for repetitive testing tasks and M​ L​
ensure high-quality releases.​ Yi Zhou
As a member of the LT team, I want to communicate that quality is a continuous
process, not a phase, so that quality is everyone’s job, not the function of just QA. M​ L​
(Marketing and Communication Strategy and Pitch/Communication Deck.)​ Jim Brown
As a QA team member, I want to build quality into the mindset, culture, and way of
M​ L​
working of each SDLC role, so that quality is viewed as a forethought and a priority.​ Sue Moody
As a QA team member, I would like to be involved in projects as early as possible
and have my risks, concern, and insights accommodated (i.e. planning) so that I
M​ L​
can make sufficient preparation, gain knowledge, and increase the efficiency of my
test plan and test cases.​ Sue Moody
As a part of the leadership team for the PMO team, I want a documented SDLC
M​ L​
RACI and process with QA built so that SDLC roles and responsibilities are defined​. Reed Richards
As a QA team member, I would like Business SMEs to formally adopt Azure DevOps
for collaboration throughout the SDLC process, instead of emails, so that we can M​ L​
have a better tracking of each ticket, communication, and approval.​ Luke Cage
As QA, I want to define my toolset for all SDLC roles, so that I know what and how
they are being used and how to optimize their use as a whole, ensure end-to-end
M​ L​
traceability, and ensure that business requirements are properly translated into the
work being done​. Yi Zhou Info-Tech Research Group | 103
4.1.1 Example
Initiatives Owner Effort​ Criticality
​ ​ Small/Medium/Large Small/Medium/Large
As a QA, I would like regular training so that we can stay up-to-date with the latest M​ L​
testing techniques and tools to improve our skills​. Yi Zhou
As SA, I want to refresh my database admin. skills and gain relevant access so that M​ M/L​
I can proactively look into the performance issues.​ Reed Richards
As QA, I would like to have a mandate process to have Business SMEs to review
and provide feedback and sign-off on test cases, so that any gaps will be identified M​ M​
and addressed before testing starts.​ Luke Cage

L​ L​
As QA, I would like to gain knowledge in multiple streams of business.​ Sue Moody

As QA, I want to integrate verification and validation activities (e.g. requirements


validation, mock-up reviews, unit testing) earlier in the software development L​ L​
lifecycle so that we can help identify gaps and defects earlier on​. Reed Richards
As QA, I want to have clear procedures, business accountabilities, and escalation
processes in place near the end of the SDLC (e.g. contingency go-live dates) so
L​ L​
that testing is not unnecessarily the first to be cut when time and resources are
running out.​ Yi Zhou
As the SDLC team, I want to alleviate resource capacity, so that I can participate in
upstream activities, deliver practice improvements, and build and develop my L​ L​
skills.​ Sue Moody
As QA, I would like to have a bigger voice (in relation to time required to test) and
viewed as a trusted and respected contributor, so that I am viewed as a trusted
L​ L​
partner, my decisions and insights are respected, and I don't feel like I have
to modify my test plan or remove test cases unnecessarily.​ Luke Cage
As IT, I would like to see more designated SMEs in every workstream so L​ L​
that it's easier to collaborate on projects and tickets with business users/SMEs​. Reed Richards
Info-Tech Research Group | 104
As a QA, I would like to introduce CI/CD in order to catch defects earlier in the L​ M​
development process and ensure high-quality code is being released to Production​. Yi Zhou
4.1.1 Example
Initiatives Owner​ Effort​ Criticality
​ ​ Small/Medium/Large Small/Medium/Large
As a QA, I would like to learn more about the different workstream and
business processes of RCM so I can help create better full-coverage test Not prioritized as it is duplication​
cases​. Yi Zhou

As a QA, I would like to gain more knowledge of the different workstreams Not prioritized as it is duplication​
within the RCM so that I don't feel like a deer in the headlights!​ Sue Moody
As a QA, I want to be involved in a project or change earlier than in
previous projects/changes. I want to better understand the project/change Not prioritized as it is duplication​
and increase my chances of finding defects earlier. ​ Luke Cage

Info-Tech Research Group | 105


4.1.1 Example
Now ​
0-3 Months​
As a BA, I want to include QA during the requirements elicitation to better prepare QA and to leverage the expertise from QA for
better requirements documentation.​
As a QA, I would like to have a better knowledge base (ADO wiki) library for us to access as it will be more efficient for the QA to
understand. ​
As a QA, I would like to see more recognition at a corporate level (not so much individual level) for the great work that our QA
team has accomplished, to help keep me motivated to go above and beyond what's required of me​.
As the SDLC team, I want to alleviate resource capacity so that I can participate in upstream activities, deliver practice
improvements, and build and develop my skills (e.g. creating the mechanisms or tactics to open capacity – band/blackout dates,
lower WIP, rigorous in requirements assessment, and delivery commitments)​.
As a member of the LT team, I want to communicate that quality is a continuous process, not a phase, so that quality is
everyone’s job, not the function of just QA. (Marketing and Communication Strategy and Pitch/Communication Deck.)​
As a QA team member, I would like to be involved in projects as early as possible and have my risks, concern, and insights
accommodated (i.e. planning), so that I can make sufficient preparation, gain knowledge, and increase the efficiency of my test
plan and test cases.​
As a part of the leadership team for the PMO team, I want a documented SDLC RACI and process with QA built so that SDLC roles
and responsibilities are defined​.
As QA, I want to define my toolset for all SDLC roles so that I know what and how they are being used and how to optimize
their use as a whole, ensure end-to-end traceability, and ensure that business requirements are properly translated into the work
being done​.
As IT, I would like to see more designated SMEs in every workstream so that it's easier to collaborate on projects and tickets with
business users/SMEs​.
As Info-Tech, I want to have a conversation on CoE/CoP with RCM​.
Info-Tech Research Group | 106
As Info-Tech, I want to have a conversation on software estimation with RCM​.
4.1.1 Example
Next​
3-6 Months​
As QA, I would like to have a mandate process to have Business SMEs to review and provide feedback and sign-off on test
cases, so that any gaps will be identified and addressed before testing starts.​
As a QA, I would like to gain knowledge in multiple streams of business.​
As a QA, I would like to have a bigger voice (in relation to time required to test) and be viewed as a trusted and respected
contributor, so that I am viewed as a trusted partner, my decisions and insights are respected, and I don't feel like I have
to modify my test plan or remove test cases unnecessarily.​
As QA, I would like regular training so that we can stay up-to-date with the latest testing techniques and tools to improve our
skills​.
As a QA team member, I would like Business SMEs to formally adopt Azure DevOps for collaboration throughout the SDLC
process, instead of emails, so that we can have a better tracking of each ticket, communication, and approval.​
As QA, I want to perform and execute verification and validation activities (e.g. requirements validation, mock-up reviews, unit
testing) earlier in the software development lifecycle, so that we can help identify gaps and defects earlier on​.
As QA, I want to have clear procedures, business accountabilities, and escalation processes in place near the end of the SDLC
(e.g. contingency go-live dates) so that testing is not unnecessarily the first to be cut when time and resources are running out.​
Later​
6+ Months​
As QA, I want to implement automation testing for regression test cases, so that we can reduce manual effort and time
required for repetitive testing tasks and ensure high-quality releases​.
As SA, I want to refresh my database admin. skills and gain relevant access so that I can proactively look into the performance
issues.​
As a QA team member, I want to build quality into the mindset, culture, and way of working of each SDLC role, so that quality is
viewed as a forethought and a priority.​ Info-Tech Research Group | 107
The way you design and structure

Communicate your QA progress to the your QA process affects how


delivery roles complete testing
right delivery roles and stakeholders on their own and with each other.
Where you invest in
communication and reporting
Enhanced communication channels can help overcome the following channels and the way you
costs: manage your relationships with
the business and other delivery
Wasted effort and opportunity
X When a change or delivery complication detracts attention from
roles has a significant impact on
what QA tasks get done, when
accomplishing committed QA work or leads teams to make assumptions. they get done, how they get
done, and what the business and
Increased overhead expenses IT stakeholders think about it.

X When a channel is too broad and complex, requiring additional effort to


coordinate dependent QA work, or when there is duplicated effort on a single
QA task.
Strengthen the relationship and
communication channels that
Confusion and delays exist among all of the roles who
X Due to the lack of clarity and transparency around responsibility and
accountability for dependent QA activities.
are directly or indirectly involved
in QA. Begin to think about how
QA will collaborate with and
report to each other and to
Unmotivated teams stakeholders, who will be
X When management does not provide sufficient support to remove QA
impediments (e.g. silos) or provide a suitable productive environment.
ultimately accountable for the
success of QA, and the degree of
each role’s involvement and
dependency for all types of tests
and defect fixes.
Info-Tech Research Group | 108
Info-Tech Research Group | 108
Input Output

4.1.2 Draw your QA • QA roles and • Communication map

communication flow responsibilities

1-3 hours
1. Identify everyone who is directly or indirectly involved with QA. Include
those who are:
a) Informed of QA work progress.
b) Subject-matter experts of business units or the product under
testing.
c) Impacted by the success of the delivered changes.
d) Responsible for the removal of impediments of QA roles.
2. Use the QA team as the focal point. Indicate how each role interacts
with the others and how frequently these interactions occur for typical Materials Participants
QA work. Do this by drawing a diagram on a whiteboard using labeled
arrows to indicate types and frequency of interactions.
3. Review the following items:
• Whiteboard and • QA team
a) For each communication medium, define what information will
be communicated. markers
• SDLC team
b) Review the structure of ceremonies and status meetings. • QA strategy
• QA stakeholders and
c) Discuss how dependent teams will be informed of progress and
how frequently. management
d) Discuss the reports that will be used. • IT operations
4. Describe how the various roles communicate with each other for each
phase and activity of your QA process.
5. Document a summary
Download the QAof this exercise
Strategy in the “Communication” section
Template
of the QA Strategy Template. Info-Tech Research Group | 109
4.2.2 Example
End Users
Completed
Fix Progress Report

Progress Application Issue


Report and Defect New Feature
Request
Product
Release Team Delivery
Ticket for Owners
Issue and Team
Request
Defect
Progress Prioritization
Go-Ahead Report and Completed Progress
for Questions of Development Report and
Deploymen Scope Work Quality of
t Product
QA Team
Prioritized
Nonfunction Quality of
Set of
al Inquiries of Request for Product
Test Data Changes
Requirement System Test Data
and
s Complexity, and
Environment
Systems Standards, Environment
s CAB
Architect and s Operations
Compliance Manager

Info-Tech Research Group | 110


Research Contributors and Experts
Alan Page Shannon Gould
Director of Quality for Services Manager
Unity Technologies Business Analysis and QA Mohawk College

Alan Page has been a software tester for over Shannon Gould is an authentic leader and
25 years and is currently the Director of Quality trusted advisor with over 19 years of
for Services (and self-proclaimed Community multidisciplinary experience, organizational
Leader) at Unity Technologies. Previous to Unity, knowledge, and advocacy to research and right-
Alan spent 22 years at Microsoft working on size best practices to the twenty-first century
projects spanning the company – including a institutional climate in higher education. The
two-year position as Microsoft’s Director of Test breadth and depth of practical experience and
Excellence. Alan was the lead author of the insight gained through private, public, and civil
book How We Test Software at Microsoft and engagement has afforded Shannon the ability to
contributed chapters for Beautiful Testing and construct knowledge across a spectrum of
Experiences of Test Automation: Case Studies of sectors, enhance expertise, and develop
Software Test Automation. His latest e-book effective leadership and team climate skills.
(which may or may not be updated soon) is a
collection of essays on test automation called
The A Word: Under the Covers of Test
Automation and is available on Leanpub.
Info-Tech Research Group | 111
Research Contributors and Experts
Benjamin Palacio Jack Bowersox Jr.
Information Systems Analyst Software Quality Assurance Supervisor
County of Placer Mutual Benefit Group

Benjamin Palacio is an accomplished Jack Bowersox Jr. is responsible for Software


Information Systems Analyst with County of Quality Assurance at Mutual Benefit Group. He
Placer, CA. As an integration expert he has provides strategic leadership for the
worked directly with executive leadership as organization’s software quality and requirement
well as elected officials in all agencies of local gathering efforts. Jack has over 15 years of
government to increase effectiveness between experience in the software quality and business
various departmentalized systems. When he is analyst industry.
not surrounded by computer screens, he enjoys
working outside and playing with his kids. He is
currently researching best practices for cloud
authentication and authorization to provide
integrations with cloud resources and on-
premises applications. He can be contacted at
[email protected] or on LinkedIn:
https://www.linkedin.com/in/bmptech/.

Info-Tech Research Group | 112


Research Contributors and Experts
Shaunna Bossler
CTFL, Chief Quality Officer
Montana Department of Revenue, IT Division

Shaunna Bossler has over 25 years of


experience in software testing and test
management. She received her certification as
a Certified Tester, Foundation Level through the
International Software Testing Qualifications
Board in 2008. She worked for several years in
the private sector as a contractor for a small
software development firm as well as the past
18 years for the Montana Department of
Revenue. Throughout her career she has
established several quality assurance processes
to ensure the integrity of the production
systems supported remain stable when defects
are resolved, enhancements are migrated, or
upgrades are implemented.

Info-Tech Research Group | 113


Bibliography
“2023 CSAT Benchmarks by Industry: What’s a Good Score?” Fullview, 2023. “Glossary: Given – When – Then.” Agile Alliance, n.d.

“Agile Methodology: The Complete Guide to Understanding Agile Testing.” Gunja, Saif. “Shift Left vs Shift-Right: A DevOps Mystery Solved.” Dynatrace
QASymphony, n.d. News, 31 Jan. 2022.

Bass, Len, et al. Software Architecture in Practices. 3rd ed. Pearson Education, “ISO 25000 Standards: Software and Data Quality.” ISO, n.d.
2003.
Jung, June. “How to Test Software, Part I: Mocking, Stubbing, and Contract
Benua, Melissa. “Doing continuous testing? Here’s why you should use Testing.” Dzone, n.d.
containers.” TechBeacon, 30 Aug. 2019.
Len Bass, Paul Clements, Rick Kazman. Software Architecture in Practices:
“Best Practices for Creating Test Scripts.” Micro Focus. N.d. Third Edition. Pearson Education, Inc. 2003.

Bose, Shreya. “Code Coverage vs. Test Coverage : A Detailed Guide.” “Chapter 16: Quality Attributes.” Microsoft Application Architecture Guide,
BrowserStack, 26 March 2020. 2nd Edition. Microsoft Developer Network, n.d.

“Chapter 16: Quality Attributes.” Microsoft Application Architecture Guide. 2nd “Parallel Testing.” SmartBear Software, 10 April 2018.
ed. Microsoft Developer Network, 2009.
Quick, Lindy. “Acceptance Criteria for User Stories: Examples and Best
Chatterjee, Shormistha . “What Is QAOps? (with Methodologies).” Practices.” KnowledgeHut, 23 Oct. 2023.
BrowserStack, 11 Nov. 2022,.
Shift-Left Testing: How to Apply Shift Left Approach to Continuous Testing.”
“Checking the Data Flow Diagrams for Errors.” W3Computing.com, n.d. Katalon, 19 May 2023.

Crispin, Lisa. “Agile testing quadrants: Guiding managers and teams in test “State of Software Quality Report 2023.” Katalon, 2023.
strategies.” TechTarget, Jan. 2012.
“Software quality.” Wikipedia. 9 July 2018. Web.
Dhanotiya, Neha . “Implementing Quality Assurance in the Software
Development Lifecycle.” FreshWorks Studio, 22 Dec. 2020. ” Test Automation vs. Automated Testing: The Difference Matters.” Tricentis, 11
Jan. 2017.
Fowler, Martin. “Mocks Aren't Stubs.” Martin Fowler.com, 2 Jan. 2007.
“Test Case.” Software Testing Fundamentals, n.d. Info-Tech Research Group | 114
“Future of Quality Assurance.” LambdaTest, 2023.
“Test Case Design Techniques.” ProfessionalQA.com. 12 March 2018.
Bibliography
“Test Environment Management Best Practices.” Plutora, 23 Nov. 2020.

“Test Script.” Software Testing Fundamentals, n.d.

“The Future up Close World Quality Report 15th Edition 2023-24.” Sogeti,
2023.

“The State of Quality Report 2022.” Katalon, 2022.

Thomas, Smishad. “The Evolution of QA and Testing in the Digital


Transformation Spotlight.” eInfochips, 11 Oct. 2022.

“UML 2 Class Diagrams: An Agile Introduction.” Agile Modeling, n.d.

“UML 2 Sequence Diagrams: An Agile Introduction.” Agile Modeling, n.d.

“UML 2 State Machine Diagrams: An Agile Introduction.” Agile Modeling, n.d.

“What Characteristics Make Good Agile Acceptance Criteria?” Segue


Technologies, 3 Sept. 2015.

“What Is the Role of QA in DevOps?” TestRail, 26 April 2022.

Wisdom, Elizabeth. “Where Are We Going and Who’s Driving? Developing and
Designing a Comprehensive QA Roadmap.” Chicago Quality Assurance
Association, 2017.

“World Quality Report 14th Edition.” Sogeti, 2022.

“World Quality Report 13th Edition.” Sogeti, 2021.

Info-Tech Research Group | 115


Appendix A
Test Design Best Practices

Info-Tech Research Group | 116


Define your acceptance
criteria
Functional “Acceptance criteria are the
predefined requirements that must
Specific user tasks, functions, business be met, taking all possible
processes, or business capabilities that scenarios into account, to consider
must be in place. a user story to be finished.
Nonfunctional
In other words, they specify the
Specific nonfunctional conditions the conditions under which a user story
implementation or change must meet, can be said to be ‘done’. When all
such as security, regulations, and the criteria are met, the team can
design elements. set the task aside and move on to
the next story.”
Performance
– KnowledgeHut, 2023
Specific system performance, user
acceptance, or productivity or business
metrics of a user story, request, or
requirement. A threshold should be
clearly defined.
Adapted from Segue Technologies, 2015. Info-Tech Research Group | 117
Input Output

A.1 Define your • Quality definition • Acceptance criteria

acceptance criteria • Defect tolerance


criteria
1 hour
1.Using the Given, When, Then format, write the various criteria
to be met for a test case or user story to be deemed complete.
Brainstorm the various scenarios, system conditions, and
outcomes that must be tested for your products, considering your
quality definition.
a) Given some condition/context, When an action occurs,
Then an expected result. Materials Participants
2.Take into consideration the business and technical requirements
that ensure a product is high quality and the risk of the defects
the test case is designed to validate and verify. • Whiteboard • Product owners and
managers
3.Document a summary of this exercise in the “Acceptance Criteria” • Markers
• Development team
section of the QA Strategy Template.
• Business analyst
• QA and testers

Info-Tech Research Group | 118


A.1 Example
Given When Then

Given my bank account is in When I attempt to withdraw Then the withdrawal


credit, and I made no an amount less than my should complete without
withdrawals recently card's limit errors or warnings

Source: Agile Alliance.

Given When Then

Given a user is logged When a user clicks on Then the user is prompted to upload
in and looking at their the update profile a new picture from their computer
profile picture link below their directory and the selected picture
profile picture becomes the current profile picture

Source: getLaura, 2014.

Info-Tech Research Group | 119


Assess the flow of outcomes from input,
process, and output perspectives
Each business unit, capability, and process requires a defined set of inputs in order to produce a set of outputs that
contribute to your business value metrics. Therefore, each business unit, capability, and process must be tested to
ensure they are performing as per your quality standards. Test if the outputs of one function are accepted as inputs into
the dependent function.

Do I have sufficient input information to generate the expected


outputs?

Business
Business Unit,
Unit, Input Output
Capability, or
Capability,
Process
or Process
Does the former business unit, capability, or process
provide sufficient information to generate the expected
outcome of the dependent business unit, capability, or
process?

Dependent Business
Business Unit,
Input Output
Unit, Capability, or
Capability, Process
or Process Info-Tech Research Group | 120
Input Output

A.2 Recognize what items


need to be tested • Business units,
processes, and
• Input-process-output
assessment
capabilities to be
tested
1-3 hours
1.Review the business units, processes, and business capabilities
that you will be testing.

2.Using the template below, identify the various inputs and outputs
for each business unit, process, and business capability.
a) Discuss if the inputs provide the business unit, processes,
or business capabilities sufficient information to generate
the expected outputs. Materials Participants
b) Discuss if the expected outputs produce sufficient
information for the successful completion of the dependent
business unit, processes, or business capabilities. • Whiteboard • Product owners and
managers
• Markers
3.Discuss what can be tested to ensure the right inputs are provided • Development team
and the expected output is generated.
• Business analyst
• QA and testers

Info-Tech Research Group | 121


A.2 Example
Assess flow of outcomes from input, process, and output
perspectives.
Do I have sufficient input information to generate the
expected outputs?
Input Output
• Username
Authenticate • Access to landing page
• Password User • Retry password or
username

Does the former process provide sufficient


information to generate the expected outcome of
the dependent process?

Input Output
• Authentication to Edit User Profile • Changes to profile
system details: username,
• Current profile details email address, and
address

Info-Tech Research Group | 122


Understand the models to identify test scenarios and
define your test cases
Significant time can be wasted by diving too far into the development and
testing of products without a thorough understanding of what needs to be
tested in the first place. Model-based testing reinforces the idea that QA
should be involved early in the discovery, exploration, and design stage of
delivery. Models enforce the criticality of testability in the product’s design
and the creation of QA plans and test cases through system abstractions.
Test against multiple models to ensure sufficient test coverage. See
Appendix B for enhanced images of the examples below.
Data Flow
• Graphical representation of the flow of data through the system or
product, indicating how the data is inputted, processed, transformed,
and consumed by users or other components.

Source: W3Computing.com
Source: W3Computing.com

Business Process
Flow
• Graphical representation of the activities and tasks to produce a
product or provide a service for a particular consumer.
• Processes include those that govern the operations of the business
unit, core business capabilities that directly create value streams, and
those that support core capabilities.

Info-Tech Research Group | 123


Understand the models to identify test scenarios
and define your test cases (cont’d.)
Sequence Diagram
• Visually models the flow of logic within your system or product as part
of usage scenarios (how is the product used) or execution of system
methods and services.
Source: “UML 2 Sequence Diagrams:
An Agile Introduction,” Agile Modeling

State Diagram

• Describes the behavior or states a system or product can have as well


as the transitions between those states.

Source: “UML 2 State Machine Diagrams: An


Agile Introduction,” Agile Modeling
Class Diagram
• Visually shows the classes of the system, their interrelationships
(including inheritance, aggregation, and association), and the
operations and attributes of the classes.

Source: “UML 2 Class Diagrams: An


Wireframes and Agile Introduction,” Agile Modeling

Storyboards
• Visually shows customer journeys of the system or product using
sketches of user interfaces (e.g. wireframes and storyboards).

Info-Tech Research Group | 124


Sidebar: Other test design
models
Specification-based (black box) techniques

• Equivalence partitioning: In this method, test input data is divided or partitioned into a number of
classes having equivalent amounts of data, which are then used to derive and design test cases for
each class or partition. This helps to significantly reduce the number of test cases.
• Boundary value analysis: This important specification-based testing technique is used to explore
errors in the software product at the extreme ends of the input domain (i.e. at boundaries) and is used
accordingly to derive and design test cases.
• Decision tables: A systematic technique of designing test cases, decision tables use different
combinations of inputs and their corresponding outputs based on variations of conditions and scenarios
adhering to different business rules.
• Use-case testing: This technique is used to identify test cases covering end-to-end software product
evaluation. The test cases are designed to execute business scenarios and user-end functionalities. With
the assistance of use-case testing, one can easily identify test cases that cover the entire system, on a
transaction-by-transaction basis, from the start of the testing to its end.

Info-Tech Research Group | 125


Sidebar: Other test design
models
Structure-based (open-box) techniques

• Statement testing and coverage: This is the weakest criteria and least preferred metric for checking
test coverage. Here, test scripts are designed to execute code statements. The main purpose of this
technique is to calculate the percentage of executable statements that are exercised by test suit.
• Decision testing coverage: Also known as branch testing, decision testing coverage is where the test
design techniques exercise the percentage of the outcome of the decisions. Here, the test coverage is
measured by the percentage of decision points, which are executed out of the total decision points in
the application.
• Condition testing: This type of structure-based technique involves 100% coverage of the code. Here,
each condition of the code coverage is executed at least once. In coverage testing, test cases are
designed in such a way that the condition outcomes are easily executed.
• Multiple condition testing: Here, the main focus is on testing different combinations of condition
outcomes to get 100% coverage. To ensure this, two or more test scripts are required, which becomes a
bit exhaustive and difficult to manage.
Source: ProfessionalQA.com.
• All path testing: All path testing is the strongest structure-based test case design technique. It
involves using the source code of a program to find every executable path to help determine Info-Tech
all the Research Group | 126
faults within a particular code.
Sidebar: Other test design
models
Experience-based techniques

• Exploratory testing: Usually conducted by business analysts and other business experts, exploratory
testing is used to test applications without any formal documentation of test cases, test conditions, or
test scripts. This is a hands-on testing approach, wherein testers are involved in minimum planning and
maximum test execution. In exploratory testing, test design and test execution are performed
simultaneously.
• Error guessing: A widely used technique, error guessing is highly dependent on the skills, intuition,
and experience of the testers. Here, testers have the freedom to anticipate the errors based on their
experience, availability of defect data, and their knowledge of product failure.
• Session-based testing: Session-based testing builds on exploratory testing by providing more
structure without taking away from the benefits that exploratory testing provides, such as the ability to
better mimic the user experience and get creative with testing. Testing is conducted during time-boxed,
uninterrupted sessions, testing against a charter and requiring testers to report on the testing that took
place during each session. Sources: ProfessinalQA.com; QASymphony.

Info-Tech Research Group | 127


Adopt good test design Strategically design your test
cases to support
tactics parallelization.
Parallel testing means testing
multiple applications or
Don’t just focus on the sunny day scenarios – look at the entire subcomponents of one application
plausible range of situations. concurrently to reduce the test time
(SmartBear, 2018).

Situations • Parallel testing requires the right


occurring with a tooling and test case designs to
maximum or justify its investment:
minimum
operating
Edge Situations • Test suites can be performed in
isolation with no interactions or
parameter. Case occurring outside a
integrations with other tests (i.e.
normal set of
operating conditions outlined in test cases
parameters and scripts are mutually
despite each exclusive).
parameter being
within the • Test data and environments are
Situations Best usesand
created of parallel
managedtesting:
in isolation
occurring when specified operating
one of its range. Multiple  with no across
Testing sharingbrowsers
of resources.
parameters is at variables are
or just beyond Boundary Corner simultaneously at  Testing compatibility
the extremes.
the maximum or Case Case  Assessing versions
minimum limits.
 Testing localization and
internalization
 Review languageInfo-Tech
and Research
validating
Group | 128
context
Script your test cases against your test case
conditions
A test script is a set of instructions that is performed on a system under test to verify that the
What is system performs as expected (Software Testing Fundamentals). Test scripts that are used in
a test tools are written in scripting or programming languages (such as JavaScript, Python, VBScript)
script? against specific frameworks. Scripts are typically written in human language for tests run
manually, particularly user acceptance testing (UAT).

Adopt Scripting Good Practices


Using test scripts is the most detailed way to document testing. Given the significant investment needed to write them,
being strategic and efficient in the scripts’ creation is critical.
• Add verification to your test scripts to test the correctness of your application under test as well as functional flow.
• When possible, avoid writing complex test scripts; instead, write multiple simple test scripts that are combinable.
• Describe your test scripts and cases as thoroughly as possible in a comment. Without a good description of the test
case in natural language, someone who needs to change the implementing code might not be able to understand what
exactly the test is doing.
• Continuously maintain and reuse your test scripts and cases to avoid redundancies by updating existing test scripts
and cases instead of adding new ones.
Source: Micro Focus.

Info-Tech Research Group | 129


Input Output

A.3 Model your • Items to be tested • List of test cases

tests • Acceptance criteria


• Input-process-output
2 hours assessment

1.Review the various test design models and discuss how they are
applicable to your business units, processes, and business
capabilities that you will be testing.

2.Begin modeling your tests using your acceptance criteria, input-


process-output diagrams, and other upstream documentation.

3.List the test cases and revise your acceptance criteria if needed.
Group your test cases into suites (or themes) if they collectively
Materials Participants
are critical to product success.

• Whiteboard • Product owners and


managers
• Markers
• Development team
• Business analyst
• QA and testers

Info-Tech Research Group | 130


A.3 Example
Example: Test Suite: Log onto website

Test
Test Expected Acceptance
Case Test Script Test Data
Scenario Results Criteria
ID
TC01 Check 1. Go to website User ID=MSmith User should log Tests part of
customer 2. Enter valid Password=pass in into web larger suite
login with user ID 99 application
valid data 3. Enter valid Given a user is
password logged in and
4. Click “Submit” looking at their
profile,
TC02 Check 1. Go to website User ID=MSmith User should not
customer 2. Enter invalid Password=glass log in into web When a user
login with user ID 99 application clicks on the
invalid data 3. Enter valid “update profile
password picture” link
4. Click “Submit” below their profile
picture,
TC03 User forgets 1. Go to website User email User should not
customer 2. Click “Forgot address= log in into Then the user is
login User ID or msmith@mysite application and prompted to
information Password” .com confirmation upload a new
3. Send email should picture from their
confirmation be sent to user. computer
email Screen directory and the
instructs user selected picture
to view email becomes the Info-Tech Research Group | 131
to reset user ID current profile
and password. picture.
A.3 Example
Log Into Find a Group Explanation of Electronic Eligibility
Use Cases: System and Number Benefits Payment
Deliver Profile

Data Mining Online New New


and Dependency Project Project
Web Web Web Web
Warehousing ET Service Service Service Service Databa
Tool L Portal se
Browser TBD
App. Reposito
Data MEAP
Load Mining ry
Balanc Tool
er
PayPal
Data
Oracle Center Possibl
Datab ETL
e
ase Queue
Routin Overwrit
e Payme
g
Daily M-F Databa nt
Security Gatew
ETL – se
Batch 5 Managem ay
Days/Wk ent
Applicatio
n

Info-Tech Research Group | 132


A.3 Example
Sent To Belongs To
Organization

Address
Campaign Customer
Name
FirstName FirstName
Attribute

Phone
LastName LastName
s

City
Description Role
Zip
Timeline Organization
Get customer’s
Create campaign Revenue
Entity

organization
Usag
e of

Deliver campaigns on details


Type Submits
multiple channels Orders
Target specific
customers Add customer
Invoice
Deliver information
to customer
Contains Items
Channels Date
Description Add customer
Availability order

Info-Tech Research Group | 133


Narrow the scope of your testing activities with
a risk-based approach
It is no surprise that the lack of time to test is one of the top challenges facing organizations today. On the way to
achieving quality at speed, there are two major barriers: frequent requirement changes (reported by 46% of companies)
and the lack of time (according to 39% of companies). In reality, there will never be enough time to test everything, nor
it is productive to do so. Use your acceptance criteria, quality definition, and defect tolerances to define a risk-based
testing approach so that you focus on the tests that mean the most to your business and IT stakeholders.
Source: State of Software Quality
Report, 2023.
Test Case Prioritization Methodology

Severity, Risk, and Probability of Issue Risk Score


Impact of Issues If Test Occurring (Higher Score, Higher
Case Not Completed Test Priority)
(From Acceptance Criteria)

% #
1

Severity,
Risk, and
4 2
Impact
Levels

3
Info-Tech Research Group | 134
Input Output

A.4 Prioritize your • Items to be tested • List of test cases

tests • Acceptance criteria


• Input-process-output
1 hour assessment

1.Review your defect tolerance and acceptance criteria as insights


behind your risk definition and risk and severity levels (e.g.
4=high risk and 1=low or no risk, or you can use monetary
values).

2.Discuss how business and technical risks will factor into your
overall risk level.

3.State the probability of the issue occurring if the test is not


Materials Participants
successfully completed.

4.Calculate your prioritization score by multiplying the risk level and • Whiteboard • Product owners and
the probability of issue occurring. managers
• Markers
5.Prioritize your test cases in order to build your test schedule. • Development team
• Business analyst
• QA and testers

Info-Tech Research Group | 135


A.4 Example
Test Business Risk Technical Risk Probab Prioritiz
ID Risk Lev ility ation
el of Score
Issue
Occurr
ing
TC01 Security and privacy Unauthorized 4 50% 2
risks; user dissatisfied access; active
with accessibility; lost directory is
TC02 revenue with lack of compromised 4 50% 2
access

TC03 User will not get or be Email sent to 2 75% 1.5


able to submit time- wrong user;
sensitive information; delay in email
reset instructions are sent
not accurate

Info-Tech Research Group | 136


Appendix B
Examples of Models

Info-Tech Research Group | 137


Example: Data Flow Diagram

Source: W3Computing.com.

Info-Tech Research Group | 138


Example: Business Process Flow

Above/Below
$5,000 Budget?
Create a Get Finance
Get Email List
Campaign Approval Below Budget

Get Additional Above Budget


Funds
Load Emails
Launch on Social
Campaign Media
Channel

Info-Tech Research Group | 139


Example: Sequence Diagram

Source: “UML 2 Sequence Diagrams: An Agile Introduction,” Agile Modeling.

Info-Tech Research Group | 140


Example: State Diagram

Source: “UML 2 State Machine Diagrams: An Agile Introduction,” Agile Modeling.

Info-Tech Research Group | 141


Example: Class Diagram

Source: “UML 2 Class Diagrams: An Agile Introduction,” Agile Modeling.

Info-Tech Research Group | 142


Example: Wireframes and
Storyboards Advanced Search Results
Name, Address,
1
Proximity, Phone #
>1 Search
Results
.
Advanced Search . System
Populates
Location .
Name, Address,
Services 15
Proximity, Phone #

Specialty Show More


Default
Doctor Name
Settings
Doctor Details
Language
Name
Gender
System Address Map

Accessibility Populates
Phone #

Distance
Search
Services Get Directions

Specialties
1 Search Result Refer
Language

Gender

Accessibility
Info-Tech Research Group | 143
Appendix C
Test Data and Environment Management Good Practices

Info-Tech Research Group | 144


Adopt effective test data
management practices
Test data is a cross-functional asset.

• It involves collaboration among operations, security, developers, testers, and database analysts (DBAs) to provision,
subset, mask, and manage test data so it meets your organization’s data quality and management standards (irrespective
of where the data comes from and who uses it). The challenge is implementing just enough oversight and discipline, so
teams are neither impeded nor disempowered to pull and manipulate data as they see fit. Even though test data
management (TDM) may not require the same degree of rigor and control as formal data management
practices, some of its key principles can be leveraged to ensure proper data quality, ownership, and approvals are
followed. For example, test data owners must keep active tabs on the value and relevance of their data (i.e. the test data
lifecycle) to determine when test data sets should be tweaked, refreshed, or retired. Refer to Info-Tech’s
Create a Data Management Roadmap blueprint for more details.
• Test data can quickly become stale and irrelevant, since much can be learned and changed as your system is developed,
tested, and handled in production. While it’s ideal to refresh test data after any change, test data should be refreshed
at least after every major release or after a defined number of changes have been made to the system under
test or test configurations. This frequency is ultimately dependent on the effort to refresh the data balanced against the
Test
valuedata needs to be protected
of production-like fromby
data as defined unauthorized access.
your functional and nonfunctional requirements.
• Most TDM vendors abide by the General Data Protection Regulation (GDPR) and personal identity information protection
standards, such as Personal Information Protection and Electronic Documents Act (PIPEDA). However, not every vendor
goes about data security and privacy the same way (e.g. dynamic masking through an API or using GUI to mask data in
CSV files), nor abides to all industry-specific compliance requirements, such as the Health Insurance Portability and
Accountability Act (HIPAA), Gramm-Leach-Bliley Act (GLBA), the Safe Harbor Privacy Principles, and Payment Card Industry
Data Security Standard (PCI DSS).
Build a test data self-service model.

• Test data must be ready to be used when it is needed, whether testing is proactively scheduled or done on demand as part
of your continuous delivery pipeline. Ensure your test data is reset automatically as part of your test run if it’s determined
it will be used again. Otherwise, collaborate with your data stewards to ensure your repository contains the appropriate
test data.
Info-Tech Research Group | 145
Adopt effective test data
management practices (continued)
Mask and obfuscate your test data with a plan.

• Company standards and industry regulations may limit the exposure of production data to
non-production users, including testers. The challenge here is determining how much of
the data should be altered to comply with these constraints while maintaining
the correct level of fidelity. Working with your DBAs, security team, and other
stakeholders, develop the appropriate data-masking profiles using frameworks and
transformation tactics (e.g. conditional masking and compound masking). Support your
data-masking profiles with applications (e.g. Oracle) and toolsets and tailor them against
regulations (e.g. GDPR) and determine if masking should be performed
statically or dynamically. Refer to Provision and Mask Your Test Data With the Right Tool for
more information
Develop a syntheticon thegeneration
data tools in thepractice.
test data management (TDM) space.

• Production data may not be ready or the right fit for testing, or the cost to prepare and
manage it outweighs its testing value. Teams can address this by generating synthetic
test data sets that are algorithmically generated repositories. These resemble
(but do not contain) or are based on real, existing information. Noise and other
discrepancies can be artificially interjected in a controlled manner to stress your
applications. The accuracy of your synthetic data may outweigh the need for production
data; however, this requires significant investment in and maturity of your data analytics
practice. Info-Tech Research Group | 146
Adopt effective test environment
management practices
Apply a lightweight environment management model.

• Budget, resourcing, time, security, and other IT and business risks often limit a tester’s ability to test in high-fidelity
environments. Testers must then use, manage, and monitor contained, isolated environments (e.g. containers) that are
often provided by your operations teams with the appropriate controls, mocks, stubs, and other alterations in place. Even
though test environments do not require a heavyweight management model, adopting some formal management
practices is often enough to see noticeable benefits, including several key practices in collaboration with
operations such as these (Plutora, 2020): package and publish test environments for easy sharing through self-service
and track, realign, and tune preconfigured assets to correctly align the test environment with production updates to
minimize variations and misrepresentations. Refreshes should occur whenever a change may influence the functionality
and operations
Mock of the
and stub your application.with a plan.
environments

• Tests should only fail when there is a failure in the code or the components being tested, not the environment. Testers
collaborate with operations to decide which third-party code and external system integrations will be
mocked and stubbed in the test environment so tests can be executed free from dependencies. Mocking involves
creating a service or object with similar properties as real ones. Stubbing only simulates the behavior and function of a real
service or object. Live integrations can replace your mocked and stubbed objects and dependencies once the appropriate
system integration and regression tests are complete.
• While mocks and stubs can help tests run more quickly and reliably, the way they are used and the reason why they
would be used can significantly influence the tests that will be using them. Design an approach for when and
how mocks and stubs will be used, when they will be replaced with real system components, who should be creating them,
and the framework on their design and implementation. See June Jung’s
How
Use to Test Software,
containers when theyPart I: Mocking,
make sense. Stubbing, and Contract Testing from DZone and Mocks Aren’t Stubs by Martin Fowler
for more information.
• “A container is a collection of code, a configuration, and runtime dependencies all bundled together with an execution
engine on a virtual machine” (TechBeacon). The ease and quickness to provision, update, secure, and manage containers
makes them an attractive test environment option for developers to test early and can satisfy the role-delineation,
infrastructure, and security concerns of operations. Despite the benefits, containers may not provide the
environmental conditions or flexibility that some tests require so virtual machines and other traditional
environment technologies may still be necessary. See Info-Tech’s Containers Survival Guide for Infrastructure for more
Info-Tech Research Group | 147
information.

You might also like