01 Build A Software Quality Assurance Program Phases 1 4
01 Build A Software Quality Assurance Program Phases 1 4
Quality Assurance
Program
Build a robust strategy and ensure quality is
at the core of everything you do.
QA performed early and throughout the solution delivery lifecycle (SDLC) improves
the accuracy and effectiveness of downstream tests and reduces costs of fixing
defects late in delivery. QA activities should be embedded in one’s job description.
QA is a shared responsibility. Your test plans and test cases are not static documents
nor built in a single event. They are continuously updated and improved through
feedback during the solution delivery process in collaboration with developers and
other key stakeholders.
Start small to evaluate the fit and acceptance of new and modified roles, processes,
and technologies. Aggressive initiatives and timelines can jeopardize success in big-
bang deployments. Gradual and thoughtful adoption of QA ways of working helps
your teams focus on practice fit rather than fighting the status quo. This Info-Tech
approach Research Group | 5
must involve change tolerant teams, solutions, and cooperative stakeholders.
The value of QA stems from the assurance of
sustainable and valuable solution delivery
Twitter series B
What is quality and QA? How is financing:
QA perceived in
the organization?
• Solution quality is the degree a system, feature, component, or
• Efficient and effective QA
process meets specified customer needs, customer expectations,
practices are vital because
and nonfunctional requirements. QA is a program of tasks and
solutions need to readily
activities to ensure software quality priorities, standards and policies
adjust to constantly evolving
are met throughout the SDLC.
and changing business
• Do not expect a universal definition of quality. Everyone will have a priorities and technologies
different understanding of what quality is and will structure people, without risking system
processes, and technologies according to that interpretation. stability and breaking
What is the core of QA? business standards and
expectations.
Verification Validation
is evaluating work items to is evaluating the solution • However, investments in QA
determine if they meet during and at the end of are often afterthoughts. QA is
specified business and delivery to determine if often viewed as a lower
technical requirements. they satisfy specified priority compared to other
business and technical
Is the solution built right? requirements. SDLC capabilities (e.g. design
and coding) and is typically
Is the right solution built? the first item cut when
Info-Tech Research Group | 6
Maximize the value you expect to gain from QA
Your QA program is not just about keeping pace with changes. QA is about setting a standard of
software quality excellence aligned to stakeholder expectations and priorities while anticipating
future challenges and opportunities.
Improved Increased
Enhanced Business resource
customer
security continuity utilization
satisfaction
Solution issues are QA enforces the right Good QA increases QA practices streamline
identified and protocols and tactics stakeholder the SDLC process by
addressed before are employed during confidence that the reducing the time
they can negatively the solution delivery. solution can reliably spent on fixing issues
impact the customer. This standard is operate in sunny- late in the lifecycle,
Preventative aligned to the day and rainy-day ramping up resources
measures can then organization’s scenarios and meet unfamiliar with the
be implemented to security risk defined service level solution, and paying
maintain a tolerance, latest agreements. down technical debt.
consistent trends, and industry
experience. regulations. Info-Tech Research Group | 7
QA remains a challenge for many organizations
These challenges highlight critical gaps in our current approach, showcasing the necessity for a shift toward more integrated
and automation-driven QA processes. This focus ensures that QA ultimately drives your competitive advantage.
0 5 10 15 20 25 30 35 40 45
Percentage of respondents
But the reception and value of software On-time, on-budget solutions do not indicate successful
products do not justify the money invested. delivery practices.
delivery
lifecycle should
embrace quality
at its core
Definition of Done
Voice of
the
Product/Enterprise
Alignment Customer
Voice of the Customer Voice of
Voice of the Teammate the
Teammat See our Evolve Your Software Development Lifecycle
Into a Solution Delivery Lifecycle blueprint for more Info-Tech Research Group | 12
e information.
Explore the trends in the QA marketplace
1 4
AI and ML-Embedded QA Tools End-to-End Testing
01 Embedding AI and ML in QA tools increased the efficiency,
scope, and accuracy of test design and execution, enabling
04 The configuration and orchestration of automated tests to evaluate
the functionality of the solution from start to finish under real user
them to: scenarios. End-to-end testing looks at:
• Create test cases from functional requirements and test • Testing to ensure specific software layers or components work
scripts without code. consistently and reliably across other parts of the software and
• Analyze past testing activities to predict potential issues and system.
defects of current delivery efforts and suggest root causes • Testing to ensure specific functions work smoothly across the
and solutions. technical stack (from the user interface down to the
• Provision accurate and realistic synthetic test data using infrastructure).
production data to train ML models. This approach can involve the use of automated testing solutions
2 5
See Leverage Gen AI to Improve Your Test Automation Strategy alongside nontraditional testing solutions like robotic process
Autonomous
and Adopt GenerativeTesting
AI in Solution Delivery for more QAOps
automation. See
02 information. testing involves tests and other QA activities to be
Autonomous
created, executed, and managed through intelligent algorithms
05 Build a involves
QAOps
Enhance
Winning the
Your
Business Process
embedding
Solution
ofAutomation
Architecture
Playbook
QA procedures,
Practices to
reporting, and technologies into the SDLC pipeline
and
automation,
identify the
without the need for human intervention. This capability enables impacts of your 2022).
(BrowserStack, changesThetogoals
your systems and users.
are to guarantee high software
the: quality consistently across teams and operationalizing QA-
• Configuration of QA activities to new requirements, testing optimized, CI/CD processes for broader organizational adoption.
scenarios and observations of solution delivery activities. QAOps shares many of the principles, behaviors, and best
• Self-healing of test scenarios and scripts when issues occur. practices of DevOps and Agile methodologies. See
• Immediate feedback during any phase of the SDLC of Implement DevOps Practices That Work for more information.
6
potential risks or conflicts with quality standards and industry
3
frameworks and regulations.
Scriptless Automated Testing QA Tool Ecosystem
03
See Tech Trends 2024 for more information on autonomous back
Low-
office.and no-code capabilities reduce or remove the technical
skills traditionally needed to create, script, execute, and manage
06 The team’s personal preferences for specific QA tools and
technologies have been and are continually shifting away from
the siloed, monolithic tooling and vendor stack that the industry
automated tests. This capability motivates the shifting of
standardized in the past. This demand pushed many vendors to
solution quality accountability earlier in the SDLC and enables
position their solutions to build and strengthen relationships with
the discovery of risks and defects before they cause negative
third parties and deliver out-of-the-box plugins and customizable
impact and become expensive to fix. See Info-Tech Research Group | 13
APIs. See Applications Priorities 2024 for more information on
Satisfy Digital End Users With Low- and No-Code for more
multisource ecosystems.
Extend the QA mindset beyond testing
Shift QA left and right Bridge your silos with DevOps
An emerging trend in QA is the adoption of shift- DevOps purposefully blurs the lines between these
left and shift-right testing. Shift-left testing is a responsibilities, forcing collaboration. The developers
software testing approach that places strong start building the mindset of continually checking for
emphasis on conducting testing activities earlier errors in their code. The testers increase their
in the development process by shifting all testing responsibilities from validating the application to
activities to earlier development stages rather ensuring it is deployable at all times. They may even
than leaving them until the very final stages fix code as needed. All these pieces work together to
(Katalon, 2023). ensure rapid delivery of features. The focus
Integrating on the
QA into DevOps:
customer drives the work of the entireteam
• Realign team.
structure
On the other hand, shift-right testing implies • Automate as much as possible
extending testing activities beyond the traditional • Use metrics to track progress
development and release phases.
• Run tests in parallel
This involves performing testing activities in the
QA involvesenvironment
testing across • Have common set of processes
production or the SDLC
closer to the end users
after the software has been deployed. & tools
• Continuous feedback
• Increasing viability
• Sufficient training Source: TestRail,
Testing 2022.
Testing new Testing Testing every Testing on See Info-Tech’s Implement DevOps Practices That
requirements new code every build deploymen production Work blueprint for more information. Info-Tech Research Group | 14
t
Info-Tech’s methodology for building a
software QA program
1. Assess Your 2. Align on Improved 3. Build Your QA 4. Establish a
QA Process QA Practices Toolbox QA Roadmap
Phase 1.1 List your QA 2.1 Define your QA guiding 3.1 Define your defect 4.1 Build your QA
Steps objectives and metrics principles tolerance roadmap
1.2 Analyze your 2.2 Define your foundational 3.2 Align on your QA
current QA state QA process activities and tools
QA Strategy
Template
A template to help you QA Current-State As
document a comprehensive sessment Tool
description of the QA practices
Assess the current state
for your organization. It presents of QA in your
several activities required to organization at team
validate and verify software and organizational
solutions. level.
• SDLC functions motivated to collaborate, learn from each industry’s, and regulator's policies and standards.
Measure the value of this
blueprint
Outcome Project Metrics Impact
Improved software
• 25% reduction per
quality by reducing the Select and Use SDLC Metric
s Effectively quarter/year
number of defects
• 20% increase in
throughput after nine
Increased solution
Select and Use SDLC Metric months
delivery throughput s Effectively • Sustainable velocity after
one year
• 50% reduction in rework
Reduction of rework due
due to compliance after
to defects found during
Select and Use SDLC Metric one year
the solution delivery s Effectively • 90% reduction after two
process
years
Increased application
• 10% increase in
and end-user End User Satisfaction Diagn
ostic satisfaction in one year
satisfaction
• 10% increase in
Increased IT satisfaction CIO Business Vision Diagno
stic satisfaction in one year
Info-Tech Research Group | 18
Executive Brief INDUSTRY SOURCE
Case Study
Government Info-Tech Research Group
Workshop
Government Agency
A government agency worked with Info-Tech to develop a Results
strategy to mature and scale their QA practice.The QA team
identified several key QA objectives that they want to achieve By conducting the workshop, the organization was
through their practice: able to:
• Ensure software products meet business, functional, and
• Build a consensus of what QA means and list the
nonfunctional (including security, performance, integration,
and regression) requirements. necessary changes to be successful.
• Build a disciplined and formal QA practice. • Gauge the maturity and capability of the current
• Increase customer and stakeholder confidence, trust, and QA practice to define a list of optimization
respect. initiatives and build a roadmap.
However, they recognized key challenges standing in their • Create the initial design of target QA roles and
way, such as: processes of the QA practice.
• Low QA resource capacity.
• Finalize the future state of QA roles, processes,
• Low availability of business subject-matter experts.
See our sample workshop deliverable to know how an Info- tools, and tactics, including their implementation
• Lack of Tech
automated testing and
Quality Assurance test automation
Workshop tools.
helps improve your QA to other products and systems, with the existing
practice.
• Very tight project timelines resulting in the cutting of QA test strategy.
activities. Info-Tech Research Group | 19
Info-Tech offers various
levels of support to best suit
your needs
Executive &
Guided Technical
DIY Toolkit Implementation Workshop Counseling Consulting
“Our team has already “Our team knows that we “We need to hit the “Our team and processes “Our team does not have
made this critical project a need to fix a process, but ground running and get are maturing; however, to the time or the knowledge
priority, and we have the we need assistance to this project kicked off expedite the journey we’ll to take this project on. We
time and capability, but determine where to focus. immediately. Our team need a seasoned need assistance through
some guidance along the Some check-ins along the has the ability to take practitioner to coach and the entirety of this
way would be helpful.” way would help keep us this over once we get a validate approaches, project.”
on track.” framework and strategy deliverables, and
in place. opportunities.”
Info-Tech
Info-Tech Research
Research Group| |2121
Group
Workshop Overview
Contact your account representative for more information.
[email protected] 1-888-670-8889
1.1 Define solution 2.1 Define your QA 3.1 Define your defect 4.1 Build your QA 5.1 Complete your QA
quality in your guiding principles tolerance roadmap strategy
context
2.2 Define your QA 3.2 Define your tests 5.2 Review the workshop
1.2 State your QA target state deliverables and
3.3 State your test data
objectives and discuss next steps
and environment
metrics
requirements
1.3 Assess the current
3.4 List your QA tools
state of your QA
practice
1. Solution quality 1.QA guiding principles 1.Test defect risk 1.List of QA initiatives 1.QA strategy
definition tolerances and roadmap
2.Target QA process and 2.Next steps with Info-
2. QA objectives artifacts 2.Test definitions 2.Communication map Tech
Deliverables
• SDLC team
• QA stakeholders and
Activities management
• IT operations
1.1.1 Define solution quality in your context
1.1.2 State your objectives
1.1.3 List the metrics that will gauge your success
• QA objectives
Step 1.1 Step 1.2
• Metrics to gauge QA success
Info-Tech Insight
It is easy to lose sight of what matters when we look at quality from a single point of view. Many organizations
simply define quality as valuable, usable, and stable products and changes to end users. This definition omits the
importance of technical attributes that make solutions maintainable, scalable, and reusable. Solutions are not often one-
offs, and technical excellence is necessary to ensure the right decisions are made that minimize technical debt.
Info-Tech Research Group | 27
Enforce your quality definition through attributes in all
delivery activities
Quality attributes are properties that dictate how the system should behave at runtime and how it should be
designed, implemented, and maintained. These attributes capture the deeper structural characteristics of the
solution architecture that enable system functionality.
Quality
Definitions
Attribute
The product is an intuitive solution. Usability helps define the ease with which users can perform a specific task
Usability on the system. Limited training and documentation are required.
Usability and performance are closely related. A solution that is slow is not usable. Performance represents the
degree to which a product or system provides functions that meet stated and implied needs when used under
Performance specified conditions. Baseline performance metrics are defined, and changes must result in improvements.
Performance is validated against peak loads.
Availability is the degree to which a software system can be accessed by the users when it is required. The
application system is present, accessible, and ready to carry out its tasks when needed. The application is
Availability accessible from multiple devices and platforms, is available 24/7/365, and teams communicate planned
downtimes and unplanned outages. Teams must not put undue burden on end users accessing the systems.
Reasonable access requirements are published.
Security refers to the degree to which a software system safeguards information or data so that users or other
systems have appropriate access to these data based on their authorization level. Applications handle both
Security private and personal data and must be able to segregate data based on permissions to protect privacy. Users
want it to be secure but seamless. Vendors need to understand and implement the organization’s security
requirements into their products. Teams ensure access is authorized, maintain data integrity, and enforce privacy.
Reusability defines the degree to which a system component or an asset can be utilized on several systems or in
Reusability building other components or assets. This attribute minimizes the duplication of components and implementation
time. Teams ensure a modular design that is flexible and usable in other applications.
Interoperabilit A system's ability to communicate or exchange data seamlessly between different operating systems, databases,
Info-Tech Research Group | 28
QA performed early and throughout the SDLC improves the accuracy and effectiveness of
downstream tests and reduces costs of fixing defects late in delivery. QA activities should be
embedded in one’s job description. Info-Tech Research Group | 29
Input Output
30 minutes
1. Review the various business (e.g. stakeholders, management, end users) and
technical (e.g. development, infrastructure) perspectives of solution quality.
2. List three quality attributes that your organization sees as important or high
priority (e.g. usability, security, scalability).
QA Value
Challenges QA Practices
Opportunities
• Unrealistic • Clarify test coverage and
commitments to product • Upstream QA Planning effort to improve release
updates and releases estimates
objectives challenges
• Business and IT
• QA value
• Business and technical
1 hour strategy
objectives
• Prevention costs are costs of all activities that are • Detailed traces of individual requests and transactions
designed to prevent poor quality from arising in products • Basic server monitoring and metrics
or services. • Application framework metrics
• Appraisal costs are costs that occur because of the need • Application log data
to control products and services to ensure high quality in
all stages and conformity to quality standards and • Application errors
performance requirements. • Real user monitoring (RUM)
Formula
Use the following formula to define the percentage of code
you would like tested:
Code Coverage
What is it? Code coverage is performed to verify the extent
to which the code is executed. Code coverage can ensure
quality standards are maintained so only the optimal quality
code is pushed to production. It is primarily performed at the
Values of Code Coverage:
unit testing level. Be sure to negotiate the code coverage
• Quantitatively indicates if there are enough tests in the
with stakeholders, since 100% coverage is not necessarily
feasible nor cost efficient. unit and component test suites and if more tests are
needed.
What are the levels of code coverage? There are several
• Adhering to a high percentage of code coverage can
levels of code coverage to consider. Which ones you decide
• Loop
to measure depends on the coding coverage
standards your team lower the chances of escaped bugs detected later in
decides to adopt. development.
• Condition coverage
• • Code coverage motivates the removal of untouched and
Branch coverage
unneeded code to improve the efficiency and size of the
• Function coverage entire code base and ease the downstream build,
Info-Tech Research Group | 40
• Statement coverage deployment, and testing processes.
Expand your test coverage
Formula
Use the following formula to define the percentage of
requirements (functional and nonfunctional) you would like
Test Coverage tested:
success
30 minutes
1.Identify the major areas that will be targeted for monitoring by
determining the metrics that will validate your QA business and IT
objectives. Aim to identify at least one metric per expectation.
• SDLC team
• QA stakeholders and
Activities management
• IT operations
1.2.1 Understand the challenges of your QA
practice
1.2.2 Complete a current state assessment
Percentage of respondents
Internal Focus
• The SWOT is structured on two key axes:
o Internal vs. External: Whether or not the Internal
Internal characteristics
item originates from inside the organization
characteristics that that are
or outside will impact how you decide to are favorable as unfavorable or
proceed. Remember internal vs. external is they relate to your need
directly related to level of control. (Can I environment. improvement.
change or simply mitigate? Can I enhance or
simply encourage?)
Opportunities Threats
External Focus
o Helpful vs. Harmful: Elements can either
help you or hinder you. Knowing which is
Info-Tech Insight External
important. External characteristics
characteristics that
Some existing wisdom discourages celebrating what
you may use to your that may be
helps you and just focusing on the challenges. This potential sources
advantage.
is misguided, as giving appropriate time to your of failure or risk.
strengths lets you know what not to focus on.
Info-Tech Research Group | 46
Input Output
1.2.1 Understand the challenges of
your QA practice • Understanding of
current QA practices
• SWOT analysis
1-3 hours
1. Complete a SWOT analysis of your current QA practice.
2. Use the outcomes of this exercise to frame your discussions in the following
exercises.
Strengths Weaknesses
assessment
1-3 hours
1.Select one QA or SDLC team employing QA practices. This will be
and stakeholders • Gaps in the current QA
practice
• QA team
Build Your QA Establish a QA • SDLC team
Toolbox Roadmap
3.1 Define your • QA stakeholders and
defect tolerance 4.1 Build your QA management
3.2 Align on your QA roadmap
• IT operations
activities and tools
Build a Software Quality
Assurance Program
Info-Tech Research Group | 50
Step 2.1 This step involves the
following participants:
• SDLC team
• QA stakeholders and
Activities management
• IT operations
2.1.1 Define your QA guiding principles
• Understanding of
30 mins organization culture
• SDLC team
• QA stakeholders and
Activities management
• IT operations
2.2.1 Document your QA process and artifacts
2.2.2 Identify your QA roles and responsibilities
2.2.3 Select and define your QA resource structure
• QA resource allocation
approach and structure
Info-Tech Research Group | 57
Continuous delivery and release cycles, intake feedback, and
Your solution improvement.
delivery lifecycle
should embrace
quality at its
core
Connect all phases with a
solution-centric approach
that goes from the first idea
all the way through to
maintenance.
• Products that affect multiple lines of business and have significant costs and/or risks.
• Most QA activities and controls are performed rigorously with business lead (e.g. vice president) sign-offs at
Level 3 all key decision points in the process. QA and SDLC teams are empowered to make some QA and delivery
decisions and department representatives must participate in QA activities.
• Example: Implement CRM
• Products with broader exposure to the business that present a moderate level of risk to business operations.
• Certain QA activities and controls are performed rigorously with business lead sign-offs at specific decision
Level 2 points in the process. QA and SDLC teams are empowered to make most decisions in the delivery process.
Department representatives should participate in QA activities.
• Example: Deploy Office 2013
• Routine/straightforward product changes or development with limited exposure to the business and low risk
of negative business impact.
• QA activities and controls are rigorously performed as needed with business lead sign-offs as optional. QA
Level 1 and SDLC teams are empowered to make all decisions in the delivery process. Department representatives
are consulted if needed.
• Example: SharePoint Update Info-Tech Research Group | 59
Make sure every QA artifact has the information you
need for testing
An extensive suite of artifacts may be needed to describe the test planning, design, execution, results, and
conclusions revealed by testing activities. These artifacts lay out the scope and processes to validate and verify
value delivery in order to justify the resources and effort needed to complete these tests.
Organizations that supplement text-based test management practices with model-driven practices (such as context
models, business process models, and use-case models) are able to drive efficiency through reusability and
increased clarity of testing objectives.
Test Plan Test Dashboards and Reports
Test Cases Pass/Fail Results
Priorities Conclusion and
Acceptance Criteria Recommendation
Stakeholder
0 0 0 0
Communication
0
1 2 3 4 5
QA Strategy Test Scripts and Defect Documentation
Frameworks Environment Root Cause Analysis
Standards Test Data and Environment Traceability to Other
Management Artifacts
Resources
Test Execution Info-Tech Research Group
Fix Request | 60
Acceptance Criteria
Input Output
1 hour
1.Complete a suppliers, inputs, processes, outputs, and customers
(SIPOC) table to ground your understanding of your SDLC process:
a) Supplier: Who provides the artifacts needed for the
process?
b) Input: What artifact is needed to initiate and execute the
process?
c) Process: What phase is being executed and who is Materials Participants
responsible in this phase?
d) Output: What is produced at the end of the process?
• Whiteboard and • QA team
e) Customer: Who consumes the completed artifact? markers
• SDLC team
2.Identify who is involved in each step, stage gates, and sign-off • QA strategy
• QA stakeholders and
points. Verify and validate the fit of these items against different management
types of work items and risk levels.
• IT operations
3.For each SDLC phase, ask yourself the following questions and
document the results:
a) How can QA capability/thinking help?
Download the QA Strategy Template Info-Tech Research Group | 61
b) How can QA activities/teams be helped?
2.2.1 Example
Suppliers Inputs Processes Outputs Customers
Software delivery Approved solution approach Build Functioning solution build (stable in test environment) Software delivery team, especially
team (scope, technical design, (Developers, QA, vendors) Updated source code repository QA
Vendors solution design, UI mock-ups, Unit testing results service desk
Procurement team SME sign-off) Software documentation (e.g. support after go-live, functional Infrastructure team
design, contacts of SME, access, release notes, training
documentation draft, etc.)
Test environment and data provisioned
Software delivery Functioning solution Test Test results (e.g. defect list, root causes, test performance) Software delivery team
team, especially Unit testing results (QA, SME, customers) Decision to go live (UAT sign-off and decision-maker approval) Vendors and procurement team
QA Software documentation (e.g. SME and end-user feedback Business sponsors
service desk support after go-live, Test status reporting
Infrastructure teamfunctional design, contacts of Updated project/maintenance backlog based on accepted or
SME, access, release notes, declined defects
etc.) Updated software documentations
Functional and nonfunctional End-user training and onboarding documentation
requirements
Software delivery Approved and tested solution Deploy Request for change (RFC) and CAB go/no-go meeting End users
team build (infrastructure, release and Solution build push to production Customers
Vendors software documentation (e.g. deployment team, Defined warranty period Service desk/operational support
Procurement team support after go-live, organizational change Hand off to service desk post warranty period Organizational change leader
functional design, contacts of leaders) Updated end-user training and onboarding documentation
SME, access, release approval End-user communication of releases
to go live)
reporting activities.
Recognize the capabilities critical for QA
success Solution Operations
Intake and Solution Development (Including Service Desk
CROSS-FUNCTIONAL Business
Backlog Architecture and Release and and
COLLABORATION Management and UX Design Analysis Implementatio Change Maintenance
n Management)
Quality
Test Data
Policies, QA Resource
Test Case and
QA PLANNING QA Plan Standards, QA Schedule Test Scripts Capacity
Design Environment
and Planning
Planning
Frameworks
Organization
QA QA Practice QA Strategy
QA Practice Executive al Desire
VISION AND BUY-IN Objectives Fund and
Vision Sponsorship and
and Metrics Sourcing Approach
Motivation
QA
Knowledge QA Artifact
Continuous QA Resource Governing QA Practice
PRACTICE QA Managemen Standards QA Guiding
QA Tools Improvemen Managemen or Performance
MANAGEMENT Processes t and and Principles
t t Collaborativ Dashboard
Sharing Versioning
e BodyInfo-Tech Research Group | 65
Input Output
• Understanding of
1-2 hours current and desired
SDLC roles
1.Review your current QA capabilities and their effectiveness in
sufficiently satisfying stakeholder needs and QA objectives.
2.Build a RACI chart for the various roles in your delivery team to
identify who will be supporting and executing your QA capabilities
as shown on the following slide. Refer below for a definition of
RACI:
a) Responsible – A single role has the authority to execute the Materials Participants
capability.
b) Accountable – A role has ownership over the capability and
is responsible for its success. These items can be • Whiteboard and • QA team
markers
delegated. • SDLC team
• QA strategy
c) Consulted – Roles who are asked for their input into the • QA stakeholders and
capability. management
Functional Testing C R A C
Nonfunctional Testing C R A C
User Acceptance Testing C R A I C
Application Performance Monitoring C C C A
QA Results and Progress Reports I R A I I I
Troubleshooting and Root Cause
C R A C
Analysis
Bug and Defect Management C R A C I I
QA Feedback Loops During SDLC C R A C I
QA Feedback Loops Post-Go-Live C I A R I C
QA Communication Channels I R A C A I
QA Tools R A C I
QA Processes C R A C I I
Practice Continuous Improvement C R A C I
QA Resource Management I R A I I I
Knowledge Management and Sharing I R A I C
QA Artifact Standards and Versioning I R A I C
QA Guiding Principles C R A C I
QA Governing or Collaborative Body C C R C A C Group | 68
Info-Tech Research
QA Practice Performance Dashboard I R A I I I
Learn the different patterns to structure and
resource QA to your product delivery teams
The primary goal of any product delivery team is to improve the delivery of value for customers and the business based on your product
definition and each product’s demand. Each organization will have different priorities and constraints, so your team structure may take on a
combination of patterns or may take on one pattern and then transform into another.
How Are Resources and Work
Delivery Team Structure Patterns
Allocated?
Functional • Division of teams by functional responsibilities (e.g. developers, testers,
BAs, operations, help desk) and arranged according to their placement in
Completed work is handed off from
team to team sequentially as outlined in
Roles the software development lifecycle.
the organization’s SDLC.
Shared
Service and • Teams are divided by functional responsibilities (e.g. developers, testers,
business analysts, operations, help desk) and arranged according to their
Resources are pulled whenever the work
requires specific skills or is pushed to
Resource placement in the software development lifecycle (SDLC).
areas where product demand is high.
Pools
Product or • Teams are dedicated to the development, support, and management of Work is directly sent to the teams who
are directly managing the product or
specific products or systems.
System directly supporting the requester.
Skills and • Teams are grouped based on skills and competencies related to Work is directly sent to the teams who
Competencie technology (e.g. Java, mobile, web) or familiarity with business
capabilities (e.g. HR, finance).
have the IT and business skills and
s competencies to complete the work.
Info-Tech Note
When deciding which is the right delivery pattern for you:
• Is there enough work (e.g. projects, systems) to warrant a separate team to do testing and
QA?
• Will a separate QA or test group be a formal testing function, or will they come together to Info-Tech Research Group | 69
build a center of excellence?
•
Staffing models for delivery teams
Functional Shared Service and Product or Skills and
Roles Resource Pools System Competencies
Specialized Flexible demand/capacity Teams are invested in the Teams are invested in the
resources are easier management full life of the product technology
Pros to staff Supports full utilization of Standing teams enable Standing teams enable
Product knowledge resources continuous improvement continuous improvement
is maintained
Demand on specialists Unavailability of resources can Changes in demand can Technology bias can lead
Cons
can create lead to delays lead to downtime to the wrong solution
bottlenecks Product knowledge can be lost Cross-functional skills Resource contention when
Creates barriers to as resources move make staffing a challenge team supports multiple
collaboration solutions
Use Case
• When you lack people • When you have specialists such • When you have people • When you have a
with cross-functional as those skilled in security and with cross-functional skills significant investment in a
skills operations who will not have who can self-organize specific technology stack
full-time work on the product around the request
Info-Tech Research Group | 70
QA team structure pattern examples
Functional Roles Shared Service Product or Skills and
and Resource System Competencies
Pools
Intake Business
Analysis
Business Analysis
Development
Intake Intake
Testing
Development
Product Team: Website Product Team: Java
Applications
Operations
Resourced
as needed
Testing
Intake
Product Product
Release Release
Product Team
Operations
Product
Product Release
Release
30 minutes
1.Document your current staffing model for your solution delivery
team. Identify all roles who are directly or indirectly involved in
managing, governing, and executing QA and the roles who are
dependent on the test results.
Developer Application
QA 2
2 s Support 1
Domain Expert Domain Expert Domain Expert
Developer
3
• QA team
Build Your QA Establish a QA • SDLC team
Toolbox Roadmap • QA stakeholders and
3.1 Define your
defect tolerance 4.1 Build your QA management
3.2 Align on your QA roadmap
• IT operations
activities and tools
Build a Software Quality
Assurance Program
Info-Tech Research Group | 74
Step 3.1 This step involves the
following participants:
• SDLC team
• QA stakeholders and
Activities management
• IT operations
3.1.1 Define your defect risk tolerance
IMPACT
Some questions to consider when deciding on defect severity Significan High High Medium Low
include: t 2 2 3 4
• How is productivity affected? Medium Medium Medium Low
Moderate
3 3 3 4
• How many users are affected?
Medium Low Low Low
• How many systems are affected? Localized
3 4 4 4
• How critical are the affected systems to the organization?
Why do I
Decide how many severity levels QA Prioritizing
needs to have to manage
test defectstest
is critical to ensure the most impactful issues are
defects. need to resolved first, aligning bug fixes with business priorities and customer needs.
prioritize High-priority defects are typically blockers for release; understanding their
test severity and impact guides informed decision making regarding product
defects? launch or update rollouts. Info-Tech Research Group | 76
Instill the right accountability in the decision to go live
Business needs to be accountable for the decision to push solutions and changes into production at
the end of testing. Otherwise, they must give IT full empowerment to make that decision.
The accountability for pushing solutions with perceived What Should Be Done for Go-Live Decisions?
tolerable test results into production often lies with IT. In other 1. Instill Business Accountability – Shift the
cases, the business quickly signs off on the go-live decisions accountability of solution risks and quality
without fully understanding the risks and trade-offs illustrated concerns to the business or product owners. This
in the test results. Both scenarios risk significant and approach requires QA to prepare reports to
undesired consequences, such as: understand the risks and trade-offs stakeholders
• Expectations Misalignment: Solutions do not meet are accepting or tolerating so they can make
stakeholder expectations due to the various decisions and confident decisions. IT governance will hold these
changes occurring during the delivery process. stakeholders to their decisions and protect solution
• Blame Game: When solution issues are found, IT is blamed delivery teams.
for its release despite the sign-off from the business. 2. Empower IT Decision Makers – Ensure IT teams
• IT-Centric Risk Tolerance: Defect tolerance levels are have clear guidelines and autonomy to make
centered on IT’s interpretation rather than what the informed decisions based on predefined and
business truly tolerates or accepts, which may conflict with business-accepted acceptance criteria.
what the business actually cares about. Stakeholders must be accepting of whatever
Confident go-live decision making involves accessible and consumable test results. decision IT makes with the full trust that their
Simplify Reporting: Develop concise and clear reports that highlight key outcomes, risks, and recommendations in atoway
concerns were taken that any decision maker
heart.
(including the business) can easily understand.
Visual Dashboards: Utilize visual dashboards that provide at-a-glance insights into test results, potential impacts, and decision points. These
dashboards should be accessible on-demand and updated live or as close to live as possible.
Decision Frameworks: Offer decision frameworks that guide stakeholders through evaluating the implications of deploying code with tolerable
risks, balancing business needs and technical realities.
Info-Tech Research Group | 77
Input Output
3.1.1 Define your defect risk • Quality definition • Test defect risk
tolerance • Prioritization and
triaging techniques
tolerances
1 hour
1. Start by identifying the indicators of high- or low-priority test defects. Once you
have these sketched out, you can begin to break them into manageable levels.
2. Define each level of impact and its contributing factors considering your quality
definition. Outline the impact of defects from multiple perspectives, such as
business operations, end users, and enterprise systems. Provide examples for
each level.
3. Define each level of urgency. Outline the factors and timelines that will dictate
how soon a request needs to be addressed. Consider your quality definition in this
Materials Participants
exercise. Provide examples for each level.
4. Combine your urgency and impact levels to define the severity levels that will be
used to prioritize your test cases. Indicate additional escalations if necessary. See • Whiteboard and • QA team
the following slide for an example. markers
• SDLC team
• QA strategy
5. Identify exceptions to the prioritization matrix that may include specific systems, • QA stakeholders and
issues roles, departments, or timing around business processes that will need to • QA plan management
be treated as high priority.
• IT operations
6. Highlight the course of actions to address failing tests for each severity level.
Identify the suspension and resumption criteria for a failing test in a test suite.
Download the QA Strategy Template
7. Document a summary of this exercise in the “Acceptance Criteria” section of the
QA Strategy Template. Info-Tech Research Group | 78
3.1.1 Example
Business Technical
Impact Number Business Number of
Rating Security
of users Requestor criticality of systems Data affected
risks
affected application affected
Will cause
Will impact
1 Multiple Will impact organizatio
100 or Mission multiple
Extensive Departmen many n-wide
more Critical databases or
Widespread ts systems security
warehouses
risks
• SDLC team
• QA stakeholders and
Activities management
• IT operations
3.2.1 List your test categories
3.2.2 Define your test data and environment requirements
3.2.3 List your desired QA tools
Incorporate
Define test Model your
Understand Prioritize different
objectives and processes to
requirements test cases strategies while
acceptance reveal various maintaining
thoroughly against risk
criteria test scenarios reusability
Supporting the
• User Acceptance Testing
expected)
defects)
Team
Your QA practice should clearly
describe the tests and discuss the • Performance & Load
• Unit Tests
Testing
benefits of completing them. • Component Tests
• Security Testing
• Integration
• “-ility” Testing
Automat Specializ
ed ed Tools
Technology-
Facing Source: Crispin, 2012.
Info-Tech Research Group | 85
Input Output
1 hour
1.List and define the tests that QA is expected to complete,
considering your acceptance criteria and defect tolerances.
Materials Participants
QA tools expectations
4.Document a summary
Download of this exercise
the QA Strategy Template in the “QA Tools” section of
Info-Tech Research Group | 95
the QA Strategy Template.
3.2.3 Example
QA Capabilities Desired Tools Tooling Use Cases
Cross-Functional Collaboration Atlassian Jira (application lifecycle management) Intake and backlog management
GitHub (source code management) Solution architecture and UX design
ServiceNow (IT service management, configuration Business analysis
management database) Solution development and implementation
Operations (including release and change management)
Service desk and maintenance
QA Planning SmartBear Zephyr (test management) QA plan, schedule, and resource capacity planning
GAP – TestComplete (automated testing) Adherence to quality policies, standards, and frameworks
Test case design
Test data and environment planning
Test scripts management
Execution and Management SmartBear Zephyr (test management) Test environment and data management
GAP – TestComplete (automated testing) Stage gate review and artifact validation
GAP – Selenium (automated testing) Code-level testing
GAP – SonarQube (source code analysis) Functional testing
New Relic APM (application performance Nonfunctional testing
monitoring) User acceptance testing
Application performance monitoring
Reporting and Analytics SmartBear Zephyr (test management) QA results and progress reports
ServiceNow (IT service management, configuration Troubleshooting and root cause analysis
management database) Bug and defect management
Atlassian Confluence (communication and Feedback loops during SDLC and post-go-live
collaboration) QA communication
Vision and Buy-In Microsoft Word and PowerPoint QA strategy
QA funding
Practice Management Microsoft Word and PowerPoint QA tooling and process management
Atlassian Jira (application lifecycle management) QA resource management
SmartBear Zephyr (test management) Knowledge management and sharing
QA artifact standards, templates, and versioning
Info-Tech Research Group | 96
Hit a home run with your stakeholders
Use a data-driven approach to select the right tooling vendor for your needs – fast.
Info-Tech Insight
Not all software selection
projects are created equal
– some are very small,
and some span the entire
enterprise. To ensure that
IT is using the right
framework, understand
the cost and complexity
profile of the application
you’re looking to select.
Info-Tech’s
Rapid Application Selectio
n Framework
approach is best for
commodity and mid-tier
enterprise applications.
Selecting complex
applications is better
handled by the
methodology in Info-Tech’s
Investing time improving your software selection methodology has big returns.
Implement a Proactive an
d Consistent Vendor
Info-Tech Research Selec
Group | 97
tion Process
Phase 4 This phase will walk you
through the following
activities:
Establish a QA Roadmap 4.1.1 Define your roadmap
Align on
Assess Your QA Improved QA
Process Practices This phase involves the
1.1 List your QA 2.1 Define your QA following participants:
objectives and guiding principles • QA team
metrics 2.2 Adopt your
1.2 Analyze your foundational QA • SDLC team
current QA state process
• QA stakeholders and
management
Build Your QA Establish a QA • IT operations
Toolbox Roadmap Build a Software Quality
3.1 Define your 4.1 Build your QA Assurance Program
defect tolerance roadmap
3.2 Execute your QA
activities
• SDLC team
• QA stakeholders and
Activities management
• IT operations
4.1.1 Define your roadmap
4.1.2 Draw your QA communication flow
1. 2. 3. 4. 5.
Lack of Overreliance
Resource Resistance Business Integration on
Constraints to Change Alignment Challenges Automation
1. Brainstorm and list all potential QA initiatives by reviewing your current state
assessments, the gaps that were revealed, and the brainstormed solutions to
fill those gaps.
2. For each initiative listed, identify and assign a responsible owner. This person
will be accountable for the planning, execution, and success of the initiative.
L L
As QA, I would like to gain knowledge in multiple streams of business. Sue Moody
As a QA, I would like to gain more knowledge of the different workstreams Not prioritized as it is duplication
within the RCM so that I don't feel like a deer in the headlights! Sue Moody
As a QA, I want to be involved in a project or change earlier than in
previous projects/changes. I want to better understand the project/change Not prioritized as it is duplication
and increase my chances of finding defects earlier. Luke Cage
1-3 hours
1. Identify everyone who is directly or indirectly involved with QA. Include
those who are:
a) Informed of QA work progress.
b) Subject-matter experts of business units or the product under
testing.
c) Impacted by the success of the delivered changes.
d) Responsible for the removal of impediments of QA roles.
2. Use the QA team as the focal point. Indicate how each role interacts
with the others and how frequently these interactions occur for typical Materials Participants
QA work. Do this by drawing a diagram on a whiteboard using labeled
arrows to indicate types and frequency of interactions.
3. Review the following items:
• Whiteboard and • QA team
a) For each communication medium, define what information will
be communicated. markers
• SDLC team
b) Review the structure of ceremonies and status meetings. • QA strategy
• QA stakeholders and
c) Discuss how dependent teams will be informed of progress and
how frequently. management
d) Discuss the reports that will be used. • IT operations
4. Describe how the various roles communicate with each other for each
phase and activity of your QA process.
5. Document a summary
Download the QAof this exercise
Strategy in the “Communication” section
Template
of the QA Strategy Template. Info-Tech Research Group | 109
4.2.2 Example
End Users
Completed
Fix Progress Report
Alan Page has been a software tester for over Shannon Gould is an authentic leader and
25 years and is currently the Director of Quality trusted advisor with over 19 years of
for Services (and self-proclaimed Community multidisciplinary experience, organizational
Leader) at Unity Technologies. Previous to Unity, knowledge, and advocacy to research and right-
Alan spent 22 years at Microsoft working on size best practices to the twenty-first century
projects spanning the company – including a institutional climate in higher education. The
two-year position as Microsoft’s Director of Test breadth and depth of practical experience and
Excellence. Alan was the lead author of the insight gained through private, public, and civil
book How We Test Software at Microsoft and engagement has afforded Shannon the ability to
contributed chapters for Beautiful Testing and construct knowledge across a spectrum of
Experiences of Test Automation: Case Studies of sectors, enhance expertise, and develop
Software Test Automation. His latest e-book effective leadership and team climate skills.
(which may or may not be updated soon) is a
collection of essays on test automation called
The A Word: Under the Covers of Test
Automation and is available on Leanpub.
Info-Tech Research Group | 111
Research Contributors and Experts
Benjamin Palacio Jack Bowersox Jr.
Information Systems Analyst Software Quality Assurance Supervisor
County of Placer Mutual Benefit Group
“Agile Methodology: The Complete Guide to Understanding Agile Testing.” Gunja, Saif. “Shift Left vs Shift-Right: A DevOps Mystery Solved.” Dynatrace
QASymphony, n.d. News, 31 Jan. 2022.
Bass, Len, et al. Software Architecture in Practices. 3rd ed. Pearson Education, “ISO 25000 Standards: Software and Data Quality.” ISO, n.d.
2003.
Jung, June. “How to Test Software, Part I: Mocking, Stubbing, and Contract
Benua, Melissa. “Doing continuous testing? Here’s why you should use Testing.” Dzone, n.d.
containers.” TechBeacon, 30 Aug. 2019.
Len Bass, Paul Clements, Rick Kazman. Software Architecture in Practices:
“Best Practices for Creating Test Scripts.” Micro Focus. N.d. Third Edition. Pearson Education, Inc. 2003.
Bose, Shreya. “Code Coverage vs. Test Coverage : A Detailed Guide.” “Chapter 16: Quality Attributes.” Microsoft Application Architecture Guide,
BrowserStack, 26 March 2020. 2nd Edition. Microsoft Developer Network, n.d.
“Chapter 16: Quality Attributes.” Microsoft Application Architecture Guide. 2nd “Parallel Testing.” SmartBear Software, 10 April 2018.
ed. Microsoft Developer Network, 2009.
Quick, Lindy. “Acceptance Criteria for User Stories: Examples and Best
Chatterjee, Shormistha . “What Is QAOps? (with Methodologies).” Practices.” KnowledgeHut, 23 Oct. 2023.
BrowserStack, 11 Nov. 2022,.
Shift-Left Testing: How to Apply Shift Left Approach to Continuous Testing.”
“Checking the Data Flow Diagrams for Errors.” W3Computing.com, n.d. Katalon, 19 May 2023.
Crispin, Lisa. “Agile testing quadrants: Guiding managers and teams in test “State of Software Quality Report 2023.” Katalon, 2023.
strategies.” TechTarget, Jan. 2012.
“Software quality.” Wikipedia. 9 July 2018. Web.
Dhanotiya, Neha . “Implementing Quality Assurance in the Software
Development Lifecycle.” FreshWorks Studio, 22 Dec. 2020. ” Test Automation vs. Automated Testing: The Difference Matters.” Tricentis, 11
Jan. 2017.
Fowler, Martin. “Mocks Aren't Stubs.” Martin Fowler.com, 2 Jan. 2007.
“Test Case.” Software Testing Fundamentals, n.d. Info-Tech Research Group | 114
“Future of Quality Assurance.” LambdaTest, 2023.
“Test Case Design Techniques.” ProfessionalQA.com. 12 March 2018.
Bibliography
“Test Environment Management Best Practices.” Plutora, 23 Nov. 2020.
“The Future up Close World Quality Report 15th Edition 2023-24.” Sogeti,
2023.
Wisdom, Elizabeth. “Where Are We Going and Who’s Driving? Developing and
Designing a Comprehensive QA Roadmap.” Chicago Quality Assurance
Association, 2017.
Given a user is logged When a user clicks on Then the user is prompted to upload
in and looking at their the update profile a new picture from their computer
profile picture link below their directory and the selected picture
profile picture becomes the current profile picture
Business
Business Unit,
Unit, Input Output
Capability, or
Capability,
Process
or Process
Does the former business unit, capability, or process
provide sufficient information to generate the expected
outcome of the dependent business unit, capability, or
process?
Dependent Business
Business Unit,
Input Output
Unit, Capability, or
Capability, Process
or Process Info-Tech Research Group | 120
Input Output
2.Using the template below, identify the various inputs and outputs
for each business unit, process, and business capability.
a) Discuss if the inputs provide the business unit, processes,
or business capabilities sufficient information to generate
the expected outputs. Materials Participants
b) Discuss if the expected outputs produce sufficient
information for the successful completion of the dependent
business unit, processes, or business capabilities. • Whiteboard • Product owners and
managers
• Markers
3.Discuss what can be tested to ensure the right inputs are provided • Development team
and the expected output is generated.
• Business analyst
• QA and testers
Input Output
• Authentication to Edit User Profile • Changes to profile
system details: username,
• Current profile details email address, and
address
Source: W3Computing.com
Source: W3Computing.com
Business Process
Flow
• Graphical representation of the activities and tasks to produce a
product or provide a service for a particular consumer.
• Processes include those that govern the operations of the business
unit, core business capabilities that directly create value streams, and
those that support core capabilities.
State Diagram
Storyboards
• Visually shows customer journeys of the system or product using
sketches of user interfaces (e.g. wireframes and storyboards).
• Equivalence partitioning: In this method, test input data is divided or partitioned into a number of
classes having equivalent amounts of data, which are then used to derive and design test cases for
each class or partition. This helps to significantly reduce the number of test cases.
• Boundary value analysis: This important specification-based testing technique is used to explore
errors in the software product at the extreme ends of the input domain (i.e. at boundaries) and is used
accordingly to derive and design test cases.
• Decision tables: A systematic technique of designing test cases, decision tables use different
combinations of inputs and their corresponding outputs based on variations of conditions and scenarios
adhering to different business rules.
• Use-case testing: This technique is used to identify test cases covering end-to-end software product
evaluation. The test cases are designed to execute business scenarios and user-end functionalities. With
the assistance of use-case testing, one can easily identify test cases that cover the entire system, on a
transaction-by-transaction basis, from the start of the testing to its end.
• Statement testing and coverage: This is the weakest criteria and least preferred metric for checking
test coverage. Here, test scripts are designed to execute code statements. The main purpose of this
technique is to calculate the percentage of executable statements that are exercised by test suit.
• Decision testing coverage: Also known as branch testing, decision testing coverage is where the test
design techniques exercise the percentage of the outcome of the decisions. Here, the test coverage is
measured by the percentage of decision points, which are executed out of the total decision points in
the application.
• Condition testing: This type of structure-based technique involves 100% coverage of the code. Here,
each condition of the code coverage is executed at least once. In coverage testing, test cases are
designed in such a way that the condition outcomes are easily executed.
• Multiple condition testing: Here, the main focus is on testing different combinations of condition
outcomes to get 100% coverage. To ensure this, two or more test scripts are required, which becomes a
bit exhaustive and difficult to manage.
Source: ProfessionalQA.com.
• All path testing: All path testing is the strongest structure-based test case design technique. It
involves using the source code of a program to find every executable path to help determine Info-Tech
all the Research Group | 126
faults within a particular code.
Sidebar: Other test design
models
Experience-based techniques
• Exploratory testing: Usually conducted by business analysts and other business experts, exploratory
testing is used to test applications without any formal documentation of test cases, test conditions, or
test scripts. This is a hands-on testing approach, wherein testers are involved in minimum planning and
maximum test execution. In exploratory testing, test design and test execution are performed
simultaneously.
• Error guessing: A widely used technique, error guessing is highly dependent on the skills, intuition,
and experience of the testers. Here, testers have the freedom to anticipate the errors based on their
experience, availability of defect data, and their knowledge of product failure.
• Session-based testing: Session-based testing builds on exploratory testing by providing more
structure without taking away from the benefits that exploratory testing provides, such as the ability to
better mimic the user experience and get creative with testing. Testing is conducted during time-boxed,
uninterrupted sessions, testing against a charter and requiring testers to report on the testing that took
place during each session. Sources: ProfessinalQA.com; QASymphony.
1.Review the various test design models and discuss how they are
applicable to your business units, processes, and business
capabilities that you will be testing.
3.List the test cases and revise your acceptance criteria if needed.
Group your test cases into suites (or themes) if they collectively
Materials Participants
are critical to product success.
Test
Test Expected Acceptance
Case Test Script Test Data
Scenario Results Criteria
ID
TC01 Check 1. Go to website User ID=MSmith User should log Tests part of
customer 2. Enter valid Password=pass in into web larger suite
login with user ID 99 application
valid data 3. Enter valid Given a user is
password logged in and
4. Click “Submit” looking at their
profile,
TC02 Check 1. Go to website User ID=MSmith User should not
customer 2. Enter invalid Password=glass log in into web When a user
login with user ID 99 application clicks on the
invalid data 3. Enter valid “update profile
password picture” link
4. Click “Submit” below their profile
picture,
TC03 User forgets 1. Go to website User email User should not
customer 2. Click “Forgot address= log in into Then the user is
login User ID or msmith@mysite application and prompted to
information Password” .com confirmation upload a new
3. Send email should picture from their
confirmation be sent to user. computer
email Screen directory and the
instructs user selected picture
to view email becomes the Info-Tech Research Group | 131
to reset user ID current profile
and password. picture.
A.3 Example
Log Into Find a Group Explanation of Electronic Eligibility
Use Cases: System and Number Benefits Payment
Deliver Profile
Address
Campaign Customer
Name
FirstName FirstName
Attribute
Phone
LastName LastName
s
City
Description Role
Zip
Timeline Organization
Get customer’s
Create campaign Revenue
Entity
organization
Usag
e of
% #
1
Severity,
Risk, and
4 2
Impact
Levels
3
Info-Tech Research Group | 134
Input Output
2.Discuss how business and technical risks will factor into your
overall risk level.
4.Calculate your prioritization score by multiplying the risk level and • Whiteboard • Product owners and
the probability of issue occurring. managers
• Markers
5.Prioritize your test cases in order to build your test schedule. • Development team
• Business analyst
• QA and testers
Source: W3Computing.com.
Above/Below
$5,000 Budget?
Create a Get Finance
Get Email List
Campaign Approval Below Budget
Accessibility Populates
Phone #
Distance
Search
Services Get Directions
Specialties
1 Search Result Refer
Language
Gender
Accessibility
Info-Tech Research Group | 143
Appendix C
Test Data and Environment Management Good Practices
• It involves collaboration among operations, security, developers, testers, and database analysts (DBAs) to provision,
subset, mask, and manage test data so it meets your organization’s data quality and management standards (irrespective
of where the data comes from and who uses it). The challenge is implementing just enough oversight and discipline, so
teams are neither impeded nor disempowered to pull and manipulate data as they see fit. Even though test data
management (TDM) may not require the same degree of rigor and control as formal data management
practices, some of its key principles can be leveraged to ensure proper data quality, ownership, and approvals are
followed. For example, test data owners must keep active tabs on the value and relevance of their data (i.e. the test data
lifecycle) to determine when test data sets should be tweaked, refreshed, or retired. Refer to Info-Tech’s
Create a Data Management Roadmap blueprint for more details.
• Test data can quickly become stale and irrelevant, since much can be learned and changed as your system is developed,
tested, and handled in production. While it’s ideal to refresh test data after any change, test data should be refreshed
at least after every major release or after a defined number of changes have been made to the system under
test or test configurations. This frequency is ultimately dependent on the effort to refresh the data balanced against the
Test
valuedata needs to be protected
of production-like fromby
data as defined unauthorized access.
your functional and nonfunctional requirements.
• Most TDM vendors abide by the General Data Protection Regulation (GDPR) and personal identity information protection
standards, such as Personal Information Protection and Electronic Documents Act (PIPEDA). However, not every vendor
goes about data security and privacy the same way (e.g. dynamic masking through an API or using GUI to mask data in
CSV files), nor abides to all industry-specific compliance requirements, such as the Health Insurance Portability and
Accountability Act (HIPAA), Gramm-Leach-Bliley Act (GLBA), the Safe Harbor Privacy Principles, and Payment Card Industry
Data Security Standard (PCI DSS).
Build a test data self-service model.
• Test data must be ready to be used when it is needed, whether testing is proactively scheduled or done on demand as part
of your continuous delivery pipeline. Ensure your test data is reset automatically as part of your test run if it’s determined
it will be used again. Otherwise, collaborate with your data stewards to ensure your repository contains the appropriate
test data.
Info-Tech Research Group | 145
Adopt effective test data
management practices (continued)
Mask and obfuscate your test data with a plan.
• Company standards and industry regulations may limit the exposure of production data to
non-production users, including testers. The challenge here is determining how much of
the data should be altered to comply with these constraints while maintaining
the correct level of fidelity. Working with your DBAs, security team, and other
stakeholders, develop the appropriate data-masking profiles using frameworks and
transformation tactics (e.g. conditional masking and compound masking). Support your
data-masking profiles with applications (e.g. Oracle) and toolsets and tailor them against
regulations (e.g. GDPR) and determine if masking should be performed
statically or dynamically. Refer to Provision and Mask Your Test Data With the Right Tool for
more information
Develop a syntheticon thegeneration
data tools in thepractice.
test data management (TDM) space.
• Production data may not be ready or the right fit for testing, or the cost to prepare and
manage it outweighs its testing value. Teams can address this by generating synthetic
test data sets that are algorithmically generated repositories. These resemble
(but do not contain) or are based on real, existing information. Noise and other
discrepancies can be artificially interjected in a controlled manner to stress your
applications. The accuracy of your synthetic data may outweigh the need for production
data; however, this requires significant investment in and maturity of your data analytics
practice. Info-Tech Research Group | 146
Adopt effective test environment
management practices
Apply a lightweight environment management model.
• Budget, resourcing, time, security, and other IT and business risks often limit a tester’s ability to test in high-fidelity
environments. Testers must then use, manage, and monitor contained, isolated environments (e.g. containers) that are
often provided by your operations teams with the appropriate controls, mocks, stubs, and other alterations in place. Even
though test environments do not require a heavyweight management model, adopting some formal management
practices is often enough to see noticeable benefits, including several key practices in collaboration with
operations such as these (Plutora, 2020): package and publish test environments for easy sharing through self-service
and track, realign, and tune preconfigured assets to correctly align the test environment with production updates to
minimize variations and misrepresentations. Refreshes should occur whenever a change may influence the functionality
and operations
Mock of the
and stub your application.with a plan.
environments
• Tests should only fail when there is a failure in the code or the components being tested, not the environment. Testers
collaborate with operations to decide which third-party code and external system integrations will be
mocked and stubbed in the test environment so tests can be executed free from dependencies. Mocking involves
creating a service or object with similar properties as real ones. Stubbing only simulates the behavior and function of a real
service or object. Live integrations can replace your mocked and stubbed objects and dependencies once the appropriate
system integration and regression tests are complete.
• While mocks and stubs can help tests run more quickly and reliably, the way they are used and the reason why they
would be used can significantly influence the tests that will be using them. Design an approach for when and
how mocks and stubs will be used, when they will be replaced with real system components, who should be creating them,
and the framework on their design and implementation. See June Jung’s
How
Use to Test Software,
containers when theyPart I: Mocking,
make sense. Stubbing, and Contract Testing from DZone and Mocks Aren’t Stubs by Martin Fowler
for more information.
• “A container is a collection of code, a configuration, and runtime dependencies all bundled together with an execution
engine on a virtual machine” (TechBeacon). The ease and quickness to provision, update, secure, and manage containers
makes them an attractive test environment option for developers to test early and can satisfy the role-delineation,
infrastructure, and security concerns of operations. Despite the benefits, containers may not provide the
environmental conditions or flexibility that some tests require so virtual machines and other traditional
environment technologies may still be necessary. See Info-Tech’s Containers Survival Guide for Infrastructure for more
Info-Tech Research Group | 147
information.