Testing An OnBase Solution PDF
Testing An OnBase Solution PDF
OnBase Solution
Technical Support
The information contained in this document is subject to change without notice and does not
represent a commitment on the part of Hyland Software, Inc. Please contact your first line of support
to request any updates to this documentation.
Hyland Software® and OnBase® are registered trademarks of Hyland Software, Inc. Application
Enabler™ is an unregistered trademark of Hyland Software, Inc. All other trademarks, service marks,
trade names and products of other companies are the property of their respective owners.
Intended Audience
This document is intended for OnBase administrators preparing to test a new solution deployment or
implementation of OnBase.
Abstract
This documents will cover a number of testing methodologies and recommendations which can be
used when preparing for a new solution deployment or implementation.
Applicable Versions
This document is not specific to an OnBase version.
ii
Testing an OnBase Solution
Table of Contents
Intended Audience .......................................................................................................................... ii
Abstract ........................................................................................................................................... ii
Applicable Versions ........................................................................................................................ ii
Table of Contents ........................................................................................................................... iii
Introduction ..................................................................................................................................... 1
New Solution vs. Existing Solution Testing ................................................................................. 1
Upgrading ................................................................................................................................ 2
The Solution Testing Lifecycle ........................................................................................................ 3
Quality Assurance (QA) Testing Phase ....................................................................................... 3
User Acceptance (UA) Testing Phase .......................................................................................... 4
Purpose of a Test Plan .................................................................................................................... 5
Creating / Designing a Test Plan .................................................................................................... 5
Building Your Testing Team ....................................................................................................... 6
Know Your Solution .................................................................................................................... 8
Identify Your Testing Goals ........................................................................................................ 8
What’s in a Proper Test Plan ....................................................................................................... 8
Where to Start ......................................................................................................................... 9
Writing Your Test Plan ............................................................................................................ 9
What to Test ............................................................................................................................ 9
Developing Test Cases ................................................................................................................10
Pre-Implementation versus Post-Implementation Test Cases ..............................................10
Create Re-Usable Test Cases ..................................................................................................10
Arrange Test Cases for Efficiency ........................................................................................... 11
Prioritize Your Test Cases....................................................................................................... 11
Smoke Tests ............................................................................................................................ 11
Before Executing the Test Plan ...................................................................................................... 12
Get Leadership Buy-In ............................................................................................................... 12
Build the Testing Team .............................................................................................................. 12
Tester Responsibilities ............................................................................................................... 12
Technical Testers .................................................................................................................... 12
iv
Testing an OnBase Solution
Introduction
Whether performing an upgrade or rolling out a new or modified solution, testing is critical to a
successful OnBase implementation. This document is designed to highlight some key strategies for
testing an OnBase solution in order to mitigate risk.
An OnBase solution can be any within the OnBase software which is used to accomplish a task. For
instance, a Workflow process to manage Human Resource Onboarding is a solution and so is a
WorkView Case Management solution to manage customer accounts through a Legal Process.
Another example of a simpler solution is rolling out a new scanning process.
It is important to note that there is no perfect test plan or way to execute it. OnBase solutions differ
greatly from one implementation to another and network environments differ in complexity. To
further complicate matters, supporting staff’s knowledge of the system is often compartmentalized.
This is why there is no single test plan that is appropriate for all customers.
This guide has been written generically for any solution and is provided to help you better understand
the Solution Testing Lifecycle, help you construct your test plan and build your supporting team. The
knowledge that you have of your system and the support of your implementation team are both vital
to devising a successful test plan that meets your specific needs.
This document focuses on testing two types of solutions: new solutions and existing solutions. While
many of the topics relate to both, there are some key differences between the two. For example, when
you develop a new solution, there is typically no existing test plan or test cases for the solution. In this
case, you will need to devote time to working with the end users to create the test plan and test cases.
However, with an existing solution, it is likely that much of content will already have been created.
Upgrading
Performing an upgrade to a new version of OnBase requires more extensive testing than testing the
roll out of a new solution. During an upgrade, all aspects of an OnBase system can be affected. In
most cases it’s best to use the Incremental/Parallel Upgrade (IPUP) method when upgrading the
software. Though IPUP can significantly mitigate the inherent risk of upgrading, it’s still important to
thoroughly test behavior across the solution.
NOTE: For more information on Incremental/Parallel Upgrades, review the Mitigating Risk in
OnBase Upgrades White Paper in the Technical section of OnBase Community;
https://community.hyland.com/technical/upgrades.
2
Testing an OnBase Solution
After the solution is handed off to the end users, the end users is responsible for thoroughly testing
the solution according to their business and verifying that it meets all requirements. The combination
of Project Team testing and the End User testing should validate the solution prior to moving it into
production.
This guide divides the Solution Testing Lifecycle into two distinct efforts; Quality Assurance Testing
and User Acceptance Testing (UAT). The following diagram illustrates the post development Solution
Testing Lifecycle.
The Quality Assurance (QA) Testing Phase focuses on things such as system availability, high volume
processing, system response times, user access and integration performance. This phase of testing is
performed by the Project Team to ensure that the solution will work for the end users and is made up
of the following:
Develop a Test Plan: The Project Team develops a test plan to be executed during the QA
Test Plan Execution phase.
©2020 Hyland Software, Inc.
3
Testing an OnBase Solution
Test Plan Sign-Off: All members of the Project Team (including the process owners) agree
on the test plan and documented test cases.
Test Plan Execution: Members of the Project Team execute the QA portion of the test plan.
Resolving Issues: Issues are identified, resolved and documented during the QA test plan
execution phase.
The User Acceptance (UA) Testing Phase focuses on areas such as system usability, messaging and
alerts, adherence to business process requirements and accessibility. This phase of testing is
performed by the End Users to ensure that the solution meets the end user requirements and is made
up of the following:
UA Test Plan Execution: The UA Test Plan Execution stage of the User Acceptance (UA)
Testing Phase focuses on the end user testing of the test plan.
Resolve Issues: The Resolve Issues stage of the User Acceptance (UA) Testing Phase
includes the documenting and resolution of any identified issues during the UA Test Plan
Execution stage.
Solution Sign-Off: The Solution Sign-Off notes the agreement between the members of the
Project Team and the end users that the solution has been tested and meets all requirements
needed for Production readiness.
Provide Solution Documentation: After the Solution Sign-Off, the Project Team will
complete the documentation for the solution.
Solution Go Live: The final step of the process after the Solution Sign-Off of implementing
the solution in Production.
4
Testing an OnBase Solution
As you embark on the process of creating a test plan, keep in mind that the process will take time.
Don’t assume that your first attempt will result in a final, permanent document. Your test plan should
be an evolving resource that will be updated and modified as the need arises. The more time you take
when developing your test plan, the more thorough and complete it will be.
Your test plan should be nearly completed before you start testing. It should be a collaborative work
between the solution business analyst and the development team and should include test cases that
account for testing all necessary business processes.
The following sections discuss the necessary components of the test plan and the steps for building
one that meets needs.
It’s important to include all stakeholders in the team who will create and execute the test plan.
Anyone who has ownership of a part of the system is a responsible party. Ownership extends beyond
the OnBase Administrator to members of your server or networking teams. Incorporating their
experiences and knowledge of the overall system will help to create a more complete test plan for both
the Quality Assurance (QA) Testing Phase and the User Acceptance (UA) Testing Phase.
The following list should help to identify individual stakeholders who should be part of the Testing
Team:
OnBase Main point of contact Set up computers for testing. This includes all
Administrator to help users with OnBase software, third-party software,
issues databases, hardware, etc.
Primary tester during the Quality Assurance
(QA) Testing Phase
The first line of support during the UA
Testing Phase
Sets up scanners and other hardware for
testing
Validates software deployment packages
Subject Validates the solution Provides the success criteria for all tests cases
Matter Typically involved with User Acceptance (UA)
Experts Testing Phase
(SME) May help execute test cases created by the
Test Plan Coordinator
Infrastructure Main point of contact Assists with resolving any defects related to
Resource for all Infrastructure the infrastructure and customer environment
defects
6
Testing an OnBase Solution
Application Main point of contact Tests the integration components with any
Specialist(s) for all third-party third-party applications
application testing Translates any errors from the third-party
and defects application and assist with defect resolution
Test Plan Principal resource for Creates test plans beginning at the end of
Coordinator generating test plans discovery
and overseeing Ensures test plans are available for the
testing efforts beginning of customer testing activities
Verifies the solution conforms to business
scenarios, per the requirements
documentation
Tester Responsible for Executes test plans generated by the test plan
testing of the Coordinator
configured solution Dedicates significant time to the testing effort
and is allocated by management
Reports issues
Retests issues that are potentially resolved
At times, same person as SME(s)
Knowing your solution is a crucial factor in successful testing. Having a strong knowledge of how the
solution is implemented, integrated and used helps to identify the areas of focus for your testing.
Take some time to work with the business process owners. They can describe business objectives and
explain how the software has been implemented to achieve them. A good rule of thumb to keep in
mind is that end users know the processes better than most admins. They can show you how they use
the system. Understanding the end user perspective is valuable for instance when testing an upgrade,
as upgrading should not change the business process and end users should be able to use the system
in the same way as before the upgrade.
There are many aspects of your solution that can require testing. Identifying testing goals will help
you to select appropriate test case that align to those goals. Start by identifying which of the following
you want to test.
Performance testing
Functionality testing
Software deployment
Installation
Another consideration as you are developing your goals is the type of project you will be testing. For
example, certain objectives may be considered differently if the testing is for a new solution versus for
a solution expansion or upgrade. This will serve to frame your perspective as you create your test
plans.
As you create your test plan keep in mind that there are different types of testing methods available.
Manual testing is most common and will typically take more effort and time due to the need for
human interaction. The benefit of manual testing is that it can capture variance typically
demonstrated by having human interaction with the system. On the other hand, automated testing
can be useful to help test performance and simulate production level load on the system that might
not be possible with manual testing.
NOTE: Examples of Test Plan and the parts of a Test Case can be found in Appendix A.
8
Testing an OnBase Solution
Where to Start
Start by reviewing the solution requirements. These are the requirements that specify what business
needs the solution was designed to address and how it should address those needs. It should be
included in the Solution Design document. Use these requirements to build test cases that prove the
individual requirements are being met. If you have any questions about the requirements, get
clarification from appropriate individuals. By including business process owners in your test plan
creation, you can draw on their expertise and enhance your own familiarity with the processes you
will be testing.
As part of your test plan you will be writing test cases. When creating test cases be sure to use simple
language and straight forward writing styles. This is important for clarity and long-term residual
value. A well-written text plan should allow anyone to execute a given test case without ambiguity.
Although you may be the individual currently testing this particular case, it is very likely someone else
will be the tester in the future.
The test case description should be short and should clearly and uniquely identify the test scenario.
Avoid creating multiple test cases with the same description as this will likely lead to confusion or
oversight.
Within the steps of each test case, refrain from using “if/then” conditions. Using “if/then” conditions
will unnecessarily complicate the test cases and can create ambiguity as there is no single result. You
should always try to address only a single test scenario in each test case. If possible, be sure to
provide sample data, where applicable, as this will help when executing the test cases.
What to Test
Thoroughly test all aspects of the solution during the execution of the test plan. Each implementation
is unique, using different features of the software and implemented differently with respect to
architecture and environmental configuration. For this reason, there is no one comprehensive list of
items that can identify what should be tested. Additionally, the way users use the system will differ
from one implementation to another, which is why knowledge of your system is so crucial.
While there is no definitive list of items to test, consider the following high-level list when building
your test plan:
OnBase Configurations
o Keywords
o Notes
o Scripts
o Workflow
Notifications
System Work
Ad-hoc tasks
Security: Life Cycles, Queues & Ad-hoc tasks
Timers
Work Folders
Filters
o Forms (E-Forms and Unity Forms)
OnBase Architecture
o Server Operating Systems with OnBase prerequisites
o Security outside of OnBase related to the system
o Performance benchmarking comparisons to baselines
o 3rd Party Integrations
The test cases constitute the individuals test and outcomes to be executed during the testing phases.
Not all test cases should be executed during each of the testing phases and for this reason there are
some additional considerations to keep in mind as you develop your test cases to help determine
when and how the test case should be executed.
In most cases, you will spend the majority of your time developing test cases that will be used during
the comprehensive testing phase that takes place prior to the roll out of the solution or the upgrade.
These test cases are used to prove that the software works as expected. This testing is done in your
test environment. However, it is also important to devote time to creating validation test cases that
will be used after the solution is rolled out and being used in a production environment. These test
cases will be used to validate that there were no issues implementation of the solution.
Build your test cases in a way that will allow them to be easily re-used. This will save you time in the
future when you need to upgrade or implement a new solution. Taking the time and care to create
solid test plans will pay off in the end. As your system grows, you’ll update the existing plans and add
new ones.
10
Testing an OnBase Solution
Arrange your test cases in the order in which you want to execute them. This helps to ensure that
your system will meet conditions necessary for subsequent tests. For example, before testing that
User modifications and User deletion works as expected, you’d want to test that User creation works.
1. Scan a document
2. Index a document
3. Retrieve a document
4. Test Security Keywords
As you develop your test cases, note the level of importance of each test case. Some will be crucial to
the success of the project, while others may not be required in order to go live.
Smoke Tests
Smoke Testing is high-level testing used to validate that changes in the software do not destabilize the
overall solution or cause catastrophic errors. Smoke Tests are often used when there are changes
made to the foundation of a solution. They are not meant to be exhaustive tests geared toward
validating the entire solution.
For example, you might use a Smoke Test when deploying a newer build of the software after the
entire solution has already been tested on a new version. In this case, specific tests around the
changes between the builds are necessary as well as smoke testing general functionality.
Verify that you have buy-in from your leadership team. This will undoubtedly help in case you run
into problems. For instance, it could turn out that you need additional resources to fully complete
testing on schedule or you may need assistance from individuals in another department. Having your
organization’s leadership on board helps assure that you’ll face fewer roadblocks.
Consider the strengths of each individual on your testing team. The following are important skills for
a good software tester to have:
Attention to Detail: Finding the obvious issues is typically easy, but finding the more
elusive issues takes considerable skill. Being a good observer and paying attention to the small
stuff can have significant impact on the quality of testing.
Think Outside the Box: A good tester has the ability to look beyond the obvious and
approach testing with a sense of curiosity. Good testers use their experience and intellect to
analyze scenarios and consider consequences. This makes them more likely to identify
problems in other areas of the software, which may not have been tested.
Time Management: Good time management skills help testers to work efficiently. This
ensures that all prioritized test cases are given adequate time
Think from a Customers Perspective: It’s important to be able to think from an end-
user perspective. End users will have a wide range of expertise and experience which should
be considered when testing.
Tester Responsibilities
In order to provide the necessary breadth of testing, a good testing team is composed of people who
have varying knowledge of technology, the solution, and its related processes. The following is a list
of some of the types of testers you should consider for your team.
Technical Testers
Technical Testers test general configuration and functionality during the Quality Assurance phase of
testing. They are typically OnBase Administrators, Network Administrators, Database
Administrators, Server Administrators, etc. These individuals will be responsible for making sure
12
Testing an OnBase Solution
that the system is operational and performing as expected. In addition, your technical testers should
include those people responsible for integrations with other systems.
The Power Testing Users help test specific areas of the solution during the User Acceptance Testing
phase. These testers use their experience in specific areas of the solution to uncover less obvious
issues. They should be engaged after the Technical Testers have determined that the base solution
works as expected.
The General User Base is composed of typical users. It can be helpful to include them towards the end
of the User Acceptance Testing phase after the Power Testing Users have completed their testing.
Have these users perform their typical work with typical documents and make sure the solution works
as expected. There is no need for a specific test plan or specific test cases for these users. General
User Base testing is done to reveal critical system issues that may have been overlooked during
discovery, implementation, and initial testing. This is not an open invitation for scope creep, but
rather a more in-depth phase that may uncover mandatory changes to the system required prior to a
go-live that could potentially result in delays.
The test environment you create should resemble your Production Environment as closely as possible.
Differences between your Production and Test environment could invalidate your testing efforts and
introduce issues into your Production Environment.
NOTE: For more information on creating a test environment, review the Creating an OnBase
Test Environment White Paper in the Technical section of OnBase Community;
https://community.hyland.com/technical/testing-onbase/resources.
Warning Messages
Any warning or error message that you encounter during testing is a potential production issue and
should be noted. Be sure to investigate all messages in order to understand their root causes and
potential effects. While not all messages are problematic, it’s important to understand the cause of
the message. This understanding helps to prevent potential issues in Production and contributes to a
better understand of the overall solution.
Test data used during the execution of a test plan should either closely resemble or be identical to the
data being used in Production. Many processes within Production will be tailored to the specific data
being processed. If you are not testing with similar data, it is possible that your testing results will not
accurately reflect what will happen in Production.
It’s important to know when it’s time to stop testing. Under ideal circumstances testing would
continue until there was a near certainty that the solution is functioning 100% as expected. However,
budgets, time constraints, and limited resource availability usually make that lengthy of a process
impractical. In most cases, testing can be stopped once you have successfully completed all test cases
that focus on system functional requirements. Take into account the priority level of the test cases
and the following to help determine when to stop testing:
An acceptable percentage of completed test cases that cover the functional requirements of the
system has been met
The known issues rate falls below a determined acceptable level
High priority issues are resolved
Another reason to stop testing is when a significant number of test cases are failing repeatedly. There
is no reason to continue execution until the interfering obstacles can be adequately addressed.
Execution should be resumed after addressing any major issues.
14
Testing an OnBase Solution
Load Testing
Load Testing is the process of putting demand on a solution and measuring the outcome. It is
performed to determine a system's response behavior under both normal and anticipated peak load.
While load testing specifics are outside of the scope of this document, it’s important to consider Load
Testing in order to identify issues that you may face in production.
Automated load testing can be helpful in some scenarios. These are typically cases where there is no
need for user interaction. In some cases, custom built applications can be used to perform automated
testing of the system. Within these applications, timestamps could be used to log out to file in order
to log benchmarks for comparison purposes.
The complexities required for Load Testing make it very hard to mimic your general user’s interaction
with the system outside of leveraging the OnBase API (Application Program Interface). For
individuals who are OnBase API Certified, custom application development could provide you the
ability to create a Load Testing application tailored to specific aspects of your system.
NOTE: Further discussion of Automated Load Testing is currently outside the scope of this
document. This requires considerable knowledge of the OnBase software, programing languages and
the APIs.
Performance Testing
Performance testing is the process of determining the speed or effectiveness of the overall solution.
Because user’s perceptions of performance can vary, it’s important to establish objective benchmarks’
for acceptable performance. A simple way accurately measure performance times is to use a
stopwatch to time a process.
NOTE: For more information on benchmarking your OnBase environment, review the Central
monitoring of an OnBase Solution White Paper in the Technical section of OnBase Community;
https://community.hyland.com/gallery/items/53576-monitoring-of-an-onbase-solution-18.
16
Testing an OnBase Solution
Proper issue definition and reporting is critical to timely issue resolution. The following Process
Overview outlines the flow of the Issues Tracking Process workflow.
Process Overview
The following diagram provides a high level overview of the Issue Tracking and Resolution process.
Testers report issues as they work through a test case in the test plan. Issues are reviewed by Project
Lead(s), who determine if the reported issue contains a valid problem related to the solution. If this
issue does not identify a valid problem, it is closed. If the reported issue identifies a valid problem,
the Project Lead(s) assign it to an owner to determine whether or not the problem is an issue with the
software itself or with the configuration of the software. If configuration changes cannot resolve the
issue, the Project Lead(s) engages the first line of support to request a change to the software (i.e. an
Enhancement or Defect Software Change Request).
You should log all discovered issues within an Issue Log to track them during testing. Testers should
supply the following information when reporting issues:
Test Case ID
Test data used (if appropriate)
Test procedure/steps
Date performed
NOTE: The process of logging issues could be through email to the Project Lead(s) or through
another implemented process. For example, a user could submit form which then tracks the issue
within Workflow and WorkView.
Most issues can be distinguished as either a Solution Design issue or a Software issue. A Solution
Design issue refers to any issue found where the business processes implemented within the Software
does not match the expected Solution Design requirements. A Software issue refers to any issue
found where the OnBase application in which the Solution is implemented is not functioning to
specification.
The issues will fall into one of the six categories as shown below:
Solution Design
o Functioning to Specifications
This issue has been reported as a perceived problem with the solution, but is
actually functioning according to the agreed upon requirements. Issues
identified as functioning to spec often identify training opportunities.
o Defect
The solution is not functioning according to the solution specifications (e.g.
Solution Design document). This issue should be reviewed by the Project Team
to determine if the defect can be resolved.
o Enhancement
This is a case where there is an issue which was never documented as a
requirement of the solution specifications (e.g. Solution Design document).
Enhancements should be discussed between the Project Lead and the Business
Process Owner and a plan of action determined.
18
Testing an OnBase Solution
Software
o Functioning to Specifications
This issue has been reported as a perceived problem with the software, but is
actually functioning according to the software specifications (e.g., help files or
Module Reference Guide). Issues identified as functioning to spec often
identify training opportunities.
o Defect
This issue has been reported as a problem. The software is not functioning as
designed according to the software specifications. (e.g., help files or Module
Reference Guide) and requires Hyland Software to fix the problem via a
Software Change Request (SCR).
o Enhancement
This issue has been reported as a new feature to the software as there is no
current feature in place to provide the needed functionality. This will require
the engagement of Hyland Software to add the feature to the software via a
Software Change Request (SCR).
It is important to revisit the test plan shortly after the completion of the project in order to fill in any
gaps discovered during the testing process. These could be individual test cases that were found to be
necessary or expanding on the notes in an individual test case in order to provide additional details
about the intended result
20
Testing an OnBase Solution
NOTE: There are many Test Management Software platform applications for building, tracking and
managing Test Cases for a Test Plan. The example above uses Microsoft Excel, however this tool can
be limited. If your system includes WorkView as part of your OnBase Solution, consider building an
Application to manage this process.
NOTE: Additional columns could be created to reference Support Issues opened with your first line
of support or to keep track of Software Change Requests specific to the Test Case.
The test case Template example can be used for any project and extrapolated into any Test
Management Software platform.
Test Case ID
This is a unique ID to help reference the specific test case. The Test Case ID does not need to be a
complex value, but should provide easy access to the specific test case. The taxonomy used for
cataloging the Test Case IDs should also allow for removal or insertion of additional test cases as it
becomes necessary to evolve your Test Plan.
The description of the test case should be concise and reveal the purpose of the test case. While some
test case will have similar description, make sure that they are all unique.
Testing Prerequisites
In the Testing Prerequisites field, you should explain any activities that the tester needs to carry out
before executing the Testing Steps. They may need to add test data, perform other functions, execute
other test cases, or navigate to a particular part of the system. The Testing Prerequisites field is not
relevant to all test cases and should only be included where there is a need. Make sure to clearly
describe any prerequisites or reference previous test cases in order to provide the necessary
information to the tester so they may conduct the test accurately.
Testing Steps
Each test case will have a list of detailed execution steps. These steps are used to instruct the tester
how to recreate the test case. They should provide guidance on how to execute the test case to
someone who might not be familiar with the process. When documenting the steps of a test case,
make sure to provide enough detail so there is no ambiguity as to what is expected and how to
perform the steps.
Smoke Test
The Smoke Test field will help determine if the specific test case should be executed as part of a
Smoke Test of the solution.
Testing Phase
The Testing Phase will help to determine if the test should be completed as part of the Quality
Assurance (QA) Testing Phase or the User Acceptance (UA) Testing Phase.
22
Testing an OnBase Solution
Priority indicates how urgent and/or important a test case is to the solutions success. All test cases
can’t be of equally high priority, and priority can help determine the amount of time you’ll want to
allocate for their testing. Priority should be based on functional requirements of the solution being
tested. By assigning a priority level to the test cases, you will have a guide to use to focus the balance
of your testing should you run up against time constraints during test execution.
The Test Case Tester is the name of the tester executing the test case. The testers name is used for
auditing the test plan.
Test Date
The Test Date is the date when the test case execution occurred.
Test Data
The Test Data refer to any specific data that is needed in order to execute the test case.
Expected Result
The Expected Result field provides the test case author an opportunity to document the anticipated
outcome of the test case.
Actual Result
The Actual Result provides the tester a place to document the results from the test case.
Pass/Fail
The Pass/Fail field correlates to the outcome of the test case. If the Expected Result matches the
Actual Result, then the test case passed. If there are differences between the Expected Results and the
Actual Result, then the test case failed and should be reviewed for discrepancies.
Comments
The comments field provides the Tester a location to document any additional information which
helps to describe the output of the test case.