One Pager Template

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

CSC Document

One Pager
How-to: Provide information in all the following headlines and remove the light blue help texts when
your one-pager is done. If you have nothing to write, then write just that and tell why - don't just leave
a headline blank or write 'N/A'.

Finally, despite its name, a one-pager does not need to be just one page, which should be pretty
obvious by now :-). The reason for its name, is to remind you to be very concise1 in everything you
write and remember: Any person should be able to read your one pager and fully comprehend the test
task and the reasoning behind the decisions you have made along the way. The one pager is basically
a test plan for a single test task.

The one-pager must be reviewed in:


1) a peer review by a tester colleague.
2) a review by either (OR both if deemed necessary) the lead developer or a business analyst with
good knowledge of the work package (delivered content and customer needs).

Remember to document the two reviews in the modification history table, and make use of Microsoft
Word's own 'Review' feature (with 'track changes' turned on) during the review phase to allow for easy
review commenting and acceptance/rejection of review comments.

1 Modification History
<Always keep your modification history up to date so that everybody can see the status of the one
pager at any point in time. Remember to keep the one pager updated if for instance you encounter
additional/reduced test scope during the test design/execution phases or new risks arise. Let such
things be reflected in the one pager and update the modification history accordingly.

Date Change comment/status Author Version


<18.12.2013> <Initial version.> <ALB> <0.1>
<18.12.2013> <Ready for peer review by HEP.> <ALB> <0.2>
<19.12.2013> <Peer review comments inserted. Peer review done.> <HEP> <0.3>
<20.12.2013> <Ready for review by business analyst, LOR> <ALB> <0.4>
<23.12.2013> <Review comments inserted. Review done.> <LOR> <0.5>
<23.12.2013> <Review comments incorporated.> <ALB> <1.0>

2 Overall Information
Author: <Write your initials/name here>
Release: <Write the release here>
Work Package / USD: <Refer to the Work Package or USD here>
Project Description: <What is the project about - in just a few sentences? Why has it been made and
who is it for? What does it do? Which part(s) of CCS does it involve/affect and does it require e.g.
new/updated/existing integration(s) and/or HIP mappings?>

3 Q&A
During test analysis and design, always important questions need to be answered. Those questions
and their answers must be entered in this table

1 giving a lot of information clearly and in a few words; brief but comprehensive [Google]
352941753 | created 17/02/2015 07:57
Updated 19/02/2015 07:13
Allan Reimer
Skipper alb Page 1 of 5
INTERNAL - This document is for INTERNAL purposes only and must
not be reproduced or distributed outside of the organisation without
CSC Document

Question Answer Reference Status


Will the solution also apply Pending ULB Early solution Mail sent to ULB.
for Capital Region? document Waiting for his answer.

4 Customer Focus, Dependencies and Agreements


Customer Focus: <What is the customer focus; must it be tested in both OPUS Work Station and
Clinical Portal, or will one of them suffice? Give an explanation why.>
Dependencies and Agreements: <Are there test dependencies or agreements made in relation to
other test activities carried out internally in CCS or externally, e.g. in SI?>

5 Test Scope
5.1 Alpha Test (System Test Phase - Test Team India)
5.1.1 In Scope (including test types and techniques)

<Bulleted list or plain text describing the scope of the alpha test

Example:
In the system test, the two new Goal and Evaluation UIs will be tested in a functional test to ensure
firstly that they comply with the UI guidelines for CPM (tabulation, default actions, button states, right
click menus and shortcuts), and secondly to ensure that they allow a user to do basic CRUD actions
on goals and evaluations.

Furthermore, the new CPM outbound 'goal and evaluation' HL7 interface to SystemX will be tested in
a system integration test by sending different CRUD messages of goals and evaluations from CPM to
SystemX via the HIP (includes new mappings from SI).

Finally, AUDIT logging and functional security as well as data authorizations in organization based
security will be tested in a functional test.>

5.1.2 Out of Scope

<Bulleted list or plain text describing what is not in scope for the alpha test, and why.
Example:

Goals and evaluations in print are not in scope of the test, since no modifications have been made in
that area as part of this work package. Furthermore, testing overview satellites showing goals and
evaluations have been excluded from this test task, as it is in scope in another test task for 6.0 (CCS-
609 Clinical Overview Make Over), which has been confirmed with HEP.

Finally, data authorizations in a team based security setup will not be tested due to the overall project
decision about testing 6.0 in organization based security environments.>

5.2 Beta Test (Delivery Test Phase - Test Team Denmark)


5.2.1 In Scope (including test types and techniques)

<Bulleted list or plain text describing the scope of the beta test. See example from alpha test above>

5.2.2 Out of Scope

352941753 | created 17/02/2015 07:57


Updated 19/02/2015 07:13
Allan Reimer
Skipper alb Page 2 of 5
INTERNAL - This document is for INTERNAL purposes only and must
not be reproduced or distributed outside of the organisation without
CSC Document

<Bulleted list or plain text describing what is not in scope for the beta test, and why. See example
from alpha test above >

5.2.3 Acceptance Criteria

<List the acceptance criteria which will be the foundation of the beta test. These must be testable, i.e.
it must be possible to verify them by running the associated beta test.>

6 Risk Assessment
<List risks identified during your analysis of the test task and how they are mitigated. For each risk,
make a likelihood (3-High, 2-Medium, 1-Low) and impact (3-High, 2-Medium, 1-Low) assessment.
Important project risks must be conveyed to the Test Manager. Priority is your way to render the test
design and test execution effort/focus more visible. Keep the status on each risk updated at all time.>
Risk Likelihood Impact Mitigation Priorit Status
y
<Test department 3-High 2-Medium <Test cases will be written to check 2 <Pending>
has not been up on the these documents,
doing any static especially the UI documents. Are they
testing, eg. no correct (when finishing the system
participation in test) with respect to descriptions and
formal review of screenshots?
HLA, FDD and UI
document. In the system test cases, there will be
Risk that several special focus on basic functional
basic UI/functional behaviour, such as navigation,
errors have made shortcuts, default buttons etc.
its way to the Furthermore, a small regression test
system test of adjecent functionality will be
undetected.> made.>
<As part of the UI 3-High 3-High (see above mitigation) 1 <In
make over, a lot of progress>
code has been
refactored.
This could
introduce a wide
variety of errors
for goals and
evaluations.>
<Integration to 2-Medium 2-Medium <Special attention in system 3 <Done>
SystemX via HIP integration test on ensuring that what
means new HIP is made of CRUD actions
mappings never (goals/evaluations) in CPM, appears
tested before. correctly in SystemX.>
This could mean a
lot of time spent
on interface
related errors.>
<Late delivery of 1-Low 2-Medium <Test Manager has been notified 3 <Done>
HIP mappings about this project risk, which
have been compromises efficient test execution
announced by SI. and quality assessment of 'CCS-608
Risk that test Goal and Evaluation'.>
won't finish in
time>

352941753 | created 17/02/2015 07:57


Updated 19/02/2015 07:13
Allan Reimer
Skipper alb Page 3 of 5
INTERNAL - This document is for INTERNAL purposes only and must
not be reproduced or distributed outside of the organisation without
CSC Document

7 References
<All used documents/references you have used as part of the analysis to produce the one pager must
be listed below.>

Title Location Version


<HLA Goal and <CCS - Product and development\Module Clinical Process R <VSS version 4>
Evaluation 6.0> 6.0\Clinical Process Functionality\CCS-608 Goal and
Evaluation\HLA\HLA Goal and Evaluation 6.0.docx>
<FDD and TDD Manage <CCS - Product and development\Module Clinical Process R <VSS version 6>
Standard Goals> 6.0\Clinical Process Functionality\CCS-608 Goal and
Evaluation\FDD & TDD\FDD and TDD Manage Standard
Goals.docx>
<UI documents> <CCS - Product and development\Module Clinical Process R <Versions on
6.0\Clinical Process Functionality\CCS-608 Goal and 19.12.2013>
Evaluation\UI>

352941753 | created 17/02/2015 07:57


Updated 19/02/2015 07:13
Allan Reimer
Skipper alb Page 4 of 5
INTERNAL - This document is for INTERNAL purposes only and must
not be reproduced or distributed outside of the organisation without
CSC Document

8 Appendix - Glossary of Test Terms


This appendix is merely a glossary for terms often used in the one pager. It is to be used by the author
to ensure a consistent use of terms. For instance, what do we mean when we use terms like 'white
box test', 'black box test', 'non-functional testing' etc?
See the full ISTQB Glossary of Terms at http://www.istqb.org/downloads/viewcategory/20.html

Term Definition (ISTQB)


Component testing A test level from the V-model where individual software components are tested.
Also known as unit testing.
May include testing of functionality and specific non-functional characteristics
(eg. resource behavior, performance, robustness and structural testing such as
decision coverage).
Test cases are derived from work products such as software design or data
model.
Integration testing A test level from the V-model used to expose defects in the interfaces and in
the interactions between integrated components or systems.
System testing A test level from the V-model where an integrated system is tested to verify that
it meets the specified requirements.
Can be based on risk and/or requirement specifications, business processes,
use-cases or other high level descriptions of system behaviour (In CCS:
typically RR, HLA, FDD and UI documents).
System Integration Testing the integration of systems and packages; testing interfaces to external
testing organizations.
Functional testing Testing based on an analysis of the specification of the functionality of a
component or system (see also black box testing). Is often done focusing on
suitability, interoperability, security, accuracy and compliance.
The techniques used for functional testing are often specification based, but
experience based techniques can also be used.
Non-functional Testing the attributes of a component or system that do not relate to
testing functionality, e.g. reliability, efficiency, usability, maintainability, performance
and portability. It is the testing of 'how well' the system works.
Black box testing Testing, either functional or non-functional, without reference to the internal
structure of the component or system.
White box testing Testing based on an analysis of the internal structure of the component or
system.
Static test Testing of a software development artifact, e.g., requirements, design or code,
without execution of these artifacts, e.g., reviews or static analysis.
Dynamic test Testing that involves the execution of the software of a component or system.
Can make use of either white box testing techiques such as 'Structure-based'
(statement, decision, condition, multiple condition) and 'Experience-based'
(error guessing, exploratory testing), or black box testing techniques such as
'Specification-based' (equivalence partitioning, boundary value analysis,
decision tables, state transition, use case testing)
Regression test Testing of a previously tested program following modification to ensure that
defects have not been introduced or uncovered in unchanged areas of the
software, as a result of the changes made. It is performed when the software or
its environment is
changed.

352941753 | created 17/02/2015 07:57


Updated 19/02/2015 07:13
Allan Reimer
Skipper alb Page 5 of 5
INTERNAL - This document is for INTERNAL purposes only and must
not be reproduced or distributed outside of the organisation without

You might also like