uid-techknowledge
uid-techknowledge
,
4, Process of Inte 'raction
Design, Prototyping
j:odule-~ Cons~ruction '
Interviews
ethnography 'c::::>EJq
Analysis
Guidelines
principals
EL~
What is there
' vs. Precise
what is wanted
If n
specification
i ""'"'"' . V
¢::i
Dialog
Prototype ¢::J
Implement
and deploy
Architectures
documentation
Evaluatiori help
heuristics
system.
The essential characteristics of good design are often summarized through 'golden rules'
o.
or heuristics.
Design patterns provide a potentially generative approach to capturing and reusing
0
design knowledge.
, We need to record our design choices in some way and there are various notations and
methods to do this, including those used to record the existingsituation In this chapter, It is at
this stage also where input from theoretical work is most helpful, includingcognitive models,
organizational issues and understandingcommunication.
• Iterationand prototyping: Humans are complex and we cannot expect to get designs right
first time. We therefore need to evaluate a design to see how well it is working and where
there can be improvements.
• Evaluationtests the usability, functionality and acceptability of ail interactivesystem.
' Evaluationmay take place
0 In the laboratory
0 In the field
------------------- ---- ---::c;::~:;:;:
V!~t~ I~
--
'-' User Interactioncieslgn (MU)
•
--;;:::;
Some approachesare based on expert evaluation
4.3
process of Inter. Design, Protolypi
o Analyticmethods
o Review methods
o Model-basedmethods
Some approachesinvolveusers
o Experimentalmethods
o Observationalmethods
o Query methods
• An evaluationmethod must be chosen carefullyand must be suitable for the job
• Some forms of evaluationcan be done using the design on paper, but it is hard to
some fo
feedback without trying it out Most user interface·design therefore involves get real
Implementationand deployment
• Finally, when we are happy with our design, we need to create it and deploy it This Will
involve writing code, perhaps making hardware, writing documentation and
v differ accor mg
O
e amount of functionality and performance they provide
I
, ·veto .
reia0
..,- .,, fi,al prodott The lmp,n.," Ile, lo l<s PrnJ~•d <alism, sJ"' <hey are <eoed
I users.
rOVI I
on rea p . ·d·ng
Since ·de: realism in prototypes is costly, there are several problems on the
, ements1
e: Proo .
manag
0
TIPl . t typing costs time, which is taken away from the real design. Therefore, there
totyping techniques.
are rapid-pro
on-funct10 ...
o • nal features : Some of the most important features, as safety and rehab1hty,
o N be tested using a prototype.
a,oo< rt d
contracts . ro
, p totyping cannot form the basis for a legal contract and must be suppo e
o WI"th documentation.
' Prototypes can be categorized as low fidelity prototypes (e.g. paper prototypes) and high
fidelity_prototypes. Low fidelity prototypes do not look much like the final device. They are
not made from the same materials as the final device and do not have all the functionality of
the final device.
· Low fidelity prototype can simulate some of the interactions, but · perhaps not all the
subtleties of the interaction. · · ·
111
_gh fidelity prototyp~s look more like the ,final device. Your final design is an example of a
high fidelity prototype. They may have some of the functions of the final product. They can
test more of the subtleties of the interactions. However, they take more time to make.
.. i,
':I)
- V User InteractionDesign (MU) • 4-5 Process of Inter. Design• p rototypin
Disadvantages
Advantages
------
Fidelity
• Limited error checking
Lower development cost
Low •
Evaluate multiple designs • Poor detailed specificatlon:r=-----
or COdin
•
. Useful communication device
Addressscreen layout Issues
•
.
Facilitator-driven
· usability~
est1ng
Limited utility after r~
• established qulrernents
--
requirements
---
• Proof-concepts • Navigationand flow limitation-;--
111111111
.,. tion oesign (MU) 4-6 P
1erac rocess of I 1 .
user in n er. 0 es,gn, Prototyping, Cons.
I g and browsing
EJ<pJorIn
o . Instruction or Instrumentation (the pa ct·
passive . . ra 1gms we added)
o The product of the interface or objects used I th .
, ectS : n e interface
obl pbor: Is an analogy to real world objects or processes
I , r,1eta
00 d concep
t design is based on ·these three
. . .
.
perspectives. This can be Illustrated by an
I
Ag e of the most influential pieces of ftw . ·
• JTIPJe. On . so are 1s the spreadsheet The first
eta Vis!Calc, was designed b Dan ·
adsheet, y· Bircklen. Before spreadsheet software
spre k pt track of accounts using ledger b00 ks · '
ountants e and performing calculations by hand .
ace try to make forecasts of profits by d 11·
Accountants • 't' ta mo e mg a spread of scenarios. Bircklen
d rstood what act1v1 ies accoun nts needed to perform and the problems they had with
u;is:ng tools. Bircklen also realized th at computers could make the process interactive. The
e design of the product was
concept
dsheet that was analogous to a ledger book
1. Aspr ea '
_ Jnterac tivity by allowing users to edit the.cells of the spreadsheet ,
2
3 T~e computer performs the calculations for a range of cells. His concept design
· considered the activities, objects (generic entries, cells and columns), and had a good
metaphor.
, Another example is the Star Interface for the office, which was the precursor to desktops.
What are the activities, objects, and metaphor fo~ its concept design?
, Metaphorsare very useful concepts for designing. They relate a new p~oduct to an old
product They can make the product easier to learn and use. They can assist the designer by
making the interface design more consistent and choosing design alternative . Because the
• metaphor is a strong concept, it can be dangerous. Users may belieye that the system should
perform identically to the analogous system that they are familiar with and become baffled
when it does not. In addition, designers may adhere too much to the metaphor , which can I
cause bad design. An example will illustrate both; I recall wh~n I first used a text editor. The
metaphor with a typewriter was clear, the cursor, an underline or gray square was the
typewriters carriage or key location . I had a lot of experience with typewriters, so I could
jl
quickly learn to use the text editor. However, I had problem-typing characters into blank
spaces. I expected positioning the cursor on a blank space should type the character over the
blank space. That is what the typewriter would do. 1 did not realize that a blank space is a
character just like any oth er character. In addition, the cursor, an underline, reinforced my
notion. Therefore, th e des igners adhered too strong to the analogy. Later cursors became
vertical lines. This made clear the difference between the metaphor and the application. The
J ~could have done worst. They could have the page move as the user typed. The
I V
d
- V_ Use r Interaction Design (MU) 4.7 Process of Inter. Desi
_......,.,,
·
.............. .
9n- ,_Prototy .
Pin
"""
notion that the page should remain fixed was a good one and allowed , Q,
. . ,or Illa•·• .
ea sier whil e entering content Note that this 1s not how a typewriter fu "Ing th
. ncttons "'h
1
e Pa
mora ls : · ere,•tei,,,!e
1. Look for a good metaphor , but do not adhere to it too strongly make 1
' • c earthed
°
2. Try to find ways that the design can improve on the old ways of doing th" lfferenc
. • lngs. e.
Concept design is about ideas, how do you come up with ideas?
This can be an individual endeavour or group activity. 1suggest:
o . keep an open mind, do not initially critique any idea that may a nu
Always .
o Become involved: constantly considering all that you know
about the Ptolettand
the world around you.
o Periodically try to force ideas by doodling or making up songs
o Keep the ideas. Later mix and match them
• The standard technique for a group to generate ideas is called brainstorming Bra·
. · instorrnin .
a process ' that assures that the points above are maintained. This is especially imp0 g~
• · . • '. • .rtant in a
group where egos can anse . The bramstormmg process typically goes as follows:
l. Gather around a table in a room that does not have distractions
2. One person briefly describes the goals of the brainstorming session
3. Each person at the table takes a turn voicing ideas, i::to matter how ridiculous, the Ideas
are not critiqued or rejected , only list them briefly .
4. Members of the group are initially not allowed to opt out of voicing an idea
5. Only after there are no more ideas evaluate the ideas: individual ideas and groups of
ideas
6. Write a summacy of the alternative valuable ideas .
.,
Process of Inter. Design, Prototypin!l_, Cons.
if user Interaction Design (MU)
4-8
problems with Interface metaphors (Nelson, 1990} Break conventional and cultural rules.
• · E.g., recycle bin placed on desktop Can constrain designers in the way they conceptualize a
problem space Conflict with design principles Forces users to only understand the system in
terms of the metaphor Designers can inadvertently use bad existing designs and transfer the
bad parts over Limits designers' imagination in coming up with new conceptual models.
Web Browser metaphors Examine a web browser you use and describe the metaphors that
have been incorporated into its design Activity Many aspects of a web browser are based on
metaphors including : Toolbars, button bar, navigation bar, favourite bar, history bar Tabs,
menus, organisers Search engines, guides Bookmarks, favourites Icons for familiar objects
home, stop etc.
Interface metaphors are commonly used as part of a conceptual model
1 What Is an Analogy ?
· ,, . f tw O th'ngs
1 based on their being
Merriam Webster defines an analogy as a companson o . •
l
alike in some way." The Stanford Encyclopedia of Philosophy descnbes an an~log1ca;
argument as citing "accepted s·imilarities between two systems to support the conclusion tha
some further similarity exists."
2. What Purposes do Analogies Serve ?
Analogies help us unde·rstand new concepts, teach new concepts to others, and see the
familiar inane~ light, which in turn , enables us to generate novel solutions to problems.
• Analogies are an important part of how we make sense of new concepts and experiences. Our
minds constantly unconsciously compare new concepts to things we already know, as a way
of understanding them. We look for similarities between our experiences and any new
situation to help us understand the new and unfamiliar.
3. Solving Problems
• When we face an unfamiliar problem, we can think about similar problems or situations we
have encountered in the past, compare their similarities and contrast their differences, and
apply the lessons we learn to our current situation. Analogies are also useful in problem 11
solving. When we face an unfamiliar problem, we can think about similar problems or .
situations we have encountered in the past, compare their similarities and contrast their
differences, and apply the lessons we learn to our current situation.
Noticing and comparing similarities between different 'domains can help us generate novel
solutions to problems . For example, the idea for an automobile assembly line came to Ford
employee Bill Klann after observing a slaughterhouse in which a trolley system moved animal
carcasses to multiple butchers, each performing a specialized task.
V J!'!-1~~.~1-~!~
- V User Interaction Design (Ml.J)_
4.9 Pro<:_ess of Inter. Design, Prot~
- typing, Co
4. Seeing the Famlllar In a Different Light . na.
Analogies also provide a technique for generating new ideas, in which we corn
familiar to something else that is seemingly unrelated. Asking questions su Ch as Pare sorne'thIn
"\An
this like 7" or "Where else have I seen someth ing like this before?" can gene rate an••nat else Isg
I
enable us to see something familiar in a new light a Ogles th·at
The re seem to be three main types of articles about UX analogies-altho ugh so
combine two or more classes of analogies. rne articles
.~~l
V r ~; 11
11111
eraction Design (MU) 4-1O Process of Inter. Desig n, Prototyp ing, Cons .
r Int -
field if we did not bring this variety of experiences we have had in other fields to our
discussions.
our Analytical Minds NaturallySee Comparisons
9, .
The analytical and problem-solving skills we use in user research and design are the same
skills that natura lly lead to, making analogical comparisons. To conduct user rese arch or
usability testing, you have to be observant and per~eptlve. When you analyze the findings of
such res·earch, you notice patterns, make comparis6ns, draw conclusions, and generate
solutions to the problems you identify. It is understandable that thes e skills woul d cause us to
see similarities between user experience and other fields or personal inte rests.
Review Questions
(1 0 Marks)
Q• .1 Explain Interaction Design Process.
(10 Marks)
a. 2 Explain Prototyping and Conceptual Design.
(5 Marks)
a. 3 Write Advantages and Disadvantages on Fidelity.
(1 0 Marks
a. 4 Explain the conceptual design steps.
I
ttE~~£~,~·-~·,~~:·~~·~~ c -• ~-- ,...
-
o_.:.,..
- 1 -
~, SO"li!D ( . j
• • ®
f7
m $1 =y- -
r~
@ #
'¼ •
7,, 'i
~ - :;,,·_
evaluation Techniques
6,~-'!.!·
oe s ign Rules and industry Standards
i.r
3.
User Interaction Design (MU)
5 -25
b) Specify the user requirernents WhY What, Where and When of Evaluation
c) Produce design solutions to meet user requirements
d) Evaluat e the designs against requirements
I
rgonom1cs of human-systern interaction - Human-centered lifecycle of 1,valuation are a valuable too\ for program managers who are se ki t
ISO TR 18529 (2000) E · outcornes . .. _ e ng o
' gthen the quality of their programs and improve outcomes for the children and youth
process descriptions. and formalized definition of the
1) This Technical Report contains a structured 5
rren · . \
\
human-centered processes described in ISO 13407: theyserVe·
evaluation answers basic questions about a program's effectiveness and evaluation
a)" HCD.1 Ensure HCD content in system strategy program . '
b) HCD.2 Plan and manage the HCD process data can be used to improve program services .
c) HCD.3 Specify the user and organizational requirements In this brief, we define evaluation provides a systematic method to study a program, practice,
d) HCD.4 Understand and specify the context of use intervention, or initiative to understand how we\\ it achieves its goals. Evaluations help
e) HCD.5 Produce design solutions determine what works we\\ and what could be improved in a program or initiative.
f) HCD.6 Evaluate designs against requirements
'\.
\
g) HCD.7 lntroduce and operate the system 6.1.1 Why Evaluation ?
1\
Review Questions \• It means simply to assess the value of something. \n this book. it means helping those who are
I
(5 Marks)
\ involved in many different kinds of development programmes to assess the value of what they
Q. 1 Write a short note on Design Principles.
Q. 2 Explain Usability in details. (10 Marks) are doing.
Many of them are already monitoring their own work and may have taken part in evaluating it
\\\
Q. 3 Write a short note on standards. (6 Marks)
Q. 4 Explain ISO standards. (5 Marks) \ in a systematic way.
H people why they evaluate their work. different people give different answers.
' When you ask I
• ere are some 0 f the actual answers people h ave given: I
I /
1
-•aiuat1onTechn1
--- nter~· asons are easy to understand. Ques and Framework
. er 1<eY re . sorne are rnor .
t t~ese at community level and helping th e difficult When you are
EvaluationTechniquesand Frameworl( ,,~e of with those rds and meanings . that are even rnoern to. ParticiPate .1n evaluation, it is
,•, User Interaction Design (MU) &2 ' I~"1 to tise Wot you can start to do this is by tellin• ";, aod cIear than those given
5
To ___,______
holp us soo i a11 tha ga story abo
where wo am
I To help us-----.
lo
going and If w~ rnako bolter plans
f've•oo• _,y
ucb a5 th'°"'
as follow,, . "'<h" P<pl,'sa<Tual
noed 10 ctmnyo !or 1ho lu\uro irJl . 0ce, s
dlroctlon efle
8
;<P 1et1° · d I
..ar community eve opment workers cornp d
ejr· some are evaluatio t ki
f otllltrY n road. While they could see throughthe gl . n ta nga bus °
To monsure
prooress
I ne c
11 o I og all u
ao
11!<110W
Id see
. ass Windowsthey wereha
that they were making progress.Then rain forced th PPY
em to put wooden
,t;e1 -•eY coU . d ws and they could no longer assess their progress Th k
r eu•
"Why did you
,i er (he WJII oid not tell along which road, how fast or even whethe · ey new they were
th .
! rfeflforWardoV but cou r ey were nearing
evaluate?"
I ,·.<illg, s0oa
~,e
. tiOII•
1-...;.~,·~; .... _ Ii
1:
j
... ..,,, .• >::..
programme
with
Others like It
• This has happened, for example, where a programmehad powerful indiVidualsor powerful The best development project will conduct differenttypes of evaluations,constantlylooking
groups of people ~ho did not agree with the actiVities or the objectives of the particular, ·to streamlinetheir project or programat differentstagesand using differentmetrics.
programmebeing carriedout
,,,,,
tion vv-·~
terfiC
---------- iJS6rln
rocess helps In assessing th
Ev 1
a Uation Tech 1
V User Interaction Design (MU) 6-6 EvaluationTechniquesand Framewor11 uadoll p . .
I
e earner'sneed nques and Framework
hing Jearnmg 1;>rocessit is very m s,
• For the better evaluation the techniques like objective tes~, essay tests, observational eteaC Uch necessa
,~ tll tor must know the knowledgea d ry to know th
techniques etc. should be used: So that a complete' picture of the pupil achievementand stfll' n skills to b eneeds of th
developmentcan be assessed. ~e 10 helps to know whether the Stud e rnasterectb h e learners.
~ 1uat1oniththe instruction . ents Possess require/t e students.
3. An evaluator should know the limitations of different evaluation techniques eed w knowledge a
iro' help In providing feed back to th nd skills to
• Evaluationcan be done with the help of simple observationor highly developedstandardized uadon e students ,
tests. But whatever the instrument or techniquemay be it has its ownlimitation. .on process helps the teacherto kn
e(<lluatl . ow the learntn .
• There may be measurement errors. Sampling error is a common factor in educationaland If, bring about an improvementin diff . gdifficultiesof the tud
betps to erentschoolpra , s ents,
psychologicalmeasure~ents . It ures an appropriate follow-upServi Ctices.
ens ce.
• An achievement test may not includethe whole course content. 1150
1"1uadoPhelps In p~ep~rlng pr~grammedlllatertats
• Error In measurement can also be found due to students guessingon objectivetests.
med instruction 1s a continuousseriesofl
• Error Is also found due to incorrect interpretation of test scores. program . , . earningsequences.
the instructional matenal Is presented In 1.
4. The technique of evaluation must be appropriate for the characteristics or performanceto ~rst a 1mtted a
the instructional material. rnount then a test is given to
be measured resPonse
'eedback is provided on the basis of correctnes f
• Every evaluation technique is appropriate for some uses and inappropriatefor another. N¢ ,, s o response made S0 tha
.tt cn·ve evaluation process the programmedlearningIs t . · t Without an
.
• Therefore while selectingan evaluationtechnique one must be well aware of the strengthand
Eflluadonhelps In curriculum development , .
limitations of the techniques.
S. Evaluation Is a means to an end but not an end In Itself urrlculum
. development is an importantaspectoftheinstructio nal process.
IC
• The evaluation technique is used to take decisionsabout the learner. Evaluation data enable the curriculumdevelopment,to determine the effectivenessof new
• It is not merely gathering data about the learner. Becauseblind collectionof data is wastageof
Iprocedures,identify areas where revisionis needed. . . . . • . I "\
both time and effort But the evaluation is meant for some useful purpose. Evaluation also helps to determine the degree to what enent an existing curriculum is
effective.
6.1.5 Functions of Evaluation , nius ~valuation data are helpful in constructingthe new curriculum and evaluating the
The main aim of teaching learning process is to enable the pupil to achieve intended learning existingcurriculum.
outcomes. In this process the learning objectives are fixed then after the instruction learning 'L Evaluationhelps In reporting pupil's progressto parents .
progress is periodicallyevaluated by tests and other evaluation devices. AsyStematic evaluation procedure providesan obj;ctiveand comprehensivepicture of each
The function of evaluatlonprocess can be summarizedas following pupil's progress.
l ~ .
is comprehensive nature of the evaluationprocess helps the teacherto reporton the total
1. Evaluation helps In preparing Instructional objectives I
developmentof the pupil to the parents.
• -Learning outcomes expected from class-room discussion can be fixed by using evaluation Th' · d · I the most
results.
• . Can only be possible when we shall identify the instructional objectives and state them clearly
in terms of intended learning outcomes. Only a good evaluation process helps us to fix up a set
is type of objective information about the pupil provides tbe foun anon or
effectiv co-operation between the parentsand teac,hers.
.,;
~,;r User Interaction Design (MU) 6-8 · Evaluation Techniques and Frarnework
-- er in
,is
, n [J 0 SIYII \ I VI VJ
10ract1;;;;0~=:,;;:;;;======-
s of evaluation should be cir I Evaluation Tech .
- ;'\_
7
ces cu ar lik n1ques an
= ...... 1ie pr0 heel can move along smoothly. ' e a Wheel, All
tthe w 1. . . the Parts h
d Framework
In order to assist the pupils to solve their problems in the educational, vocational and
~3 rt of the whee ts mtssmg, it is no I s ould fit togeth
personal nelds the counsellor must have an objective knowledge of the pupils abiliti ~ro
' 11one h evaluationprocess. If they ar 1 useful. Partic•
interests, attitudes and other personal characteristics. .artloft e e eftoutofonep IPantsneedtobe·
r- -~art,1ti shke bre. in~olvedinall
An effective evaluation procedure helps in gettinga comprehensivepicture of the pupil which - ak1ngthe wheel.
leads to effectiveguidance and of counselling. Using the
results
8. Evaluation helps In effective school administration to improve the
program
Evaluation data helps the administrators to judge the extent to which the objectives of the
school are being achieved, to find out strengths and weaknesses of the curriculum and
arranging special school programmes.
It also helps in decisions concerningadmission,grouping~nd promotionof the students.
'"'"'?
Carrying ot~h~
9. Evaluation data are helpful in school research
--
understanding of programme progress, strengths and weaknesses. soon ? Has the programme got short-te
They can see where and why changes are needed, and can plan how to put them into practice. Programme). '
• "
•-n oesign (MU)
e••
Evaluat,on Tethn, "
V User Interaction Design (MU) 6-10 EvaluationTechniqu_Elsand Framework
..... qu_s an,JFrarnewor11 L.J.
/
t d for Your project Iamong
t ·the
d' Potentialbeneficiaries
2. In this case, evaluation usually takes place when the programme ends. What· kind of .mi, eerren line of re evan m 1cators, Which can he\ h .
o , ~e ell
monitoring methods are already used 7 1 '
,, ~· ' '"' p' ow '''"" ""
3. Do records need to be gathered from places far away before the evaluation starts 7 Are there o arly I.mprovements to the program
5
going to be many records to look at? \~y~elP Jllake
roject
e ma nagers to refine or improve the program
4. Will extra time be required to do this? o ~lov/sP
0
\
5. Are extern al evaluators to be Involved in the evaluation 7
6. If so, this will affect timing. When are they able to come and how Jong will they take to get to
the programme area i
,o'il
5
'ionduct theY
aJIIPare
wer
rveys and focus group discussionsamong the tar t
\e SUlikely to need, understand,and accept programelements.
ge popu\at1on focused on \
;tie 1uat1on
7: How long can they stay? What about climate and seasons 7 sE"a . g') \1
8. In the rainy season is it possible to reach isolated communities.? r•·..,,i:eS
n as , prog ram monitonn .
\ Mso •on oc \\
\
In the dry season rivers may dry up and communities cannot be reached by boat. In a city I !{llOW curs once program 1rnplementation has begun, and it measures how
people may find it harder to concentrate on evaluation in the hot season. What about people's ,,,,,:ess eva\uatl rogram 's procedures are. The data it generates 1s useful in 1denti[ymg
ffectlve . nd
P stream lining processes, and portrays the program's status to external
times? p• - . your
~efficiencies a IL
9. During haP\'.esttime will people have time and interest to spare for evaluation? \
parnes. i'
At certain times of the year people have less food and money. \II
,..,en . plementatiqn begins
10. Is it possible to choose a time when people may be more relaxed and willing to give time and \ "" ram im
When prog . f an existing program
attention to evaluation ? What about the time of the programme staff? o . operation o
11. They also have particularly busy times. Which is the best time for them ? What about
ministries and outside agencies ? (such as departments, funding agencies and organisations)
0
ounng
What
\\
\I
Whether program goals and strategiesare workingas they should
They will also have specific ideas about timing. 0
Whether the pr o~ram /s reaching Its targetpopulation, and whattheythink aboutit \I,I
0
I
6.2 Types of Evaluation Why
Providesan opportunity to avoid problemsby spottingthem early \1
0
Different types of evaluations, constantly looking to streamline their project or program at o Allows program administrators to determinehow well the programis working
different stages and using different metrics .
How
1. Formative Evaluation Conduct a review of int ernal reports and a survey of program managers and a sample of the \I
\ ~rget population. Th e aim should be to measure the number of participants, how longthey
(Also known as 'evaluability assessment')
Formative evaluation is used before program design or implementation. It generates data on haveto wait to receive benefits, and what their experiencehas been. \I
I
the need for the program and develops the baseline for subsequ ent monitoring. It also
\ Outcome Evaluation
identifies areas of improvement and can give insights on what the program's priorities should
be. This helps project managers determine their areas of concern and focus, and increases I tAlsoknownas 'objective-based evaluation') . ram implementation. It generates data
awareness of your program among the target population prior to launch. Outcome evaluation is conventionally used during prog ttributab\e to the
o h d those outcomes are a
n t e program's outco mes and to what egree h sbeenandhe\psmake
When · your program a
ff
program itself. It is use ful in measuring how e ective
o New program development It m h • t nded benefits.
..
ore effective in terms of delivering t em e ~~!i ,
o Program expansion
i, T1<hbewtd1i
r n
C 11 : ~\I ~ I
.,,,.1.,.;,,,.,
~r U ser Int eractio n Design (MU)
When
6 - 12
Evaluation Techn iques and F
rarneworl(
.I 1~---------..
·on Design (MU)
1n1eract,"'.. """"
E:valuatklni .
0
After the program h as run for so m e time period ,., a=lysls of "• '""'"• by , _ect,n~u•sa~F .
" 5'/ rnatic 011ecting data - rarnewor1c
- '<\l
-~
How much the program ha s affected the target population
evafua eg1nn1ng to end (
o Clearly establish the degree of ben e fit provided by the program 1111P3 • at) and looks to quantify Whether or not it h b or at whateverstage the
• Why rllgrarn is. act,
impact evaluation is useful for meas .
as een suecessful. Focused on the
P rn 1mp ' unng sustainedcha
0 Jong-ter m or making policy changes or modificationsto th _nges broughtabout
progra e Program. •
Helps program administrators tell wheth e r a program is meeting its objectives . .
o Insights from outcome-focus e d feedback can help increase effectiveness Wbentthe en d of the program
How A re-se Iec ted intervals in the program
o , AtP
0
A randomized controlled trial, comparing the status of beneficiaries before and during the WJiat
program or comparing beneficiaries to similar people outside of the program. This can be Assesses the change in the target populati , '
o · on swell-being
done through a survey or a focus group discussion. Accounts for what would have happened ifth h
ere ad been no program
4. Economic Evaluation WbY
To show proof of impact by comparing benefician· .th
(also known as 'cost analysis' , 'cost-effectiveness evaluation', 'cost-benefit analysis', and o es wt controlgroups
0 Provides insights to help in making policy and funding decisions.
'cost-utility analysis')
How
Economic evaluation is used during the program's implementation and looks to measure the
A macroscopic review of the program, coupled with an extensive survey of program
benefits of the programs against the costs . Doing so generati:s useful quantitative data that
participants, to determine the effort involved and the impactachieved.lnsigh~from ;rogram
measures the efficiency of the program. This data is like an audit, and provides useful officers and suggestions from program participantsare also useful, and a control group of
information to sponsors and backers who often want to see wlpt ben efi ts their money would non-particip;mts for comparison is helpful.
bring to beneficiaries. l Summatlve Evaluation
When . · th , mpletionor at the end of a program
Summative evaluation is conducted after e program5 co uJ ·tlon.
· ct deliveredbenefitsto the target pop a
0 At-the beginning of a program, to remove potential leakages cycle. It generates data about how well the proJe . h hat they have achieved,
. useful for program admm1strators
·11 1s . . . stify the proJect. s ow w .
to JU
0 During the operation of a program ; to find and remove in e fficienci es.
an_d lobby for project continuation or expansion.
What
;, When
0 What resources are being spent and where
o At the end of a program
0 How th ese costs are translating into outcom es
0 At the end of a program cycle
Why
What , d . ed change happen
o Program manag e rs and funders can justify or streamline costs 0 de the esir .
How effectively the program ma m participants
o The program can be modif'ied to deliver more results at lower costs d he uves of progra
~ Tec:IIK.. ml1ll1i How th e pro gra m chang~
P II D I I <• I :I n1
oeslg_ri_{MU)
V User InteractionDesign (MU)
--1es of Ev11111.
6-14
Why --:
1
o Provides data to Justify continuingthe program
o Generates Insights Into the effectivenessand efficiencyof the program
l l"dl(lllngarideo mon!,t by..,.,....
rr'1graJ!l
• How ('.
11
Conduct a review of Internal reports and a survey for program managers and target "'I • •" 111,,.,.wd-log • ""°""'""fu, • "'•
.......
how the walkthrougbrnelhOd \Vu>L
..,1....,_
· ·
populations. The aim should be to measure the change that the projecthas broughtabout and
compare the change to the costs.
1
~I ,,,• e are mmlaglb,VCR to d
,.. r'"'"
'••~"'°"'" , ,,,..,.._
recorder "'c•1
•..,,.,
7. Goats-Based Evaluation ,,, " . !,show, In ~, 6J.t lo, ''""' .
• ,,b It"'•~!rates the~
. I d•"'"lcture on the right after the timed record
1<inill3 . •
(Also known as 'objectivelyset evaluation)
Goals,based evaluation Is usually done towards the end of the program or at previously
'"' althuse,
,,
...,
program up to three tltned recordings in d'" ,
to stream number is autornattcanyassigned."''"•
"......
P1'sstd.Th, vat111.
io1~s • '"' bl W """ ·
agreed-upon Intervals. Development programs often set 'SMART' targets · Specific, . 1
, _.
• ,,.11, •
orts oho .,.,, task W, ....
. ldo,°"~, -._,"' ....
••,, ..........
Measurable, Attainable, Relevant,and Timely• and goals-basedevaluationmeasuresprogress
towards these targets. The evaluation is useful in presenting reports to program
to
" ti••·'"'
ndeOign SOPP 2005. nl , program '""'"' •t
11febf1J3l)'
"'° "' ""'".• , 1'ts. ·-
...
'•
administrators and backers, as it provides them the informationthat was agreed upon at the
start of the program.
Ume: 21:45 stan:
• When
111\d:
channel: 3 chan11e1:
o At the end of the program
date:
o At pre-decided milestones
• What i' [D0[]1 m001
0 # How the program has performed on Initial metrics 000 000
0 Whether the program has achievedits goals 000@1 000@]
• Why [C<l[IJa; ~[I]m.
o To show that the program is meetingits initial benchmarks ~[El~ ~[£][!!
0 To review the program and its progress
Cooduct a <eview of lot,cnal ,ep 0 ru; aod a ,u~,y fo, pn,grnm maoag, cs aod ~"'"
ie te how the Walkthroughlllethod
Popula•oo,
compare the. The aimtoshould
change be to me,su,e <he ehaoge <hat the pn,Ject hu bn,ught abo"' '"'
the costs. 1Uustra Works Usin
[:,,e can re designing a remote ~ontro1 for a '"d gaslll!p\eexaillPle
7. Goals-Based Evaluation wea V ·ieorec d .
a~ne ogramming the CR to do tlrned record\n or er (VCR) and
(Also known as 'objectively set evaluation) ' "ofp•
,, , I des
. .
,eta 3 · ign is shown in Fig. 6,3.1, The Pl,..,,
·•ore on the
" '", ,,... 1,
. .....
r init1 h picture on the right after the titned r 1eft Illustrates the h
Goals-based evaluat100 1, usually dooe
agreed-upon Intervals.
tow.,,, <he '"' of the prog~m °' a, ,..,,,
Development programs often set 'SMART' targets - Specific,
0 .
,,,, 5
'• "''' " • te,to progc•m UPto
~o'/1
1
. .........,
'"•db,,,,
'••d.....,." dff has been Pressed. The VCR
ii ble stream number is automaticallyassi"" d ,., ·
"'u'
Measurable, Attainable, Relevant, and Timely - and goals-based evaluation measures progress ava a ...e · vve want to k
.,, neXt rts the user's task. We begin by identifyinga r now whether our
towards these targets. The evaluation Is useful In presenting reports to program "' ,pPo . ... ... ..,, ... "" \1
,...
i,deo
• . -reoocd. pro--., •t1aoo.., • .,•••• .,,15 gr,IOthe
administrators and backers, as it provides them the Information that was agreed upon at the iifebruary 2005. --
start of the program. • ,ro, . • ''""'' •
• When
time: 21:45
0
At the end of the program
0
At pre-decided milestones
channel: 3 \
• What
0
input costs and efficiency.
Development programs with effective monitoring and evaluation frameworks
next 5tep in the walkthrough is to identifythe action sequencefor this task. Wespecify!hisin
~nns of the user's action (UA) and the system'sdisplayor response(SD). The initialdisplayIs
1I
use different types of evaluation at different points of time. Some programs might ij lhe left-handpicture in Fig. 6.3.1:
0
even run two different types of evaluation at the same time for entirely different UA 1: Press the 'timed record' button safter'start:' \ '1
purposes . SD.1: Display moves to timer mo_de. Flashing
· cursorappear
0
UA 2: Press digits 1 8 0 0
I.
V :~.~~
·.':'~~!~
-~u ~•
I .iser ""u·· ' ' '·r' -~ ', ''
-
assume that the'~- ---~ , ;.~"!!>,., ". . .. _,~
6-16
EvaluationTechniques and Framework v ble to '· - ,•-~--~ ~""i;;;,·c -=~~ ........,
V User Interaction Design (MU)
o SD 2 : Each digit is displayed as typed and flashing cursor moves to next position
I< ',.,,,.of""
"pl•"'' """"""'-,..,· """"""'
th< 'tlm<d ""'"' ·'-w, •....
" •..::..;;~~ 'lir.,;i,
.......:,,~'.1,:7, ·.,
_~,,,._
o UA 3 : Press the 'timed record' button SD 3: Flashing cursor moves to 'end:'
o• ,d '' o"""> .............. Li ,~,~hi.,; · · c
S"'"' coold ' . -~ ..... ;.;., , '"'-· .
o UA 4: Press digits 1 9 1 5
,, '"'"'
,eeds
111
•
ra,hl"w\O., ............ __ ,L, "
.__. ~...
.. •"· •.,
"!""'~
prOer 2 this ~ -"'lllltt!:--..
o SD-4: Each digit is displayed as typed and flashing cursor moves to next position udY
c,se 51
o UA S: Press the 'timed record' button
l
1
o SD S: Flashing cursor moves to 'channel:' e ·gns imagine you ire designinga ~ lntt'4'--- d--. _
o SD 6: Digit is displayed as typed and flashing cursor moves to next position P ,,age, wh 1 ...._ ·n.,.
EJC~' •denn . and you "'1sh to know W'n!cb rl-111\
. g two styles of icon design L.
o UA 7 : Press the 'timed record' button cons member . One set of icons 115es llaturaJisticlln<•es n...,__ •
are
351er to re 6 <""><Ilona~
you for users
caphor ) , the other uses abstra.a l!nages(s..oe 6J.2).
o SD 7 : Flashing cursor moves to 'date:'
need 7
It is not clear which butt on is the 'time d record' button . The icon of a clock (fourth button
§_rn
Fig. 9.3.2 : Am tract 1nd c:onmll ~tar• apellllana
down on dte right) is a poss ible candidate but thi s could be interpr eted as a butt on to change • Users will re me mberth e natural 1consmor,:easllythantbubm1ctones.
the tim e. Other possible candida tes might be the fourth button down on the left or the filled , ' The null hypothesi s In thi s ca se ls that lbert d .bt no dilft!ffl(e ~ rd•*•·
circle (associated with reco rd). In fact, the icon of the clock is the correct choice but It Is quite l types. This hypoth esis clearly Identifiesttw.1'14 ,,.. mabltforoW'tJIZSL !
possible that the user would fail at this point This identifi es a potential usabi lity problem .
J varying the style of Icon.
• Question 4: After the action Is taken, will users understand the feedback they get 7
!: The ind epe nd ent variab le has two levels: ninatal,•~
Once the action is take n the display cha nges to the timed record mode and shows familiar
headings (sta rt, end, channel, date).
,.,.,....,,.
• ~::ever,
.
whe n wecome to a>.n1jdBtlltl- Alj[J_. •r~ ;
e expressedour hypothesis.in .,.., .. II.
• ,. , t l1 t l \Ut •
J
waction Design (MU)
V Use r InteractionDesign (MU)
I, u,er 1n
6-18 EvaluationTechniquesand Framework
l:va1ua11onrec .
How can we measure this 7 hn,quesandF
f Dete11111n th rarnewark
st
Fir we need to clarify exactly what we mean'by the phrase more easily: are we concern d thee e e90als
W~th • e Evaluate, interpret advaluation \
e users performance in terms of accurate recall or In terms of speed, and presentthe r---.. dress
For example, or are we looking at more subjectivemeasures like user preference? data \ Eip1ore1he 5Pecit~
llecicie quest,anstabe
• In this example, we will assume that the speed at which a user can accuratelyselect an icon Is
framework answ,r,d
Decide how to evaluation
an indication of how easily it is remembered. deal with the C%<J )
ethical issues. Parad,Se the "'"Ua:an
Our dependent variables are therefore the number of mistakesin selectionand the time taken grn andtechn,q,e
to select an icon. '\... .J
toanswarthequesions_
ldenlify the practkal
issues.
• Of course, we need to control the experiment so that any differenceswe observe are clearly
attributable to the independent variable, and so that our measurements of the dependent
variables are comparable. Fig. 6.4.1 : Decide F
I ramework Evaluation·
• To do this, we provide an interface that is identical in every way except for the icon design, ermine the goals of evaluation
and a selection task that can be repeated for each condition. 1. D· Det
tour goals of evaluation
• The latter could be either a naturalistic task (such as producing a document) or a more There are
To understand real world i.e. understandhow
artificial task in which the user has to select the appropriate icon to a given prompt o . users employ technol .
and how designs can be Improvedto fit the work . ogy mthe real World
The second task has the advantage that it is more controlled(there Is little variationbetween enVIronment better
users as to how they will perform the task) and it can be varied to avoid transfer oflearning. o To enable choose the best alternativeof the van·ousdes1gns
. ·
0
To enable if the product being made is of the requiredsta ndards by the usm
Before performing the selection task, the users will be allowed to learn the icons in controlled
0 To determineif the target of our projectis good enough
conditions: for example,they may be given a fixed.amount of time to learn the icon meanings.
, Some examplesof goals
The next stage is to decide upon an experimentalmethod.
o · Identifythe best metaphor on which to base the design.
• This may depend on the participants that are available,but in this case we will assume that we
o Check to ensure that the final interfaceis consistent
have sufficientparticipants from the intended user group.
o Investigatehow technologyaffectsworkingpractices.
6.4 DECIDE Framework o Improvethe usabilityof an existingproduct
2. E• Explorethe specificquestionsto be answered
DECIDE l,s a frameworkthat Is used to guide evaluation
(a) What is the purpose of the product? The purpose of this product istobring togetller
D : Determine the goals the evaluationaddresses th
several bookshops under one site so that it can be easy to searchfor books and buy e
E : Explore the specificquestions to be answered ·books in the same site
ers the suppliers. the
C : Choose the evaluation paradigmand techniquesto answer the questions. (b) Who will be the users/stakeholders? The users will be th_ecuStOm '
: Identifythe practicalissues. bookshopowners, the advertisersand also the owners of lhesite. .
. . berter than pre~ous
D : Decide how to deal with the ethicalissues. c) H- · ·
( ow ts 1t than better than previous inte ce ·rfa 1 The pro1ect is
• , , eds and wants
E : Evaluate,interpret and present the data mter,acesbecause we came to captureall the user ne
v:~,~-:.~l.•,j!'.
'ifltdlK1t111ld1!
ruDlltJtlUI
,.
...
user 1n :.:.
·;...::======--
10,rira~c;;uv;;•~·
-----
Evaluation Techniques and Framework
I ) rhe
user will be treated politelyb
Ut they h
va1ua1
IOn Tlleh .
---......
,; User Interaction Design (MU) 6-20
(C ebsite. aveto fol!0 n1ques and F·
W wth •arney
(d) What arc the users attitude towards the website ? The users were very responsiveto the users will have the right to kno e rules and iorJc
(d) 'file wthe &oaI regulau
w,cbsite because they saw that the site was saving time and also money. uate, Interpret and present th s of thestudy ons nf the
(e) For example, the goal offinding out why many customers prefer to purchase pape~ airline ~. ~\Illl . e data
tickets rather than e-tickets can be broken down into sub questions : is analyzed and interpreted u .
"'1,e data . sing the •
, '" . can be generahzed on our part th quick and di
What are customers' attitudes to these new tickets ? findings e evaluation rty Paradl
• Are they concerned about security? rne foIIowing also need to be considered.. Was bothf grn and tech
ormativeand nlque. Th:
• Is the interface for obtaining them poor?
" uabilitf:Can the study be replicated? sumrnatlve.
o ..e . .
3. C • Choose the evaluation.paradigm and techniquesto answer the questions. O valI'dity.• ts it measunng what you th
ought?
Biases: Js the process qeating biases?
• This looks at how the data is analyzed and presented 0
scope : can the findings be generalized7
(a) Quick and dirty : Designers informallyget feedback from users or constants to confirm 0
th_at their ideas are in-line with users' needs and are done any time. o cologicalvalidity : Is the environmentof the stud .
E yInfluencing it
(b) Predictive evaluation : Experts apply their knowledge of typical users to predict
pilot studies
usabilityproblems.Users need not to be present.
Asmall trial run of the main study.
(c) µsability testing : The performance of the typical users is recorded, the users are
, The aim is to make sure your plan is viable.
watched on video, evaluationon· the users is done, user satisfactionis done lastly
pilot studies check
(d) Field studies : This is done in the natural settings to understandwhat users do naturally
and how technologyimpacts no them. 0
That you can conduct the procedure
The paradigm that was used was quick and dirty because the users were involved in every 0
That interview
- scripts, questionnaires' expenments,
· etc. work a ro .
step of the design p~ocessand it is usuallydone at any time. , it's worth doing several to iron out problemsbefore domg
. thernainPP studypnately
• E.g. field studiesdo not involvetestingor modelling , Ask colleaguesif you can't spare real users. ·
4. I • Identifythe practical issues.
Note:
You are going to look at some of the things like
, An evaluation paradigm is an approach that is influenced by particular theones and
How to select users philosophies.
• How to stay on budget • Five categories of techniques were identified: observing users, asking users, asking exp,rts,
How to stay on schedule user testing, modeling users .
How to find evaluators II• The DECIDE framework has six parts :
0
• How to select equipinent Determinethe overall goals
0
· Explorethe questions that satisfy the goals
5. D • Decide how to deal with the ethical Issues
° Choose the paradigm and techniques
(a) The users will have the privacy of their private informationand will not be disclosed to 0
Identifythe practical issues
·anyone unless with the consentof the user 0
Decide on the ethical issues
lbl The users will be able to leave the site at any time that they wish i.e. they can be able to 0
Evaluateways to analyze & present data
discontinuetheir membershipat their own wish
' Do a Pilot study
"~~l~~•J~l1
I!!~
~ -
Y User lnt11ractlonDesign (MU)
6-22
·r interactionDesign (MU)
'f us;~e~="'=;a.;;;;;;~;,._
EvaluationTechnlques and F 6·23
6.5 UaabllltyTesting - -
rall'lewor11 · ·
usabihlYtesting to gauge th
o e user•1 ~, 1
-------
Usability Evaluation focuses on how well users can learn and use a product to achieve their
goals. It also refers to how satisfied users are with that process. To gather this lnfonnatt
practitioners use a variety of methods that gather feedback from users about an extsttngsiteon,
plans related to a new site.
or
o
sys
satisfactionsurveys to see h
b'
erea1 \Vo 1
radical~ . rd.
1
. 1Ilpr , 0
8thnique
s a'1<1 Fi
a1new
°rk
- k . e~ U sabili
What la Uaablllty? , sabililY evaluations can capture two ty of Your site
U
ta Quantitative data notes wh %es of d ·
• Usability refers to the quality of a user's experience when interacting with p d da• Ma~
pa rrtcipants thought or said. ally hap · quahtativ
Pened Q e data
systems, Including websites, software, devices, or applications. Usabilityrots Ucts or
about nce you have gath ered your data, Use it. to•
• · Ualitati anctq
ve datades uantitatiVe
effectiveness,efficiencyand the overall satisfactionof the user. O ·1· . Clibes Wh
• Evaluatethe usab1 1ty of your w b . at
It ts important to realize that usability ts not a single, one-dimensionalproperty of a product, 1· e Site
system, or user Interface. 'Usability'is a combinationoffactors including: . Recommendimprovements
2
. implementthe recommendations
o Intuitive design : A nearly effortless understanding of the architectureand navigationof 3
the site
R -test the site to measure the effectivenessof
4. e your changes.
o Ease of learning : How fast a user who has never seen the user i11terface before can
.accomplishbasic tasks 6,5,1 usabilityTesting and EvaluationMethOds
o Efficiency of use: How.fast an experienced user can accomplishtasks Traditionalusabilitytesting and eva.luation methods.
o Memorability : After visiting the site, if a user can remember enough to use it effectively There are many tradition~! usabilitytestingand..,..,
in future visits .. .... uatton IDethods
heuristic evaluation, cognitive walking, behavioral an . ,such as user testtn&
0 . . a1Y51S, structured and
Error frequency and severity : How often users make errors while using the system, interview,questionnaire,GOMS (Goals, Operators,Method S . unstructured
s, e1ection rules] Prob bTI
how serious the errors are, and how users recover from the errors grammarfor interfaceusabilityassessment . a lly rules
o Subjective satisfaction : If the user likes using the system , These methods are now widely used in a varietyof interfaceevaluati . ·
. , on PTOCest Swtablefor
What are the Evaluation Methods and When Should I ImplementThem? differentuser interfacedesign and developmentstages.
, Each has its own advantagesand disadvantages.
• The key to developing highly usable sites is employinguser-centered design. The expression,
"test early and often", is particularly appropriate when it comes to usabilitytesting.As part of Usabilityof cognitivephysiologyassessmentniethod t
UCO you can and should test as early as possible in the process and the variety 1. Eye trackingtechnology
of methods available ·allow you to assist in the development of content, Information o Visual is the direct way for users to interactwith the interface.
•
architecture, visual design, interaction design and general user satisfaction.
Opportunities for testing include :
o Visual Awareness (Eye-tracking)PhysiologicalAssessmentis avery effective~ethod or
evaluatingthe pros and cons of the interface,usingline-of-sighttrac
kingasanassessment •
1
o Baseline usability testing onan existingsite techniqueto evaluatea range of websiteusabilitylevels.
o Focus groups, surveys or interviews to establish user goals 2. EEG technology .
o Card Sort testing to assist with IA development 0 . . a small real-time window, with
EEG can accurately analyze neuropsychologicaldata m
o Wireframetesting to evaluate navigation high sensitivityto task complexity. ·hineinterface.
o First click testing to make sure your users go down the right path 0 .. klOads in the man-mac
Is a very attractiveway to measurecogmtivewor
..
'(, User lnterac11on Design (MU)
6.5.2
How to Evaluate UsabilityTesting ?
6-24
"""""11111
liiiiiiiiiii,7'
EvaluationTechniques and Framework
1ntera
. Design (MU)
ct1on
........._ (..
for ·
tvaiuat,onl
1,00~
issue h user encountered
k While Perform· 8chn,ques
The P'Oces, or t,mlog a mas, or'"'"""" dab, traomip", aad obse~atlo,s loio ., st they
e too ing tasks and Frarne-,,Orl(
o
'<Ho,ab 1, report o, ,sabillcy ''"" c;,o seem ,,,~h,lmlog at Hm - b,t It's simply a matte, , Actions ·t· d
0
organizing your findings and looking for patterns and recurring issues In the data. (b
1 oth pos1 1ve an negative)they
ments made
· 1. Define What you're looking for
Co!Tl user discovered, or une""ect d .
f
o .,.,. • .,, • '<0o h
r each I k the user was attemptingto complet t ey too~ lllake
, oecord
' the "''fie categories
' and tags (for example •lo... .•• • act Ptob\e111 they
• "e""• ,,,.
•od ,dd s'"' ie,ce-,.la~d °'" '"h '"'m' •~""°""O•O "'"'"'°'I
,,,,~.
W' ,
.
"'1ge, or
•r
rt
,od I .
well,
,r, b,st m
If you previously createduser Person,s
""'
filter
'
h' digitally, with a too)
around,
dot "ap ply bg,, aod "rt It
.
a,
so You can t
,, "'->aou b
U,t h I
0n)
""<
• '._' ,.,
d' ,
ma'"•
er
....
Fig. 6.5.1
the data your statements are conciseand exactlydes""'b h ,
• , . make sure "' et eissue.
Start analyzing the results, review your original goals for testing. Remind, yourself of the fro
0
up · . 1 •
ad examp e . The user clicked
. on the wrong. link
problem areas of your website, or pain points, that you wanted to evaluate.
Goo
• le . The user .chcked on the hnk for DiscountsCodes instead th
0
B
Once you begin reviewing the testing data, you wm be presented with hundreds, or even dent
""mpInfo . " •o,;,
I
thousands, of user insights. Identifyingyour main areas of interest will help you stay focused WhenyOu're done, yo ur data might look similartothis:
Paym
on the most relevant feedback
7 '
User Category Task
• Use those areas of focus to create overarchingcategoriesof interest. Most likely, each category · Problem
ID \ Tag1
will correspond to one of the tasks that you asked users to complete during testing. They may Tag2
be things like: loggingin, searching for an item, or going through the payment process,etc.• 1 Search Find a red item
User unable to locate
2. Organize the data Filter
filter features Confusion
1 Shopping Saving an item
•save for later'button
- ·~ cart for later
did not work-simply
Broken
element
removeditem from cart
1 Checkout Entering
Accidentallyexited Icons Confusion
process payment paymentprocessby
details clickingon shipping
Fig. 6.5.2 options button
2 Checkout
• Review your testing sessions one by one. Watch the recordings, read the transcripts, and Entering User expressed Payment Disappointment
carefullygo over your notes. payment disappointmentthat
details
l V
L Paypal wasn'ta
paymentoption I L--:::J ~1d~
~1 User lnteraclion Design (MU)
6-26 . Evaluation TechniQ_LJ_e.s and Framework :, iJS
l/.
81
1n10ract1
runni
011 w~- , -,
[jJI]
r" ho co grouped together h
other W
aconcJus1on
. that cs on tact details for the company Were diff\cu1ttofind. 'Wit the overa\\
Fig. 6.5.3
4. pr1orltlie
the Issue
1
Assess your data with both qualitative and quantitativt, measures:
o Quantitativeanalysis will give you statistics that can be used to identify the presenceand
\CZJ
severity of issues
o Qualitativeanalysis will give you an insight into why the issues exist, and how to fix them.
In most usability studies, your focus and the bulk of your findings will be qualitative,but
calculating some key numbers can give your findings c~edibilityand provide baseline metrics
for evaluating future iterations of the website.
L ____ 7
Ag.6.5.4
1. Quantitative data analysis : Extract hard numbers from the data to employ quantitative
·data analysis. Figures like rankings and statistics will help you determine where the most
, Now that you have a list of problems, rank they based on their impact, if solved.Considerho½
common issues are on your website and their severity. global the problem is throughout the site, and how severe it is; acknowledge the im?lications
2. Quantitative da_ta metrics for user testing include of specific problems when extended site wide (e.g., if one page is full of typos, yousr.ould
o Success rate : The percentage of users in the testing group who ultimately completedthe probablyget the rest of the site proofreadas well).
assigned task , Categorizethe problems into :
o Error rate : The percentage of users that made or encountered the same error
o Critical: impossible for users to completetasks
o Time to complete task: The average time it took to complete a given task
o Satisfaction rankings : An average of users' self-reported satisfaction measured on a o Serious : frustrating for many users
o Minor: annoying, but not going to drive users away
numbered scale . . . ent issue than dislikingthe
3. Qualitative data analysis : Qualitative data is just as, if not more, important than ' For example : being unable to complete payments ts a more urg . hi th
. ' ectedimmediately,w I e e
quantitative analysis because it helps to illustrate why certain problems are happening, Slte s color scheme. The first is a critical issue that shouldbe corr .
. for sometime in the future.
and how they can be fixed. Such anecdotes and insights will help you come up with \ second is a minor issue that can be put on the back burner
solutions to increase usability.
S. Compile a report of your results \tstolmproveyour
o Sort the data in your spreadsheet so that issues involving the same tasks are grouped • T b . 1· te\yusetheresu h se
o enefit from website usability testing. you must u nma nissues,leverage t o
together. This will give you an idea of how many users experienced problems with a site o , . . . d the most commo
certain step (e.g., check out) and the overlap of these problems. Look for patterns and · nee you ve evaluated the data and pnonnze
insightst0 - ·t 's usability,
repetitions in the data to help identify recurring issues. encourage positive changes to your SI e
"fl:'# Ttchl<n~
V ru ~i iti t•~~ s
)
:tion oesign (MU) 6-29
- ---- - - \
~r9
&ram g1c Model
Step 2 : Define Purpose and Scope
Step 3 : Budgetfor an Evaluation
Step 4 : Selectan Evaluator
Step 5 : Developan EvaluationPlan
Step 6 : CollectData
8(i]
tation
1eJllet1 · Step 7 : Manage the Evaluation
lillP
j\ e orting Step 8: AnalyzeData
/Jlalysisand R p • Step 9 : CommunicateFindings
Q
Fig. 6.5.5
nt Step 10: Apply Findingsand Feedbackfor ProgramImprovement
In s'ome cases, you may have the power to just make the changes yourself.In other situations, .,,.,,,1mP'°""''
you may need to make your case to higher-upsat your company • and when that happens,
you'll likely need to draft a report that explains the problems you discovered and your
proposedsolutions.
Qualitiesof an effectiveusabilityreport
It's not enough to simply present the raw data to decision-makersand hope it inspires·change.
O~
A good report should :
• Showcase the highest priority Issues : Don't just present a laundry list of everythingthat
8
program
logic model
Develop an Define
evaluation ./'I purpose and r,.re How many,
Are activities
plan C:---V scope resources how much Change In
delivered as
Planning adequateto intended? was i\n~.ieoge,
1rnp1e111ent prO<Juced? attiludes,
I
proflrar11? skills?
Budget for tj? Select an
an evaluator
evaluation
o
o
Information to collect
Fig. 6.6.2 : Steps of Plannlng
, sudgetfor an evaluation
-
areav~labieloreva~a,~n'
o Evaluationtimeframe SttP 3•
common cost categoriesare as follows :
• How to read a logic model
, studY Design
0 Logic models read from left to right
Type of study design
0
0
There are two "sides" to a logic model-a processside and an outcomesside
The size of the study
• Experiencethat matches the design, methods, and/or approachof your pla~nedeva)uation " .UY '" ..,ti., Qo help yo, "• "°".._.,'"'"••""'••••. ldoo;
• Cap~cityto handle the scale of your planned evaluation IY of yo d give""""'"""'
•(actors m1,~• - - ,...,,.. .,. , ,-., "'""'
• Personal style that fits your program staffor organization "" 0 ""' ,ffecte
you
do<ght - '°""°""'"'"'""'°"""'"'~°"'
ini. ·ative,an level ln dlcators of Impact
• Evaluator's independence : No conflictsof interest related to the evaluation- Able to provide
· cot11tll d-and-true
--
an unbiased assessment of the program's outcomes/impacts are teste
These unity· .
m,rtrei, fint help "" """ the •'"- "''"' • ,~
Step 5 : Develop an evaluationplan . tive. 1
Decide which methods will best address those questions. Here is a brief overview of some
P
common evaluation methods and what they work best for. V{IIY develo anhat
Clarifies w
. the evaluations ou
direction
.evalua
. an
. h Id take based on priorities,resources, time, ' l
\
\
• Monitoringand feedbacksystem
si.i\lS
This method of evaluationhas three main elements:
, createsshared understanding of the purposeand use of evaluationresults
o Process measures : These tell you about what you did to implementyour Initiative;
o Outcomemeasures: These tell you about what the results were; and
Fostersprogram transparency to stakeholdersand decisionmakers I
, Helps Identifywhether there are sufficientprogramresourcesto carry out the evaluation
I
o . Observationalsystem : This is whatever you do to keep track of the initiativewhile it's
Support
. '
Monitor
I
Indicatorsor What is collected and analyzethe data 1
Outcomeof collectedand bywhom?
interest how?
'
Fig. 6.6.6 : Manage the evaluation
.T
..
V ru1 11c1 11a1s
__,I
__ . n oesign (MU)
V User Interaction Design (MU) 6-36 Evaluation Techniques and Framework Impact Evaluat1 ------ --•alll:l ti ... _ -
l
relative to a
o ·1 conclusive, but show prom
compari son woup? Present results that are not necessan Y
examination
0
Be careful not to overstate your findings V!~~::a~~
_,
J
--..
6-38 Evaluation Techniques and Frarnewor11
,.., Usor Interaction Design (MU)
-4.a;
Apply Findings
and Feedback
for Program
---- --
Improvement
(5 Martes)
a. 1 Write a short note on : Evaluation
(10 Marks)
a. 2 Explain·the types of Evaluation.
(10 Marks)
Q. 3 Write a short note on : Case studies of Evaluation.
(10 Marks)
a. 4 Explain DECIDE Frameworkof Evaluation.
(5 Marks)
a. 5 Write a short note on : UsabilityTesting.
(10 Marks)
a. 6 Explain steps of conductingan evaluation.
(5 Marks)
a. 7 Write a short note on : conducting.
,I