0% found this document useful (0 votes)
10 views

uid-techknowledge

Uploaded by

Sahil Sawant
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

uid-techknowledge

Uploaded by

Sahil Sawant
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 124

I .

,
4, Process of Inte 'raction
Design, Prototyping
j:odule-~ Cons~ruction '

Interaction Design Process, Prototyping and Conceptual Design, Interface M


etaphor
Analogies. s and

4.1 InteractionDesign Process


Scenarios
What is task
wanted analysis

Interviews
ethnography 'c::::>EJq
Analysis
Guidelines
principals

EL~
What is there
' vs. Precise
what is wanted
If n
specification

i ""'"'"' . V
¢::i
Dialog

Prototype ¢::J
Implement
and deploy

Architectures
documentation
Evaluatiori help
heuristics

· Fig. 4.1.1 : InteractionDesign Process

• Requirements·what is wanted: The first stage is establishingwhat exactly is needed.Asa


-precursorto this, it is usuallynecessaryto find out what is currently happening.
• For example, how do people currently watch movies? What sort of personal appliancesdo
they currently use ? There are a number of techniques used for this in HCI : Interviewing
people, videotaping them, looking at the documents and objects that they work with.
observingthem directly.

• · 1ar, ethnography, a form of observation deriving from anthropology,ha5. become


In part1cu
very influential.
, r9otion D"lil" (MU) 4•2 process of Inter. Design,. Prototyping, Cons.
1nte
v,
11
vs6' : file results of observation
. . and interview need lo be ordered in some way to bring
.,slS nd commumcatewith later stages ofde .
1' issuesa sign.
, o~t ~er
defs, w111-ell are a means to capture how peop1e carry out the various tasks that are
s~ fllo ork and life.
fa ft11eirw
, r~art o aIYSI·sis the study
, · of the way peopleperformtasks with existing systems.
1as~ an , task analysis: ·
. ues ior
1echn1q •tion of tasks into subtasks
ecomPos1 .
0
d , . c[assificat10nof task knowledge
ca"onomic
o . things used and actionsperfonned.
.
0 chniquescan be used both to represe~tthe situationas it is and the desired situation.
fhese teWell, this 1s
. all _about design,
. but there 1s
. a central stage when you move from what
pes1gJI to hOW to do it There are numerousrules, guidelinesand design principles that can
ou want, .
y d to help with this
beuse . .. . .
esigningfor maximum usabilityis the goal of interactivesystemsdesign.
0 D
bstract principles offer a way of understanding' usability in a more general sense,
A
o especiallyifwe can expressthem wit. h'msome coherentcatalog.
Design rules in the form of standardsand guidelinesprovide direction for design, in both
0 general and more concrete terms, in order to enhance the interactive propertjes of the

system.
The essential characteristics of good design are often summarized through 'golden rules'
o.
or heuristics.
Design patterns provide a potentially generative approach to capturing and reusing
0
design knowledge.
, We need to record our design choices in some way and there are various notations and
methods to do this, including those used to record the existingsituation In this chapter, It is at
this stage also where input from theoretical work is most helpful, includingcognitive models,
organizational issues and understandingcommunication.
• Iterationand prototyping: Humans are complex and we cannot expect to get designs right
first time. We therefore need to evaluate a design to see how well it is working and where
there can be improvements.
• Evaluationtests the usability, functionality and acceptability of ail interactivesystem.
' Evaluationmay take place
0 In the laboratory
0 In the field
------------------- ---- ---::c;::~:;:;:
V!~t~ I~
--
'-' User Interactioncieslgn (MU)

--;;:::;
Some approachesare based on expert evaluation
4.3
process of Inter. Design, Protolypi

o Analyticmethods
o Review methods
o Model-basedmethods
Some approachesinvolveusers
o Experimentalmethods
o Observationalmethods
o Query methods
• An evaluationmethod must be chosen carefullyand must be suitable for the job
• Some forms of evaluationcan be done using the design on paper, but it is hard to
some fo
feedback without trying it out Most user interface·design therefore involves get real

proto'>'Pi'•prod,cioge,rly ""'"' of 'l''~""to ny '"' wrlh "'' "'"'· "" •

Implementationand deployment
• Finally, when we are happy with our design, we need to create it and deploy it This Will
involve writing code, perhaps making hardware, writing documentation and

manuals• everythingthat goes into a real


• Time Is limited - there is a trade-offbetween the I_ength of the design period and the quality
of the final design. This means one sometimes has to accept a design as final even if it is not
perfect it \soften better to have a product that is acceptable but on time and to cost than it is
to have one that has perfectinteractionbut is late and over budget.

4.2 PrototyplnQ_and ConceptualDesign


• Iterative design : A purposeful design process, which tries to overcome the inherent
problems of incomplete requirement by cycling through several designs, incrementally
improvingupon the final product.with each pass. On the technical side, this is describedby

the use of prototypes.


There are 3 main approachesof prototyping
th
• Throwaway: The kn?wledgegained from the prototype is used in the final design, but e
prototypeis discard~d
• Incremental : The final product is released as a series of components that have been
prototypedseparately
• Evolutionary:The prototypeis not'discardedbut serves as a basis for the next iterationof
the design · ·
~ "
p~ t l l ( J l tl ll
1ntera"' 44
..,..
user ct~iio~n~D;;eisilg"'n"'("'M"'U"')======::a
d" t th ::a·==::aP::aroc~e.;;:ss;;_o;;;f.;:ln~
t;;:er.;,·~
De~s~ig~n~
,,::;P~,o~to~ty~p~in~g~,~C~o~n~s.
11

v differ accor mg
O
e amount of functionality and performance they provide
I
, ·veto .
reia0
..,- .,, fi,al prodott The lmp,n.," Ile, lo l<s PrnJ~•d <alism, sJ"' <hey are <eoed
I users.
rOVI I
on rea p . ·d·ng
Since ·de: realism in prototypes is costly, there are several problems on the
, ements1

e: Proo .
manag
0
TIPl . t typing costs time, which is taken away from the real design. Therefore, there
totyping techniques.
are rapid-pro

on-funct10 ...
o • nal features : Some of the most important features, as safety and rehab1hty,
o N be tested using a prototype.
a,oo< rt d
contracts . ro
, p totyping cannot form the basis for a legal contract and must be suppo e
o WI"th documentation.

4.2.1 Techniques for Prototyping

, Storyboards : A graphical depiction of the outward appearance of the intended syst~m.


withoutany accompanying system functionality.

, Limited functionality simulations : Programming support for simulations means a designer


can rapidly build graphical and textual interaction objects and attach soine behavior to those
objects, which mimics the systems functionality. There are many techniques to build these
prototypes. A special one is the Wjzard of Oz technique, in which the system is controlled by
human intervention.

• High-level programming support . : . High-level programming languages allow the


programmer to abstract away from the hardware specifics and thinking terms that are closer
to the way the input and output devices are perceived as interaction devices. This technique
can also be provided by a user interface management system, in which features of the
interfacecan be designed apart from the u_nderlying functionality

' Prototypes can be categorized as low fidelity prototypes (e.g. paper prototypes) and high
fidelity_prototypes. Low fidelity prototypes do not look much like the final device. They are
not made from the same materials as the final device and do not have all the functionality of
the final device.

· Low fidelity prototype can simulate some of the interactions, but · perhaps not all the
subtleties of the interaction. · · ·
111
_gh fidelity prototyp~s look more like the ,final device. Your final design is an example of a
high fidelity prototype. They may have some of the functions of the final product. They can
test more of the subtleties of the interactions. However, they take more time to make.

.. i,
':I)
- V User InteractionDesign (MU) • 4-5 Process of Inter. Design• p rototypin

Frequentlyafter making a high fidelityfunctioningprototype,the design team orrn


Is reluctant to throwaway the prototype, and attempt to evolve the final devtceanagernent
prototype. This Is not always the way to achieve the best product frolll the

Disadvantages
Advantages

------
Fidelity
• Limited error checking
Lower development cost
Low •
Evaluate multiple designs • Poor detailed specificatlon:r=-----
or COdin

. Useful communication device
Addressscreen layout Issues

.
Facilitator-driven
· usability~
est1ng
Limited utility after r~
• established qulrernents

marke1 • Limited usefulness for usabil;;----


tests
• Useful for identifying

--
requirements

---
• Proof-concepts • Navigationand flow limitation-;--

. Complete Functionality' • More expensiveto develop


High
. Fully Interactive •

Time consumingto create
Inefficientfor proof of concepts

.
User-driven usabilitytesting
Clearlydefines navigationalscheme . Not effectivefor requirements gat~
-
. Use for explorationand test
-
. Look and feel of final product
. Serves as a living specification
. Marketing and sales tool

4.3 Conceptual Design


Concept design is understood in industry as a specific document. It is the integrated ideas
describing the system : appearance, function and how the user interacts with the
system. However, designers can understand concept design as a process of design, which I
consider similar to idea generation. In other words, the concept design is an idea of the total
system.
Preece, Rogers, and Sharp in Interaction Design propose that the concept can be based on
three perspectives: .
• Activities: Activities that users do most, in terms of 4 interaction paradigms :
o Instructing
o Conversing
o Manipulatingand navigation
. __.-.:;
'-'

111111111
.,. tion oesign (MU) 4-6 P
1erac rocess of I 1 .
user in n er. 0 es,gn, Prototyping, Cons.
I g and browsing
EJ<pJorIn
o . Instruction or Instrumentation (the pa ct·
passive . . ra 1gms we added)
o The product of the interface or objects used I th .
, ectS : n e interface
obl pbor: Is an analogy to real world objects or processes
I , r,1eta
00 d concep
t design is based on ·these three
. . .
.
perspectives. This can be Illustrated by an

I
Ag e of the most influential pieces of ftw . ·
• JTIPJe. On . so are 1s the spreadsheet The first
eta Vis!Calc, was designed b Dan ·
adsheet, y· Bircklen. Before spreadsheet software
spre k pt track of accounts using ledger b00 ks · '
ountants e and performing calculations by hand .
ace try to make forecasts of profits by d 11·
Accountants • 't' ta mo e mg a spread of scenarios. Bircklen
d rstood what act1v1 ies accoun nts needed to perform and the problems they had with
u;is:ng tools. Bircklen also realized th at computers could make the process interactive. The
e design of the product was
concept
dsheet that was analogous to a ledger book
1. Aspr ea '
_ Jnterac tivity by allowing users to edit the.cells of the spreadsheet ,
2
3 T~e computer performs the calculations for a range of cells. His concept design
· considered the activities, objects (generic entries, cells and columns), and had a good
metaphor.
, Another example is the Star Interface for the office, which was the precursor to desktops.
What are the activities, objects, and metaphor fo~ its concept design?
, Metaphorsare very useful concepts for designing. They relate a new p~oduct to an old
product They can make the product easier to learn and use. They can assist the designer by
making the interface design more consistent and choosing design alternative . Because the
• metaphor is a strong concept, it can be dangerous. Users may belieye that the system should
perform identically to the analogous system that they are familiar with and become baffled
when it does not. In addition, designers may adhere too much to the metaphor , which can I

cause bad design. An example will illustrate both; I recall wh~n I first used a text editor. The
metaphor with a typewriter was clear, the cursor, an underline or gray square was the
typewriters carriage or key location . I had a lot of experience with typewriters, so I could
jl
quickly learn to use the text editor. However, I had problem-typing characters into blank
spaces. I expected positioning the cursor on a blank space should type the character over the
blank space. That is what the typewriter would do. 1 did not realize that a blank space is a
character just like any oth er character. In addition, the cursor, an underline, reinforced my
notion. Therefore, th e des igners adhered too strong to the analogy. Later cursors became
vertical lines. This made clear the difference between the metaphor and the application. The
J ~could have done worst. They could have the page move as the user typed. The
I V

d
- V_ Use r Interaction Design (MU) 4.7 Process of Inter. Desi
_......,.,,
·
.............. .
9n- ,_Prototy .
Pin
"""
notion that the page should remain fixed was a good one and allowed , Q,
. . ,or Illa•·• .
ea sier whil e entering content Note that this 1s not how a typewriter fu "Ing th
. ncttons "'h
1
e Pa
mora ls : · ere,•tei,,,!e
1. Look for a good metaphor , but do not adhere to it too strongly make 1
' • c earthed
°
2. Try to find ways that the design can improve on the old ways of doing th" lfferenc
. • lngs. e.
Concept design is about ideas, how do you come up with ideas?
This can be an individual endeavour or group activity. 1suggest:
o . keep an open mind, do not initially critique any idea that may a nu
Always .
o Become involved: constantly considering all that you know
about the Ptolettand
the world around you.
o Periodically try to force ideas by doodling or making up songs
o Keep the ideas. Later mix and match them
• The standard technique for a group to generate ideas is called brainstorming Bra·
. · instorrnin .
a process ' that assures that the points above are maintained. This is especially imp0 g~
• · . • '. • .rtant in a
group where egos can anse . The bramstormmg process typically goes as follows:
l. Gather around a table in a room that does not have distractions
2. One person briefly describes the goals of the brainstorming session
3. Each person at the table takes a turn voicing ideas, i::to matter how ridiculous, the Ideas
are not critiqued or rejected , only list them briefly .
4. Members of the group are initially not allowed to opt out of voicing an idea
5. Only after there are no more ideas evaluate the ideas: individual ideas and groups of
ideas
6. Write a summacy of the alternative valuable ideas .

4.4 Interface Metaphors and Analogies


Interface metaphors Designed to be similar t~ a physic,d entity but also has own properties
. hl·gthem to
E.g. desktop metaphor, sear ch engine Exploit user 's familiar knowledge, e pm
understand 'the unfamiliar' Conjures up the essence of the unfamiliar activity , enablingus;~
. .. . l"ty People fin I
to leverage of th is to understand more aspects of the unfam1har funct10na 1
. . . . rms fam'·i·ar
i
easier to learn and talk about what they are doing at the computer mterface 111 te
~ ~fil . d
. ers understan
• Benefits of interface metaphors Makes learning new systems easier Helps us rs 3nd
· · . 0 f~m~W
the underlymg conceptual model can be innovative and enable the realm
·thei r applications to be made more accessible to a greater diversity of users.

.,
Process of Inter. Design, Prototypin!l_, Cons.
if user Interaction Design (MU)
4-8

problems with Interface metaphors (Nelson, 1990} Break conventional and cultural rules.
• · E.g., recycle bin placed on desktop Can constrain designers in the way they conceptualize a
problem space Conflict with design principles Forces users to only understand the system in
terms of the metaphor Designers can inadvertently use bad existing designs and transfer the
bad parts over Limits designers' imagination in coming up with new conceptual models.
Web Browser metaphors Examine a web browser you use and describe the metaphors that
have been incorporated into its design Activity Many aspects of a web browser are based on
metaphors including : Toolbars, button bar, navigation bar, favourite bar, history bar Tabs,
menus, organisers Search engines, guides Bookmarks, favourites Icons for familiar objects
home, stop etc.
Interface metaphors are commonly used as part of a conceptual model

1 What Is an Analogy ?
· ,, . f tw O th'ngs
1 based on their being
Merriam Webster defines an analogy as a companson o . •

l
alike in some way." The Stanford Encyclopedia of Philosophy descnbes an an~log1ca;
argument as citing "accepted s·imilarities between two systems to support the conclusion tha
some further similarity exists."
2. What Purposes do Analogies Serve ?

Analogies help us unde·rstand new concepts, teach new concepts to others, and see the
familiar inane~ light, which in turn , enables us to generate novel solutions to problems.
• Analogies are an important part of how we make sense of new concepts and experiences. Our
minds constantly unconsciously compare new concepts to things we already know, as a way
of understanding them. We look for similarities between our experiences and any new
situation to help us understand the new and unfamiliar.
3. Solving Problems

• When we face an unfamiliar problem, we can think about similar problems or situations we
have encountered in the past, compare their similarities and contrast their differences, and
apply the lessons we learn to our current situation. Analogies are also useful in problem 11
solving. When we face an unfamiliar problem, we can think about similar problems or .
situations we have encountered in the past, compare their similarities and contrast their
differences, and apply the lessons we learn to our current situation.
Noticing and comparing similarities between different 'domains can help us generate novel
solutions to problems . For example, the idea for an automobile assembly line came to Ford
employee Bill Klann after observing a slaughterhouse in which a trolley system moved animal
carcasses to multiple butchers, each performing a specialized task.

V J!'!-1~~.~1-~!~
- V User Interaction Design (Ml.J)_
4.9 Pro<:_ess of Inter. Design, Prot~
- typing, Co
4. Seeing the Famlllar In a Different Light . na.
Analogies also provide a technique for generating new ideas, in which we corn
familiar to something else that is seemingly unrelated. Asking questions su Ch as Pare sorne'thIn
"\An
this like 7" or "Where else have I seen someth ing like this before?" can gene rate an••nat else Isg
I
enable us to see something familiar in a new light a Ogles th·at
The re seem to be three main types of articles about UX analogies-altho ugh so
combine two or more classes of analogies. rne articles

S. Leaming from a Related Field


Some articles compare user experience to another related field such as arch·1tectur
design, print design, or Disney Imagineering. It's easy to see the connections a d . e, game
n s1rnn . .
between user experience and these fields. Each of them focuses on designin anties
Examining how professionals in other domains handle similar situations can lg edxpertences.
. .. ea Usto
ideas. So such articles seem more leg1t1mateand respectable. new
6. Taking a Fresh Look at User Experience
Articles that compare user experience to something that isn't so obviously connected .
to 1~ let
us take a step back and see user experience from a new perspective. These comParisons
enable us to re-examine elements of user experience that are so common and fam•i· 1 tar, we
don't even consciously think about them anymore .
7. Attrar;tlng Attention Through Humor
• a
Some comparisons aim at being humorous, and it's stretch making a connectio~ to user
experience . Such analogies may also make some good points, but their purpose is to attract
our attenti on with what may seem like a ridiculous compa rison . For example, if I were start
out with the idea, "What User Experience Can Learn from the Simpsons, " I could probably
think of some connections. Presto! I'd have a unique, humoro us UX article that would attract
att ention. But it's this type of pop-culture comparison that critics of articles based on UX
analogies love to hate.
• People come to user experience from a variety of different backgrounds. Many UX
professionals have trans itioned into user experience after gaining the ir education aflrl
experi ence in other fields. Although user experience isn't the only profess ion that is prone to
making such analogies
8. UX Is a Multidisciplinary Field
ux
People come to user experience from a variety of different backgrounds. · Many
• ·ence
professi onals have trans itioned into user experie nce after gaining educ ation and expen r
. with OU
in other fields. So, naturally, we notice paralle ls and compar e user experience nt
. g, stagna
previ ous fields and oth~r topics of interest User experience might become a borin

.~~l
V r ~; 11
11111
eraction Design (MU) 4-1O Process of Inter. Desig n, Prototyp ing, Cons .
r Int -

field if we did not bring this variety of experiences we have had in other fields to our
discussions.
our Analytical Minds NaturallySee Comparisons
9, .
The analytical and problem-solving skills we use in user research and design are the same
skills that natura lly lead to, making analogical comparisons. To conduct user rese arch or
usability testing, you have to be observant and per~eptlve. When you analyze the findings of
such res·earch, you notice patterns, make comparis6ns, draw conclusions, and generate
solutions to the problems you identify. It is understandable that thes e skills woul d cause us to
see similarities between user experience and other fields or personal inte rests.

Review Questions

(1 0 Marks)
Q• .1 Explain Interaction Design Process.
(10 Marks)
a. 2 Explain Prototyping and Conceptual Design.
(5 Marks)
a. 3 Write Advantages and Disadvantages on Fidelity.
(1 0 Marks
a. 4 Explain the conceptual design steps.

Q. 5 Explain Interface Metaphors and Analogies. (10 Marks)

I
ttE~~£~,~·-~·,~~:·~~·~~ c -• ~-- ,...

-
o_.:.,..
- 1 -

~, SO"li!D ( . j

• • ®
f7
m $1 =y- -
r~
@ #
'¼ •
7,, 'i
~ - :;,,·_

evaluation Techniques
6,~-'!.!·
oe s ign Rules and industry Standards

i.r
3.
User Interaction Design (MU)
5 -25

Standards dealing with the product development process


th
I~~-
~
~ - r,,
and Framework
ISO 92 41-21 o (2010) Hum an -ce nte red des ign for int e ra ctive syst e rns.
1) Thi s sta nd a rd pro vid es guidan ce on hurn a n-centered d es ign activities roughout the
developm ent life cycle o f Inte rac tiv e co mput e r-b ased systerns.
2) It Is a too l for th ose mana ging des ign proce ss es and provides guidance on sources of
Inform ation and sta nd ards relevant to th e hurn an -center e d approach . , nd When of Evaluation, Types of Evaluation, case studies, DECIDE
3) Th ere are fo ur esse ntial use r- ce nte red des ign a ctivitie s which should be planned for and . ,,hat Where a · · t F. Id ct· H . . . d
"Y• v•
v,J,, · Tes't·ng ' conducting expenmen s, 1e stu tes, eunst,c Evaluat,on an
' . ·ty-
und e rtak e n in ord e r to incorporate usability re quir e ments Into the developrnent process. 1~e i.Jsab1lt
work, ·ct ·ve models.
frarl'e hS pred1 1
These are as follow s :
a) Understand and specify the context of use , ancthroU9 '

b) Specify the user requirernents WhY What, Where and When of Evaluation
c) Produce design solutions to meet user requirements
d) Evaluat e the designs against requirements
I
rgonom1cs of human-systern interaction - Human-centered lifecycle of 1,valuation are a valuable too\ for program managers who are se ki t
ISO TR 18529 (2000) E · outcornes . .. _ e ng o
' gthen the quality of their programs and improve outcomes for the children and youth
process descriptions. and formalized definition of the
1) This Technical Report contains a structured 5
rren · . \

\
human-centered processes described in ISO 13407: theyserVe·
evaluation answers basic questions about a program's effectiveness and evaluation
a)" HCD.1 Ensure HCD content in system strategy program . '
b) HCD.2 Plan and manage the HCD process data can be used to improve program services .
c) HCD.3 Specify the user and organizational requirements In this brief, we define evaluation provides a systematic method to study a program, practice,
d) HCD.4 Understand and specify the context of use intervention, or initiative to understand how we\\ it achieves its goals. Evaluations help
e) HCD.5 Produce design solutions determine what works we\\ and what could be improved in a program or initiative.
f) HCD.6 Evaluate designs against requirements
'\.
\
g) HCD.7 lntroduce and operate the system 6.1.1 Why Evaluation ?
1\
Review Questions \• It means simply to assess the value of something. \n this book. it means helping those who are
I
(5 Marks)
\ involved in many different kinds of development programmes to assess the value of what they
Q. 1 Write a short note on Design Principles.
Q. 2 Explain Usability in details. (10 Marks) are doing.
Many of them are already monitoring their own work and may have taken part in evaluating it
\\\
Q. 3 Write a short note on standards. (6 Marks)
Q. 4 Explain ISO standards. (5 Marks) \ in a systematic way.
H people why they evaluate their work. different people give different answers.
' When you ask I
• ere are some 0 f the actual answers people h ave given: I

I /
1
-•aiuat1onTechn1
--- nter~· asons are easy to understand. Ques and Framework
. er 1<eY re . sorne are rnor .
t t~ese at community level and helping th e difficult When you are
EvaluationTechniquesand Frameworl( ,,~e of with those rds and meanings . that are even rnoern to. ParticiPate .1n evaluation, it is
,•, User Interaction Design (MU) &2 ' I~"1 to tise Wot you can start to do this is by tellin• ";, aod cIear than those given
5
To ___,______
holp us soo i a11 tha ga story abo
where wo am
I To help us-----.
lo
going and If w~ rnako bolter plans
f've•oo• _,y
ucb a5 th'°"'
as follow,, . "'<h" P<pl,'sa<Tual
noed 10 ctmnyo !or 1ho lu\uro irJl . 0ce, s
dlroctlon efle
8
;<P 1et1° · d I
..ar community eve opment workers cornp d
ejr· some are evaluatio t ki
f otllltrY n road. While they could see throughthe gl . n ta nga bus °
To monsure
prooress
I ne c
11 o I og all u
ao
11!<110W
Id see
. ass Windowsthey wereha
that they were making progress.Then rain forced th PPY
em to put wooden
,t;e1 -•eY coU . d ws and they could no longer assess their progress Th k
r eu•
"Why did you
,i er (he WJII oid not tell along which road, how fast or even whethe · ey new they were
th .
! rfeflforWardoV but cou r ey were nearing
evaluate?"
I ,·.<illg, s0oa
~,e
. tiOII•

1-...;.~,·~; .... _ Ii
1:
j
... ..,,, .• >::..

programme
with
Others like It

Fig. 6.1.1 : Why evaluationneeded


Fig. 6.1.2
6.1.2 Why did you Evaluate?
These were to do evaluation: llaJ!lple: Atruck
Achievement(seeingwhat has been achieved) , Evaluationis like looking to see where and how fast you are going, and then estimating when

Measuringprogress(in accordancewith the objectivesof the programme) you are likely to reach your destination.

Improvingmonitoring(for better management) ' So, from the answers that people gave it is clear that evaluationhas beencarriedout mainly as

.. Identifyingstrengthsand weaknesses(to strengthenthe programme) J '"' of looklogat prng,amme,ctMti" h•"" "'""'"'· '"''"" •~"" · ''"""'nd
facts and figures; in order to: monitor progress and effectweness, consider coSIS a
Seeingif effortwas effective(what differencehas the programmemade ?)
• · ' ff t· \yforthe future
Cost benefit(were the costs reasonable?) ctency, show where changeswere needed, and help to plan more e ec ive
e ·
• ,·Hffi , dth se are alittle different
• Collectinginformation(to plan and manage programmeactivitiesbetter)
' However, there is also another group of. reasons for evaluaang,
. I00 ank t ethe answers that people
• Sharing experience(to prevent others making similar mistakes,or to encouragethem to use
ere are some of those reasons,which become clearer tf we a
similarmethods)
• Improvingeffectiveness(to have more impact) haveactuallygiven.
• Allowing for better planning (more in line with the needs of people, especiallyat community 0 'Because our funding agency asked for it.'
V!~.~-I~.~~
level) o 'Because the ministry asked for it.'
.....__
1erac11011 - - - 6-5
.
I ,ess evaluations assess
er 1n
, iJS Wheth ,
Evaluau
on T8Chniques
V• User InteractionDesign (MU) 6-4
·
EvaluationTechniquesand Framework ' pro ted as planned , whethe h
er
an lnterventi
and Fram
ework
teJ'llen r t e lnte d on or
o ,Because our sponsors wanted to see whether they wished to go on supporting our 1J11P·or chaII enges ·and successful strategiesass °
n ed. target PopulationProgram m del Was
programme.' J11al ociated . Was rea h
..,e evaluationsdetermine Wh th With Progra . c ed, and the
o 'Becausethe researcherswanted to try out new evaluationtechniques.' 11tcO••· e er, and mimplem .
i, 0 . d or youth outcomes occur and wh h to What extent th entation.
o 'Because new materialwas needed for publicitypurposes.' p1l . .. et er th • e expect d
c rn or program acttv1t1es. · ese changes e changes In
From this group of answers it is clear that evaluationhas been ca·rrted out for another group
of reasons. .P tuation can help you make better d . . uted to the
cb eva ecis1ons by &iVi
• However, these reasons may not have been clear to ail those who were actually involved in ,, t1rne·----.----- ng You the right k· d
the evaluation. ~gllt . , in of data at the
e of project Purpose
SPg
- • For example, evaluationshave been carried out with some programmeparticipantsbelievtng
;--;alization Phase Helps prevent waste~aluatlon
and identify •
that the results would be used to make decisionsabout the further funding of a programme, l(DPceP
when in reality the decisions on further funding had already been taken by the programme potential areas of concerns while 1 Formative Evaluation
funders before the evaluation began. So the evaluation results did not really make any increasing chancesof success.
differenceto the decisionsthat had to be made on funding.
lillP1ementationPhase Optimizes the project, measures its •
· • Evaluation has also been used in some cases,as a way of justifyinga weak or unsuccessful 1
programme,or as a way of tryingto cover up areas of programmefailure. ability to meet targets, and suggest • Process Evaluation
improvements for improVingefficiency. 1 Outcome Evaluation
• For example, some-evaluationshave looked only at those parts of a programme that were
successful, not at the whole programme.
r,oject Closure Phase '
EconomicEvaluation
Insights into the project's success and 1• Impact Evaluation
If only the successfulparts of a programmeare being looked at, there is less chance that the impact, and highlight --· ·· ·
weak parts will be noticed.
__ :,0!e::;;;:1• SummativeEvaluation
• Afew evaluationshave even resultedin the destructionof programme~. improvements for subsequentprojects. • Goals-BasedEvaluation
0

• This has happened, for example, where a programmehad powerful indiVidualsor powerful The best development project will conduct differenttypes of evaluations,constantlylooking
groups of people ~ho did not agree with the actiVities or the objectives of the particular, ·to streamlinetheir project or programat differentstagesand using differentmetrics.
programmebeing carriedout

• Fortunately,the vast majorityof programmeevaluationsare not carriedout for these kinds of


t1.4 Principlesof Evaluation
1

reasons. Howeve~, they are worth remembering.


6.1.3 What Evaluation? Evaluationis a systematic process of determiningto what extent instructionalobjectives has
IIleen achieved. Therefore evaluation processmust be carriedout with effectivetechniques.

• Evaluationis a systematicmethod for collecting, analyzing, and using informationto answer


basic questionsabout a program.
r following principleswlll help to make the evaluationprocess an effectiveone
L It must be clearlystated what is to be evaluated
• To evaluateis defined as to judge the value or worth of someone or something.
' Ateachermust be clear about the purposeof evaluation.
An example.of evaluateis when a teacher revtews a paper in order to give it a grade. It will
take severalyears to evaluatethe materialgatheredin the survey. He must formulate the instruction:ilobjectivesand define them clearly in terms of student's
observablebehaviours . . .
• While there are many different types of program evaluations,and many terms are used to
describe them, evaluations are typically divided into two major categories : process 11 • Before selecting the achievement measuresthe intended 1earnmg
• 0ut comes must be specified
evaluationsand outcome evaluations. clearly,

· of evaluation techniques shouldbe used for a comprebensiveevaluation


AVa.riety .
lt' . . 'th the helpof a single technique.
1,
Possible to evaluate all the aspectofachievementWI V!~,~~:

,,,,,
tion vv-·~
terfiC
---------- iJS6rln
rocess helps In assessing th
Ev 1
a Uation Tech 1
V User Interaction Design (MU) 6-6 EvaluationTechniquesand Framewor11 uadoll p . .
I
e earner'sneed nques and Framework
hing Jearnmg 1;>rocessit is very m s,
• For the better evaluation the techniques like objective tes~, essay tests, observational eteaC Uch necessa
,~ tll tor must know the knowledgea d ry to know th
techniques etc. should be used: So that a complete' picture of the pupil achievementand stfll' n skills to b eneeds of th
developmentcan be assessed. ~e 10 helps to know whether the Stud e rnasterectb h e learners.
~ 1uat1oniththe instruction . ents Possess require/t e students.
3. An evaluator should know the limitations of different evaluation techniques eed w knowledge a
iro' help In providing feed back to th nd skills to
• Evaluationcan be done with the help of simple observationor highly developedstandardized uadon e students ,
tests. But whatever the instrument or techniquemay be it has its ownlimitation. .on process helps the teacherto kn
e(<lluatl . ow the learntn .
• There may be measurement errors. Sampling error is a common factor in educationaland If, bring about an improvementin diff . gdifficultiesof the tud
betps to erentschoolpra , s ents,
psychologicalmeasure~ents . It ures an appropriate follow-upServi Ctices.
ens ce.
• An achievement test may not includethe whole course content. 1150
1"1uadoPhelps In p~ep~rlng pr~grammedlllatertats
• Error In measurement can also be found due to students guessingon objectivetests.
med instruction 1s a continuousseriesofl
• Error Is also found due to incorrect interpretation of test scores. program . , . earningsequences.
the instructional matenal Is presented In 1.
4. The technique of evaluation must be appropriate for the characteristics or performanceto ~rst a 1mtted a
the instructional material. rnount then a test is given to
be measured resPonse
'eedback is provided on the basis of correctnes f
• Every evaluation technique is appropriate for some uses and inappropriatefor another. N¢ ,, s o response made S0 tha
.tt cn·ve evaluation process the programmedlearningIs t . · t Without an
.
• Therefore while selectingan evaluationtechnique one must be well aware of the strengthand
Eflluadonhelps In curriculum development , .
limitations of the techniques.
S. Evaluation Is a means to an end but not an end In Itself urrlculum
. development is an importantaspectoftheinstructio nal process.
IC
• The evaluation technique is used to take decisionsabout the learner. Evaluation data enable the curriculumdevelopment,to determine the effectivenessof new

• It is not merely gathering data about the learner. Becauseblind collectionof data is wastageof
Iprocedures,identify areas where revisionis needed. . . . . • . I "\
both time and effort But the evaluation is meant for some useful purpose. Evaluation also helps to determine the degree to what enent an existing curriculum is
effective.
6.1.5 Functions of Evaluation , nius ~valuation data are helpful in constructingthe new curriculum and evaluating the
The main aim of teaching learning process is to enable the pupil to achieve intended learning existingcurriculum.
outcomes. In this process the learning objectives are fixed then after the instruction learning 'L Evaluationhelps In reporting pupil's progressto parents .
progress is periodicallyevaluated by tests and other evaluation devices. AsyStematic evaluation procedure providesan obj;ctiveand comprehensivepicture of each
The function of evaluatlonprocess can be summarizedas following pupil's progress.
l ~ .
is comprehensive nature of the evaluationprocess helps the teacherto reporton the total
1. Evaluation helps In preparing Instructional objectives I
developmentof the pupil to the parents.
• -Learning outcomes expected from class-room discussion can be fixed by using evaluation Th' · d · I the most
results.
• . Can only be possible when we shall identify the instructional objectives and state them clearly
in terms of intended learning outcomes. Only a good evaluation process helps us to fix up a set
is type of objective information about the pupil provides tbe foun anon or
effectiv co-operation between the parentsand teac,hers.

Evaluattondata are very much usefulIn guidanceand counsellin& · I d personal


Ii~
L
Evaluatj for educational.voca\lona an
of perfect instructional objectives. on procedur es are very much necessary
.. todllM91... i
&Uldance. . i,!~~ ·
V PuD1Lt1tll 1nl

.,;
~,;r User Interaction Design (MU) 6-8 · Evaluation Techniques and Frarnework
-- er in
,is
, n [J 0 SIYII \ I VI VJ

10ract1;;;;0~=:,;;:;;;======-
s of evaluation should be cir I Evaluation Tech .
- ;'\_
7
ces cu ar lik n1ques an
= ...... 1ie pr0 heel can move along smoothly. ' e a Wheel, All
tthe w 1. . . the Parts h
d Framework
In order to assist the pupils to solve their problems in the educational, vocational and
~3 rt of the whee ts mtssmg, it is no I s ould fit togeth
personal nelds the counsellor must have an objective knowledge of the pupils abiliti ~ro
' 11one h evaluationprocess. If they ar 1 useful. Partic•
interests, attitudes and other personal characteristics. .artloft e e eftoutofonep IPantsneedtobe·
r- -~art,1ti shke bre. in~olvedinall
An effective evaluation procedure helps in gettinga comprehensivepicture of the pupil which - ak1ngthe wheel.
leads to effectiveguidance and of counselling. Using the
results
8. Evaluation helps In effective school administration to improve the
program
Evaluation data helps the administrators to judge the extent to which the objectives of the
school are being achieved, to find out strengths and weaknesses of the curriculum and
arranging special school programmes.
It also helps in decisions concerningadmission,grouping~nd promotionof the students.

'"'"'?
Carrying ot~h~
9. Evaluation data are helpful in school research

• In order to make the school programmer more effective,researchesare necessary.


Evaluation data help in research areas like comparative study of different curricula,

effectiveness of different methods, effectivenessof differentorganizationalplans, etc.
Fig. 6.1.3 : _Evaluationcycle
6.1.6 Where and When of Evaluation?
I valuation is like a wheel turning.Don'tlet the wheeI bebroken.
E
6.1.6(A)When of Evaluationdoes ? Using materialsand resources from 'outside'
11
An evaluation is often planned and prepared outside the area where it is to take place. \I is likely that some
. of the documents and materials used m
· the eva1ualion
. will
. come from

For example, an external evaluator may arrive carrying a plan, and with a clear idea of which outside the .evaluat10narea. The greaterthe use ,of outside matena· 1s andresources, the harder
• \ it will be to develop self-reliance.
methods should be used in the .evaluation. Questionriairesmay already have been prepared, t
even though these will be tested first at community level, before they are used on a large I
l\.~B) Where Evaluationdoes ?
scale .
At the end of an evaluation the information collected is often taken away to be analysed and ·I Regular monitoring,which is built into an ongoingprogramme, there is aneedfor evaluation
• at regular intervals to prevent the pilingup of information and also toobtain aclearerpicture
reported, usually in the form of a written report .
This is often the case where a computer is used to analyse the information. olprogramme progress and impact.
• For some programmes a two-to-threeyear gap between evaluationsworks well.
Sometimes the evaluation results are taken away, not only out of the area, but out of the 1
,H d • h case on many factors
\11
• owever, the decision about when to evaluate will depen ' m eac '
country.
For example, an external evaluator may not have enough time to write up the results before al!ecting a particular programme.
• leaving for home. Where this happens people who have participated in planning and carrying \
. !ample
out the evaluation are not able to take part in one of its most important and interesting
1
Hasthe programme got long-termob'1ect·tves · Perhaps 1essthan twoyears·is
stages - that of producing its results. II ith . . 1 t progresstoo soon. h vaccination
By taking part in analysing and reporting the results of evaluation, participants gain a deeper as, lt Wt\\ be useless to try and eva ua e . . [l'ke a two•mont .
loo rm ob1ecuves I

--
understanding of programme progress, strengths and weaknesses. soon ? Has the programme got short-te
They can see where and why changes are needed, and can plan how to put them into practice. Programme). '
• "
•-n oesign (MU)
e••
Evaluat,on Tethn, "
V User Interaction Design (MU) 6-10 EvaluationTechniqu_Elsand Framework
..... qu_s an,JFrarnewor11 L.J.
/
t d for Your project Iamong
t ·the
d' Potentialbeneficiaries
2. In this case, evaluation usually takes place when the programme ends. What· kind of .mi, eerren line of re evan m 1cators, Which can he\ h .
o , ~e ell
monitoring methods are already used 7 1 '
,, ~· ' '"' p' ow '''"" ""
3. Do records need to be gathered from places far away before the evaluation starts 7 Are there o arly I.mprovements to the program
5
going to be many records to look at? \~y~elP Jllake
roject
e ma nagers to refine or improve the program
4. Will extra time be required to do this? o ~lov/sP
0
\
5. Are extern al evaluators to be Involved in the evaluation 7
6. If so, this will affect timing. When are they able to come and how Jong will they take to get to
the programme area i
,o'il
5
'ionduct theY
aJIIPare
wer
rveys and focus group discussionsamong the tar t
\e SUlikely to need, understand,and accept programelements.
ge popu\at1on focused on \
;tie 1uat1on
7: How long can they stay? What about climate and seasons 7 sE"a . g') \1
8. In the rainy season is it possible to reach isolated communities.? r•·..,,i:eS
n as , prog ram monitonn .
\ Mso •on oc \\
\
In the dry season rivers may dry up and communities cannot be reached by boat. In a city I !{llOW curs once program 1rnplementation has begun, and it measures how
people may find it harder to concentrate on evaluation in the hot season. What about people's ,,,,,:ess eva\uatl rogram 's procedures are. The data it generates 1s useful in 1denti[ymg
ffectlve . nd
P stream lining processes, and portrays the program's status to external
times? p• - . your
~efficiencies a IL
9. During haP\'.esttime will people have time and interest to spare for evaluation? \
parnes. i'
At certain times of the year people have less food and money. \II
,..,en . plementatiqn begins
10. Is it possible to choose a time when people may be more relaxed and willing to give time and \ "" ram im
When prog . f an existing program
attention to evaluation ? What about the time of the programme staff? o . operation o
11. They also have particularly busy times. Which is the best time for them ? What about
ministries and outside agencies ? (such as departments, funding agencies and organisations)
0
ounng
What
\\
\I
Whether program goals and strategiesare workingas they should
They will also have specific ideas about timing. 0
Whether the pr o~ram /s reaching Its targetpopulation, and whattheythink aboutit \I,I
0
I
6.2 Types of Evaluation Why
Providesan opportunity to avoid problemsby spottingthem early \1
0
Different types of evaluations, constantly looking to streamline their project or program at o Allows program administrators to determinehow well the programis working
different stages and using different metrics .
How
1. Formative Evaluation Conduct a review of int ernal reports and a survey of program managers and a sample of the \I
\ ~rget population. Th e aim should be to measure the number of participants, how longthey
(Also known as 'evaluability assessment')
Formative evaluation is used before program design or implementation. It generates data on haveto wait to receive benefits, and what their experiencehas been. \I
I
the need for the program and develops the baseline for subsequ ent monitoring. It also
\ Outcome Evaluation
identifies areas of improvement and can give insights on what the program's priorities should
be. This helps project managers determine their areas of concern and focus, and increases I tAlsoknownas 'objective-based evaluation') . ram implementation. It generates data
awareness of your program among the target population prior to launch. Outcome evaluation is conventionally used during prog ttributab\e to the
o h d those outcomes are a
n t e program's outco mes and to what egree h sbeenandhe\psmake
When · your program a
ff
program itself. It is use ful in measuring how e ective
o New program development It m h • t nded benefits.

..
ore effective in terms of delivering t em e ~~!i ,
o Program expansion
i, T1<hbewtd1i
r n
C 11 : ~\I ~ I
.,,,.1.,.;,,,.,
~r U ser Int eractio n Design (MU)

When
6 - 12
Evaluation Techn iques and F
rarneworl(
.I 1~---------..
·on Design (MU)
1n1eract,"'.. """"

E:valuatklni .
0
After the program h as run for so m e time period ,., a=lysls of "• '""'"• by , _ect,n~u•sa~F .
" 5'/ rnatic 011ecting data - rarnewor1c
- '<\l

ste rs of work. It Will also requir on Progra


o At an appropriate time to measure outcomes against se t targets - usually benchmarked n-hOU e a surve costs Incl d
time periods d ma determine potential areas of Wast Y of Progra 111 om ' u ing capft,f
iJ1
pePulatl.Evaluation
on to · e. 111 leers •nd th e targ,
What
..act h .
0 1
!ll'r·ct tion studies t e entire Program from b . .

-~
How much the program ha s affected the target population
evafua eg1nn1ng to end (
o Clearly establish the degree of ben e fit provided by the program 1111P3 • at) and looks to quantify Whether or not it h b or at whateverstage the
• Why rllgrarn is. act,
impact evaluation is useful for meas .
as een suecessful. Focused on the
P rn 1mp ' unng sustainedcha
0 Jong-ter m or making policy changes or modificationsto th _nges broughtabout
progra e Program. •
Helps program administrators tell wheth e r a program is meeting its objectives . .

o Insights from outcome-focus e d feedback can help increase effectiveness Wbentthe en d of the program
How A re-se Iec ted intervals in the program
o , AtP
0

A randomized controlled trial, comparing the status of beneficiaries before and during the WJiat
program or comparing beneficiaries to similar people outside of the program. This can be Assesses the change in the target populati , '
o · on swell-being
done through a survey or a focus group discussion. Accounts for what would have happened ifth h
ere ad been no program
4. Economic Evaluation WbY
To show proof of impact by comparing benefician· .th
(also known as 'cost analysis' , 'cost-effectiveness evaluation', 'cost-benefit analysis', and o es wt controlgroups
0 Provides insights to help in making policy and funding decisions.
'cost-utility analysis')
How
Economic evaluation is used during the program's implementation and looks to measure the
A macroscopic review of the program, coupled with an extensive survey of program
benefits of the programs against the costs . Doing so generati:s useful quantitative data that
participants, to determine the effort involved and the impactachieved.lnsigh~from ;rogram
measures the efficiency of the program. This data is like an audit, and provides useful officers and suggestions from program participantsare also useful, and a control group of
information to sponsors and backers who often want to see wlpt ben efi ts their money would non-particip;mts for comparison is helpful.
bring to beneficiaries. l Summatlve Evaluation
When . · th , mpletionor at the end of a program
Summative evaluation is conducted after e program5 co uJ ·tlon.
· ct deliveredbenefitsto the target pop a
0 At-the beginning of a program, to remove potential leakages cycle. It generates data about how well the proJe . h hat they have achieved,
. useful for program admm1strators
·11 1s . . . stify the proJect. s ow w .
to JU
0 During the operation of a program ; to find and remove in e fficienci es.
an_d lobby for project continuation or expansion.
What
;, When
0 What resources are being spent and where
o At the end of a program
0 How th ese costs are translating into outcom es
0 At the end of a program cycle
Why
What , d . ed change happen
o Program manag e rs and funders can justify or streamline costs 0 de the esir .
How effectively the program ma m participants
o The program can be modif'ied to deliver more results at lower costs d he uves of progra
~ Tec:IIK.. ml1ll1i How th e pro gra m chang~
P II D I I <• I :I n1
oeslg_ri_{MU)
V User InteractionDesign (MU)

--1es of Ev11111.
6-14
Why --:
1
o Provides data to Justify continuingthe program
o Generates Insights Into the effectivenessand efficiencyof the program
l l"dl(lllngarideo mon!,t by..,.,....
rr'1graJ!l
• How ('.
11
Conduct a review of Internal reports and a survey for program managers and target "'I • •" 111,,.,.wd-log • ""°""'""fu, • "'•
.......
how the walkthrougbrnelhOd \Vu>L
..,1....,_
· ·

populations. The aim should be to measure the change that the projecthas broughtabout and
compare the change to the costs.
1
~I ,,,• e are mmlaglb,VCR to d
,.. r'"'"
'••~"'°"'" , ,,,..,.._
recorder "'c•1
•..,,.,
7. Goats-Based Evaluation ,,, " . !,show, In ~, 6J.t lo, ''""' .
• ,,b It"'•~!rates the~
. I d•"'"lcture on the right after the timed record
1<inill3 . •
(Also known as 'objectivelyset evaluation)

Goals,based evaluation Is usually done towards the end of the program or at previously
'"' althuse,
,,
...,
program up to three tltned recordings in d'" ,
to stream number is autornattcanyassigned."''"•
"......
P1'sstd.Th, vat111.
io1~s • '"' bl W """ ·
agreed-upon Intervals. Development programs often set 'SMART' targets · Specific, . 1
, _.
• ,,.11, •
orts oho .,.,, task W, ....
. ldo,°"~, -._,"' ....
••,, ..........
Measurable, Attainable, Relevant,and Timely• and goals-basedevaluationmeasuresprogress
towards these targets. The evaluation is useful in presenting reports to program
to
" ti••·'"'
ndeOign SOPP 2005. nl , program '""'"' •t
11febf1J3l)'
"'° "' ""'".• , 1'ts. ·-
...
'•
administrators and backers, as it provides them the informationthat was agreed upon at the
start of the program.
Ume: 21:45 stan:
• When
111\d:
channel: 3 chan11e1:
o At the end of the program
date:
o At pre-decided milestones
• What i' [D0[]1 m001
0 # How the program has performed on Initial metrics 000 000
0 Whether the program has achievedits goals 000@1 000@]
• Why [C<l[IJa; ~[I]m.
o To show that the program is meetingits initial benchmarks ~[El~ ~[£][!!
0 To review the program and its progress

• How Rg. 8.3.1: An lnltllll'IIIIOtlconlnlldlllgn


o This depends entirely on the goals that were agreed upon. Usually,goals-based evaluation •
We will assume tliat the user ts famlllarwith VCRs but not with this partiallar 1'111
would involve some survey of the participants to measure impact, as well as a review of
1111 step In the walktbr,JJupts ta liientllytile action sequeJKtfor this Wt,..
Input costs and efficiency. llnn.!of the user'sac:thJit(U~),and'tliepjdllpl,1.ot,apoase(SDJn, ··
o Development programs with effective monitoring and evaluation libeleft-hindplcturetnPig. 6;J.1: . " . .
use different types of evaluation at different points of time. Some programs m
DA 1: Press the''ttllt
even run two different types of evaluation at the same time for entirely dlffi
SD.t I Display~
purposes.
'(r Use r Interaction Des ign (MU) ction Design (MU)
( -
• Why 6-14
Evaluation Techniques and Frarnew r11
0
-- es of E
~•atuau
o Pt·ovtdes data to Justify conttnulng the program on'!(:h .
Sl~dY 1 n,q,~ ano,
• o Generates Insights Into the effectiveness and efficiency of the program
How ' ..t11rnrn1ng
ffl'&'
, a v1'deo record .
er by relllote •arne••ori.
con..401

Cooduct a <eview of lot,cnal ,ep 0 ru; aod a ,u~,y fo, pn,grnm maoag, cs aod ~"'"
ie te how the Walkthroughlllethod
Popula•oo,
compare the. The aimtoshould
change be to me,su,e <he ehaoge <hat the pn,Ject hu bn,ught abo"' '"'
the costs. 1Uustra Works Usin
[:,,e can re designing a remote ~ontro1 for a '"d gaslll!p\eexaillPle
7. Goals-Based Evaluation wea V ·ieorec d .
a~ne ogramming the CR to do tlrned record\n or er (VCR) and
(Also known as 'objectively set evaluation) ' "ofp•
,, , I des
. .
,eta 3 · ign is shown in Fig. 6,3.1, The Pl,..,,
·•ore on the
" '", ,,... 1,

. .....
r init1 h picture on the right after the titned r 1eft Illustrates the h
Goals-based evaluat100 1, usually dooe
agreed-upon Intervals.
tow.,,, <he '"' of the prog~m °' a, ,..,,,
Development programs often set 'SMART' targets - Specific,
0 .
,,,, 5
'• "''' " • te,to progc•m UPto
~o'/1
1
. .........,
'"•db,,,,
'••d.....,." dff has been Pressed. The VCR
ii ble stream number is automaticallyassi"" d ,., ·
"'u'
Measurable, Attainable, Relevant, and Timely - and goals-based evaluation measures progress ava a ...e · vve want to k
.,, neXt rts the user's task. We begin by identifyinga r now whether our
towards these targets. The evaluation Is useful In presenting reports to program "' ,pPo . ... ... ..,, ... "" \1
,...
i,deo
• . -reoocd. pro--., •t1aoo.., • .,•••• .,,15 gr,IOthe
administrators and backers, as it provides them the Information that was agreed upon at the iifebruary 2005. --
start of the program. • ,ro, . • ''""'' •
• When
time: 21:45
0
At the end of the program
0
At pre-decided milestones
channel: 3 \
• What

o • How the program has performed on initial metrics


0J00B
00ill DJ0ms j.
o Whether the program has achieved its goals 000
• Why
[?]00@] 00mm I

o To show that the program is meeting its initial benchmarks


[<)CIJll
~ITJ[ffi
~oom J1
0 To review the program and its progress
~[TI~
• How
Fig. 6.3.1 : An lnltlal remotecontroldesign 1
o This depends entirely on the goals that were agreed upon. Usually,goals-based evaluation
1
would involve some survey of the participants to measure impact, as well as a review of We Will assume that the user is familiar with VCRs but not with this particulardesign. The

0
input costs and efficiency.
Development programs with effective monitoring and evaluation frameworks
next 5tep in the walkthrough is to identifythe action sequencefor this task. Wespecify!hisin
~nns of the user's action (UA) and the system'sdisplayor response(SD). The initialdisplayIs
1I
use different types of evaluation at different points of time. Some programs might ij lhe left-handpicture in Fig. 6.3.1:
0
even run two different types of evaluation at the same time for entirely different UA 1: Press the 'timed record' button safter'start:' \ '1
purposes . SD.1: Display moves to timer mo_de. Flashing
· cursorappear
0
UA 2: Press digits 1 8 0 0
I.
V :~.~~
·.':'~~!~
-~u ~•
I .iser ""u·· ' ' '·r' -~ ', ''

-
assume that the'~- ---~ , ;.~"!!>,., ". . .. _,~
6-16
EvaluationTechniques and Framework v ble to '· - ,•-~--~ ~""i;;;,·c -=~~ ........,
V User Interaction Design (MU)

o SD 2 : Each digit is displayed as typed and flashing cursor moves to next position
I< ',.,,,.of""
"pl•"'' """"""'-,..,· """"""'
th< 'tlm<d ""'"' ·'-w, •....
" •..::..;;~~ 'lir.,;i,
.......:,,~'.1,:7, ·.,
_~,,,._
o UA 3 : Press the 'timed record' button SD 3: Flashing cursor moves to 'end:'
o• ,d '' o"""> .............. Li ,~,~hi.,; · · c
S"'"' coold ' . -~ ..... ;.;., , '"'-· .
o UA 4: Press digits 1 9 1 5
,, '"'"'
,eeds
111

ra,hl"w\O., ............ __ ,L, "
.__. ~...
.. •"· •.,
"!""'~
prOer 2 this ~ -"'lllltt!:--..
o SD-4: Each digit is displayed as typed and flashing cursor moves to next position udY
c,se 51
o UA S: Press the 'timed record' button
l
1
o SD S: Flashing cursor moves to 'channel:' e ·gns imagine you ire designinga ~ lntt'4'--- d--. _

o UA 6 : Press digit 4 O" J11Pvalua ich


"'' '"'' des
is to u,. looas fo, ""'"'lloa · -•
•oaco: toi --~111'1l<P..l~

o SD 6: Digit is displayed as typed and flashing cursor moves to next position P ,,age, wh 1 ...._ ·n.,.
EJC~' •denn . and you "'1sh to know W'n!cb rl-111\
. g two styles of icon design L.

o UA 7 : Press the 'timed record' button cons member . One set of icons 115es llaturaJisticlln<•es n...,__ •
are
351er to re 6 <""><Ilona~
you for users
caphor ) , the other uses abstra.a l!nages(s..oe 6J.2).
o SD 7 : Flashing cursor moves to 'date:'

forn, •._ ...,",-,- ••..


edocurnent rne design
. a n exp eriment
o UA 8 : Press digits 2 4 0 2 0 5 , to help you decide 1s·hic1i style to Its! )
o SD 8 : Each digit is displayed as typed and flashing cursor moves to next position · f1oW _ you nee
• ,;ghl
The firs
Y'°
t thing d ,o do " %
o UA 9 : Press the 'timed record ' button
o SD 9: Stream number in top right-hand comer of display flashes utco iight ex
rne? pect the natural Icons to be easier ro retail siin they <I? mmt
o
o
UA 10 : Press the 'transmit' button
SD 10 : Details are transmitted to video player and display returns to normal mode.
Having determined our action list we are in a position to proceed with the walkthrough. For
. ~;::.:::.::"iirefo~·oo
each action (1-10) we must answer the four questions and tell a story about the usability of
the system . Beginning with UA 1: UA 1 : Press the 'timed record' button.
Question 1 : Is the effect of the action the same as the us~r·s goal at that point 7
The timed record button initiates timer programming. it is reaso nable to assume that a user "" -

familiar with VCRs would be t_rying to do this as his first goal.
Question 2 : Will users see that the action Is available 7
TI1e 'ti med record ' button is visible on the remote control.
Question 3 : Once users have found the correct action, will they know It Is the one they

need 7
It is not clear which butt on is the 'time d record' button . The icon of a clock (fourth button
§_rn
Fig. 9.3.2 : Am tract 1nd c:onmll ~tar• apellllana
down on dte right) is a poss ible candidate but thi s could be interpr eted as a butt on to change • Users will re me mberth e natural 1consmor,:easllythantbubm1ctones.
the tim e. Other possible candida tes might be the fourth button down on the left or the filled , ' The null hypothesi s In thi s ca se ls that lbert d .bt no dilft!ffl(e ~ rd•*•·
circle (associated with reco rd). In fact, the icon of the clock is the correct choice but It Is quite l types. This hypoth esis clearly Identifiesttw.1'14 ,,.. mabltforoW'tJIZSL !
possible that the user would fail at this point This identifi es a potential usabi lity problem .
J varying the style of Icon.
• Question 4: After the action Is taken, will users understand the feedback they get 7
!: The ind epe nd ent variab le has two levels: ninatal,•~
Once the action is take n the display cha nges to the timed record mode and shows familiar
headings (sta rt, end, channel, date).
,.,.,....,,.
• ~::ever,
.
whe n wecome to a>.n1jdBtlltl- Alj[J_. •r~ ;
e expressedour hypothesis.in .,.., .. II.
• ,. , t l1 t l \Ut •

J
waction Design (MU)
V Use r InteractionDesign (MU)
I, u,er 1n
6-18 EvaluationTechniquesand Framework
l:va1ua11onrec .
How can we measure this 7 hn,quesandF
f Dete11111n th rarnewark
st
Fir we need to clarify exactly what we mean'by the phrase more easily: are we concern d thee e e90als
W~th • e Evaluate, interpret advaluation \
e users performance in terms of accurate recall or In terms of speed, and presentthe r---.. dress
For example, or are we looking at more subjectivemeasures like user preference? data \ Eip1ore1he 5Pecit~
llecicie quest,anstabe
• In this example, we will assume that the speed at which a user can accuratelyselect an icon Is
framework answ,r,d
Decide how to evaluation
an indication of how easily it is remembered. deal with the C%<J )
ethical issues. Parad,Se the "'"Ua:an
Our dependent variables are therefore the number of mistakesin selectionand the time taken grn andtechn,q,e
to select an icon. '\... .J
toanswarthequesions_
ldenlify the practkal
issues.
• Of course, we need to control the experiment so that any differenceswe observe are clearly
attributable to the independent variable, and so that our measurements of the dependent
variables are comparable. Fig. 6.4.1 : Decide F
I ramework Evaluation·
• To do this, we provide an interface that is identical in every way except for the icon design, ermine the goals of evaluation
and a selection task that can be repeated for each condition. 1. D· Det
tour goals of evaluation
• The latter could be either a naturalistic task (such as producing a document) or a more There are
To understand real world i.e. understandhow
artificial task in which the user has to select the appropriate icon to a given prompt o . users employ technol .
and how designs can be Improvedto fit the work . ogy mthe real World
The second task has the advantage that it is more controlled(there Is little variationbetween enVIronment better
users as to how they will perform the task) and it can be varied to avoid transfer oflearning. o To enable choose the best alternativeof the van·ousdes1gns
. ·

0
To enable if the product being made is of the requiredsta ndards by the usm
Before performing the selection task, the users will be allowed to learn the icons in controlled
0 To determineif the target of our projectis good enough
conditions: for example,they may be given a fixed.amount of time to learn the icon meanings.
, Some examplesof goals
The next stage is to decide upon an experimentalmethod.
o · Identifythe best metaphor on which to base the design.
• This may depend on the participants that are available,but in this case we will assume that we
o Check to ensure that the final interfaceis consistent
have sufficientparticipants from the intended user group.
o Investigatehow technologyaffectsworkingpractices.
6.4 DECIDE Framework o Improvethe usabilityof an existingproduct
2. E• Explorethe specificquestionsto be answered
DECIDE l,s a frameworkthat Is used to guide evaluation
(a) What is the purpose of the product? The purpose of this product istobring togetller
D : Determine the goals the evaluationaddresses th
several bookshops under one site so that it can be easy to searchfor books and buy e
E : Explore the specificquestions to be answered ·books in the same site
ers the suppliers. the
C : Choose the evaluation paradigmand techniquesto answer the questions. (b) Who will be the users/stakeholders? The users will be th_ecuStOm '
: Identifythe practicalissues. bookshopowners, the advertisersand also the owners of lhesite. .
. . berter than pre~ous
D : Decide how to deal with the ethicalissues. c) H- · ·
( ow ts 1t than better than previous inte ce ·rfa 1 The pro1ect is
• , , eds and wants
E : Evaluate,interpret and present the data mter,acesbecause we came to captureall the user ne
v:~,~-:.~l.•,j!'.
'ifltdlK1t111ld1!
ruDlltJtlUI
,.

...
user 1n :.:.
·;...::======--
10,rira~c;;uv;;•~·
-----
Evaluation Techniques and Framework
I ) rhe
user will be treated politelyb
Ut they h
va1ua1
IOn Tlleh .
---......
,; User Interaction Design (MU) 6-20
(C ebsite. aveto fol!0 n1ques and F·
W wth •arney
(d) What arc the users attitude towards the website ? The users were very responsiveto the users will have the right to kno e rules and iorJc
(d) 'file wthe &oaI regulau
w,cbsite because they saw that the site was saving time and also money. uate, Interpret and present th s of thestudy ons nf the
(e) For example, the goal offinding out why many customers prefer to purchase pape~ airline ~. ~\Illl . e data
tickets rather than e-tickets can be broken down into sub questions : is analyzed and interpreted u .
"'1,e data . sing the •
, '" . can be generahzed on our part th quick and di
What are customers' attitudes to these new tickets ? findings e evaluation rty Paradl
• Are they concerned about security? rne foIIowing also need to be considered.. Was bothf grn and tech
ormativeand nlque. Th:
• Is the interface for obtaining them poor?
" uabilitf:Can the study be replicated? sumrnatlve.
o ..e . .
3. C • Choose the evaluation.paradigm and techniquesto answer the questions. O valI'dity.• ts it measunng what you th
ought?
Biases: Js the process qeating biases?
• This looks at how the data is analyzed and presented 0
scope : can the findings be generalized7
(a) Quick and dirty : Designers informallyget feedback from users or constants to confirm 0
th_at their ideas are in-line with users' needs and are done any time. o cologicalvalidity : Is the environmentof the stud .
E yInfluencing it
(b) Predictive evaluation : Experts apply their knowledge of typical users to predict
pilot studies
usabilityproblems.Users need not to be present.
Asmall trial run of the main study.
(c) µsability testing : The performance of the typical users is recorded, the users are
, The aim is to make sure your plan is viable.
watched on video, evaluationon· the users is done, user satisfactionis done lastly
pilot studies check
(d) Field studies : This is done in the natural settings to understandwhat users do naturally
and how technologyimpacts no them. 0
That you can conduct the procedure

The paradigm that was used was quick and dirty because the users were involved in every 0
That interview
- scripts, questionnaires' expenments,
· etc. work a ro .
step of the design p~ocessand it is usuallydone at any time. , it's worth doing several to iron out problemsbefore domg
. thernainPP studypnately
• E.g. field studiesdo not involvetestingor modelling , Ask colleaguesif you can't spare real users. ·
4. I • Identifythe practical issues.
Note:
You are going to look at some of the things like
, An evaluation paradigm is an approach that is influenced by particular theones and
How to select users philosophies.
• How to stay on budget • Five categories of techniques were identified: observing users, asking users, asking exp,rts,
How to stay on schedule user testing, modeling users .
How to find evaluators II• The DECIDE framework has six parts :
0
• How to select equipinent Determinethe overall goals
0
· Explorethe questions that satisfy the goals
5. D • Decide how to deal with the ethical Issues
° Choose the paradigm and techniques
(a) The users will have the privacy of their private informationand will not be disclosed to 0
Identifythe practical issues
·anyone unless with the consentof the user 0
Decide on the ethical issues
lbl The users will be able to leave the site at any time that they wish i.e. they can be able to 0
Evaluateways to analyze & present data
discontinuetheir membershipat their own wish
' Do a Pilot study
"~~l~~•J~l1
I!!~
~ -
Y User lnt11ractlonDesign (MU)
6-22
·r interactionDesign (MU)
'f us;~e~="'=;a.;;;;;;~;,._
EvaluationTechnlques and F 6·23
6.5 UaabllltyTesting - -
rall'lewor11 · ·
usabihlYtesting to gauge th
o e user•1 ~, 1

-------
Usability Evaluation focuses on how well users can learn and use a product to achieve their
goals. It also refers to how satisfied users are with that process. To gather this lnfonnatt
practitioners use a variety of methods that gather feedback from users about an extsttngsiteon,
plans related to a new site.
or
o

sys
satisfactionsurveys to see h
b'

ing with Data from Testing


nteracti
ow the sit on end.to
f.JI one or a com mation of th ese t e fares 1.nth ·end and
' Ytem or application. ests Will
• uation r

erea1 \Vo 1
radical~ . rd.
1
. 1Ilpr , 0
8thnique
s a'1<1 Fi
a1new
°rk

- k . e~ U sabili
What la Uaablllty? , sabililY evaluations can capture two ty of Your site
U
ta Quantitative data notes wh %es of d ·
• Usability refers to the quality of a user's experience when interacting with p d da• Ma~
pa rrtcipants thought or said. ally hap · quahtativ
Pened Q e data
systems, Including websites, software, devices, or applications. Usabilityrots Ucts or
about nce you have gath ered your data, Use it. to•
• · Ualitati anctq
ve datades uantitatiVe
effectiveness,efficiencyand the overall satisfactionof the user. O ·1· . Clibes Wh
• Evaluatethe usab1 1ty of your w b . at
It ts important to realize that usability ts not a single, one-dimensionalproperty of a product, 1· e Site
system, or user Interface. 'Usability'is a combinationoffactors including: . Recommendimprovements
2
. implementthe recommendations
o Intuitive design : A nearly effortless understanding of the architectureand navigationof 3
the site
R -test the site to measure the effectivenessof
4. e your changes.
o Ease of learning : How fast a user who has never seen the user i11terface before can
.accomplishbasic tasks 6,5,1 usabilityTesting and EvaluationMethOds
o Efficiency of use: How.fast an experienced user can accomplishtasks Traditionalusabilitytesting and eva.luation methods.
o Memorability : After visiting the site, if a user can remember enough to use it effectively There are many tradition~! usabilitytestingand..,..,
in future visits .. .... uatton IDethods
heuristic evaluation, cognitive walking, behavioral an . ,such as user testtn&
0 . . a1Y51S, structured and
Error frequency and severity : How often users make errors while using the system, interview,questionnaire,GOMS (Goals, Operators,Method S . unstructured
s, e1ection rules] Prob bTI
how serious the errors are, and how users recover from the errors grammarfor interfaceusabilityassessment . a lly rules
o Subjective satisfaction : If the user likes using the system , These methods are now widely used in a varietyof interfaceevaluati . ·
. , on PTOCest Swtablefor
What are the Evaluation Methods and When Should I ImplementThem? differentuser interfacedesign and developmentstages.
, Each has its own advantagesand disadvantages.
• The key to developing highly usable sites is employinguser-centered design. The expression,
"test early and often", is particularly appropriate when it comes to usabilitytesting.As part of Usabilityof cognitivephysiologyassessmentniethod t
UCO you can and should test as early as possible in the process and the variety 1. Eye trackingtechnology
of methods available ·allow you to assist in the development of content, Information o Visual is the direct way for users to interactwith the interface.


architecture, visual design, interaction design and general user satisfaction.
Opportunities for testing include :
o Visual Awareness (Eye-tracking)PhysiologicalAssessmentis avery effective~ethod or
evaluatingthe pros and cons of the interface,usingline-of-sighttrac
kingasanassessment •
1
o Baseline usability testing onan existingsite techniqueto evaluatea range of websiteusabilitylevels.
o Focus groups, surveys or interviews to establish user goals 2. EEG technology .
o Card Sort testing to assist with IA development 0 . . a small real-time window, with
EEG can accurately analyze neuropsychologicaldata m
o Wireframetesting to evaluate navigation high sensitivityto task complexity. ·hineinterface.
o First click testing to make sure your users go down the right path 0 .. klOads in the man-mac
Is a very attractiveway to measurecogmtivewor
..
'(, User lnterac11on Design (MU)

6.5.2
How to Evaluate UsabilityTesting ?
6-24
"""""11111
liiiiiiiiiii,7'
EvaluationTechniques and Framework
1ntera
. Design (MU)
ct1on
........._ (..
for ·
tvaiuat,onl
1,00~
issue h user encountered
k While Perform· 8chn,ques
The P'Oces, or t,mlog a mas, or'"'"""" dab, traomip", aad obse~atlo,s loio ., st they
e too ing tasks and Frarne-,,Orl(
o
'<Ho,ab 1, report o, ,sabillcy ''"" c;,o seem ,,,~h,lmlog at Hm - b,t It's simply a matte, , Actions ·t· d
0
organizing your findings and looking for patterns and recurring issues In the data. (b
1 oth pos1 1ve an negative)they
ments made
· 1. Define What you're looking for
Co!Tl user discovered, or une""ect d .
f
o .,.,. • .,, • '<0o h
r each I k the user was attemptingto complet t ey too~ lllake
, oecord
' the "''fie categories
' and tags (for example •lo... .•• • act Ptob\e111 they
• "e""• ,,,.
•od ,dd s'"' ie,ce-,.la~d °'" '"h '"'m' •~""°""O•O "'"'"'°'I
,,,,~.
W' ,
.
"'1ge, or
•r
rt

,od I .
well,
,r, b,st m
If you previously createduser Person,s
""'
filter
'
h' digitally, with a too)
around,
dot "ap ply bg,, aod "rt It
.
a,
so You can t

like Excel or AirtabJ


by""••>.
• "'"'•
Ottestin"• 11.
! ""••

,, "'->aou b
U,t h I
0n)

""<
• '._' ,.,
d' ,

ma'"•
er
....
Fig. 6.5.1
the data your statements are conciseand exactlydes""'b h ,
• , . make sure "' et eissue.
Start analyzing the results, review your original goals for testing. Remind, yourself of the fro
0
up · . 1 •
ad examp e . The user clicked
. on the wrong. link
problem areas of your website, or pain points, that you wanted to evaluate.
Goo
• le . The user .chcked on the hnk for DiscountsCodes instead th
0
B
Once you begin reviewing the testing data, you wm be presented with hundreds, or even dent
""mpInfo . " •o,;,
I
thousands, of user insights. Identifyingyour main areas of interest will help you stay focused WhenyOu're done, yo ur data might look similartothis:
Paym
on the most relevant feedback
7 '
User Category Task
• Use those areas of focus to create overarchingcategoriesof interest. Most likely, each category · Problem
ID \ Tag1
will correspond to one of the tasks that you asked users to complete during testing. They may Tag2
be things like: loggingin, searching for an item, or going through the payment process,etc.• 1 Search Find a red item
User unable to locate
2. Organize the data Filter
filter features Confusion
1 Shopping Saving an item
•save for later'button
- ·~ cart for later
did not work-simply
Broken
element
removeditem from cart
1 Checkout Entering
Accidentallyexited Icons Confusion
process payment paymentprocessby
details clickingon shipping
Fig. 6.5.2 options button
2 Checkout
• Review your testing sessions one by one. Watch the recordings, read the transcripts, and Entering User expressed Payment Disappointment
carefullygo over your notes. payment disappointmentthat
details

l V
L Paypal wasn'ta
paymentoption I L--:::J ~1d~
~1 User lnteraclion Design (MU)
6-26 . Evaluation TechniQ_LJ_e.s and Framework :, iJS
l/.
81
1n10ract1

runni
011 w~- , -,

·ng tally of each issue, and how


e.
;.;;===---- ....."""===""""'= va1ua1ionrechni ue
cornrnon it
t
-------
q ana Framewort,
3
'
- ,------.,
0
J{eeP a 'th th/! website. For example,You rnay • Was. You are creating
3. Draw concluslons P ms WI ,ind that se a st of
roble hi·r payment details on the checkout Pa Vera\ users had Issues11"'th
. gt e &e. lf the 11 · "' l
ente!"ID n conclude that there is an issue thatneed Ya encountered the same
le!Tl, the . . . . . ' sto be resolved
prob den the ms1ght tf 1t 1sn t exactly identi . ·
0 d broa ca1With anoth b
"rY

an
for exa
mple, a user who could not find a
support Phone b
er, Ut Is still strongly
elated- uldn't find an email addressshouldbe num er to call and

[jJI]
r" ho co grouped together h
other W
aconcJus1on
. that cs on tact details for the company Were diff\cu1ttofind. 'Wit the overa\\

Fig. 6.5.3
4. pr1orltlie
the Issue
1
Assess your data with both qualitative and quantitativt, measures:
o Quantitativeanalysis will give you statistics that can be used to identify the presenceand

\CZJ
severity of issues
o Qualitativeanalysis will give you an insight into why the issues exist, and how to fix them.
In most usability studies, your focus and the bulk of your findings will be qualitative,but
calculating some key numbers can give your findings c~edibilityand provide baseline metrics
for evaluating future iterations of the website.
L ____ 7
Ag.6.5.4
1. Quantitative data analysis : Extract hard numbers from the data to employ quantitative
·data analysis. Figures like rankings and statistics will help you determine where the most
, Now that you have a list of problems, rank they based on their impact, if solved.Considerho½
common issues are on your website and their severity. global the problem is throughout the site, and how severe it is; acknowledge the im?lications
2. Quantitative da_ta metrics for user testing include of specific problems when extended site wide (e.g., if one page is full of typos, yousr.ould
o Success rate : The percentage of users in the testing group who ultimately completedthe probablyget the rest of the site proofreadas well).
assigned task , Categorizethe problems into :
o Error rate : The percentage of users that made or encountered the same error
o Critical: impossible for users to completetasks
o Time to complete task: The average time it took to complete a given task
o Satisfaction rankings : An average of users' self-reported satisfaction measured on a o Serious : frustrating for many users
o Minor: annoying, but not going to drive users away
numbered scale . . . ent issue than dislikingthe
3. Qualitative data analysis : Qualitative data is just as, if not more, important than ' For example : being unable to complete payments ts a more urg . hi th
. ' ectedimmediately,w I e e
quantitative analysis because it helps to illustrate why certain problems are happening, Slte s color scheme. The first is a critical issue that shouldbe corr .
. for sometime in the future.
and how they can be fixed. Such anecdotes and insights will help you come up with \ second is a minor issue that can be put on the back burner
solutions to increase usability.
S. Compile a report of your results \tstolmproveyour
o Sort the data in your spreadsheet so that issues involving the same tasks are grouped • T b . 1· te\yusetheresu h se
o enefit from website usability testing. you must u nma nissues,leverage t o
together. This will give you an idea of how many users experienced problems with a site o , . . . d the most commo
certain step (e.g., check out) and the overlap of these problems. Look for patterns and · nee you ve evaluated the data and pnonnze
insightst0 - ·t 's usability,
repetitions in the data to help identify recurring issues. encourage positive changes to your SI e
"fl:'# Ttchl<n~
V ru ~i iti t•~~ s
)
:tion oesign (MU) 6-29
- ---- - - \

(Steps for Conduc


EvaluationTechniquesand Framework
6-28
i.J Usor InteractionDesign (MU) Step
= Step 1: Build (or Review)a p;- Lo ,

~r9
&ram g1c Model
Step 2 : Define Purpose and Scope
Step 3 : Budgetfor an Evaluation
Step 4 : Selectan Evaluator
Step 5 : Developan EvaluationPlan
Step 6 : CollectData

8(i]
tation
1eJllet1 · Step 7 : Manage the Evaluation
lillP
j\ e orting Step 8: AnalyzeData
/Jlalysisand R p • Step 9 : CommunicateFindings

Q
Fig. 6.5.5
nt Step 10: Apply Findingsand Feedbackfor ProgramImprovement
In s'ome cases, you may have the power to just make the changes yourself.In other situations, .,,.,,,1mP'°""''
you may need to make your case to higher-upsat your company • and when that happens,
you'll likely need to draft a report that explains the problems you discovered and your

proposedsolutions.
Qualitiesof an effectiveusabilityreport
It's not enough to simply present the raw data to decision-makersand hope it inspires·change.
O~
A good report should :
• Showcase the highest priority Issues : Don't just present a laundry list of everythingthat

went wrong. Focus on the most pressingissues. ·


• Be specific : It's not enough to simply say "users bad difficulty with entering payment
information." Identifythe specificarea of design, interaction,or flow that caused the problem.
• Include evidence : Snippets of videos, screenshots, or transcripts from actual tests can help
make your point (for certain stakeholders,actuallyseeing someone struggle is more effective
Fig. 6.6.1 : Steps101 conductingan evaluation
than ~imply bearing about it second-hand).Consider presenting your report in slideshow

form, instead of as a written document,for this reason.


Present solutions : Brainstorm solutions for the highest priority issues. There are usually
I Phase I : Planning
many ways to attack any one problem. For example: if your problem is that users don't · uIId a programlogic model
Stepl ·B
understand the shippingoptions that could be a design issue or a copywritingissue. It will be
up to you and your team to figure out the most efficientchange to shift user behavior in the -
\ Ale,i,model ''°
mve " ' "'"'"'"'""'''"'wrl""••'"''°"¢w
' It can he\P you focus your evaluationby identl
. •fy'mg .·
directionyou desire.
• Include positive findings In addition to the problems you've identified;· include any 0 Questionswant/need answered
meaningful positivefeedbackyou received.This nelps the team know what is working well so 0 Aspects of program to evaluate -------- V ,, , . '
----........ . n Design (MU)
nteract10 -
V User Interaction Design (MU) 6-30 EvaluationTechniquesand Frarn-.
fU~ .Process -"\
""Ork
Activities
Build a ........, ~.

8
program
logic model

Develop an Define
evaluation ./'I purpose and r,.re How many,
Are activities
plan C:---V scope resources how much Change In
delivered as
Planning adequateto intended? was i\n~.ieoge,
1rnp1e111ent prO<Juced? attiludes,

I
proflrar11? skills?
Budget for tj? Select an
an evaluator
evaluation

o
o
Information to collect
Fig. 6.6.2 : Steps of Plannlng

Measures and data collectionmethods


What will be measured?· What data
----------

, sudgetfor an evaluation
-
areav~labieloreva~a,~n'

Fig. 6.6.4 .. Purpose and Scope


.

o Evaluationtimeframe SttP 3•
common cost categoriesare as follows :
• How to read a logic model
, studY Design
0 Logic models read from left to right
Type of study design
0
0
There are two "sides" to a logic model-a processside and an outcomesside
The size of the study

I Process ] q~mes I o The l~vel of expertiseand experienceof the evaluator


o Data colle~tion expenses
If If , Stafftime
, Materials,equipment,and supplies
Then Then Then
Then Then • Travel
Fig. 6.6.3 : Logic model ' Data collection
' An evaluatoris an individualor team of people responsibleforleading theevaluation.
Step 2 : Define purpose and scope
Each evaluation should have a primary purpose around which it can be designedand planned. ' Potentialoptions for an evaluatorinclude: •versity personnel, independent
I
0
An external source (e.g., consultingfirm, col ege or um
• Why is the evaluation being done? What do you want to learn ?
consultant)
• How will the results be used? By whom? An internal source - program.staffmember(s] ember or to relyonanexterna\evaluator.
' 0
sta
• Additionalthings to consider: Akey decision is whether to use an internal ff ro
o Specificprogram requirements ' Factorsto con~iderwhen making this decision :
o Resourcesavailableto carry out the evaluation 0 Purpose of the evaluation
0 staff workloadand expertise
raction oes1y11 \IVIUJ
1018
user rveys
Y User Interaction Design (MU) 6-32 EvaluationTechniques and Frarnewor11
..... ~~, .
6
l~~o,al ,u , help yo, find •,t •hat•••""
h•h .
ethn,quesanoF1amewo11<.
o Program resources (e.g., financial,necessarycomputersoftwar~, etc.)
...
0
,v,y
.,; ral su . h they're doing so. For exarnp\e,if Your coa\·ti
"°".... •
areta ing Part Inand
o Specificprogram requirements ' "'"
~• I"' w
h><
· Y'"""'•""'•~......,, d
' •°'"%g ..
• " ''"" '
Step 4: Select an evaluator
redUce car
••'"'' aeddoo~m
wswlth key participants • ... •..,•• .. 00 •••~<o,,,
Considerwhether your potential evaluatorhas :
Formal training In evaluationstudies 10 •pants · · th
,<'"' .
ake ul,adee• lo Y'"«•-•lty, ""'• • y., '-'• etc "• '"• • •• ' "'
• Experienceevaluatingsimilar programs/interventions
deYP
•"'"
n re ur lnit1 .
I
., of. loteMewtog em• Ot...,..._.•
.
olt;~ ,.., , "

• Experiencethat matches the design, methods, and/or approachof your pla~nedeva)uation " .UY '" ..,ti., Qo help yo, "• "°".._.,'"'"••""'••••. ldoo;
• Cap~cityto handle the scale of your planned evaluation IY of yo d give""""'"""'
•(actors m1,~• - - ,...,,.. .,. , ,-., "'""'
• Personal style that fits your program staffor organization "" 0 ""' ,ffecte
you
do<ght - '°""°""'"'"'""'°"""'"'~°"'
ini. ·ative,an level ln dlcators of Impact
• Evaluator's independence : No conflictsof interest related to the evaluation- Able to provide
· cot11tll d-and-true

--
an unbiased assessment of the program's outcomes/impacts are teste
These unity· .
m,rtrei, fint help "" """ the •'"- "''"' • ,~
Step 5 : Develop an evaluationplan . tive. 1
Decide which methods will best address those questions. Here is a brief overview of some
P
common evaluation methods and what they work best for. V{IIY develo anhat
Clarifies w
. the evaluations ou
direction
.evalua
. an
. h Id take based on priorities,resources, time, ' l

\
\

• Monitoringand feedbacksystem
si.i\lS
This method of evaluationhas three main elements:
, createsshared understanding of the purposeand use of evaluationresults
o Process measures : These tell you about what you did to implementyour Initiative;
o Outcomemeasures: These tell you about what the results were; and
Fostersprogram transparency to stakeholdersand decisionmakers I
, Helps Identifywhether there are sufficientprogramresourcesto carry out the evaluation
I
o . Observationalsystem : This is whatever you do to keep track of the initiativewhile it's

happening. , Facilitatessmoother transition when there is staffturnover.


\
• Membersurveys about the Initiative,
It might seem like an overly simple approach, but sometimes the best thing you can do to find
Phase II : Implementation
\
out if you're doing a good job is to ask your members. This is best done through member
surveys. There are three kinds of member surveys you're most likely to need to use at some \mplernentati?n
point:
o Membersurvey of goals: Done before the initiative begins - how do your members
Manage the
evaluation
\ \
think you're going to do?
Fig, 6,6.5 : lmplemenllonol Evaluationplan
o Membersurvey of process: Done during the initiative - how are you doing so far?
o Member survey of outcomes : Done after the initiative is finished - how did you do?
Step 6 : Collectdata
• Go_al attainment report ' Where to find data?
If you want• to know whether your proposed community changes were truly
o Existing
accomplished-and we assume you do-your best bet may be to do a goal attainment report.
o · New or
Have your staff keep track of the date each time a community change mentioned in your
o both
action plan takes place. Later on, someone compiles this information
V Toclll,,..;..J
P Ill It I t I~~,
)
y User InteractionDesign (MU) interactionDesign (MU)
6-34 EvaluationTechniquesand F l user s.
• What type of data? rarnewor11 p1untcatlon: Maintainc 3S
cofll . on11nun1 Eva1u
o Quantitative , projectkick-offmeeting . tatton lhto auon r!!Clin·
0 • • UghOUt ~Ues a
o Qualitative Regular, ongomg meetingsI thePr lid Framew
o • . okeep th O/ect or11
o orboth Ad hoc eevaluatto
0 nllloVj
• What type of data meets your evaluationneeds 7 ~o nttor : Continuallymonlto
r Progress
nginatirn
elyandefli
• Existingdata (I.e., secondarydata) o
Review and provide feedb
ack
on the eva1
uau0n
clent manner
instrllments,
. reports) . on deliverables and the el'aluator
o Internal program data (e.g., participantrecords,program logs, performancemeasurement Enforce the schedulefor com I . [e.g., evalu sWork:
data) o p eting tasks atlon plan
0 A5sess the evaluator'sskills and lllak . ' design,
External datasets / administrativedata (e.g., student records test scores medi'cal . O· . and perfor ead1usnn
' • records Keep up with invoicing/pa""' mancethrough entsasneeded
test scores, Census data, unemploymentinsuranceclaims) ' o , ..,ents out theeval .
• New data (I.e., primarydata) · ,,,,.v1desupportand feedbackas
rov needed
Uation
0
Data from surveys,assessments,interviews,and observations offer advice and guidanceto hel
0 , ptroubleshoo
• Quantitativedata
0
Ensure the evaluatorhas acces
. . .
. tlssues, asneed d
s to the informatl e
ProVIde continuousinputand i db on required
o Numerical infonnation that can be counted, quantified, and mathematicallyanalyzed 0 ee ackon th
(e.g., test scores, ratings) 1pi,ase Ill : Analysis and reporting eevaluatorswork
o Quantitativedata are systematicallycollected; recorded, and analyzed
• Qualitativedata
o Narrative infonnation that describesthe study subject(s)and context (e.g., transcriptsof
interviews and focus groups,field notes from observationof certainactivities)
o Qualitativedata are systematicallycollected, recorded,and analyzed
Fig. 6.6.7: Analysisand reporting
• Individual anecdotes and testimonialsare _not qualitativedata unless systematicallycollected,
recorded, and analyzed Step8:Analyzedata
• Quantitativedata analysis
Step 7 : Manage the evaluation
o · Statisticalanalysis(mean, median,chi-square, t-tes~ ANOVA. regression,etc)
• Communicate
Qualitativedata analysis
• Monitor
° Contentanalysis(cross-site analysis,theme identification,case studydescriptions)
• Support
Communicate , ~-. 0 Example data collectionand analysiscrosswalk.
-r;, Processor ImpactEvaluationofyour Program
. ~/
i~ .L.- When IHow will you 11

Support
. '
Monitor
I
Indicatorsor What is collected and analyzethe data 1
Outcomeof collectedand bywhom?
interest how?
'
Fig. 6.6.6 : Manage the evaluation

.T
..
V ru1 11c1 11a1s
__,I
__ . n oesign (MU)

V User Interaction Design (MU) 6-36 Evaluation Techniques and Framework Impact Evaluat1 ------ --•alll:l ti ... _ -

Process or Impact Evaluation of your Program -- outcome of What t


Interest collect,
Research Indicators What ls From When How wilt you -
question collected and collected and ho
whom/ analyze the data
how? data and by . 7
\\1llat i!IIPact uywhorn7 -- -,.., me data1
sources 7 whom? statusloyment I EmploYment Participating The evaluator Calculatethe
Is the job a) Member a-c)M ember a-c) External a-c) External a-c) Generate
does the job
rtadioess
Emp "'"" 1,
measured
1, Iii,'"""•...,...,. """'""
servesas the the surveyat averageoutcome
1,
readiness use of report details evaluator evaluator frequencies on iJ!terveotion
program program about workshops collects the collects the with a intervention two time In the Intervention
use of curriculum ~3~e on
being curriculum In logs with
survey &roup points: outcome In the
workshop workshop ; average duration AbilltY to receiVing no
Implemented during pre-defined logs logs quarterly of workshops ; and secure and job .ssistance Beforethe Job Intervention
as designed workshops categories of sefVicesserve readiness group minus the
reporting
quarterly a) Quarterly average rate of inaiotain
e111ployment· as the program difference In \
b) Duration a) Quarterly observations workshop begins
of
workshops
a-b) observations
of workshops .
observations bytl_le attendance .,rtlatiVeto a
co111parison
comparison \ 1 year after
average outcome
in the comparison
.\
by the evaluator( s) c) Generate group the job
group? group before and
c) evaluator (s) using frequencies .and readiness after treatment
Participant using structured averages on programis (difference in
workshop
rates
structured
observation
protocols
observation
protocols
quantitative data
(e.g. ratings
scales, frequency
lli\1 9 : communicate findings
implemented differences
method) \I
, Who are the potential target auc!.iences7
scales) and
thematically code Program staff , agency personnel, stakeholders,beneficiaries, funders,etc.
and analyze What are potential tools for communicatingfindings1·
open-ended Formal report, shorter m; mos, PowerPoint briefings,etc.
co.mments/notes

Impact Evaluation of a Job Readiness Program


, What ls an evaluation report 7
o Key product resulting from evaluation \
Research question I Outcome of
interest
I What Is
collected and
From When How will you o A written document that objectivelydescribes: \
whom/ collected analyze the a) Program background
how? data and by data? b) Evaluation purpose, methods,procedures,and limitations
sources 7 whom?
What impact does Employment I Employment c) Evaluation results
the job readiness
intervention have
on
status status is
measured with a
survey
d) Conclusions and recommendations
e) Lessons learned ,
\
Ability to secure f) Questions for future research
and maintain ' When reporting findings, lt ls lmportantto:
employment 0
Report positive, as well as negativefi ndings ise and warrantfurther

l
relative to a
o ·1 conclusive, but show prom
compari son woup? Present results that are not necessan Y
examination
0
Be careful not to overstate your findings V!~~::a~~
_,
J
--..
6-38 Evaluation Techniques and Frarnewor11
,.., Usor Interaction Design (MU)
-4.a;

• Other useful products for communicationare as follows:


o Executivesummary of final report (5-10 pages)
o Short research briefs (2-4 pages)
a) Graphics and pictures
b) Bulletedinformation
\
c) Non-technicalmemos
Phase IV : Action and Improvement steps

Apply Findings
and Feedback
for Program

---- --
Improvement

Fig. 6.6.8 : Action and Improvement

Step 10 : Apply findings and feedback for program improvement


• Evaluationfindings can support these decisions and actions:
• Program improvement or assessing fit
1I
• Buildingthe evidence base
o Implementation to impact
'-
\
• Evaluation findings can support these decisions and actions:
\
o Scaling
o Implementing change \
Review Questions

(5 Martes)
a. 1 Write a short note on : Evaluation
(10 Marks)
a. 2 Explain·the types of Evaluation.
(10 Marks)
Q. 3 Write a short note on : Case studies of Evaluation.
(10 Marks)
a. 4 Explain DECIDE Frameworkof Evaluation.
(5 Marks)
a. 5 Write a short note on : UsabilityTesting.
(10 Marks)
a. 6 Explain steps of conductingan evaluation.
(5 Marks)
a. 7 Write a short note on : conducting.

,I

You might also like