0% found this document useful (0 votes)
140 views

UID Merged Compressed PDF

Uploaded by

kiki
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
140 views

UID Merged Compressed PDF

Uploaded by

kiki
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 665

10 important products Steve Jobs

helped bring to the world

Ref: VentureBeat, http://venturebeat.com/2011/10/06/10-important-


products-steve-jobs-helped-bring-to-the-world/
Steve Jobs the creator of Apple
 Apple changed the way people interact with computers.

 One of Jobs’ biggest notions was the idea of building a


device focused on the needs of a user and not just
building technology for the sake of it.

User Interface Design - UTCN 2


 Despre designul performant: "Harul meu cuprinde
doua elemente - concentrare si simplitate. Simplu poate
fi mai dificil decat complex. Trebuie sa exersezi mult,
pentru a reusi sa gandesti aerisit si simplu. Dar tot
efortul este rasplatit, pentru ca dupa ce ai ajuns acolo,
poti muta muntii din loc" (BusinessWeek, 25 mai 1998)

User Interface Design - UTCN 3


 Despre design: "Este greu sa creezi un design cu
ajutorul grupurilor de brainstorming. De multe ori,
oamenii nu stiu ce vor, pana nu le arati" (BusinessWeek,
25 mai 1998)

User Interface Design - UTCN 4


 Despre munca: "Munca o sa-ti ocupe o mare parte din
viata si singurul mod prin care poti sa fii cu adevarat
multumit este sa crezi in ceea ce faci. Daca nu ti-ai gasit
deja vocatia, continua sa o cauti. Nu te multumi cu orice.
Vei sti cand ti-ai gasit destinul, iti va spune inima. Ca si
intr-o relatie minunata, o sa devina din ce in ce mai
legata, de-a lungul anilor. Cauta pana gasesti. Nu te
multumi cu orice" (Discurs Stanford, iunie 2005).

User Interface Design - UTCN 5


Apple II

 1-MHz processor
 4KB of RAM
 Audio cassette interface for programs and data storage
 External 5.25-inch floppy disk drive

User Interface Design - UTCN 6


Apple II
 Launched in June 1977, the Apple II was the first
successful mass-market PC.

 Jobs and Apple cofounder Steve Wozniak designed the


Apple II, and it changed computing around the world.

 The first Apple II had specs that were quite good for the
time:
 1-MHz processor
 4KB of RAM
 Audio cassette interface for programs and data storage
 External 5.25-inch floppy disk drive

User Interface Design - UTCN 7


Lisa

User Interface Design - UTCN 8


Lisa
 While Apple’s 1983 Lisa computer was a failure of sorts
because of its $10,000 price tag.

 It did introduce many computing features that continue


to drive computing innovation.

 The Lisa was one of the first computers to offer:


 Multitasking
 Document-based graphical user interface
 Optional hard drive
 Bundled office software

User Interface Design - UTCN 9


Macintosh

User Interface Design - UTCN 10


Macintosh
 The original Macintosh computer was advertised during
the Super Bowl in 1984 and famously decried the status
quo of personal computing with imagery related to the
film 1984.

 Macintosh redefined PCs and was the first commercially


successful personal computer to feature a graphical user
interface and a mouse.

 The Macintosh line faltered in the early 90s but began to


regain steam again with the iMac.

User Interface Design - UTCN 11


iMac

User Interface Design - UTCN 12


iMac
 A year after Jobs returned to the helm at Apple in 1997,
the company launched the distinctive (and divisive) first-
generation iMac.

 The design was a radical departure from the Macs of old


and helped Apple regain its footing with high-minded
consumers.

 Designer Jonathan Ive, with oversight from Jobs, led the


design team in the creation of the iMac, and he later
helped design most of the next products.

User Interface Design - UTCN 13


iPod

User Interface Design - UTCN 14


iPod
 The iPod MP3 player looked a little wacky when it first hit
the scene in 2001.

 Outside of the Walkman, there really wasn’t a single


portable music device that changed music so drastically.

 The first iPod retailed for $400 with 5GB of storage, but
now there’s a host of iPod devices ranging from the tiny
iPod shuffle to the feature-filled iPod touch, each with its
own purpose.

 The iPod line has had the best-selling music players in


the world for several years.

User Interface Design - UTCN 15


iTunes

User Interface Design - UTCN 16


iTunes
 It wasn’t enough that Jobs revolutionized the MP3
player; he also needed to give people the software to
manage the content.

 iTunes started as an interface for playing your music


files, but now it is one of the largest music stores on the
planet.

 iTunes accounts for more than a fourth of music sales


happening today, and the trend will likely continue in its
favor as the iPod continues its reign as the most popular
music player.

 Apple also recently introduced iCloud, which will interact


with iTunes and Apple products so users can store music
in the cloud rather than solely on their devices.

User Interface Design - UTCN 17


MacBook Pro

User Interface Design - UTCN 18


MacBook Pro
 The MacBook Pro’s launch in January 2006 showed that
Apple was once again getting serious about innovating in
the laptop space with high-end parts and aluminum
bodies.

 The Pro’s design largely took cues from PowerBook G4


but included Intel Core Duo processors rather than
PowerPC chips, a move that opened up a lot more
potential for Apple’s machine and showed the “Wintel”
alliance wasn’t going to last.

 The MacBook Pro paved the way for the MacBook Air a
few years later.

User Interface Design - UTCN 19


iPhone

User Interface Design - UTCN 20


iPhone
 iPhone changed the smartphone landscape as we know it
when it landed in June 2007.

 Steve Jobs’ dedication to a strong user interface showed


with his focus on a simple mobile operating system
paired with a 3.5-inch touch screen.

 iPhone now has more than 500,000 apps available for it


(Oct. 2011), and the phone is the best-selling
smartphone in the world.

User Interface Design - UTCN 21


MacBook Air

User Interface Design - UTCN 22


MacBook Air
 The first MacBook Air didn’t seem as important as it is
now, but that just shows how Jobs was thinking ahead
yet again.

 When Apple launched the MacBook Air in January 2008,


it seemed like a stripped-down laptop that ditched the
CD-ROM a little too soon.

 Now that we’re in the age of cloud computing and


streaming media, the need for physical media is
essentially gone.

 The MacBook Air and Intel’s “Ultrabook” followers will


continue to change how we look at laptops and personal
computing.

User Interface Design - UTCN 23


iPad

User Interface Design - UTCN 24


iPad
 The January 2010 launch of the iPad tablet showed that
Jobs yet again was ahead of the curve by bringing back
tablet computing.

 Tablets were first shown off by Microsoft in 2001, but


tablet PCs didn’t take off with consumers until Jobs
paired a tablet with the simple iOS mobile operating
system and a variety of compelling apps.

 The iPad is by far the best-selling tablet in the world and


many analysts believe it will stay that way, even with
competitors like Amazon Kindle Fire and Samsung
Galaxy Tab.

User Interface Design - UTCN 25


User Interface Design - UTCN 26
Apple is still Steve Jobs

User Interface Design - UTCN 27


User Interface Design - UTCN 28
Graphical User Interface Timeline

Selections from Nathan's Toasty Technology page :


http://pla-netx.com/linebackn/guis/guitimeline.html

Selection from Computer History Museum:


http://www.computerhistory.org/
1973

o April 1973, the first


operational Alto computer
is completed at Xerox
PARC.
o The Alto is the first
system to pull together
all of the elements of the
modern Graphical User
Interface.
o Features:
o 3-button mouse
o Bit-mapped display
o Graphical windows
o Ethernet network

User Interface Design - UTCN 2


1980

o 1980: Three Rivers


Computer
Corporation
introduces the Perq
graphical workstation.

User Interface Design - UTCN 3


1981

o 1981 June: Xerox


introduces the Star,
the commercial
successor to the Alto.
o Features:
o Double-clickable
icons
o Overlapping
windows
o Dialog boxes
o 1024*768
monochrome display.

User Interface Design - UTCN 4


1983
o 1983 January:
Apple
introduces the
Lisa.
o Features:
n Pull down
menus
n Menu bars

User Interface Design - UTCN 5


1983

o Visi Corp releases Visi On, the first integrated graphical software
environment for IBM PCs.

User Interface Design - UTCN 6


1983

o Microsoft announces their new "Windows" program for the IBM PC


but does not release it until 1985.
o Notable features:
n Overlapping / resizable windows.
User Interface Design - UTCN 7
1984

o January 1984: Apple introduces the Macintosh.

User Interface Design - UTCN 8


1984

o September 1984: Digital Research announces its GEM icon/desktop user


interface for 8086- and DOS-based computers. It also was later ported to the
Atari ST.
User Interface Design - UTCN 9
1984
o June 1984: "window system
X" announced at MIT.
Versions 1-6 were
monochrome only, and ran on
DEC VS100's displays
connected to VAXen and
VAXstations 1 and 2.
o Versions 8-10 dealt with color,
for the VAXstation II/GPX. X10
is the first version that saw
widespread availability and use
on many vendor's systems.
o Version 11 was redesign for
higher performance, more
window management styles,
extensibility and better
graphics capability

User Interface Design - UTCN 10


1985

o 1985: Geos released for Commodore 64 and later the Apple II.
User Interface Design - UTCN 11
1985

o July 1985: Commodore introduces the Amiga 1000 with the Amiga
Workbench Version 1.0.
User Interface Design - UTCN 12
1985

o August 1985: Microsoft finally releases the first version of Windows.


o Features:
n Windows can not be overlapped, but are instead "tiled".
n Windows are not allowed to cover an area at the bottom of the screen that is
reserved for "iconized" programs.
User Interface Design - UTCN 13
1986

o 1986: Digital Research developed the GEM desktop that looked like Apple's
Macintosh.
o The new GEM desktop now has just two unmovable, non-resizable
windows for file browsing.

User Interface Design - UTCN 14


1987

o March 1987 - Apple introduces the Apple Macintosh II, the first color
Macintosh.
o Features: 640*480*256 color with 24 bits color card available.

User Interface Design - UTCN 15


1987

o Microsoft releases the second version of Windows, version 2.03.


o Features:
n Finally has resizable / overlapping windows and new windowing controls.

User Interface Design - UTCN 16


1987

o Acorn releases "Arthur" for the Acorn computer, it is the basis for RISC OS.
RISC OS 2 and 3 have a similar look, but an improved feel.
User Interface Design - UTCN 17
1988

o September 1988: Apple releases GS/OS, a 16-bit operating system with a


Macintosh-like GUI for the Apple IIGS.
User Interface Design - UTCN 18
1988

o October 1988: IBM releases OS/2 1.10 Standard Edition (SE) which added
a graphical user interface called Presentation Manager. (OS/2 1.0 was text
mode only!) The 1.10 GUI was written by Microsoft and looked like
Windows 2.

User Interface Design - UTCN 19


1988

o October 1988: The NeXT Computer is released. It includes a 25 MHz


processor, 8 MB RAM, 250 MB optical disk drive, math coprocessor, digital
processor for real time sound, fax modem, and a 17" monitor.
o NeXT Inc., Apple, Steve Jobs 1985.
User Interface Design - UTCN 20
1990

o 1990: Commodore releases Amiga Workbench 2 for the A3000.


o Features: New 3d effects, a revised menu system and many other
improvements.

User Interface Design - UTCN 21


1990

o May 1990: Windows 3.0 released by Microsoft


o Features: Program Manager shell.

User Interface Design - UTCN 22


1990

o November 1990: GeoWorks released PC/GEOS for IBM PCcompatible systems.

User Interface Design - UTCN 23


1992

o Spring of 1992: IBM releases OS/2 Version 2.0, a true 32-bit OS.
o Features a new "Workplace Shell", an object oriented user interface that is
heavily integrated with the rest of the OS.
User Interface Design - UTCN 24
1992

o March 1992: Microsoft introduces Windows 3.1. The user interface is


basically the same as Windows 3.0 but now includes their "multimedia"
enhancements.

User Interface Design - UTCN 25


1992

o September: Amiga Workbench 3 released for AGA Amigas.


o Features: Images for backgrounds, color pallet remapping.

User Interface Design - UTCN 26


1993

o May 1993 Microsoft releases the first version of Windows NT, their 32-bit
OS. They give it the version number "3.1" and use the same user interface
they do for regular Windows 3.1. Made available for Intel, Power PC, Alpha,
and MIPS systems.

User Interface Design - UTCN 27


1994

o 1994: QNX Software Systems releases the first embeddable microkernel


windowing system, the Photon microGUI.

User Interface Design - UTCN 28


1995

o 1995: Microsoft introduces Windows 95 on August 24th.

User Interface Design - UTCN 29


1995

o October 1995: Be introduced BeOS at Agenda 96. The first version was
designed to run on a custom multiprocessor system known as the
"BeBox". Later made available for Power PC and Intel systems.

User Interface Design - UTCN 30


1996

o 1996: New Deal releases New Deal Office 2.5, which was formerly PC-
GEOS.
User Interface Design - UTCN 31
1996

o IBM Releases OS/2 Warp 4 with a significant facelift for the Workplace
Shell.

User Interface Design - UTCN 32


1996

o Microsoft releases Windows NT 4.0 with the same user interface as


Windows 95.

User Interface Design - UTCN 33


1997

o July 1997: Mac OS 8 is finally released. Selling 1.25 million copies in less
than 2 weeks, it becomes the best-selling software in that period.

User Interface Design - UTCN 34


1998

o June 25, 1998: Microsoft releases Windows 98.


o Features: Internet Explorer Web browser application takes over the role of
the Windows shell, advertising right on the desktop, entire help system
replaced by Internet Explorer.

User Interface Design - UTCN 35


1998

o November 22, 1998: Shane Brooks Releases 98Lite, an installer that


removes or prevents the installation of Internet Explorer with Windows 98.
o Features No Internet Explorer or advertising, all the hardware support of
Windows 98, faster boot time, and the more responsive Windows 95 shell.

User Interface Design - UTCN 36


1999

o March 1999 - Apple releases Mac OS X Server, a Unix based OS with their
Macintosh GUI.

User Interface Design - UTCN 37


2000

o January 5, 2000: Apple announces Aqua, the new look for their upcoming
MacOS X client.

User Interface Design - UTCN 38


2000

o February 17, 2000: Microsoft Windows 2000 (i.e., Windows NT 5) becomes


available in stores.
o Features: The Internet Explorer web browser application finally takes over
the Windows NT UI.

User Interface Design - UTCN 39


Graphical User Interface Timeline
– Part 2
Contents

o Compiz
o Mac OS X
o Vista

User Interface Design - UTCN 2


Compiz
o One of the first compositing window managers for the X Window
System that uses 3D graphics hardware to create fast compositing
desktop effects for window management
o minimization effect and a cube workspace are implemented as loadable
plugins.
o Conforms to the Inter-Client Communication Conventions Manual
standard
o Can substitute for the default Metacity in GNOME or KWin in KDE
o Inspire from Exposé of Apple's Mac OS X and an Alt-Tab application-
switcher that uses live previews, instead of just icons

References:
http://en.wikipedia.org/wiki/Compiz
http://www.compiz.org/

User Interface Design - UTCN 3


Compiz
o Released as free software by Novell (SUSE) in January 2006 in the
wake of the also new Xgl
o software packages:
1. Compiz, (also Compiz-core) which contains only the core
functionality of compiz and base plugins
2. Compiz Fusion, consisting of the plugins, decorators, settings tools
and related applications from the Beryl and Compiz communities

User Interface Design - UTCN 4


Features - Included plug-in
A few examples:
n Annotate: draw things on top of all windows
n Cube: each virtual desktop becomes a face on a cube
n Fade: windows fade in and out
n Minimize: windows minimize (and maximize/restore) with an
animation effect
n Move: window moving
n Place: placement of new windows
n Resize: window resizing
n Rotate: the desktop cube can be rotated
n Scale: an overview of all open windows (similar to Mac OS X's
Exposé)
n Svg: allows plugin developers to load svg files as textures. Other
image plugins can be added so that extra image types will be
seamlessly supported
n Water: ripples trailing mouse and rain effect
n Zoom: magnifies a part of screen

User Interface Design - UTCN 5


Features - Community plug-in
A few examples:
n Animation: animation effects for window events
n Bs: brightness and saturation control
n Cube Gears: 3D animated gears in the center of the cube
n Cube Reflection: Draws a reflection of the cube
n Group And Tabs: group windows and access them through a tab
bar similar to the well-known feature in browsers
n Negative: inverts color of a window or screen
n Reflection: watermarks window decorations, similar to Aero-Glass
by default
n Screenshot: mode to capture screen regions with the mouse
n Shift Switcher: Provides Flip 3d and Cover Switching of windows
n State: set default opacity and other options for types of windows
n Mousegestures: advanced mouse gestures to control effect

User Interface Design - UTCN 6


Compiz

User Interface Design - UTCN 7


Compiz

User Interface Design - UTCN 8


Compiz

User Interface Design - UTCN 9


Compiz

User Interface Design - UTCN 10


User Interface Design - UTCN 11
User Interface Design - UTCN 12
Compiz

User Interface Design - UTCN 13


User Interface Design - UTCN 14
User Interface Design - UTCN 15
Compiz

User Interface Design - UTCN 16


Compiz

User Interface Design - UTCN 17


Compiz

User Interface Design - UTCN 18


Compiz

User Interface Design - UTCN 19


Compiz

User Interface Design - UTCN 20


Graphical User Interface Timeline
– Part 3
Contents

 Compiz
 Mac OS X
 Vista

User Interface Design - UTCN 2


Mac OS X
 Mac OS X is the tenth major version of Apple's operating system for
Macintosh computers
 Since 2002 has been included with all new Macintosh computer
systems. It is the successor to Mac OS 9, the final release of the
"classic" Mac OS, which had been Apple's primary operating system
since 1984
 Unix-based operating system, built by Steve Jobs on technologies
developed at NeXT between 1985 and Apple's purchase of the
company in early 1996

References:
http://en.wikipedia.org/wiki/Mac_os_x
http://www.apple.com/macosx/

User Interface Design - UTCN 3


User Interface Design - UTCN 4
Mac OS X

User Interface Design - UTCN 5


Mac OS X

User Interface Design - UTCN 6


Mac OS X

User Interface Design - UTCN 7


Graphical User Interface Timeline
– Part 4
Contents

o Compiz
o Mac OS X
o Vista

User Interface Design - UTCN 2


Vista
o Windows Vista is an operating system that Microsoft released in
January 2007, after 5 years of a development programme.
o One of the most obvious changes in Windows Vista is a completely
restyled user interface.
o Aero, as the new theme is called, uses a glass-like user interface which
adds a new and much more enjoyable feel to using Windows. Aero
uses the computer’s graphics power to add a crisp and smooth feel to
the desktop, and gives the Windows desktop something to rival Apple
Macs when it comes to the user experience.
o Gone is the sluggish menu animation included with XP.
o Vista requires more resources on your PC than previous versions.
Thus, whereas Windows XP would run quite well with 512MB of
memory (or even 256MB), Windows Vista requires up to 1GB to run
even approaching smoothly. Also, the Aero user interface only runs on
a PC which has a very powerful graphics card.

References:
http://en.wikipedia.org/wiki/Windows_Vista
NVIDIA's GPUs that support the features of Windows Vista,
http://www.nvidia.com/page/technology_vista_home.html

User Interface Design - UTCN 3


Vista

User Interface Design - UTCN 4


Vista

User Interface Design - UTCN 5


Vista

User Interface Design - UTCN 6


Vista

User Interface Design - UTCN 7


Vista

User Interface Design - UTCN 8


Vista

User Interface Design - UTCN 9


Vista

User Interface Design - UTCN 10


Vista

User Interface Design - UTCN 11


Vista

User Interface Design - UTCN 12


Vista

User Interface Design - UTCN 13


Vista

User Interface Design - UTCN 14


Tehnologii noi
 Un set de pixuri?
 Pixuri cu camere de luat vederi?
 Dispozitive de calcul?
 Miniatura computerizata bazata pe bluetooth
Aceste 'instrumente
sub forma de pix'
produc si monitorul,
si tastatura pe orice
suprafata plata unde
puteti face ceea ce
de obicei realizati pe
desktopul
calculatorului.
Aceste 'instrumente sub forma de
pix' produc si monitorul, si tastatura
pe orice suprafata plata unde puteti
face ceea ce de obicei realizati pe
desktopul calculatorului.
GUI Development Concepts

User Interface Design


HCI, UI, GUI Design
“Human Computer Interaction is a discipline concerned with
the design, evaluation and implementation of interactive
computing systems for human use and with the study of
the major phenomena surrounding them.”

n As defined by the Special Interest Group on Human-


Computer Interaction (SIGCHI) of the Association for
Computing Machinery (ACM)

User Interface Design - UTCN 2


Motivation
o Computer system configuration
o New software and hardware technologies
o Complex interactivity
o Interface layout and functionality
o Large application domains
o Software engineering efficiency

User Interface Design - UTCN 3


Objectives
o User control on application entities
o Active entity based structure
o Visual programming techniques
o End user programming and development
o Virtual space navigation
o Photorealistic presentation
o Distributed functionality
o Parallel, distributed and cooperative processing
o Compatibility with new technologies
o Efficient user interface design methodologies

User Interface Design - UTCN 4


Applications
o Interactive systems
o Software engineering
o Database systems
o Scientific visualization
o Animation
o Intelligent user interfaces
o Authoring tools
o Cooperative work
o Computer aided education
o Distributed interfaces
o Web applications

User Interface Design - UTCN 5


Computing context
o Trend
n smaller, cheaper, faster, more intimate, intelligent
objects
o Computers need to become invisible
n hide the computer in the real world
o Ubiquitous / Tangible Computing
n put the user inside the computer
o Virtual Reality

User Interface Design - UTCN 6


User interfaces

C Computer

R Real world

Jun Rekimoto and Katashi Nagao. The world through the computer: computer augmented interaction
with real world environments. UIST ’95, pp. 29–36 (1995).

User Interface Design - UTCN 7


GUI - Graphical User Interfaces
o Separation between real and digital worlds
Windows, Icons, Menus, Pointer (WOMP), metaphors, direct
manipulation

User Interface Design - UTCN 8


Ubiquitous computing

Kim M.,Chae K., DMP: Detouring Using Multiple Paths against Jamming Attack for Ubiquitous
Networking System. Sensors Journal, V0l 10(4), pp.3626-3640 (2010).

User Interface Design - UTCN 9


Virtual reality
o Immersive VR
n Head mounted display, gloves
n Separation from the real world

Ervin R., Owen T., Innovative virtual reality technology revolutionizes stroke therapy. Medill Reports -
Chicago, Northwestern University (2012).

User Interface Design - UTCN 10


Augmented reality - Definition
o Combines Real and Virtual Images
n Both can be seen at the same time
o Interactive in real-time
n The virtual content can be interacted with

o Registered in 3D
n Virtual objects appear fixed in space

Real objects

User

Virtual objects

Azuma, R. T., A survey of augmented reality. Presence, 6(4), 355-385 (1997).

User Interface Design - UTCN 11


Augmented reality - Examples

User Interface Design - UTCN 12


Virtual reality vs Augmented reality
o Virtual Reality: Replaces Reality
n Scene Generation: requires realistic images
n Display Device: fully immersive
n Tracking and Sensing: low accuracy is enough
o Augmented Reality: Enhances Reality
n Scene Generation: minimal rendering is enough
n Display Device: non-immersive
n Tracking and Sensing: high accuracy needed

User Interface Design - UTCN 13


Milgram’s Reality-Virtuality continuum

P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays. IEICE Transactions on
Information and Systems, E77-D(12), pp. 1321-1329 (1994).

User Interface Design - UTCN 14


Interactive applications

Functional Interactive
Component Component
User
outputs,
Screen,

Interface Objects
App. Objects Interface Operations User
App. Operations Interaction Techniques
User
inputs
mouse,
...

Internal External
Dialogue Dialogue

User Interface Design - UTCN 15


GUI example

User Interface Design - UTCN 16


GUI example

User Interface Design - UTCN 17


GUI example

User Interface Design - UTCN 18


GUI examples

User Interface Design - UTCN 19


GUI examples

User Interface Design - UTCN 20


GUI examples

User Interface Design - UTCN 21


GUI examples

User Interface Design - UTCN 22


GUI examples

Complex Interaction
Techniques

User Interface Design - UTCN 23


GUI examples

User Interface Design - UTCN 24


GUI examples

User Interface Design - UTCN 25


GUI examples

User Interface Design - UTCN 26


GUI examples

User Interface Design - UTCN 27


GUI examples

User Interface Design - UTCN 28


Structure of interactive Web apps

o Static HTML files


o Dynamic HTML pages
o Java assisted editing
o Dynamic Java

Static HTML files:

WEB Server
HTTP
TCP/IP WEB
Browser
HTML
Files

User Interface Design - UTCN 29


Web Appl: dynamic HTML pages

Application WEB
Server Server
WEB
Browser
Database
client- Scripts CGI HTML HTTP
server NSAPI/ Files TCP/IP
network ISAPI

User Interface Design - UTCN 30


Web Appl: dynamic java

Java Server
JDBC
RPC
Database Java Programs TCP/IP

Client

Applets
WEB Server

HTML Files HTTP


Applets

User Interface Design - UTCN 31


GUI development concepts
1. Dialogue independence
2. Structure modeling
3. Representation techniques
4. Interactive techniques
5. Rapid prototyping
6. Development methodology
7. Control structures

User Interface Design - UTCN 32


GUI development concepts
1. Dialogue independence
Internal/external, Data structures, Knowledge, Semantics Functionality
2. Structure modelling
Task analysis (GOMS, CLG, TAG, SSOA)
Structure description (linquistic and nonlinguistic models, event oriented)
3. Representation techniques
BNF, Regular expressions, Context-free grammars, State transition diagrams, Petri Nets,
Event based, UAN, …
4. Interactive techniques
Gestures (click, press-down, release, press-timer, range, drag)
Simple (radio, command and push buttons, check boxes, slider, edit box, …)
Complex (menu, dialogue box, combo box, grid, panel, marks, drag and drop, ..)
5. Rapid prototyping
Revolutionary/ evolutionary, Interface/ whole system, Intermitent / continuous
6. Development methodology
UI generators, scenarious, scripts, UIMS, object oriented, active object based, multi
agent based, model based, by demonstration,…
7. Control structures
Sequential/asynchronous dialogue, Local/global, Functional and interface, dominant
control, Mixed/balanced

User Interface Design - UTCN 33


1. Dialogue independence
o Separate the functional and interactive components
o Internal/external
o Data structures
o Knowledge modeling
o Semantics/Functionality

Functional Interactive
Component Component User
outputs,
Screen,

Interface Objects
App. Objects User
Interface Operations
App. Operations
Interaction Techniques User
inputs
mouse,
...
Internal External
Dialogue Dialogue

User Interface Design - UTCN 34


2. Structure modeling
o Task models
o Goals, Operators, Methods, and Selection (GOMS)
o Command Language Grammar (CLG)
o Task Action Grammar (TAG)
o Syntactic-Semantic Object-Action (SSOA)
o Structure description
o Linguistic model
o Nonlinguistic model
o Architectural abstractions

User Interface Design - UTCN 35


GOMS
o GOMS is an acronym that stands for GOALS, OPERATORS, METHODS, and
SELECTION RULES.
o Developed by Card, Moran and Newell (1980)
o A GOMS model is composed of METHODS that are used to achieve specific
GOALS. The METHODS are then composed of OPERATORS at the lowest level.
The OPERATORS are specific steps that a user performs and are assigned a
specific execution time. If a GOAL can be achieved by more than one METHOD,
then SELECTION RULES are used to determine the proper METHOD.
o GOALS are a specific state that a user wants to occur. Goals are intentions, what
you would like to be true. Example: selecting a file through a graphical user
interface. Tasks – are actions, how to achieve it.
o OPERATORS are the low level acts (perceptual, cognitive, and motor) that bring
about changes. OPERATOR execution times are often determined from average
execution times taken from a number of users.
o METHODS are the sequence of OPERATORS used to achieve a GOAL. GOMS models
assume an expert user. There are more than one way to achieve a goal.
o SELECTION RULES are IF THEN type operations that are used to select between
multiple METHODS to achieve a single GOAL.

Ref: http://ei.cs.vt.edu/~cs5724/g2/index.html

User Interface Design - UTCN 36


GOMS example
GOAL: CLOSE-WINDOW
. [select GOAL: USE-MENU-METHOD
. MOVE-MOUSE-TO-FILE-MENU
. PULL-DOWN-FILE-MENU
. CLICK-OVER-CLOSE-OPTION
GOAL: USE-CTRL-W-METHOD
. PRESS-CONTROL-W-KEYS]

For a particular user:

Rule 1: Select USE-MENU-METHOD unless another


rule applies
Rule 2: If the application is GAME,
select CTRL-W-METHOD

Ref: Alan Dix at al.:”Human Computer Interaction”. Third edition.

User Interface Design - UTCN 37


CLG
o Command Language Grammar (CLG)
o There are three components and six levels of the Command Language Grammar
(Moran, 1981)

Conceptual component Task level


Semantic level
Communication component Syntactic level
Interaction level
Physical component (Spatial layout level)
(Device level)

User Interface Design - UTCN 38


CLG
o Task level: The user comes to the system with a set of tasks that he wants to accomplish.
The purpose of the Task level is to analyze the user’s needs, and to structure his task domain
in a way that is available to an interactive system. The output of this level is a structure of
specific tasks that the user will set for himself with the aid of the system.

o Semantic level: A system is built around a set of objects and manipulations of those objects.
To the system these are data structures and procedures; to the user they are conceptual
entities and conceptual operations on these entities. The Semantic level lays out these entities
and operations. They are intended to be useful for accomplishing the user’s tasks, since they
represent the systems functional capability. Thus, the Semantic level also specifies methods
for accomplishing the tasks in terms of these conceptual entities and operations.
o Syntactic level: The conceptual model of a system is embedded in a language structure, the
command language, for users to communicate with the system. All command languages are
built out of a few syntactic elements; commands, arguments, contexts and state variables. The
Syntactic level lays out these elements. The “meaning” of each command of the system is
defined in terms of operations at the Semantic level, and the methods at the Semantic level
are recorded in terms of Syntactic level commands.
o Interactional level: The dialogue conventions for the user-system interaction must ultimately
be resolved as a sequence of physical actions-key presses and other primitive device
manipulations by the user and display actions by the system. The Interaction level specifies
the physical actions associated with each of the Syntactic level elements, as well as the rules
governing the dialogue.

User Interface Design - UTCN 39


TAG
o Payne, S.J. and Green, T.R.G. (1986) Task-action grammars: a model of the
mental representation of task languages. Human-Computer Interaction 2 (2) 93-
133.
o Describe the consistency of an interaction language as a Task-Action Grammar or
TAG, using a feature-based representation of tasks and a form of attribute
grammar to generate the actions necessary to accomplish those tasks.
o Understanding the user's behavior and cognitive difficulty based on analysis of
language between user and system.
o Very similar with BNF (Backus-Naur Form) notation.

User Interface Design - UTCN 40


BNF and TAG examples
o In BNF, three UNIX commands would be described as:
copy ::= cp + filename + filename | cp + filenames + directory
move ::= mv + filename + filename | mv + filenames + directory
link ::= ln + filename + filename | ln + filenames + directory

o TAG example
n consistency of argument order made explicit using a parameter, or semantic
feature for file operations

n Feature Possible values


Op = copy; move; link

n Rules
file-op[Op] ::= command[Op] + filename + filename
| command[Op] + filenames + directory
command[Op = copy] ::= cp
command[Op = move] ::= mv
command[Op = link] ::= ln

Ref: Alan Dix at al.:”Human Computer Interaction”. Third edition.

User Interface Design - UTCN 41


Linguistic model
o Language based dialogue
o Sequential dialogue modeled by grammar
o Interaction language – dialogue
o Metalanguage – describes the interaction language
o Abstract linguistic levels:
n conceptual
n semantic
n syntactic
n lexical

User Interface Design - UTCN 42


Dialogue transaction model
Secquence of
dialogue
transactions Semantical
level

Transactions
(statement) Processing

Interaction Syntactical
(token) level

Prompter Input Acknowledge


token

Lexical
Action Action level
(lexem) (lexem)

Action Action
Action input
prompter Feedback

User Interface Design - UTCN 43


Nonlinguistic model
o Dialogue cell 1. Prompter
o Interaction cycle 2. Enter (implicit input or user data)
o Prompter 3. Escape: if input = "escape" then
o Input 3.1. activate the “next event“ flag
o Action 3.2. end cycle
o Flow control 4. Help: if input = "help" then
o Dialogue scenarios 4.1. display more information
o Scenario interpreter 4.2. end cycle
5. Clock: check input
if errors then
ECHO 5.1. specify the errors
5.2. end cycle
START
6. Action: call appropriate process
VALUE
SYMBOL 7. FlowControl: activate the “next
FINISH event“ flag

PROMPTER

User Interface Design - UTCN 44


Nonlinguistic model

ECHO

START
VALUE
SYMBOL
FINISH

PROMPTER

User Interface Design - UTCN 45


Dialogue description language - HTML
<HTML>
<HEAD>
<TITLE>Fundamentals on Computer Graphics - Menu</TITLE>
<SCRIPT LANGUAGE="JavaScript">
function changeImage(name,btn,state){
document.images[name].src="../btn"+btn+state+".jpg"
}
</SCRIPT>
</HEAD>

<BODY LINK="#0000FF" VLINK="#800080" BACKGROUND="../bgmenu.JPG">


...
<CENTER><TABLE BORDER=0 CELLSPACING=0 CELLPADDING=2 WIDTH="110" >
<FORM>
...
<TR>
<TD ALIGN=CENTER VALIGN=CENTER HEIGHT="25">
<CENTER>
<A HREF="ass.html"
onMouseOver="changeImage('BtnAss','Ass','D')"
onMouseOut="changeImage('BtnAss','Ass','U')" TARGET="main">
<IMG SRC="../btnAssU.jpg" NAME="BtnAss" BORDER=0 width="110" height="27">
</A>
</CENTER>
</TD>
</TR>
</FORM>
</TABLE></CENTER>

</BODY>
</HTML>

User Interface Design - UTCN 46


Dialogue description language - SMIL
o SMIL (Synchronized Multimedia Integration Language), W3C standard,
XML language
o Multimedia presentation (i.e. audio, video, text, and graphics) and
interaction on the client site
o Example:

<par>
<a href="#Story"> <img src="button1.jpg"/> </a>
<a href="#Weather"> <img src="button2.jpg"/></a>
<excl>
<par id="Story" begin="0s">
<video src="video1.mpg"/>
<text src="captions.html"/>
</par>

<par id="Weather">
<img src="weather.jpg"/>
<audio src="weather_rpt.mp3"/>
</par>
</excl>
</par>

User Interface Design - UTCN 47


BNF and diagramatic representation
<command> ::= <create> │ <polyline> │ <delete> │ <move> │ STOP
<create> ::= CREATE + <type> + <position>
<type> ::= SQUARE │ TRIANGLE
<position> ::= NUMBER + NUMBER
<polyline> ::= POLYLINE + <vertex_list> + END_POLY
<vertex_list> ::= <position> │ <vertex_list> + <position>
<delete> ::= DELETE + OBJECT_ID
<move> ::= MOVEA + OBJECT_ID + <position>

User Interface Design - UTCN 48


Multy-party grammar representation
o Sequential dialogue:

< Command > ::= < H: Open >< C: Open-ACK >


< H: Open > ::= OPEN < H: FileName >
< C: Open-ACK > ::= [< H: Filename >] IS NOW OPEN

User Interface Design - UTCN 49


State transition diagram
o Sequential dialogue:

User Interface Design - UTCN 50


State transition diagram
o RAPID/USE System (1985, Univ. of California)
o User Software Engineering (USE) methodology
o Dialogue is developed through a graphical editor TDE (Transition Diagram Editor)
o Dialogue discribes by RAPID/USE programming language
o The user interface is created during the execution of the dialogue description

User Interface Design - UTCN 51


RAPID/USE dialogue program
DIAGRAM irg entry start exit quit ARC start single_key
NODE start ON '1' TO (addinf)
cs, r2, rv, c_ 'Interactive Power Network Guide',sv, ON '2' TO (givedata)
r6, c5, 'Please make a choice:', ON '3' TO (readinf)
r+2, c10, '1:Add a new electric station todatabase', ON '4','?' TO help
r+2, c10, '2:Give information about a station', ON '5' TO quit
r+2, c10, '3:Read data for a given station', ALARM 30 TO wakeup
r+2, c10, '4:Help', ELSE TO error
r+2, c10, '5:Quit', ARC error
r+3, c5, 'Your choice:', mark_A ELSE TO start
NODE help ARC help
cs, r5, c0, 'This program stores and retrievs SKIP TO clean
information about', ARC clean
r+1, c0, 'stations, with emphasis on Cluj-Napoca.', ELSE TO start
r+1, c0, 'You can add or update information about ARC (addinf)
stations', SKIP TO start
r+1, c0, 'already in the database, or obtain ARC (readinf)
information about', SKIP TO start
r+1, c0, 'stations, including information of others.', ARC (givedata)
r+2, c0, 'To continue, type RETURN.' SKIP TO start
NODE error
r$-1, rv, 'Illegal command.', sv, 'Please type a
number from 1 to 5.',
r$, 'Press RETURN to continue.'
NODE clear
r$-1, cl, r$, cl
NODE wakeup
r$, cl, rv, 'Please make a choice', sv, tomark_A
NODE quit
cs, 'Thank you much. Please try this program
again',
nl, 'and continue to add information on
stations.'

User Interface Design - UTCN 52


User Action Notation (UAN)
o Asynchronous dialogue

o TASK: select icon


User action Interface echo
~[icon] Mv icon!
M^

o TASK: delete file


User action Interface echo Interface status
~[fisier] Mv fisier!, selectat = fisier
forall(fisier!): fisier-!
~[x,y,]* outline(fisier) > ~
~[cos] outline(fisier) > ~, cos!
M^ erase(fisier), cos!! selectat = null

User Interface Design - UTCN 53


Event based representation
o Asynchronous dialogue
o Skeleton of an event handler declaration in University of Alberta UIMS
(1985):

Eventhandler event_handler_name Is
Token /* input and output tokens */
token_name event_name; /* the handler can process */
...
Var /* declaration of local variables */
type variable_name=initial_value;
...
Event event_name:type{ /* event declarations */
statements
}
...
Event event_name:type{
statements
}
end event_handler_name

User Interface Design - UTCN 54


4. Interactive techniques
o Gestures
n click, press-down, release, press-timer, range, drag
o Simple interaction techniques
n radio, command and push buttons, check boxes, slider, edit
box, …
o Complex interaction techniques
n menu, dialogue box, combo box, grid, panel, marks, drag
and drop, …

Functional Interactive
Component Component User
outputs,
Interface Objects Screen,
App. Objects …
Interface Operations
App. Operations User
Interaction Techniques
User
inputs
mouse,
...
Internal External
Dialogue Dialogue

User Interface Design - UTCN 55


5. Rapid prototyping
o Revolutionary / evolutionary
o Interface / whole system
o Intermitent / continuous

Functional Interactive
Component Component User
outputs,
Interface Objects Screen,
App. Objects …
Interface Operations
App. Operations User
Interaction Techniques
User
inputs
mouse,
...
Internal External
Dialogue Dialogue

User Interface Design - UTCN 56


6. Development methodology
o UI generators
o Scenarios
o Scripts
o UIMS
o Object oriented
o Active object based
o Multi agent based
o Model based
o By demonstration
o …

User Interface Design - UTCN 57


User Interface Management Systems-UIMS

External Internal
dialogue Dialogue dialogue Computational
Component Component

End User

Dialogue
Development Programming
Tools Environment
Evaluator

Feedback
for Inter-role
interative communication
Dialogue Application
refinement
Developer Programmer

User Interface Design - UTCN 58


Active Object Model

User

Graphical User Interface

AOM Model

AOM Platform

Computer Internet
Resources

User Interface Design - UTCN 59


Active Object Model GUI

User Interface Design - UTCN 60


Active Object Modeling Language
AGM "CheckBox";
variable V1{
value FALSE;
}, V2{
value FALSE;
}, V3{
value FALSE;
};
agent Controls{
position (4,4);
state PLAY;
visibility 1;
interactor I1{
position (10,10);
mouse_event Buton1{ }, Buton2{
}, Buton3{ };
};
behavior Behe{
position (10,10);
type CYCLE;
direction FORWARD;
steps 30;
trajectory T1{
position (79,45);
trjposition ETP1{
position (10,10);
type UNCOND;
};
};
};
presentation pres{
position (0,0);
graphics G1{
position (26,29);
drawtext("",5,5,195,24,DT_LEFT);
};
};
};

User Interface Design - UTCN 61


Active Object Modeling Language
mouse_event Buton1{
type LEFTBDOWN;
zone Z1 {
visibility 1;
graphics G1{
select brush,br1{
color RGB(255,255,255);
};
rectangle(11,11,21,21);
textout(30,8,"Monitor");
};
};
do {
type UNCOND;
rule R1{
position(4,4);
action A1{
set variable(V1).value,!variable(V1).value;
};
}, R2{
position(4,4);
condition C1{variable(V1).value};
action A1{
set
agent(Controls).interactor(I1).mouse_event(Buton1).zone(Z1).graphics(G1).brush(br1).color_red,0;
}, A2{
set
agent(Controls).interactor(I1).mouse_event(Buton1).zone(Z1).graphics(G1).brush(br1).color_green,0;
};
}, R3{

};
};
},

User Interface Design - UTCN 62


7. Control structures

o Sequential/asynchronous dialogue
o Local/global
o Functional and interface dominant control
o Mixed/balanced

Functional Interactive
Component Component User
outputs,
Screen,

Interface Objects
App. Objects User
Interface Operations
App. Operations
Interaction Techniques User
inputs
mouse,
...
Internal External
Dialogue Dialogue

User Interface Design - UTCN 63


Functional and interface dominant control

User Interface Design - UTCN 64


User event oriented application

Interactive Functional
Component Component
User
outputs,
Screen,

Interface Objects
User Interface Operations App. Objects
Interaction Techniques App. Operations
User
inputs
mouse,
...

External Internal
Dialogue Dialogue

User Interface Design - UTCN 65


User event based control

User

User Procedures
action
Event P1
PushBttn (msg) ____
Message
queue ____
Processing
____ ...
Loop
____
... P2
User event
____ ____
(msg)
____
...
- MouseMove
- MouseOver
- PushLeftButton Pn
____
____
...

User Interface Design - UTCN 66


Questions and problems
1. Explain the main difference between the HCI and CHI
domains?
2. Explain the notion “visual programming techniques”
3. Explain the notion “end user programming and development”
and highlight some main issues.
4. Exemplify the conceptual architecture of the interactive
application. Detail the Functional and Interactive components.
5. What are the main issues in the achievement of the dialogue
independence for an interactive application? Give examples for
application and knowledge model.
6. Define shortly the concept of structure modeling. Give
examples for linguistic and graphics user interactions.
7. What is the conceptual difference between task and goal?

User Interface Design - UTCN 67


Questions and problems
8. Identify the particularities and explain the use of task models
such as GOMS to the development of the interactive
applications.
9. What is the difference between an interaction language and a
metalanguage?
10. Exemplify the notion of Dialogue transaction model for
linguistic, graphical, and voice based user interaction. Highlight
the lexical, syntactic and semantic levels.
11. Exemplify the Dialogue cell concept for various levels of
interaction: lexical (identifier), syntactic (command) and
semantic (dialogue).
12. Explain the main contribution of TDE (Transition Diagram
Editor) in the RAPID/USE project.
13. Exemplify the description of the aspect, behavior, and
interaction by the User Action Notation (UAN) language.

User Interface Design - UTCN 68


Questions and problems
14. Explain by comparison the Event based architecture and
interaction.
15. Define shortly and exemplify the concept of user interaction
technique.
16. Define shortly and exemplify the concept of rapid prototyping.

17. Describe and exemplify the User Interface Management


Systems (UIMS). Identify and explain the roles.
18. Define the concept of Control structure. What are the main
advantages and disadvantages of the interface dominant
control?
19. Give an example of the mixed and balanced control structure.

User Interface Design - UTCN 69


Input and output communication
concepts

User Interface Design


Contents
o Communication concepts
o Output communication concepts
o Presenters
o Output communication models
o Input communication concepts

User Interface Design - UTCN 2


Interactive applications

Functional Interactive
Component Component
User
outputs,
Screen,

Interface Objects
App. Objects Interface Operations User
App. Operations Interaction Techniques
User
inputs
mouse,
...

Internal External
Dialogue Dialogue

User Interface Design - UTCN 3


Communication concepts

Functional Interactive
Component Component User
outputs,
Screen,
Output …
communication
Encoder
concepts

Objects User
Operations Interpretor inputs
mouse,
...
Input
communication Decoder
concepts

User Interface Design - UTCN 4


Speed meter

User Interface Design - UTCN 5


Speed meter

User Interface Design - UTCN 6


Speed meter

User Interface Design - UTCN 7


Communication concepts
o Categories of information about objects and operations which
user and program communicate through the interaction
language
o Interaction language
o Semantics: communication concepts
o Syntax: encoding rules for the communication concepts
o Lexic: atomically encoding the communication concepts
o Example
o communication concepts = the set of possible interface objects that
could be user inputs of a given command.

Op1 O2
50
25 75
Op2 O1
0 100
Km/h
O3
Op3

O4

User Interface Design - UTCN 8


Communication concept types

1. Output communication concepts


o Content
o Change

2. Input communication concepts


o Input gathering
o Invocation
o Execute
o Terminate

Op1 O2
50
25 75
Op2 O1
0 100
Km/h
O3
Op3

O4

User Interface Design - UTCN 9


Output communication concepts
1. Content communication concepts
2. Change communication concepts

o Example:
Speed - application variable
1. Content: Value = 75 Km/h
Change: Set the speed
2. Content: Increment value = 0.5 km/h
Change: Increment the speed

50
25 75

0 100
Km/h

User Interface Design - UTCN 10


Presenters
o Transform output communication concept into image
o Image = abstract representation of the output
communication concept
o If the presenter includes the software device driver the
image is a physical image

Presenter
Model

Output
Communication Presenter Image
Concepts

User Interface Design - UTCN 11


Presenter – concept to image

Output Interpreter and Abstract


Communication Encoder representation e.g. 0 1 0 1
Concepts of the image

e.g. GKS, PHIGS,


Software device driver DECwindows,
Windows, Delphi

Physical
image

Volts

2. 5.0
7.5
Buttons 0 5 10 Line

User Interface Design - UTCN 12


Presenter implementation
o Procedures
o Functions
o Hierarchy of classes

Button

Rect Radio ... Spinner


Button Button Button

Red
Push Toggle
Blue
Button Button

Rotate TglBtn

User Interface Design - UTCN 13


Domain of the presenter
o The set of output
communication
concepts for which the
presenter provides
images

o The set of parameters


describes a particular
presenter of the
domain

User Interface Design - UTCN 14


Knowledge required by a presenter

o Model knowledge (state, change)


o Domain knowledge Model knowledge
o Format knowledge
o Environment knowledge State Change
knowledge knowledge

Format
knowledge Domain knowledge

Environment
knowledge Presenter

Presenter's state

User Interface Design - UTCN 15


Presenter’s knowledge
o Source of the format knowledge
1. User interface programmer
2. Inherited from the ancestor presenter
3. User options
4. Database

o Example:
o Programer encodes the fonts of a list box.
o Push command and press buttons have a rectangular shape
inherited from the button class.
o The end user selects through the Option menu the filling
pattern.
o The graphical attributes are extracted from a database
according with the interaction style or a global parameter.

User Interface Design - UTCN 16


State, format and domain knowledge
o Parameterized presenters
o Different parameters

User Interface Design - UTCN 17


Output communication models
1. Continue refresh model
2. Message model

Application connection
Model Presenters User

Functional Internal Interactive External


Component Dialogue Component Dialogue
50
o Information through connection: 25 75
1. Image 0 100
Km/h
2. Format
3. Current state of the presentation
*!!* Web application, Three tier application
o Issues:
o Avoid the unuseful updating of the display
o Delay between the model change and the
presentation update
o Event state presentation

User Interface Design - UTCN 18


Message output model

low speed
channel
Partial
Application Model
Model Replication
Messages

Event
identification

User Interface Design - UTCN 19


Message model
o Example:
Modify the speed with an implicit increment

50
25 75
Increment 0 100
message Presenter Km/h

Presenter state:
increment value

o Issues:
o Different communication protocol
o Presenters get the changes
o Uses a set of change communication concepts
o Difficult change communication:
n to avoid the unuseful display
n to supply enough knowledge to presenters
n to present the events

User Interface Design - UTCN 20


Continue refresh model

high speed
channel
Application Interactive
Model part

Continue
refresh

User Interface Design - UTCN 21


Continue refresh model

Whole
Application graphical
Model model

Continue
refresh

User Interface Design - UTCN 22


Continue refresh model
o Simplicity
o No extra communication concepts
o Complex event communication to the user
o Presenters infer the event that cause a change
o Unuseful displays
o Significant delay for unuseful displays
o To avoid the unuseful display involves a complex implementation

Continue
connection
Application Presenter
Model

Presenter state

User Interface Design - UTCN 23


Mixed output model

Partial
Application Model
Model Replication
Messages

Continue
refresh

User Interface Design - UTCN 24


Model-View-Controller (MVC)
o The Model-View-Controller (MVC) pattern is a way of supporting
multiple presentations of data.

Image taken from Ch16-User Interface Design, in “Software Engineering”, of Ian Sommerville,
7’th ed., 2004

User Interface Design - UTCN 25


Model-View-Controller

Image taken from Ch16-User Interface Design, in “Software Engineering”, of Ian Sommerville,
7’th ed., 2004

User Interface Design - UTCN 26


MVC in interactive applications

User Interface Design - UTCN 27


Hand gesture language

User Interface Design - UTCN 28


Eye movement

User Interface Design - UTCN 29


User input correctness
o How do you differentiate between useful and involuntary
inputs (noise)user inputs
o Control information
o Input communication concepts

User Interface Design - UTCN 30


Input communication concepts
*Set Table Color to Red^

user inputs + control information

Move
abl e Name
T Chair
Set
Bo Weight
Delete ok Ok
Color
...
...
Round Doo
r
Wall Price

User Interface Design - UTCN 31


Input communication concepts
o Input gathering communication concepts
o Invocation communication concepts
o Execute communication concepts
o Terminate communication concepts

Functional
Operations component

Commands
Communication concepts

Recognisers Interactive
component

Events

The role of the commands and recognisers

User Interface Design - UTCN 32


Input gathering communication concepts
o The role of the commands and recognizers:
1. Check the input correctness
2. Decode the user inputs
3. Gather the inputs toward an operation
4. Transform the inputs into an available
form to the operations
Operations
5. Send the inputs to the operation

o Type of object:
Command
n Conceptual
n Generated by operational
execution
Objects
pool

Gathering the inputs

User Interface Design - UTCN 33


Knowledge for inputs gathering
1. Knowledge requested by commands
o About operations
o About user inputs
2. Knowledge requested by presenters
Syntactical forms:

Op1 O2 Prefix: Op1, O1, O2, ….^

Infix: *O1,….,Op1, …On^


Op2 O1
Postfix: *O1,…,On, Op1
O3
Op3
Example: Rotate, O1, x, y, θ^
O4 *O1, Aggregate, O3^
*O1, O3 , Round ^
* Explicit specification of the beginning
^ Explicit specification of the end

User Interface Design - UTCN 34


Syntactic form building up
*Set Table Color to Red^

o User builds the syntactic form by:


n Graphical user interface (GUI): direct selection of interface
objects -> no lexical errors
n Voice interface: pronunciation
n Linguistic interface: writing the words

Move
abl e Name
T Chair
Set
Bo Weight
Delete ok Ok
Color
...
...
Round Doo
r
Wall Price

User Interface Design - UTCN 35


Syntactic form in GUI

User Interface Design - UTCN 36


Syntactical form of user command
Postfix: *O1,…,On, Op1

Example: Obj1, Obj2, Obj3, …, Group

User Interface Design - UTCN 37


Knowledge requested by commands
o Knowledge requested by commands about operations:
1. Input Identifiers (II)
2. Choice Input Set (CIS)
3. Requested Inputs (RI)

o Knowledge requested by commands about user inputs:


1. Basic Set (BS)
2. Basic Predicate (BP) (semantically constraint)
3. Inputs Gathering Predicate (IGP)
4. Transformer (T)
5. Generator (G)

User Interface Design - UTCN 38


Knowledge requested by commands
o Content communication concepts that may be used by
commands:
o Possible input
o Correctness Op1 O2
o Alternatives
Op2 O1
O3
Op3
o Content communication concepts
O4
provides for:
o control the mouse sensitivity
o highlight errors and error explanations
o alternative menus
o flexible order for user inputs specification
o different options of input correctness checking

User Interface Design - UTCN 39


Possible user inputs
*Set Table Color to Red^

o Syntactic and semantic knowledge enable just possible


user inputs
o Avoid lexical, syntactical and some semantical errors

Move b le
Ta Chair
Name
Set
Bo Weight
Delete ok Ok
Color
...
...
Round D oo
r
Wall Price

User Interface Design - UTCN 40


Knowledge embedded - GUI features
o Possible inputs, Correctness, Alternatives

User Interface Design - UTCN 41


Invocation communication concepts
o Knowledge to:
1. control the invocation,
2. interpret the invocation
o Input communication concepts to invoke the operations
1. Command state
active / inactive
execution / waiting
2. Variables that control the command state
active
execute

o Knowledge requested to interpret the invocation


communication concepts
execute
activate
preview

User Interface Design - UTCN 42


Preview operation
o Simulated operation -> temporar replicated model
o Expensive operation – require resources (computation
time, simulation algorithm, replicated model, memory
capacity, execution control, etc)

Preview
Operation

Appl. Temp. Simulated


Model Appl. Model

remove

User Interface Design - UTCN 43


Execute and terminate communication concepts

o Execute Communication Concepts


Control the operation execution:
suspend / resume
cancel
o Terminate Communication Concepts
undo
redo
commit

User Interface Design - UTCN 44


Undo operations
o Operations, reversible operations, irreversible operation
o Reversible operations: op (par) -> stack
o Irreversible operations: replicate the application model
Op (par)

Appl.
Stack
Model

Op-1 (par)

Op 1 Op 2 Op k-1

Appl. Appl. ... Appl.


Model 1 Model 2 Model k

remove

User Interface Design - UTCN 45


Questions and problems
1. Define and exemplify the communication concept in the
interactive application.
2. What is the difference between the metaphorical presentation
and the input/output communication concept? Give some
examples.
3. Explain the role of Decoder and Encoder in the structure of an
interactive application.
4. Exemplify the notion of Content and Change communication
concepts in a graphics editor within the specification and
execution of the operation “edit a circle”.
5. Explain the components and knowledge of a presenter for the
example of speedometer on the board of a car.
6. Explain and exemplify the Continue refresh output
communication model.

User Interface Design - UTCN 46


Questions and problems
7. Explain and exemplify the Message output communication
model.
8. Exemplify and explain an interactive application architecture
that uses the combination of the both output communication
models.
9. Explain and exemplify the Input gathering communication
concept.
10. Explain and exemplify the Invocation communication concept.

11. Explain and exemplify the Execute communication concept.

12. Explain and exemplify the Terminate communication concept.

13. Explain the role of the Recognizers module throughout a use


interaction process.
14. Explain the role of the Commands module throughout a user
interaction process.

User Interface Design - UTCN 47


Questions and problems
15. Explain and exemplify the notions of Knowledge requested by
commands about operations and about user inputs.
16. Exemplify and explain a knowledge based implementation in
GUI (e.g. Chess game) of the feature Possible inputs.
17. Exemplify and explain a knowledge based implementation in
GUI (e.g. Chess game) of the feature Correctness inputs.
18. Exemplify and explain a knowledge based implementation in
GUI (e.g. Chess game) of the feature Alternatives inputs.
19. Describe a technical solution to implement the Preview
operation in a graphical interactive application.
20. Describe a technical solution to implement the Undo operation
in a graphical interactive application.

User Interface Design - UTCN 48


Interaction Design Process

User Interface Design


Contents
o Interaction Design

o Practical issues

o User centered development methodology

o Lifecycle models from software engineering

o Lifecycle models from HCI

User Interface Design - UTCN 2


Interaction design
o It is a process:
n a goal-directed problem solving activity informed by
o intended use
o target domain
o materials
o cost
o feasibility
n a creative activity
n a decision-making activity to balance trade-offs

o It is a representation:
n a plan for development
n a set of alternatives and successive elaborations

User Interface Design - UTCN 3


Basic activities and characteristics
o There are four basic activities in Interaction Design:

1. Identify needs and establish requirements


2. Develop alternative designs
3. Build interactive versions of the designs
4. Evaluate designs

o Three key characteristics permeate these four activities:


1. Users are early involved in the design and evaluation
2. Identify specific usability and user experience goals
3. Iterative process.

User Interface Design - UTCN 4


Practical issues
o Who are the users?
o What are the ‘needs’?
o What are the alternatives and where come from?
o How do you choose among alternatives?

User Interface Design - UTCN 5


User-Centered Development Methodology
o Traditional software engineering methods arose in 1960s
and 1970s
− Systems were not highly interactive
− End-user were computer specialists
− Issues concerning end-user and usability were not at all
important
− user interface design not considered explicitly

o Now:
− Most end-users are not computer specialists
− Usability vital for success

User Interface Design - UTCN 6


System vs User centered design
o Traditional System-Centered design:
n Emphasis on the functionality,
n UI is added at the end
n Emphasis on correct software rather than on ease of use
n User has to adapt himself to the system

o User-Centered design
n UI more important
n Emphasis on end-users’ tasks,
n Early end-user participation: in analysis and design
n Evaluation by end-users
n Consequences: more work for UI-designer and UI-programmer

User Interface Design - UTCN 7


Lifecycle models
o Show how activities are related to each other
o Lifecycle models
— management tools
— simplified versions of reality
o Example:
— from software engineering: waterfall, spiral, JAD/RAD,
Microsoft
— from HCI: Star, usability engineering

User Interface Design - UTCN 8


A simple interaction design model
o Exemplifies a user-centered design approach

Identify needs/
establish
requirements

(Re)Design

Evaluate

Build an
interactive
version
Final product

User Interface Design - UTCN 9


Traditional Waterfall lifecycle

Requirements o Requirement Doc.


analysis o Prepare Use Cases

o Software architecture
Design
o Map the stackholders

o Construct the software


Implementation
o Data storage and retrieval

o Install
Verification
o Test and debug

o Check errors
Maintenance
o Optimize
capabilities

User Interface Design - UTCN 10


RAD (Rapid Applications Development)

Project set-up

JAD workshops

Iterative design
and build

Engineer and
test final prototype

Implementation
review

User Interface Design - UTCN 11


Spiral model (Barry Boehm)
o Important features:
n Risk analysis
n Prototyping
n Iterative framework allowing ideas to be checked and
evaluated
n Explicitly encourages alternatives to be considered

o Good for large and complex projects but not simple ones

User Interface Design - UTCN 12


Spiral model (Barry Boehm)

1 2

4 3

User Interface Design - UTCN 13


Star lifecycle model
o Suggested by Hartson and Hix (1989)
Hartson, H. R., & Hix, D. (1989). "Toward Empirically Derived Methodologies and Tools for
Human-Computer Interface Development." Int. J. Man-Machine Studies, 31, 477-494.

o Important features:
n Evaluation at the center of activities
n No particular ordering of activities. Development may start
in any one
n Derived from empirical studies of interface designers

User Interface Design - UTCN 14


Star lifecycle model

Task/functional
Implementation
analysis

Requirements
Prototyping Evaluation specification

Conceptual/
formal design

User Interface Design - UTCN 15


Game Development Methodology

Concept
Design Implementation Testing Deployment
Development

Game Designers Artists, Programmers Prof. Testers Sales


Subject Matter Expert Technical Directors Beta Testers Marketing
Instructional Designer Educational
Testers

Pre-Production Production Post-Production

User Interface Design - UTCN 16


Scenario-Based Usability Engineering
o M.B. Rosson and J.M. Carroll (2002)
ANALYZE
analysis of Problem scenarios claims about
stakeholders, current
field studies practice

DESIGN
Activity
metaphors, scenarios iterative
information analysis of
technology, usability
HCI theory, claims and
guidelines Information scenarios redesign

Interaction scenarios

PROTOTYPE & EVALUATE


summative formative
evaluation Usability specifications evaluation

User Interface Design - UTCN 17


VUB model
Course “User Aspects of Software Systems”, Prof.dr. Olga De Troyer, Vrije Universiteit Brussel, (2004)

Define Users
and Usability User Classes &
Requirements Usability
User Classes Requirements

User Task User Object Define Style


Analysis & Modeling Modeling Guide

User Object Models


Task models Style Guide
Task scenarios

Design UI

Prototype UI Usability Problems


Usability
Requirements
Evaluate UI

UI design

User Interface Design - UTCN 18


Univ Calgary’s UID Methodology
Interface Design and Usability Engineering (http://www.cpsc.ucalgary.ca/~saul/481/)

Articulate: Brainstorm Refined Completed


•who users are designs designs designs
Goals:
•their key tasks

Task centered Psychology of Graphical


system design everyday Participatory screen
things interaction design
Evaluate Usability Field
Participatory User testing testing
design involvement Task Interface
Representation scenario guidelines
User-centered & metaphors walkthrough Heuristic
design Style guides evaluation

high fidelity
low fidelity
prototyping
prototyping
methods
methods

Products: User and Throw-away Testable Alpha/beta


task paper prototypes systems or
descriptions prototypes complete
specification

User Interface Design - UTCN 19


User oriented GUI Design Methodology
o User involved in all the development phases
o User and beneficiary oriented communication language
o Less formal description
o Based on: visual presentation (i.e. GUI), low fidelity prototyping,
scenarios, heuristic evaluation, user evaluation, iterative development,
etc.
o Issues in the GUI description and representation:
n User actions
n Graphical appearance
n User interactions
n Metaphors
n Interaction style
n Interactin techniques
n Navigation
n …

User Interface Design - UTCN 20


Software Development Methodology
o User requirements
o Specifications
o Analysis
o Design
o Implementation
o Testing
o Deployment
o Maintenance

User Interface Design - UTCN 21


GUI Development Phases
o Project proposals
o User visit plan
o Task analysis and task description
o Low fidelity prototype
o Scenarios
o Walkthrough evaluation
o Heuristic evaluation
o High fidelity prototype

User Interface Design - UTCN 22


Questions and problems
1. Explain why to identify both the user requirements and user’s
needs is important in the development of interactive
applications?
2. Explain how the alternative solutions are developed and
evaluated within the analysis phase of the software
development methodology.
3. Explain the concept of System-centered, User-centered, User-
oriented software development methodology.
4. Explain the notion of user participation and user involvement in
the software development methodology.
5. What are the main differences between general software
methodology and the interactive application development
methodology? Identify and explain the particular techniques.
6. Explain and exemplify the Simple interaction design model.
Explain the main advantages and disadvantages.

User Interface Design - UTCN 23


Questions and problems
7. Identify the main concept used in the Simple interaction design
model. Specify what phases the model covers throughout the
software development methodology.
8. Explain and exemplify the Waterfall model. Explain the main
advantages and disadvantages.
9. Identify the main concept used in the Waterfall model. Specify
what phases the model covers throughout the software
development methodology.
10. Explain and exemplify the RAD (Rapid Applications
Development) design model. Explain the main advantages and
disadvantages.
11. Identify the main concept used in the RAD model. Specify what
phases the model covers throughout the software development
methodology.

User Interface Design - UTCN 24


Questions and problems
12. Explain and exemplify the Spiral model. Explain the main
advantages and disadvantages.
13. Identify the main concept used in the Spiral model. Specify
what phases the model covers throughout the software
development methodology.
14. Explain and exemplify the Star model. Explain the main
advantages and disadvantages.
15. Identify the main concept used in the Star model. Specify what
phases the model covers throughout the software development
methodology.
16. Explain and exemplify the Scenario-Based Usability
Engineering model. Explain the main advantages and
disadvantages.
17. Identify the main concept used in the Scenario-Based Usability
Engineering model. Specify what phases the model covers
throughout the software development methodology.
User Interface Design - UTCN 25
Questions and problems
18. Explain and exemplify the VUB model. Explain the main
advantages and disadvantages.
19. Identify the main concept used in the VUB model. Specify what
phases the model covers throughout the software development
methodology.
20. Explain and exemplify the Calgary model. Explain the main
advantages and disadvantages.
21. Identify the main concept used in the Calgary model. Specify
what phases the model covers throughout the software
development methodology.

User Interface Design - UTCN 26


User Interface
Design Methodology

User Interface Design


Contents
 User oriented GUI design methodology

 Software development methodology

 GUI development phases

User Interface Design - UTCN 2


User oriented GUI Design Methodology
 User involved in all the development phases
 User and beneficiary oriented communication language
 Less formal description
 Based on: visual presentation (i.e. GUI), low fidelity prototyping,
scenarios, heuristic evaluation, user evaluation, iterative development,
etc.
 Issues in the GUI description and representation:
 User actions
 Graphical appearance
 User interactions
 Metaphors
 Interaction style
 Interactin techniques
 Navigation
 …

User Interface Design - UTCN 3


Software Development Methodology
 User requirements
 Specifications
 Analysis
 Design
 Implementation
 Testing
 Deployment
 Maintenance

User Interface Design - UTCN 4


GUI Development Phases
 Project proposals
 User visit plan
 Task analysis
 Low fidelity prototype
 Scenarios
 Walkthrough evaluation
 Heuristic evaluation
 High fidelity prototype

User Interface Design - UTCN 5


Usability

User Interface Design


Contents
1. Usability Definition
2. Measuring Usability
3. Usability Paradigms and Principles
4. Usability Engineering
5. Usability Engineering Lifecycle

User Interface Design - UTCN 2


System-Centered Development Methodology

o Traditional software engineering methods arose in 1960s


and 1970s
n Systems were not highly interactive
n End-user were computer specialists
n Issues concerning end-user and usability were not at all important
n user interface design not considered explicitly

o Now:
– Most end-users are not computer specialists
– Usability vital for success

User Interface Design - UTCN 3


System vs User centered design
o Traditional System-Centered design:
n Emphasis on the functionality,
n UI is added at the end
n Emphasis on correct software rather than on ease of use
n User has to adapt himself to the system

o User-Centered design
n UI more important
n Emphasis on end-users’ tasks,
n Early end-user participation: in analysis and design
n Evaluation by end-users
n Consequences: more work for UI-designer and UI-programmer

User Interface Design - UTCN 4


1. Usability definition
o Usability definition
n “a measure of the ease with which a system can be learned and
used, its safety, effectiveness and efficiency, and attitude of its
users towards it” (Preece et al., 1994)
n “the extent to which a product can be used by specified users to
achieve specified goals with effectiveness, efficiency and
satisfaction in a specified context of use” (ISO 9241-11)
n Usability is a quality attribute that assesses how easy user
interfaces are to use. The word "usability" also refers to methods
for improving ease-of-use during the design process.
o Goal of UI Design → maximum usability
o Problems:
n How can an interactive system be developed to ensure its
usability?
n How can the usability of an interactive system be demonstrated or
measured?

User Interface Design - UTCN 5


Usability attributes
Key usability attributes (Neilsen (1993):
1. Learnability
How easy is it for users to accomplish basic tasks the first time they
encounter the design?
2. Efficiency
Once users have learned the design, how quickly can they perform
tasks?
3. Memorability
When users return to the design after a period of not using it, how
easily can they reestablish proficiency?
4. Errors
How many errors do users make, how severe are these errors, and
how easily can they recover from the errors?
5. Satisfaction
How pleasant is it to use the design?

User Interface Design - UTCN 6


Why Usability is important
o People leave a website:
n If a website is difficult to use
n If the homepage fails to clearly state what a company
offers and what users can do on the site
n If users get lost on a website
n If a website's information is hard to read or doesn't answer
users' key questions
o Because: there is no such thing as a user reading a
website manual or otherwise spending much time trying
to figure out an interface
o Result: there are plenty of other websites available;
leaving is the first line of defense when users encounter
a difficulty

User Interface Design - UTCN 7


Costs and benefits
o Costs: about 10% of a design project's budget on
usability
o Benefits: double the website desired quality metrics
o Quality metrics by doubling the usability
n For internal design projects:
o cutting training budgets in half
o doubling the number of transactions employees perform
per hour
n For external designs:
o doubling sales
o doubling the number of registered users or customer
leads
o doubling whatever other key performance indicators

User Interface Design - UTCN 8


Learnability
o Principles affecting learnability:
o Predictability: to be able to predict the result of an interaction
o Feedback: the system provides feedback about the effect of
the interaction
o Familiarity: correlation between the user’s existing knowledge
and the knowledge required to use the interaction
o Generalization: e.g., drawing rectangle will be the same as
drawing square; close/open window will be the same as in
other MS word application
o Consistency: in naming, color use, command invocation, …
o Advantages
o reduces training time and costs
o enable more flexible staffing practices (staff become effective
more quickly)

User Interface Design - UTCN 9


User Interface Design - UTCN 10
Selecting the operations in
RhinoCeros

Selecting the operations in 3D Studio


Max

User Interface Design - UTCN 11


User Interface Design - UTCN 12
Efficiency
o Always for a specified range of tasks and group of users in a
particular environment.
o Principles affecting efficiency:
o Observe the internal state: The user may visualize the system
states
o Corrective actions: The system recovers the recognized user
errors
o Response time: The time in which the system replies to the
user actions
o Task completeness: The system supports all the user tasks
o Task adequacy: The system supports the task as they are
understood by the user
o Advantages:
n higher productivity

User Interface Design - UTCN 13


Learnability and efficiency
Learning Curves:
o Some systems are designed to focus on learnability (easy to learn, but less
efficient to use)
o Others emphasizes efficiency for proficient users (harder to learn, but then
highly efficient)
o Some systems support both ease of learning and an expert mode (attempt to
ride the top of the curves)

User Interface Design - UTCN 14


2. Measuring Usability
Usability is measurable by its attributes:

1. Learnability
consider novice users of system, measure time to perform certain tasks;
distinguish between no/some general computer experience
2. Efficiency
decide definition of expertise, get sample expert users (difficult), measure
time to perform typical tasks
3. Memorability
get sample casual users (away from system for certain time), measure time
to perform typical tasks
4. Errors
count minor and catastrophic errors made by users while performing some
specified task
5. Satisfaction
ask users' opinion (questionnaire), after trying system for real task

User Interface Design - UTCN 15


Define the usability measuring
o Interface evaluation requires to measure the usability. It needs
to set goals:
1. Identify desired interface attributes
2. Set specific usability goals
3. Make them measurable

o Example for a customer on-line banking system:


o what usability attributes needed?
o What specific, measurable goals?
o It is important to improve the error avoiding?
o To perform bank operations does the user need to memorize long
sequences?
o Does it need high efficiency in short time?

User Interface Design - UTCN 16


Scope of user interfaces
o In traditional software engineering, the interface was attached
on at the end. Usability was rarely considered, and poor.
o Usability depends on:
n Characteristics of users
n Context of use
n Information architecture
o i.e., the Web applications require appropriate usability. Interface
design is about more than just the screen displays

User Interface Design - UTCN 17


Characteristics of users
o Analyse the target user population
o Personal characteristics, technical abilities, prior experience in this area,
jobs, goals and tasks, motivation, etc.

o Cognitive and physical abilities


o Problem-solving, visual and language literacy, memory capacity, age,
disability, etc.

o Novice/intermediate/expert users need different design


considerations

User Interface Design - UTCN 18


Context of use
o Requires design for very different environments.
Examples
n ATMs on the street
n Database system for estate agents
n Flightboard controller for passenger jets
n CAD tools
n GIS apllications
n Wireless terminals
o Four main aspects of context of use (Preece 2002) :
1. Physical environment
2. Social environment
3. Organisational environment
4. Technical environment

User Interface Design - UTCN 19


Information architecture
o This is the way data is organised, labelled and displayed for
users
o Application design
o Designers need to:
o Group information into rational categories
o Label categories appropriately
o Present data so it can be easily read
o Provide alternative methods of accessing the data

User Interface Design - UTCN 20


Disciplines contributing to HCI
o AI (artificial intelligence)
o Anthropology
o Cognitive psychology
o Computer science
o Design
o Ergonomics
o Information architecture/science
o Linguistics
o Organisational psychology
o Philosophy
o Social psychology
o Sociology

User Interface Design - UTCN 21


3. Usability Paradigms and Principles
Usability Paradigms (Conceptual rules):
n Establish the philosophy of user-centered design
n Point out the general direction
n Usually based on new technology
n Access, efficacy, progression, support, and context

Usability Principles (Practical guide):


n Offer specific directions for solving practical problems
n Map the route to take
n Independent of technology
n Structure, simplicity, visibility, feedback, tolerance, and reuse

The success of designing for usability requires both creative insight (new
paradigms) and purposeful principled practice.

User Interface Design - UTCN 22


3.1 Usability Paradigms
o Access
The system should be usable, without help or instruction, by a user who has
knowledge and experience in the application domain but no prior experience
with the system.
o Efficacy
The system should not interfere with or impede efficient use by a skilled
user who has substantial experience with the system.
o Progression
The system should facilitate continuous advancement in knowledge, skill,
and facility and accommodate progressive change in usage as the user gains
experience with the system.
o Support
The system should support the real work that users are trying to accomplish
by making it easier, simpler, faster, or more fun or by making new things
possible.
o Context
The system should be suited to the real conditions and actual environment
of the operational context within which it will be deployed and used.

User Interface Design - UTCN 23


3.2 Usability Principles
Structure
Organize the UI in meaningful and useful ways based on clear, consistent
models that are apparent and recognizable to users, putting related things
together and separating unrelated things (as well as in the user mental model).
Simplicity
Make simple common tasks simple to do. Communicate clearly and simply in the
user’s own language. Provide good shortcuts for long procedures.
Visibility
Keep all needed options and materials for a given task visible, without
distracting the user with extraneous or redundant information.
Feedback
Keep users informed of actions or interpretations, changes of state or condition,
and errors or exceptions. Communicate by a clear and concise language.
Tolerance
Be flexible and tolerant, reducing the cost of mistakes and misuse by allowing
undoing and redoing. Wherever possible tolerate varied inputs and sequences by
interpreting all reasonable actions reasonably.
Reuse
Reuse internal and external components and behaviors, maintaining consistency
with purpose rather than arbitrary consistency, thus reducing the need for users
to rethink and remember.

User Interface Design - UTCN 24


When to Work on Usability
o Usability plays a role in each stage of the development
process:
1. Analysis
2. Design
3. Development
4. Testing
5. Evaluation and validation
6. Production

User Interface Design - UTCN 25


Analysing process
o Check list for analysing the providing features
o Various features impose technical and technological solutions
o Solutions for solving practical problems
o Therefore, the providing of the main principles/ features is a
very important decision on the design

Features

Check list Solutions

Specifications

User Interface Design - UTCN 26


Usability Principles Classification
o Learnable
n Predictable
n Synthesize
n Familiar
n Generalize
n Consistent
o Flexible
n Dialogue initiative
n Multimodal
n Task migration
n Substitution
n Customization
o Robust
n Observable
n Recoverable
n Responsiveness
n Stabile
n Task conformance

User Interface Design - UTCN 27


Usability Principles Classification
o Seeing/pointing versus remembering/typing
o Consistency (same thing, same way)
o Timely and accurate feedback
o Salient repertoire of actions
o Forgiveness (reversible actions)
o Familiar user conceptual model
o Feedback (acknowledgement of input)
o Prevention of errors
o Easily discriminated action alternatives
o Modeless interaction
o Speaking the user’s language
o Aesthetic integrity (simple design)
o Shortcuts and accelerators
o Real-world conventions
o Help with error recognition and recovery

User Interface Design - UTCN 28


Usability Principles Classification
o Visibility of system status
keeping the user informed
o Match between system and real world
user language and real-world conventions
o User control and freedom
easy exits, undo, and redo
o Consistency and standards
o Error prevention
o Recognition rather than recall
reduced remembering with visible options, actions, and instructions
o Flexibility and efficiency of use
customization and support for advanced users
o Aesthetic and minimalist design
reduced irrelevant or rarely needed information
o Help in recognition, diagnosing, and recovering from errors
o Good help and documentation
User Interface Design - UTCN 29
4. Usability Engineering
o The ultimate test of usability based on measurement of user
experiments
o Demands that specific usability measures be made explicit
according with the requirements
o Usability requirements
o Usability specification
o Measuring concept
o Measuring method
o Criteria for judging

o Usability Engineering Lifecycle

User Interface Design - UTCN 30


Usability requirements
o Requirements that when satisfied will contribute to the
usability of the system
o May be part of or derived from the general system
requirements
o Should be used to measure the usability of the system →
usability specifications
o Examples:
n Functional requirement: “allow to enter a new member”
o Associated usability requirement:
n Entering a new member must be able in less than 30 sec.

n Functional requirement: “provide an overview of all members”


o Associated usability requirement:
n the overview of all members must be possible
§ Alphabetically
§ By date of enrollment

n No more than 3 errors should be made when entering information

User Interface Design - UTCN 31


Usability specification
o Specifies how to measure that the usability requirement is
achieved. Specification includes:

1. Measuring Concept: what will be measured


o E.g., quality of task performance, user satisfaction, learnability, etc.

2. Measuring Method: how will it be measured


o e.g., via task scenario, or a questionnaire or an interview
o what result does the measure produce? e.g. time to complete task,
number of error, proportion of users able to complete the task, etc.

3. Criteria for judging


o what is the current level
o worst acceptable level,
o planned target level,
o best possible level

User Interface Design - UTCN 32


Example: usability specification

1. Usability Requirement
Ex. allow backward recoverability

2. Measuring concept
Ex. undo an erroneous programming sequence

3. Measuring method
Ex. task scenario. Result: number of explicit user actions to undo
current program

4. Current level
Ex. not supported

5. Worst level
Ex. as many actions as it takes to get the program in mistake

6. Planned level
EX. a maximum of two explicit user actions

7. Best case
Ex. one explicit cancel action

User Interface Design - UTCN 33


Performance measures - examples
o Time to locate a book at the library
o Time to fill in customer information and place order
o Number of times the Back Button is used, indicating that user cannot find desired
information
o Number of clicks to find the time of a TV show
o Percentage of tasks completed correctly
o Number of calls to support line
o Number of complaints, negative facial expressions, or regressive behaviors
(screaming at monitor, etc.)

User Interface Design - UTCN 34


Performance measures - examples
o Time to complete a task
o Per cent of task completed
o Per cent of task completed per unit time
o Ratio of successes to failure
o Time spent in errors
o Per cent or number of errors
o Per cent or number of competitors better than it
o Number of commands used
o Frequency of help and documentation use
o Per cent of favorable/unfavorable user commands
o Number of repetitions of failed commands
o Number of runs of successes and of failures
o Number of times interface misleads the user
o Number of good and bad features recalled by users
o Number of available commands not invoked
o Number of regressive behaviors
o Number of users preferring the system
o Number of times users need to work around a problem
o Number of times the user is disrupted from a work task
o Number of times user loses control of the system
o Number of times user expresses frustration or satisfaction

User Interface Design - UTCN 35


Possible ways to set measurement levels
Set levels with respect to information on
n An existing system or previous version
n Competitive systems
n Carrying out the task without use of a computer system
n An absolute scale
n Your own prototype
n User’s own earlier performance
n Each component of a system separately
n A successive split of the difference between best and worst values
observed in user tests

User Interface Design - UTCN 36


Issues
o Usually, it is not possible to specify all usability requirements.
And if so, probably not the resources to evaluate them all.

o If the measures of the usability attributes are too difficult or


time-consuming, they will not be measured in practice.

o Satisfying usability specifications does not guarantee better


usability
e.g. ‘foresee an undo’, but maybe ‘prevent errors’ would have been better

o Usability specifications may already require specific design


information
e.g. to count keystroke we have to know that we will use key

User Interface Design - UTCN 37


Finding usability requirements
o Identify the areas of usability which are critical to the success
of the the GUI:
Examples
o rate of performance for high-volume tasks,
o learning time for discretionary users,
o error rate for intermittent users (ex. monthly time sheets)

o The key issue is the benefit to the organization:


Examples:
o increased productivity
o reduce overhead
o wider availability of information

User Interface Design - UTCN 38


5. Usability Engineering Lifecycle
1. Know the User

2. Competitive Analysis

3. Set Usability Targets

4. Goal-Oriented Interaction Design

5. Iterative Design:

a. Prototyping

b. Usability Evaluation (Inspection and Testing)

User Interface Design - UTCN 39


5.1 Know the user
o Observe Users in Working Environment
o Site visits, unobtrusive observation. Don't believe their superiors!

o Individual User Characteristics


o Classify users by experience, educational level, age, amount of prior
training, etc.

o Task Analysis
o Users' overall goals, current approach, model of task, prerequisite
information, exceptions to normal work flow

o Functional Analysis
o Functional reason for task: what really needs to be done, and what are
supporting procedures

o Categories of User Experience:


o Experience of computers in general, understanding of the task domain,
expertise in using the specific system

User Interface Design - UTCN 40


5.2 Competitive analysis
o Competitive analysis of competing systems:
o Analyze competing products heuristically or empirically
o Take ideas from other systems

User Interface Design - UTCN 41


5.3 Set usability targets
o Decide in advance on usability metrics and desired level of
measured usability (usability targets)
o Financial impact analysis - estimate savings based on loaded
cost of users, compared to cost of usability effort

User Interface Design - UTCN 42


5.4 Goal-oriented interaction design
o Programming is such a difficult and absorbing task that it
dominates all other considerations, including the concerns of
the user

User Interface Design - UTCN 43


5.5 Iterative design
o Design, Test, Redesign
o Build Prototypes
o Verbal description
o Paper prototype
o Working prototype
o Implementation of final design (of prototype)
o Usability Evaluation:
o Usability Inspection: Inspection of interface design using
heuristics and judgment (no user tests)
o Usability Testing: Empirical testing of interface design with real
users

User Interface Design - UTCN 44


Questions and problems
1. Explain and exemplify how the usability is used in the two
phases of the software development methodology: analysis
and evaluation.
2. Exemplify and explain the usability specification in the analysis
phase of the software development methodology.
3. Exemplify and explain the usability evaluation in the testing
and evaluation phase of the software development
methodology.
4. Describe and exemplify the five attributes of the usability.
5. Describe and exemplify the way of implementing and
enhancing the learnability.
6. Describe and exemplify the way of implementing and
enhancing the efficiency.

User Interface Design - UTCN 45


Questions and problems
7. Exemplify and explain an approach focused on measuring one
attribute of the usability: 1. learnability; 2. efficiency; 3.
memorability; 4. errors; 5. satisfaction. Describe the details of
each experiment.
8. Exemplify the characterization of the generic user by detailing
the target population, cognitive and physical abilities, and
professional level.
9. Exemplify the context of use, considering the classification
given by Preece.
10. Exemplify and explain the following usability paradigms:
access, progression, and context.
11. Exemplify and explain the following usability principles:
structure and simplicity, feedback, and tolerance.
12. Exemplify and explain the following usability principles:
consistent, multimodal, substitution, recoverable, and task
conformance.
User Interface Design - UTCN 46
Questions and problems
13. Exemplify and explain the following usability principles:
visibility of system status, recognition rather than recall,
aesthetic and minimalist design.
14. Define and explain the concept of Usability engineering. How it
is mapped onto the general software development
methodology?
15. Highlight the difference between functional and usability
requirements.
16. Exemplify and explain the concept of Usability specifications.
What is the difference between usability requirements and
usability specifications?

User Interface Design - UTCN 47


Măsurarea utilizabilităţii tehnicilor de adnotare grafică
Teodor Ştefănuţ, Dorian Gorgan
Universitatea Tehnică din Cluj-Napoca
Str. C. Daicoviciu 15, 400020 Cluj-Napoca
[email protected], [email protected]

bazate pe creion ca simple tehnici care nu necesită


REZUMAT
Adnotarea grafică pe documente permite utilizatorului o interpretare, sau ca tehnici care necesită cunoştinţe din
mai mare libertate creativă în exprimare. Literatura de modelul aplicaţie.
specialitate menţionează în special adnotări grafice în Mizobuchi şi Yasumura au arătat în [5] un experiment prin
spaţiul imagine asupra documentelor în diferite formate. care au comparat două tehnici de selecţie prin creionul
Lucrarea aceasta prezintă experimente asupra adnotărilor grafic: atingere şi încercuire. Experimentele au arătat că
grafice în spaţiul document 2D în contextul lecţiilor din încercuirea este mai puţin precisă şi mai lentă decât
aplicaţii eLearning. Sunt prezentate comparativ măsurători atingerea. Încercuirea însă, este mai rapidă când obiectele
de utilizabilitate asupra tehnicilor de adnotare folosind sunt grupate, dar mai lentă decăt atingerea, dacă obiectele
dispozitivul mouse şi creionul grafic. sunt dispersate spaţial. Încercuirea durează un timp mai
mare pentru obiectele grupate în forme complexe. În
Cuvinte cheie
schimb, timpul necesar pentru selecţia prin atingere nu
Adnotare grafică, interfeţe utilizator grafice, tehnici de
depinde de complexitatea formei.
interacţiune, utilizabilitate, măsurarea utilizabilităţii,
aplicaţii e-Learning. În [6] se utilizează eTrace pentru experimentarea
tehnicilor de adnotare grafică 2D şi 3D pe diverse
Clasificare ACM documente şi scene de obiecte 3D. În [7] se prezintă
H.5.2 User Interfaces - Graphical user interfaces (GUI),
aplicaţii ale adnotărilor grafice în lecţii din medicină.
H.5.3 Group and Organization Interfaces - Evaluation/
methodology, K.3.1 Computer Uses in Education.
INTRODUCERE
Majoritatea aplicaţiilor interactive permit accesul
utilizatorului la entităţile aplicaţiei doar prin intermediul
controalelor din interfaţa utilizator grafică. Accesul se face
în stilul de interacţiune comandă prin care utilizatorul
construieşte o formă sintactică prin selectarea sau editarea
termenilor. Tehnica de interacţiune prin manipulare
directă introduce un control mai flexibil şi mai natural
asupra entităţilor unei aplicaţii (ex. obiecte, operaţii).
Manipularea directă poate fi în spaţiul imagine (ex.
obiectele controale din GUI) sau în spaţiul obiect.
Manipularea directă în spaţiul obiect, spre deosebire de
manipularea directă în spaţiul imagine, este mult mai
naturală, dar necesită acces şi operaţii costisitoare la
modelul grafic al aplicaţiei. În această lucrare sunt
prezentate exemple de adnotari grafice pe obiecte 3D cu
aplicaţii în e-Learning şi un studiu de evaluare
comparativă a utilizabilităţii adnotării 2D prin mouse şi
creion grafic.
ALTE REALIZĂRI
În literatura de specialitate sunt prezentate evaluări ale
tehnicilor de interacţiune prin intermediul creionului
grafic. Creionul grafic combinat cu tableta grafică permite
un control foarte bun al desenării, specificarea presiunii de
apăsare sau înclinarea creionului. Mackenzie et al a
demonstrat în [1] viteza şi precizia superioară a creionului
grafic faţă de mouse. Interfeţele utilizator bazate pe
interacţiunea prin creion grafic permit o mai mare precizie
şi abilitate în scrierea de mână şi desenare. Wuthrich [2] şi
Aliakseyeu et al [3] au identificat patru tipuri de acţiuni
elementare realizabile prin creionul graphic: selecţie/
prindere, poziţionare cu n grade de libertate, deformare şi
Figura 1. Adnotare grafică 2D în pagina unui document.
desenare/ scriere. Hinckley et al [4] a clasificat gesturile
1
Figura 3. Adnotare 2D pentru obiecte 3D.

studia comparativ tehnicile de adnotare folosind mouse şi


creion grafic. Evaluarea utilizabilităţii [9] s-a realizat
conform unor specificaţii care se referă la:
1. Concepul de utilizabilitate – descrie obiectivele
experimentelor, de fapt, atributele de utilizabilitate care
constituie subiectul de studiu. Exemplu, eficienţa tehnicii
de adnotare cu creionul grafic, sau numărul de erori la
desenarea unui pătrat cu ajutorul dispozitivului mouse;
2. Scenariul – descrie acţiunile utilizatorului, obiectele şi
textele asupra cărora se fac adnotările;
3. Medoda de măsurare – specifică parametrii care se vor
măsura, modul în care se măsoară şi înregistrează valorile
acestor parametri;
4. Criteriul de evaluare – defineşte valorile parametrilor
pentru cazurile normale de valori medii acceptate, precum
şi pentru cazurile extreme, cele mai bune şi cele mai rele
Figura 2. Adnotare 3D pe suprafaţa obiectelor 3D. valori.
EXPERIMENTE DE ADNOTARE GRAFICĂ
ADNOTĂRI GRAFICE Pentru testarea utilizabilităţii tehnicilor de adnotare grafică
Adnotările grafice le putem caracteriza prin spaţiul în care în cadrul aplicaţiilor eLearning am realizat un set de
are loc adnotarea şi spaţiul în care este descris subiectul de
adnotat. Obiectele adnotate pot fi definite în spaţiul 2D ca
imagini (ex. jpeg, gif, bmp etc.), pagini cu text, sau pagini
de documente în diverse formate (ex. doc, pdf, xls, ppt
etc.). Obiectele 3D sunt definite în spaţiul virtual 3D.
Adnotarea, la rândul său, poate fi în spaţiul 2D sau 3D.
Adnotarea 2D se realizează într-un plan paralel şi
deasupra paginii document, sau într-un plan de proiecţie
care se poate plasa în jurul unor obiecte 3D. Adnotarea 3D
se realizează direct pe suprafaţa obiectelor 3D.
În Figura 1 şi Figura 2 sunt prezentate exemple de
adnotare 2D şi 3D realizate în eTrace [8]. Încercuiţi fiecare obiect din figură printr-un
contur, folosind pentru fiecare tip de obiecte
EVALUAREA UTILIZABILITĂŢII (de aceeaşi formă şi culoare) o culoare
Evaluarea utilizabilităţii s-a realizat pentru tehnicile de diferită a conturului.
adnotare grafică 2D având în vedere aplicaţiile eLearning. Figura 4. Selectarea individuală şi în grup a unor
Au fost dezvoltate câteva lecţii în mediul eTrace pentru a elemente dintr-o mulţime dată.

2
Imitaţi cât mai precis şi în cât mai scurt timp
Cuprindeţi într-un singur contur toate scrisul de mână specificat.
pătratele roşii, şi numai acestea. Trasarea
conturului va fi făcută pentru fiecare grup de Figura 7. Testarea abilităţii în scrierea de mână.
figuri separat şi în mod continuu, fără
întrerupere şi fără intersectarea marginilor.
Obiectivul acestui exerciţiu este evaluarea abilităţii cu care
Figura 5. Selectarea individuală şi în grup a unor utilizatorul poate efectua operaţiuni complexe ce presupun
elemente dintr-o mulţime dată. selectarea condiţionată a obiectelor individuale sau în
grup. S-a măsurat timpul necesar realizării sarcinii,
exerciţii grupate în cadrul a două lecţii incluse în aplicaţia acurateţea cu care aceasta a fost realizată (ex. câte figuri
eTrace. Aceste exerciţii au fost rezolvate de către au fost încercuite parţial) precum şi numărul erorilor (ex.
utilizatori cu ajutorul mouse şi mai apoi cu ajutorul de câte ori s-a utilizat operaţia UNDO, câte greşeli de
creionului grafic, rezultatele obţinute fiind salvate la nivel marcare, obiecte identice marcate cu culori diferite etc).
de server pentru o prelucrare ulterioară. Desenarea
În timpul testelor au fost măsuraţi următorii parametri: Reprezintă adnotarea complexă realizată în spaţiul 2D sau
3D. Poate fi alcătuită din primitive grafice (ex. polilinii,
- timpul în care utilizatorul realizează complet un anumit
triunghiuri, pătrate, cercuri etc.) cu diferite caracteristici
exerciţiu sau o parte a acestuia;
(ex. grosime, culoare, tip etc.) sau din gesturi – adnotări
- precizia cu care utilizatorul îşi îndeplineşte sarcina; realizate cu mâna liberă care respectă un anumit model şi
- numărul erorilor care apar pe parcursul îndeplinirii unei au o semnificaţie clar stabilită. În vederea evaluării acestui
anumite sarcini. tip de interacţiune s-a utilizat tipul de exerciţiu prezentat
în Figura 6.
Exerciţiile care intră în alcătuirea testelor au fost atent
selecţionate pentru a fi reprezentative în evidenţierea Prin intermediul acestei sarcini s-a urmărit evaluarea
atributelor utilizabilităţii. Fiecare experiment este riguros abilităţii utilizatorilor de a urma un anumit şablon grafic,
descris prin: conceptul de utilizabilitate, scenariul, metoda în descrierea cu mouse sau creionul grafic a unui anumit
de măsurare şi criteriul de evaluare. simbol. Prin utilizarea acestei tehnici de interacţiune,
aplicaţiile pot implementa metode de control şi manipulare
Principalele tehnici de interacţiune cu utilizatorul care au
a obiectelor prin interpretarea unor gesturi simple ale
fost evaluate sunt: selecţia, desenarea şi scrierea de mână.
utilizatorului (ex. X – închide aplicaţia, & - salvează
Selecţia modificările etc.). Au fost măsurate şi interpretate:
Este principala acţiune realizată în cadrul adnotărilor - timpul în care utilizatorul a reuşit realizarea gestului;
grafice. Prin intermediul acestui tip de interacţiune
utilizatorul poate specifica unul sau mai multe obiecte - numărul de erori: de cîte ori utilizatorul a utilizat UNDO
GUI (ex. butoane opţiune, casete opţiune etc.) sau obiecte înainte de obţinerea rezultatului final;
aplicaţie (ex. unităţi sau zone grafice pe o hartă). - precizia cu care utilizatorul a reuşit descrierea gestului.
Tipul de exerciţii utilizat pentru evaluarea acestei tehnici
de interacţiune este prezentat în Figura 4 şi Figura 5.

Desenaţi în interiorul zonei haşurate, forma


dată ca exemplu. Toate semnele grafice, cu
exceptia primului, vor fi desenate fiecare
printr-o mişcare continuă.
Figura 8. Selectarea individuală a unor elemente dintr-o
Figura 6. Testarea abilităţii de realizare a gesturilor. mulţime dată.

3
Figura 11. Exemple de adnotare în selectarea individuală a Figura 9. Exemple de adnotare pentru selectarea prin
unor elemente dintr-o mulţime dată. contur a unui grup de elemente dintr-o mulţime dată.

Scrierea de mână Selecţia


Reprezintă o modalitate de adnotare prin care pot fi După cum se poate observa în Figura 8 în cazul primului
asociate etichete sau observaţii anumitor obiecte grafice experiment selectarea cu creionului grafic este mai rapidă
sau textelor. Pentru evaluarea acestui tip de interacţiune s- decât cea realizată cu mouse. De asemenea, rata erorilor şi
au folosit exerciţii de tipul prezentat în Figura 7. acurateţea adnotărilor sunt superioare celor obţinute cu
Abilitatea utilizatorilor de a scrie comentarii cu ajutorul mouse.
creionului grafic sau mouse a fost măsurată prin: În Figura 11 sunt prezentate rezultatele obţinute de către
- timpul în care utilizatorul a reuşit finalizarea sarcinii; utilizatorii notaţi cu numerele 6 (diferenţă mare de timp
între creion grafic şi mouse) şi 10 (rezultate de timp
- numărul de erori: de cîte ori utilizatorul a utilizat UNDO similare). Sunt evidente diferenţele în precizia şi aspectul
înainte de obţinerea rezultatului final; adnotărilor.
- claritatea şi lizibilitatea textului scris de mână. În cazul celui de al doilea exerciţiu, selecţia prin contur a
ANALIZA REZULTATELOR EXPERIMENTALE unui grup de elemente dintr-un set dat, a rezultat că
Au fost realizate teste cu ajutorul a 12 utilizatori de nivel utilizatorii necesită un timp destul de mare pentru a-şi
mediu în operarea pe calculatoar şi novici în utilizarea controla mişcările creionului grafic, devenind astfel mult
creionului grafic. Să analizăm în continuare rezultatele mai dificilă realizarea unei sarcini complexe. Se observă
experimentale obţinute în raport cu abilităţile de selecţie, din graficul prezentat în Figura 10 că nu se poate
desenare şi scriere de mână. În grafice s-au trasat valorile identifica o diferenţă sistematică de timp între cele două
diferenţă de timp mouse (Tm) şi creion grafic (Tp), dispozitive de interacţiune – mouse şi creion grafic.
raportate la suma valorilor de timp, pentru fiecare În Figura 9 se prezintă măsurătorile obţinute pentru doi
utilizator: (Tm-Tp)/(Tm+Tp). Curba din grafic exprimă utilizatori, 9 şi 11, între care nu există diferenţe mari între
media polinomială a tendinţelor trasate de graficul descris timpul de adnotare. În schimb calitatea şi precizia
anterior.

Figura 10. Selectarea prin contur a unui grup de elemente


dintr-o mulţime dată. Figura 12. Timp de adnotare prin gesturi.

4
Figura 14. Exemple de desenare a gesturilor cu mouse şi
creion grafic după un model de formă dată.

adnotării diferă semnificativ. Utilizatorul 9 este abil în


folosirea mouse, dar mai puţin obişnuit cu creionul grafic.
Utilizatorul 11 foloseşte greu şi imprecis dispozitivul
mouse, dar foarte bine creionul grafic.
Pentru o abilitate asemănătoare de utilizare mouse şi
creion grafic, diferenţele de timp sunt determinate, în
principal de complexitatea grupului de obiecte.
Desenarea
Realizarea gesturilor predefinite cu ajutorul adnotărilor
poate oferi dezvoltatorilor de software o nouă şi variată
gamă de interacţiuni cu utilizatorul, prin introducerea unor
simboluri distincte cu semnificaţie precisă. Conform
rezultatelor obţinute nu există diferenţe sistematice între
timpul de adnotare cu cele două dispozitive (Figura 12). În
schimb gestica prin intermediul creionului grafic este mult
mai precisă decât cea prin folosirea mouse, în condiţiile în
care valorile de timp sunt foarte apropiate (Figura 14).
Numărul erorilor a fost de asemenea semnificativ mai
mare în cazul mouse decât în cazul creionului grafic.
Scrierea de mână
Rezultatele prezentate în Figura 13 sunt evident în
favoarea utilizării creionului grafic faţă de utilizarea
dispozitivului mouse.
CONCLUZII ŞI RECOMANDĂRI
Lucrarea prezintă experimente şi măsurători asupra Figura 13. Rezultatele experimentale ale adnotării prin
utilizabilităţii tehnicilor de adnotare 2D în contextul unei scriere de mână.
lecţii. Pe lângă aspectul tehnic, contextul lecţiei implică
unele constrângeri şi solicitări suplimentare, cum ar fi:
controlul adnotării şi al documentelor sau obiectelor din sarcinii utilizând mouse şi a clarităţii scăzute a scrisului în
scenă, interpretarea întrebării şi conceperea răspunsului acest caz.
grafic, trasarea soluţiei grafice a răspunsului, analiza şi Adnotarea grafică introduce noi forme de exprimare şi
verificarea de către utilizator a calităţii răspunsului etc. astfel noi tipuri de întrebări şi răspunsuri grafice în
Unul din obiectivele principale ale experimentelor a fost aplicaţiile eLearning. Prin aceste tehnici de interacţiune se
verificarea rezultatelor raportate în literatura de pot aborda răspunsuri care solicită creativitatea, imaginaţia
specialitate, în noul context al documentelor dintr-o lecţie. şi abilităţile artistice ale participanţilor la test.
Un al doilea obiectiv a fost extinderea experimentelor la
adnotările 2D şi 3D pentru obiecte 3D, însă nefinalizate Direcţiile viitoare de cercetare cuprind evaluări de
încă şi neraportate în această lucrare. utilizabilitate ale tehnicilor de adnotare 3D, utilizarea
adnotărilor în contextul aplicaţiilor eLearning în diferite
În realizarea aplicaţiilor, în special a celor care se domenii, precum şi metode şi tehnici de măsurare şi
adresează unui număr foarte mare de utilizatori, trebuie să evaluare automată a parametrilor de adnotare.
se ţină cont de dispozitivele pe care aceştia le pot folosi
pentru a interacţiona cu aplicaţia. Tehnicile de adnotare MULŢUMIRI
trebuie concepute şi proiectate în raport cu dispozitivele Activitatea de cercetare prezentată în această lucrare a fost
de interacţiune disponibile. Utilizarea gesticii în astfel de realizată prin Proiectul I-Trace, finanţat de către
aplicaţii trebuie făcută cu atenţie datorită acurateţei destul Comunitatea Europeană prin Contractul 223434-CP-I-
de scăzute a desenării semnelor cu ajutorul mouse. 2005-IT-Minerva-M.
De asemenea, utilizarea scrierii de mână trebuie făcută cu
precauţie din cauza timpului mare necesar realizării
5
REFERINŢE 5. Mizobuchi, S., Yasumura, M., Tapping vs. Circling
1. Mackenzie, I. S., Sellen, A., And Buxton, W. A. S.: A Selections on Pen-based Devices: Evidence for
comparison of input devices in elemental pointing and Different Performance-Shaping Factors. CHI 2004,
dragging tasks. In Proceedings of the Conference on APRIL 24–29, 2004, Vienna, Austria, 2004, pp. 607-
Human Factors in Computing Systems: Reaching 614.
through Technology (CHI ’91, New Orleans, LA, Apr.
6. Gorgan, D., Baciu, B., Pop, T., Ştefănuţ, T., Boitor, R.
27–May 2), S. P. Robertson, G. M. Olson, and J. S.
Pen Based Annotation in e-Learning Applications.
Olson, Eds. ACM Press, New York, NY, 1991,
Research Report in I-Trace Project.
pp.161–166
http://www.itrace.ing.unict.it/i-trace
2. Wuthrich, C.A.: An Analysis and a Model of 3D
Interaction Methods and Devices for Virtual Reality, 7. Gorgan, D., Ştefănuţ, T., Găvrea, B., Pen Based
Proceedings of the Eurographics Workshop, pp 18-29, Graphical Annotation in Medical Education,
1999. Conferinţa CBMS 2007, Maribor, Slovenia 20-22
Iunie, 2007
3. Aliakseyeu, D., Martens, J-B., Subramanian, S.,
Rauterberg, M.: Interaction Techniques for Navigation 8. eTrace – Sistem eLearning bazat pe adnotare grafică,
through and Manipulation of 2D and 3D Data. In proc. http://dataserver.mediogrid.utcluj.ro/adnotare
of Eigth Eurographics Workshop on Virtual 9. Usability Guide, http://www.usability.gov/
Environments, Barcelona, pp. 179-188, May 2002.
4. Hinckley, K., Jacob, R., Ware, C.: Input/Output
Devices And Interaction Techniques. CRC Computer
Science and Engineering Handbook, CRC Press,
http://www.cs.tufts.edu/~jacob/papers/crc2.pdf

6
Evaluarea utilizabilitatii
tehnicilor de adnotare grafică 3D
Adnotări grafice
} spațiul de reprezentare
} 2D/2D
} 2D/3D
} 3D/3D

a)
} contextul și scopul de realizare
} prezentarea materialelor
b)
} comunicarea între utilizatori
} analiza informațiilor
} evaluarea cunoștințelor
} structură de evaluare
} descriere răspuns

c)
2
Dificultăți în trasarea adnotărilor
grafice 3D
} adnotarea continuă în jurul obiectelor convexe
} adnotarea concavităților

a) b)

c) d)
3
Dificultăți în trasarea adnotărilor
grafice 3D

4
Algoritmi de trasare a adnotărilor
grafice 3D

5
Algoritmi de trasare a adnotărilor
grafice 3D

6
Evaluarea utilizabilității adnotărilor
grafice 3D
} definiție
} utilizabilitatea reprezintă gradul în care un produs software poate fi
înțeles, învățat, utilizat și considerat atractiv de către utilizatori, atunci
când este folosit în condiții clar specificate.
(ISO DIS 9126-2:2001 )

} abordări
} evaluare euristică (efectuată de experți)
} cele 10 euristici ale lui Jacob Nielsen
} recomandări pentru dezvoltarea interfețelor utilizator în care sunt implementate
tehnici de interacțiune utilizator prin adnotare grafică 3D
} evaluarea experimentală a adnotărilor grafice (efectuată cu ajutorul
utilizatorilor)
} evaluarea comparativă a dispozitivelor de intrare: mouse și creion grafic
} evaluarea interacțiunii utilizator prin adnotare grafică 3D

7
Evaluarea experimentală a adnotărilor
3D
} aspecte evaluate
} eficacitate
} eficiență
} ușurința de învățare
} gradul de naturalețe
} satisfacția utilizatorilor

} metodologie de evaluare
} dezvoltarea și efectuarea unor experimente dedicate
} aplicarea unui chestionar
} analiza și interpretarea rezultatelor

8
Evaluarea experimentală a
dispozitivelor de intrare
} efectuarea testelor prin adnotare 2D
} 17 exerciţii sub forma unei lecţii în aplicația e-Learning eTrace
} grup de test de 20 utilizatori
} bărbaţi şi femei, 22-52 ani, 8 utilizatori cu experienţă
} măsurători:
} timpul de execuţie, aspect, număr de erori

9
Evaluarea experimentală a
dispozitivelor de intrare
} Exemple de exerciții

10
Evaluarea experimentală a
dispozitivelor de intrare
} Exemple de rezolvări

11
Evaluarea experimentală a
dispozitivelor de intrare
} Măsurători de utilizabilitate pentru selecţie individuală prin
contur.
Utilizabilitate ← 1/(Timp x Număr_de_Erori)

} Concluzie: la selecţia individuală prin contur utilizabilitatea adnotării cu


creionul grafic este mai mare decat cea cu mouse.

12
Evaluarea experimentală a
dispozitivelor de intrare
} Măsurători de utilizabilitate pentru desenarea gesturilor
Utilizabilitate ← Aspect/(Timp x Număr_de_Erori)
} Concluzie: la desenarea gesturilor, luând în considerare aspectul,
utilizabilitatea adnotării cu creionul grafic este mai mare decât cea
obţinută cu mouse.

13
Evaluarea experimentală a adnotărilor
3D
} grup de test de 50 utilizatori
} bărbaţi şi femei, 19-21 ani
} experiență medie - ridicată în utilizarea calculatorului
} experiență minimă – medie în vizualizarea 3D
} experiență minimă în modelarea 3D
} 3 experimente dedicate pentru interacțiune prin adnotare grafică 3D
} un chestionar din 14 întrebări pentru identificarea
} caracteristicilor utilizatorului
} gradului de satisfacție
} percepției asupra dificultății sarcinilor
} timp total de test pentru fiecare utilizator: 35 - 40 de minute

14
Evaluarea experimentală a adnotărilor
3D: Experimentul I
} ipoteza
} tehnicile de interacțiune utilizator bazate pe adnotarea grafică
tridimensională sunt ușor de învățat
} două grupuri de utilizatori
} Grup 1 – sesiune de prezentare și încercare a adnotărilor grafice 3D
înaintea efectuării experimentului (5 minute)
} Grup 2 - efectuarea fără pregătire a experimentului și învățarea
utilizării tehnicilor pe parcursul acestuia
} măsurători
} numărul sarcinilor complet și corect îndeplinite
} timpul de realizare a sarcinilor
} numărul de erori înregistrate (prin apăsarea butonului “anulează”)
} corectitudinea adnotării trasate

15
Evaluarea experimentală a adnotărilor
3D: Experimentul I
} sarcinile utilizator

16
Evaluarea experimentală a adnotărilor
3D: Experimentul I
} utilizabilitatea = calitate/timp

} concluzii
} performanță cu 8.95% mai bună a utilizatorilor din Grupul 1,
după 5 minute de acomodare supervizată
} în baza chestionarului, 96% dintre utilizatori apreciază că
interacțiunea prin adnotare grafică 3D este ușor de învățat

17
Evaluarea experimentală a adnotărilor
3D: Experimentul II
} ipoteza
} vizibilitatea adnotărilor tridimensionale plasate pe suprafețe 3D oarecare
depinde de complexitatea suprafeței în termeni de număr și tip de concavități
} măsurători
} numărul adnotărilor observate de către utilizatori
} timpul de realizare a sarcinilor

18
Evaluarea experimentală a adnotărilor
3D: Experimentul II
} utilizabilitatea = calitate/timp
} identificarea adnotărilor grafice pe suprafețe de complexitate diferită

} identificarea unui număr variabil de adnotări pe o suprafață sferică

19
Evaluarea experimentală a adnotărilor
3D: Experimentul II
} 84.31% dintre participanți au apreciat prin intermediul
chestionarului că adnotările sunt ușor și foarte ușor de
identificat

} concluzii
} identificarea adnotărilor grafice depinde de caracteristicile
suprafețelor
} interfețele utilizator trebuie să ofere metode alternative de
localizare a adnotărilor din cadrul unei scene de obiecte
} identificarea individualizată a adnotărilor depinde de forma
acestora și de reperele existente pe suprafață

20
Evaluarea experimentală a adnotărilor
3D: Experimentul III
} ipoteza
} tehnicile de interacțiune prin intermediul adnotărilor grafice
tridimensionale au un grad ridicat de naturalețe
} două grupuri de utilizatori
} Grup A - efectuează prima dată sarcinile pe un obiect real
} Grup B - efectuează prima dată sarcinile pe un obiect virtual
} măsurători
} numărul sarcinilor complet și corect îndeplinite
} timpul de realizare a sarcinilor în mediul real și mediul virtual
} numărul de erori înregistrate
¨ prin apăsarea butonului “anulează” în mediul virtual
¨ prin trasările greșite din mediul real
} corectitudinea adnotării trasate

21
Evaluarea experimentală a adnotărilor
3D: Experimentul III

22
Evaluarea experimentală a adnotărilor
3D: Experimentul III

23
Evaluarea experimentală a adnotărilor
3D: Experimentul III
} utilizabilitatea pentru fiecare utilizator în fiecare mediu de
lucru

E - valoarea normalizată a erorilor față de valoarea maxim înregistrată


pentru toți utilizatorii, indiferent de mediul de lucru
} mediu real: erori de trasare = linii care nu unesc două puncte consecutive
} mediu virtual: erori de trasare = număr de apelări ale funcției de anulare
C - calitatea trasării grafice exprimată procentual
} mediu real: procentul de segmente cu o abatere mai mică de 1mm
} mediu virtual: procentul de puncte din adnotare plasate la o distanță mai mică de
1mm față de adnotarea exemplu
T - valoarea normalizată a timpului față de durata individuală maxim
înregistrată pentru toți utilizatorii, indiferent de mediul de lucru

24
Evaluarea experimentală a adnotărilor
3D: Experimentul III
} valori medii obținute în realizarea sarcinilor

} observații
} performanțele în mediul real sunt mai bune la primul contact
} performanțele în mediul real diferă foarte puțin între cele două
grupuri de utilizatori
} diferențele între grupuri pentru mediul virtual se datorează
mai ales timpului necesar efectuării sarcinilor

25
Evaluarea experimentală a adnotărilor
3D: Experimentul III
} concluzii
} principalele dificultăți întâmpinate de utilizatori în mediul virtual se
datorează mecanismelor de vizualizare a scenei de obiecte și nu
interacțiunii prin adnotarea grafică
} tehnicile de interacțiune prin adnotare grafică 3D au un grad ridicat de
naturalețe
} întrebare chestionar: Cum apreciați comparativ dificultatea trasării
pe un obiect real cu cea efectuată pe un obiect virtual?

26
Evaluarea experimentală a adnotărilor
3D: Chestionarul

27
User Experience

User Interface Design


User Experience definition
o User experience includes all aspects of the end user’s
interaction with a product or service
o Usability is an important aspect of the overall user
experience, but it is not the only one. If only usability is
considered, many other aspects of the experience are
overlooked and user’s needs can’t be fully met
o Unlike usability, user experience goes beyond the user
interface. There are other critical aspects of the product
or service that need to be explored in order to get a full
picture of the product or service

User Interface Design - UTCN 2


User Experience definition (cont)
o During the evaluation process, we go beyond simply
testing the usability of the design to understand the
entire user experience
o It’s not only about asking users if they can efficiently
complete a task; we want to know whether our design
would add value to their lives, provide the functionality
they need, and constitute a more delightful experience
than what they’re used to.

User Interface Design - UTCN 3


UX characteristics
1. Functionality
Does the product or service meet the end user’s needs? Can
a user use it to do what they need to do? Are the tasks and
information a user would want made available?
2. Findability
Are users able to find what they’re looking for? Can users
find the product itself using external search engines? Are
navigational elements clear and informative?
3. Trust
Does the product, service, and/or company seem credible?
Is it easy to contact a real person or find support
information? Does the content seem genuine and up to
date?

User Interface Design - UTCN 4


UX characteristics
4. Value
Is the product or service desirable to the end user? Can a
user easily describe its value? Does it improve customer
satisfaction?
5. Accessibility
Can it be accessed via all channels and devices? Are all
potential end users, including those with disabilities, being
considered?
6. Delight
Are users’ expectations not only met, but exceeded? Are
there differentiators from other similar experiences, or is
anything being provided that is pleasantly unexpected?
Does it bring a user joy?

User Interface Design - UTCN 5


ISO Definition - Usability vs UX
o ISO Definition
o Usability is concerned with the “effectiveness, efficiency
and satisfaction with which specified users achieve
specified goals in particular environments” (ISO 9241-
11)
o User Experience is concerned with “all aspects of the
user’s experience when interacting with the product,
service, environment or facility” (ISO 9241-210)

Reference:
The Difference (And Relationship) Between Usability And User Experience, By Justin Mifsud,
https://usabilitygeek.com/the-difference-between-usability-and-user-experience/.

User Interface Design - UTCN 6


Aim - Usability vs UX
o Aim on a website
o The aim of usability is to make that web site easy to use
o The aim of user experience is to make the user happy
before, during and after using that web site
o Thus, usability relates to the ease with which users can
achieve their goals while interacting with a web site
while user experience is concerned with the way users
perceive their interaction with that web

User Interface Design - UTCN 7


Process - Usability vs UX
o Defined as a Process
o “User experience (UX) design is the process of creating
products that provide meaningful and relevant
experiences to users. This involves the design of the
entire process of acquiring and integrating the product,
including aspects of branding, design, usability, and
function.”

User Interface Design - UTCN 8


Question - Usability vs UX
o Defined as a Question
o Usability can be modeled as the question “Can the user
accomplish their goal?” whilst user experience can be
phrased as “Did the user have as delightful an
experience as possible?”

User Interface Design - UTCN 9


Metaphor - Usability vs UX
o Defined as a Metaphor
o Metaphorical comparison: freeway (usability) vs. a
twisting mountain road (user experience)
n In essence, this metaphorical representation of these two
terms focuses on defining something that is usable as
functional, simple and requires less mental effort to use.
Thus, a freeway is usable since it has no oncoming traffic,
enables you to get from point A to point B in a fast manner
and has consistent signage, hence requiring little
learnability. In terms of usability, a freeway is highly
usable but it is boring when assessed in terms of user
experience.
n In contrast, something that focuses on user experience is
depicted as highly emotional. Thus, a twisting mountain
road is less usable but, because of its scenery, the smell of
nature and the excitement of the climb, it conveys a
pleasant user experience.

User Interface Design - UTCN 10


Resources - Usability vs UX
o Resources Required
o Usability involves those employees who influence the
user interface design of a web site whilst user experience
requires the collective and seamless effort of employees
from various departments including engineering,
marketing, graphical and industrial design and interface
design

User Interface Design - UTCN 11


Impact - Usability vs UX
o Impact
o Although user experience requires more effort to do well,
its results have a better impact.
o When done properly, user experience effectively
enhances the relationship between the user and the
brand. This is because “true user experience goes far
beyond giving customers what they say they want, or
providing checklist features”

User Interface Design - UTCN 12


Effect on UI - Usability vs UX
o Effect on User Interface
o A usable user interface is one which is typically intuitive,
simple or extremely learnable.
o A user interface whose aim is to create a positive user
experience is one which is pleasing to the user. This does
not mean that when the focus is on user experience, the
user interface is not usable.
o To the contrary, user experience professionals typically
hand over their designs to usability professionals so that
they can validate them

User Interface Design - UTCN 13


Usability vs User Experience
o Usability is a narrower concept than user experience
since it only focuses on goal achievement when using a
web site.
o By contrast, user experience is a “consequence of the
presentation, functionality, system performance,
interactive behaviour, and assistive capabilities of the
interactive system”.
o User experience includes aspects such as human factors,
design, ergonomics, HCI, accessibility, marketing as well
as usability.

User Interface Design - UTCN 14


Usability vs User Experience
o User experience includes utility, usability, desirability
and brand experience

User Interface Design - UTCN 15


Development Methodology Phases

User Interface Design


Outline

o User requirements
o Task description and analysis
o Prototyping

User Interface Design - UTCN 2


User Requirements
o The importance of requirements
o Different types of requirements
o Data gathering

User Interface Design - UTCN 3


Aims, approaches, outcome
o Aims:
1. Understand as much as possible about users, task, context
2. Produce a stable set of requirements

o Approaches
n Data gathering activities
n Data analysis activities
n Expression as “requirements”
n All of this is iterative

o Outcome:
n Requirements definition

User Interface Design - UTCN 4


Iterative requirements definition
o Main questions:
1. What do users want?
2. What do users ‘need’?

o Requirements need clarification, refinement, completion,


re-scoping

o Input: requirements document

o Output: stable requirements

User Interface Design - UTCN 5


Types of requirements
o Functional:
o What the system should do
o Historically the main focus of requirements activities
o Non-functional:
o memory size, response time, software and hardware platform,
programming language, development tools
o Data:
o What kinds of data need to be stored?
o How will they be stored (e.g. database)?
o Environment or context of use:
o physical:
dusty, noisy, vibration, light, heat, humidity, etc. (e.g. ATM)
o social:
sharing of files, of displays, in paper, across great distances, work
individually, privacy for clients
o organisational:
hierarchy, IT departments, user support, communications structure
and infrastructure, availability of training

User Interface Design - UTCN 6


Types of requirements
o Users:
o Characteristics:
ability, background, attitude to computers
o System use:
novice, expert, casual, frequent
o Novice:
step-by-step (prompted), constrained, clear information
o Expert:
flexibility, access/power
o Frequent use:
short cuts
o Casual/infrequent:
clear instructions, e.g. menu paths
o Usability:
o learnability, flexibility, attitude
o learnability, efficiency, memorability, errors, satisfaction

User Interface Design - UTCN 7


Data gathering techniques (DGT)
o Questionnaires
o Interviews
o Workshops or focus groups
o Naturalistic observation
o Studying documentation

User Interface Design - UTCN 8


DGT - Questionnaires
o A series of questions designed to draw out specific information
o Questions may require different types of answers:
1. simple YES/NO
2. choice of pre-supplied answers
3. comment

o Often used in conjunction with other techniques


o Can give quantitative or qualitative data
o Good for answering specific questions from a large, dispersed
group of people

User Interface Design - UTCN 9


DGT - Interviews
o Forum for talking to people
o Structured, unstructured or semi-structured
o Can be used in interviews: scenarios of use, prototypes
o Good for exploring issues
o Time consuming and may be difficult to visit everyone

User Interface Design - UTCN 10


DGT - Workshops or focus groups
o Group interviews
o Brainstorming
o Good at gaining a consensus view and/or highlighting areas of
conflict

User Interface Design - UTCN 11


DGT - Naturalistic observation
o Spend time with stakeholders in their day-to-day tasks,
observing work as it happens
o Gain insights into the tasks of the stakeholders
o Good for understanding the nature and context of the tasks
o Requires time and commitment from a member of the design
team, and it can result in a huge amount of data
o Ethnography is one form
scientific description of individual culture

User Interface Design - UTCN 12


DGT - Studying documentation
o Procedures and rules are often written down in manuals
Document flow inside an organization, internal procedures, operational rules

o Good source of data about the steps involved in an activity,


and any regulations governing a task
o Not to be used in isolation
o Good for understanding legislation, and getting background
information
o No stakeholder time, which is a limiting factor on the other
techniques

User Interface Design - UTCN 13


Appropriate DG techniques
o Data gathering techniques differ in two ways:
1. Amount of time, level of detail and risk associated with the
findings
2. Knowledge the analyst requires

o Select the DG techniques by the kind of task to be


studied:
o Sequential steps or overlapping series of subtasks
o High or low, complex or simple information
o Task for an amateur or a skilled practitioner

User Interface Design - UTCN 14


Data gathering - issues
o Identifying and involving stakeholders:
users, managers, developers, customer representants, shareholders

o Involving stakeholders:
workshops, interviews, workplace studies, include stakeholders into the development
team

o Need real users, not managers:


traditionally a problem in software engineering

o Requirements management:
version control, ownership

o Communication between parties:


o within development team
o with customer, user
o between usere. Different parts of an organisation use different terminology

o Domain knowledge distributed and implicit:


o difficult to understand, terminology

o Availability of key people

User Interface Design - UTCN 15


Data interpretation and analysis
o Start soon after data gathering session

o Initial interpretation before deeper analysis

o Different approaches emphasize different elements


e.g. Class diagrams for object-oriented systems,
Entity-relationship diagrams for data intensive systems
Use cases
Scenarios
Low fidelity prototyping
Sketches

User Interface Design - UTCN 16


Task Description and Analysis
o Task descriptions:
o Scenarios
o Use Cases
o Essential use cases
o Task analysis: HTA

User Interface Design - UTCN 17


Task descriptions
o Scenarios
o an informal narrative story, simple, natural, personal, not
generalisable

o Use cases
o assume interaction with a system
o assume detailed understanding of the interaction

o Essential use cases


o abstract away from the details
o does not have the same assumptions as use cases

User Interface Design - UTCN 18


Scenario
Scenario for shared calendar:

“The user types in all the names of the meeting participants together with some
constraints such as the length of the meeting, roughly when the meeting needs
to take place, and possibly where it needs to take place.

The system then checks against the individuals’ calendars and the central
departmental calendar and presents the user with a series of dates on which
everyone is free all at the same time. Then the meeting could be confirmed and
written into people’s calendars. Some people, though, will want to be asked
before the calendar entry is made. Perhaps the system could email them
automatically and ask that it be confirmed before it is written in.”

User Interface Design - UTCN 19


Use case
Use case for shared calendar:
1. The user chooses the option to arrange a meeting.

2. The system prompts user for the names of attendees.


3. The user types in a list of names.
4. The system checks that the list is valid.
5. The system prompts the user for meeting constraints.

6. The user types in meeting constraints.


7. The system searches the calendars for a date that satisfies the constraints.
8. The system displays a list of potential dates.
9. The user chooses one of the dates.
10. The system writes the meeting into the calendar.
11. The system emails all the meeting participants informing them of them
appointment.

User Interface Design - UTCN 20


Alternative courses
o Some alternative courses for shared calendar:

5. If the list of people is invalid,

5.1 The system displays an error message.

5.2 The system returns to step 2.

8. If no potential dates are found,

8.1 The system displays a suitable message.

8.2 The system returns to step 5.

User Interface Design - UTCN 21


Use case diagram
o Example use case diagram for shared calendar

Arrange a
meeting

Retrieve
contact details

Administrator Departmental
member
Update calendar
entry

User Interface Design - UTCN 22


Essential use case
o Example essential use case for shared calendar:
arrangeMeeting

User Intention System Responsibility


arrange a meeting

request meeting attendees and


constraints
identify meeting attendees and
constraints
search calendars for suitable dates
suggest potential dates
choose preferred date
book meeting

User Interface Design - UTCN 23


Task analysis
o Task descriptions are often used to envision new systems
or devices

o Task analysis is used mainly to investigate an existing


situation

o Do not focus on superficial activities and highlight:


o what are people trying to achieve
o why are they trying to achieve it
o how are they going about it

o The most popular technique:


Hierarchical Task Analysis (HTA)

User Interface Design - UTCN 24


Hierarchical Task Analysis (HTA)
o Breaks down a task into subtasks, then sub-sub-tasks and so
on. These are grouped as plans which specify how the tasks
might be performed in practice

o HTA focuses on physical and observable actions, and includes


looking at actions not related to software or an interaction
device

o Start with a user goal which is examined and the main tasks
for achieving it are identified

o Tasks are sub-divided into sub-tasks

User Interface Design - UTCN 25


HTA - example
In order to borrow a book from the library
1. go to the library

2. find the required book


2.1 access library catalogue
2.2 access the search screen
2.3 enter search criteria
2.4 identify required book
2.5 note location

3. go to correct shelf and retrieve book

4. take book to checkout counter

plan 0: do 1-3-4. If book isn’t on the shelf expected, do 2-3-4.


plan 2: do 2.1-2.4-2.5. If book not identified do 2.2-2.3-2.4.

User Interface Design - UTCN 26


HTA – graphical representation

Borrow a
book from the
library
0
plan 0:
do 1-3-4.
If book isn’t on the shelf expected, do 2-3-4.

go to the find required retrieve book take book to


library book from shelf counter
1 2 3 4

plan 2:
do 2.1-2.4-2.5.
If book not identified from information available, do 2.2-2.3-2.4-2.5

access access enter identify note


catalog search search required location
2.1 screen 2.2 criteria 2.3 book 2.4 2.5

User Interface Design - UTCN 27


Prototyping
o Prototype
definition
purpose
subjects

o Types of prototyping
o Methods for prototyping

User Interface Design - UTCN 28


Prototype definition
o Prototype is a small-scale model
o Examples in interaction design:
o screen sketches
o storyboard, i.e. a cartoon-like series of scenes
o powerpoint slide show
o video simulating the use of a system
o cardboard mock-up
o piece of software with limited functionality written in the target
language or in another language

User Interface Design - UTCN 29


Prototype purpose
o Solution domain exploration approach
o Answer questions, and support designers in choosing between
alternatives
o Test out ideas for yourself
o Support evaluation and feedback
n central approaches in interaction design
o Encourage reflection
o Communication approach
o Stakeholders can see, hold, interact with a prototype more easily than a
document or a drawing
o Identify the expectations of the customer
o Team members can communicate effectively
o Identify usability problems and suggestions for improvement
o Provide feedback on the validity of the task model, user model and
style guide
o Reduced risk of unsuitable UI

User Interface Design - UTCN 30


Prototyping subjects
o Technical issues

o Work flow, task design

o Screen layouts and information display

o Difficult, controversial, critical areas

User Interface Design - UTCN 31


Types of prototypes
o Evolutionary
the prototype eventually becomes the product
o Revolutionary (throwaway)
the prototype is used to get the specifications right, then discarded

o Low fidelity
just rough sketch - not close to final
o High fidelity
resembles final product

o Horizontal prototype
broad but only top-level
o Vertical prototype
deep, but only some functions

User Interface Design - UTCN 32


Prototype type - evolutionary prototype
o Actual system evolves from a very limited initial version
o Requires other tools
n the ability to construct production-quality systems
n flexibility

Requirements Build Evaluate


Definition Prototype Prototype

User Interface Design - UTCN 33


Prototype type - throwaway prototype
n Throwaway prototype
o Inexpensive—in materials cost, people time, and schedule time
o No risk of being mistaken for the final product

o Simple and fast to repeat as lessons are learned

Preliminary Build Evaluate


requirements Prototype Prototype

Yes
No Final
Adequate?
requirements

User Interface Design - UTCN 34


Prototype type - incremental prototype
o Overall design
o System partitioned into independent components

Identify Prototype
components Component

No Yes
System
complete? ...

User Interface Design - UTCN 35


Horizontal prototype

User Interface Design - UTCN 36


Vertical prototype

User Interface Design - UTCN 37


Low-fidelity prototyping
o Uses a medium which is unlike the final medium
e.g. paper, cardboard

o Is quick, cheap and easily changed

o Examples:
Sketches of screens, task sequences, etc
Card notes
Storyboards
‘Wizard-of-Oz’

User Interface Design - UTCN 38


Low-fidelity prototyping

User Interface Design - UTCN 39


Sketches
o Drawing on the paper
o Interface objects
n Symbol
n Location
n Shape
n Dimension
o Interaction metaphors
o Graphical aspect

o Metaphor (symbolical presentation of a real case):


1. Visual presentation +
2. Scenario +
3. Sequence of user actions +
4. Interaction device

User Interface Design - UTCN 40


Prototype type – paper prototype
Menu Bar

Scroll
Bar

Opening Secondary
Contents Menu

User Interface Design - UTCN 41


Storyboards
o It is a series of sketches showing how a user might
progress through a task using the device

o Often used with scenarios


bring more details
support role playing

o Used early in design

User Interface Design - UTCN 42


Storyboards - example
Prof. James Landay CS 160, HCI course, Univ. of Berkeley

User Interface Design - UTCN 43


Prof. James Landay
CS 160, HCI course,
Univ. of Berkeley

User Interface Design - UTCN 44


Card notes
o Index cards

o Each card represents one screen

o Usually in website development

User Interface Design - UTCN 45


‘Wizard-of-Oz’ prototyping
o The user are interacting with a computer

o The developer responds to output rather than the system

o Usually done early in design to understand users’ expectations

‘System’,
User
developer
>Select the user name
>Name: ‘Maria’
>Specify the age
>Years: ’25’
>

User Interface Design - UTCN 46


High-fidelity prototyping
o Prototype looks more like the final system than a low-fidelity version

o Uses techniques, styles, languages, graphics, information organization,


and navigation that you would expect to be in the final product

o It is not at all a full system


o All prototypes involve usually types of compromise
1. ‘horizontal’: provide a wide range of functions, but with little detail
2. ‘vertical’: provide a lot of detail for only a few functions

o For software-based prototyping there are slow responses, sketches of icons, limited
functionality, etc.

o Final product needs engineering along a development methodology

o Development environments
E.g. Flash, Visual Basic, Visual C++, .Net, HTML tools, etc.

User Interface Design - UTCN 47


High-fidelity prototyping

User Interface Design - UTCN 48


Method for prototyping
1. Define prototype objectives
o define the objectives of each prototype iteration

2. Choose the prototyping tool


o flipchart, whiteboard and pens
o presentation package
o high level programming language
o specialized prototyping package (e.g. Flash)

3. Build prototype
4. Investigate the prototype
o For each problem/change make a note:
n what is the problem
n what solution is proposed
n what decision is made (try in current iteration, in next iteration, ..)
n what is the rationale for the decision.

User Interface Design - UTCN 49


Questions and problems
1. Explain the difference between user requirements and user’s
needs in the interactive applications development
methodology.
2. Classify and exemplify the types of requirements.
3. Detail and exemplify the user requirements and the usability
requirements.
4. Explain and exemplify the data gathering technique based on
questionnaires. Highlight the main issues.
5. Explain and exemplify the data gathering technique based on
interview. Highlight the main issues.
6. Explain and exemplify the data gathering technique based on
workshops or focus groups. Highlight the main issues.
7. Explain and exemplify the data gathering technique based on
naturalistic observations. Highlight the main issues.

User Interface Design - UTCN 50


Questions and problems
8. Explain and exemplify the data gathering technique based on
studying documentation. Highlight the main issues.
9. Give an example of data gathering based on all these five
techniques.
10. What is the main difference between task description and task
analysis? Way the task analysis is not enough in the
development of interactive applications?
11. Why the scenarios are used in the development of the
interactive applications even they are not a very rigorous
technical approach?
12. What are the main goals of the use case diagrams?

13. Define and exemplify shortly the concept of prototype.

14. Exemplify and explain the purpose of the prototyping as


solution domain exploration approach.

User Interface Design - UTCN 51


Questions and problems
15. Exemplify and explain the purpose of the prototyping as
communication approach.
16. Exemplify and explain the purpose of the prototyping as risk
identification and reduction.
17. Exemplify and explain the purpose of the prototyping as
correctness and usability of the tasks.
18. Exemplify and explain the evolutionary prototyping approach
through the development of a graphics editor.
19. Exemplify and explain the revolutionary prototyping approach
through the development of a graphics editor.
20. Exemplify and explain by comparing the horizontally and
vertically prototyping approach through the development of a
graphics editor.

User Interface Design - UTCN 52


Questions and problems
21. Explain how can be used the incremental prototyping in the
analysis phase and in the development phase of the interactive
application development methodology.

User Interface Design - UTCN 53


Formative and Summative Evaluation

User Interface Design


Contents
o Objectives
o Formative and Summative Evaluation
o Types of Evaluation
o Formative Evaluation Techniques
o Summative Evaluation Techniques

User Interface Design - UTCN 2


Objectives
o To assess whether the UI (i.e. UI design) satisfies the
requirements (i.e. usability)

o To identify problems (i.e. usability)

o Evaluation should occur throughout the design life cycle


providing feedback for modifications

User Interface Design - UTCN 3


Formative and summative evaluation
A. Formative
q during development
B. Summative
q at completion

“When the cook tastes the soup in the kitchen, that’s formative
evaluation; when the guests taste the soup at the dinner table, that’s
summative evaluation.” (Prof. O. De Troyer)

User Interface Design - UTCN 4


Types of evaluation
o Laboratory studies
Advantages
n Sophisticated audio/visual recording facilities
n Interruption-free environment
n Context can be controlled

Disadvantages
n Lack of real context - unnatural situation

o Field studies
Advantages
n Real context: user is in its natural environment
E.g. user interrupted during the execution of a task

Disadvantages
n Constant interruptions can make observation difficult

User Interface Design - UTCN 5


A. Formative evaluation
o Methods are largely analytic
o Methods do not rely on an implementation

o Methods can also be used to evaluate an implementation


(summative evaluation)
o Methods are not mutual exclusive
o Formative evaluation types:
o Cognitive walkthrough
o Heuristic
o Review based
o Model based

User Interface Design - UTCN 6


A1. Cognitive walkthrough
o Definition:
o Evaluators step through the action sequence a user has to perform to
accomplish a certain task
o Objective:
o Evaluate how easy a system is to learn through exploration
o Requirements
o A description of a prototype
o A representative task
o A scenario to complete the task
o The description of the user class
o Technique:
o The task is competed trough a sequence of actions (scenario)
o For each action
1. Does this action support the completion of the proposed task?
2. Will the user notice that the action is available?
3. Once the action is found within GUI, does the user know that this
action is the right one?
4. Will the users understand the feedback given by the action?
o Results:
o Problems should be documented in usability problem report
sheet

User Interface Design - UTCN 7


A2. Heuristic evaluation
o Definition:
o Structuring the critique of a system using a set of relatively simple and
general heuristics

o Objectives:
o Heuristics are used to find the usability problems
o Classification of the problem is less important

o Technique
o Several evaluators are needed
o Evaluations must be done independently

User Interface Design - UTCN 8


Ten usability heuristics by Jakob Nielsen
o "heuristics“ – ways to discover the specific usability issues
o Ten general principles for user interface design (updated in 2004)
1. Visibility of system status
2. Match between system and the real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover from errors
10. Help and documentation
Reference: Nielsen, J., Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.),
Usability Inspection Methods, John Wiley & Sons (1994).

User Interface Design - UTCN 9


Ten Usability Heuristics
1. Visibility of system status
n The system should always keep users informed about what is going on,
through appropriate feedback within reasonable time.
2. Match between system and the real world
n The system should speak the users' language, with words, phrases and
concepts familiar to the user, rather than system-oriented terms. Follow
real-world conventions, making information appear in a natural and logical
order.
3. User control and freedom
n Users often choose system functions by mistake and will need a clearly
marked "emergency exit" to leave the unwanted state without having to go
through an extended dialogue. Support undo and redo.
4. Consistency and standards
n Users should not have to wonder whether different words, situations, or
actions mean the same thing. Follow platform conventions.
5. Error prevention
n Even better than good error messages is a careful design which prevents a
problem from occurring in the first place.

User Interface Design - UTCN 10


Ten Usability Heuristics
6. Recognition rather than recall
n Make objects, actions, and options visible. The user should not have to
remember information from one part of the dialogue to another. Instructions
for use of the system should be visible or easily retrievable whenever
appropriate.
7. Flexibility and efficiency of use
n Accelerators - unseen by the novice user - may often speed up the
interaction for the expert user such that the system can cater to both
inexperienced and experienced users. Allow users to tailor frequent actions.
8. Aesthetic and minimalist design
n Dialogues should not contain information which is irrelevant or rarely
needed. Every extra unit of information in a dialogue competes with the
relevant units of information and diminishes their relative visibility.
9. Help users recognize, diagnose, and recover from errors
n Error messages should be expressed in plain language (no codes), precisely
indicate the problem, and constructively suggest a solution.
10. Help and documentation
n Even though it is better if the system can be used without documentation, it
may be necessary to provide help and documentation. Any such information
should be easy to search, focused on the user's task, list concrete steps to
be carried out, and not be too large.

User Interface Design - UTCN 11


Heuristics in Grid Computing
o A set of 12 Grid Computing usability heuristics were
grouped in three categories:
1. Design and Aesthetics (5)
2. Navigation (4)
3. Errors and Help (3)

Reference: Rusu C., Roncagliolo S., Tapia G., Hayvar D., Rusu V., Gorgan D., Usability
Heuristics for Grid Computing Applications. ACHI 2011 - The Fourth International
Conference on Advances in Computer-Human Interactions, 23-28 Febr., 2011,
Gosier, Guadeloupe, France. Published by IARIA, 2011, Eds. Miller I., Roncagliolo
S., pp.53-58 (2011).

User Interface Design - UTCN 12


Design and Aesthetics Heuristics
(H1) Clarity: A Grid Computing application interface should be easy to
understand, using clear graphic elements, text and language.
(H2) Metaphors: A Grid Computing application should use appropriate
metaphors, making the possible actions easy to understand, through
images and familiar objects.
(H3) Simplicity: A Grid Computing application should provide the
necessary information in order to complete a task in a concise (yet
clear) manner.
(H4) Feedback: A Grid Computing application should keep users informed
on the jobs’ progress, indicating both the global and the detailed state
of the system. The application should deliver appropriate feedback on
users’ actions.
(H5) Consistency: A Grid Computing application should be consistent in
using language and concepts. The forms of data entry and visualization
of results should be consistent.

User Interface Design - UTCN 13


Navigation Heuristics
(H6) Shortcuts: A Grid Computing application should provide shortcuts,
abbreviations, accessibility keys or command lines for expert users.
(H7) Low memory load: A Grid Computing application should maintain
the main commands always available. It should offer easy to find
elements, functions and options.
(H8) Explorability: A Grid Computing application should minimize
navigation and should provide easy, clear, and natural ways to perform
tasks.
(H9) Control over actions: A Grid Computing application should offer
ways to cancel a running task or process. It should allow undo and/or
changes of actions.

User Interface Design - UTCN 14


Errors and Help Heuristics
(H10) Error prevention: A Grid Computing application should prevent
users from performing actions that could lead to errors, and should
avoid confusions that could lead to mistakes.
(H11) Recovering from errors: A Grid Computing application should
provide clear messages, hopefully indicating causes and solutions of
errors.
(H12) Help and documentation: A Grid Computing application should
provide an easy to find, easy to understand, and complete online
documentation. It should provide contextual help and glossary of
terms for novice users.

User Interface Design - UTCN 15


Extended Heuristics - Checklist
(H1) Claritate
(H1.1) Scopul aplicaţiei este clar pentru toţi utilizatorii.
(H1.2) Tehnicile de interacţiune şi elementele grafice sunt înţelese uşor,
scopul lor este clar.
(H1.3) Elementele din interfaţă sunt familiare utilizatorilor.
(H1.4) Limbjul din interfaţa utilizator este clar şi uşor de înţeles.

(H2) Metafore
(H2.1) Metaforele îl ajută pe utilizator să înţeleagă mai bine semnificaţia
aplicaţiei.
(H2.2) Metaforele sunt folosite numai pentru concepte şi taskuri simple.
(H2.3) Metaforele pentru concepte şi taskuri complexe, greu de înţeles,
sunt evitate.
(H2.4) Metaforele sunt simplu de înţeles.

User Interface Design - UTCN 16


Extended Heuristics – Checklist (2)
(H3) Simplitate
(H3.1) Interfaţa utilizator este simplă şi neîncărcată.
(H3.2) Nu există elemente redundante cum ar fi simboluri grafice,
controale, meniuri, elemente grafice statice sau dinamice, sau text.
(H3.3) Nu există în interfaţă numai elemente ornamentale.

(H4) Ecou (engl. Feedback)


(H4.1) Există indicatori clari ai stării sistemului.
(H4.2) Există indicatori clari ai stării taskurilor.
(H4.3) Sistemul arată clar reacţiile sale la acţiunile utilizatorului.
(H4.4) Există indicatori dinamici clari ai desfăşurării unui proces.

User Interface Design - UTCN 17


Extended Heuristics – Checklist (3)
(H5) Consistenţă
(H5.1) Există consistenţă în folosirea în interfaţă a vocabularului, tehnicilor
de interacţiune, elementelor grafice, sau stilului de interacţiune.
(H5.2) Introducerea datelor utilizator similare se face într-un mod simular.
(H5.3) Afişarea datelor similare se face într-un mod similar.

(H6) Accelerări (engl. Shortcuts)


(H6.1) Există moduri accelerate de invocare a modulelor şi funcţionalităţii
de bază a aplicaţiei.
(H6.2) Linia de comandă este disponibilă pentru cele mai comune şi
frecvente operaţii.
(H6.3) Există moduri alternative pentru realizarea unor operaţii utilizator.

User Interface Design - UTCN 18


Extended Heuristics – Checklist (4)
(H7) Memorare redusă (engl. Low memory load)
(H7.1) Funcţionalitatea principală este întotdeauna disponibilă.
(H7.2) Funcţtile şi opţiunile sunt uşor de găsit.
(H7.3) Formularele deja completate sunt întotdeauna înregistrate.
(H7.4) Întotdeauna datele de intrare nespecificate explicit de către utilizator
primesc valori iniţiale implicite.

User Interface Design - UTCN 19


Extended Heuristics – Checklist (5)
(H8) Explorare
(H8.1) Se dau pentru fiecare task secvenţe explicite de paşi/acţiuni.
(H8.2) Navigarea este intuitivă, uşor de înţeles.
(H8.3) Meniurile sunt consistente iar selectarea opţiunilor duce la efecte
predictibile.
(H8.4) Explorarea grafică a datelor spaţiale.
(H8.5) Explorarea text a datelor spaţiale de intrare sau de ieşire.
(H8.6) Vizualizarea grafică a algoritmilor de prelucrare.
(H8.7) Vizualizarea text a algoritmilor de prelucrare.
(H8.8) Explorarea rezultatelor intermediare ale prelucrărilor distribuite.

User Interface Design - UTCN 20


Extended Heuristics – Checklist (6)
(H9) Control asupra acţiunilor
(H9.1) Acţiunile se pot anula uşor.
(H9.2) Este uşor să anulezi o acţiune să revii la starea anterioară.
(H9.3) Acţiunile anulate se opresc imediat, cu un feedback corespunzător.
(H9.4) Execuţia prelucrărilor este monitorizată în timp real.
(H9.5) Arhivarea informaţiilor despre procesele desfăşurate.
(H9.6) Lansarea în execuţie a prelucrărilor distribuite numai după
verificarea corectitudinii datelor de intrare.
(H9.7) Intervenţia utilizatorului în optimizarea execuţiei proceselor
distribuite.
(H9.8) Controlul utilizatorului asupra alocării şi utilizării resurselor de calcul
distribuit.
(H9.10) Controlul execuţiei simultane/paralele a mai multor procese
distribuite.

User Interface Design - UTCN 21


Extended Heuristics – Checklist (7)
(H10) Prevenirea erorilor
(H10.1) Când este necesară încărcarea unui fişier, se specifică clar tipul
fişierului.
(H10.2) Pentru toate datele de intrare se specifică domeniul de valori şi
tipul permis.
(H10.3) Toate datele de intrare sunt validate.
(H10.4) Mesajele de atenţionare sunt uşor de înţeles.
(H10.5) Utilizatorul este asistat de sistem în acţiunile comune.
(H10.6) Specificarea intrărilor utilizator prin selecţie în loc de editare.
(H10.7) Construirea formelor sintactice prin manipulare directă.
(H10.8) Evitarea formelor semantice incorecte.

User Interface Design - UTCN 22


Extended Heuristics – Checklist (8)
(H11) Recuperare din erori
(H11.1) Mesajele de eroare sunt simple şi uşor de înţeles.
(H11.2) Mesajele de eroare sunt orientate spre rezolvarea problemelor.
(H11.3) Utilizatorul este asistat de sistem în taskurile complexe.
(H11.4) Recuperarea unui proces de la o stare anterioară stabilă şi corectă.

User Interface Design - UTCN 23


Extended Heuristics – Checklist (9)
(H12) Ajutor şi documentare
(H12.1) Documentaţie online.
(H12.2) Manual de utilizare.
(H12.3) Indicaţii pentru taskurile comune/de bază.
(H12.4) Manual de referinţă complet şi explicaţii detaliate.
(H12.5) Indicaţii contextuale disponibile în mod sistematic.
(H12.6) Informaţii detaliate despre procesele folosite de utilizator.

User Interface Design - UTCN 24


A3. Review-based evaluation
o Definition:
o Make use of existing experimental results about different menu
types, recall of command names, choices of icons, shotcuts,
etc.

o Technique:
o Search literature for this, conference papers, technical reports,
etc.
o Take into consideration the context, user classes and tasks
used during the experiments

User Interface Design - UTCN 25


A4. Model-based evaluation
o Theoretical models used to predict usability
E.g. GOMS, keystroke-level model
GUI design evaluation

o Design rationale provides a framework to evaluate


design options

User Interface Design - UTCN 26


GUI Design Evaluation
Fitts’s Law
Used to model physical motion from a starting point to a target
e.g. Move mouse from starting point to destination.
e.g. Draw line from here to there
Based on certain parameters
Distance, D: From starting point to target
Target width, W: Target has a certain width along the moving direction
Based on certain assumptions
Developed for 1D tasks
Extensions to 2 or 3D are possible
Motion must be along straight line in these cases
The law that predicts time of the motion is:
T = K log2 (2D/W)
K is a constant that represents human processing time. For mice it is
consistently around 100 ms. Sometimes a constant is added to account for
variation in input device
Examples:
Given a distance of 10 cm, we can compare predicted times for 1.5 cm and 1.2
cm targets:
100 log (20/1.5) = about 439 ms
100 log (20/1.2) = about 567 ms

User Interface Design - UTCN 27


Fitts’s law based evaluation - example

User Interface Design - UTCN 28


B. Summative evaluation of usability

1. Expert-based evaluation

2. Experimental evaluation

3. Observational techniques

4. Query techniques

User Interface Design - UTCN 29


B1. Expert-based evaluation

o Usability experts look at the UI and identify problems


E.g. WEBQEM, course uid-08-Evaluation(Part2-3)_WebQEM_ICI

o Disadvantages
o usually too late
o the expert does not have the characteristics of the real users

User Interface Design - UTCN 30


B2. Experimental evaluation
Experiment pattern
1. Hypothesis, prediction
2. Experiment description
Tasks, scenario

3. Subjects
4. Configuration
5. Variables
a. Independent variables
b. Measured variables
6. Data analysis
7. Conclusions on hypothesis

Example:
Graphical annotation in eTrace, Course uid-06-Usability(Part2-5)

User Interface Design - UTCN 31


Experiment pattern
o The evaluator chooses a hypothesis to test, which can be determined by
measuring some attributes of subject behaviour
o Aim of the experiment is to prove that the prediction is true.
o Hypotheses
n prediction of the outcome of the experiment, stating that a variation in the independent
variables will cause a difference in the dependent variables

o Subjects (participants)
n Should be representative for the expected user classes
n Number should be large enough to be representative (recommended 10)

o Variables
n Independent variables:
Set to change the conditions of the experiment to derive different situations. E.g number of nested
menus, number of icons
n Dependent variables:
Variables which are measured. E.g. time to complete task , number of errors

o Conclusions on hypothesis
n The initial hypothesis is confirmed totally or partially
n Conclusions describe the conditions for each type of confirmation

User Interface Design - UTCN 32


Experiment – data analysis

y
(dependent
variable) V3 – case 3

V2 – case 2

V1 – case 1

X
(independent
variable)

User Interface Design - UTCN 33


Experimental methods
o Between-groups
o Each subject has different condition
o At least two conditions
n The experimental condition (with manipulated variable)
n The control condition (without manipulation)
o Advantage: no learning effect
o Disadvantage:
n more subjects needed
n Individual differences may influence the result

o Within-groups
o Each subject performs under each condition
o Advantages
n Less costly: less subjects
n Less effects from individual differences
o Disadvantage: learning effect

o Mixed method
o If more than one independent variable
n One between-groups and one within-groups

User Interface Design - UTCN 34


B3. Observational techniques
o Think aloud and co-operative evaluation
o Observation of users while performing tasks
o Ask them to describe what they are doing, why and what they think is
happening
o Process of think aloud can change the way you do it
o Co-operative evaluation:
n Variation of think aloud
n User is encouraged to criticise the system

o Automatic protocol analysis tools


o Analysing protocols by hand is time consuming
o Different tools available
o Post-task walkthroughs
o Subject is invited to comment on the protocol
o Evaluator asks clarifying questions
o Advantage:
n Evaluator can prepare questions
n Can be use to replace think aloud
o Disadvantage
n Loss of freshness

User Interface Design - UTCN 35


Protocols
o Methods for recording the evaluation session
o Paper and pencil
Limited by writing speed
Forms may help
o Audio recording
Difficult to record the exact actions
o Video recording
Two cameras: one for the screen, one for the user
o Computer logging
Log can be replayed
May result in large volume of data
o User notebooks
User logs usual or infrequent tasks and problems

User Interface Design - UTCN 36


B4. Query techniques
o Interviews
o Effective for high-level evaluation
Eliciting information about user preferences, impressions, attitudes
o Purpose of interview must be clear
o Central questions must be planned
o Questionnaires
o Effective for high-level evaluation
o Take less time
o Can be analysed more rigorously
o Purpose must be clear: what information is needed

User Interface Design - UTCN 37


Type of questions
o General
o Background / characteristics of the user
E.g. age, sex, occupation
o Open ended
E.g. “Can you suggest any improvements to the Interface?”
o Gathers subjective information
o Difficult to analyse
o Scalar
o Judgement of a statement on a numeric scale
E.g. “It is easy to recover from mistakes”
disagree 1 2 3 4 5 agree
o Coarse scale (1-3): users tempt to give neutral answers
o Fine scale (1-10): numbers become difficult to interpret
o Multiple-choice
E.g. “how do you most often get help (tick one)?”
o On-line manual
o Contextual help system
o Command prompt
o Ask a colleague
o Ranked
Place an ordering on items in a list
Useful to find user preferences
E.g. “Please rank the usefulness of these methods of issuing a command (1 most useful,
2 next, 0 if not used)”
o Menu selection
o Command line
o Control key accelerator

User Interface Design - UTCN 38


Choosing an evaluation method
o The stage in the cycle
o Formative vs. summative

o Style of evaluation
o Laboratory vs. field

o Level of subjectivity or objectivity required


o Type of measures provided
o Qualitative vs. quantitative

o Information provided
o Low level vs. high level

o The immediacy of the response


o The level of interference implied
o measurements may influence behaviour of user

o Resources required
o Equipment
o Time
o Money
o Subjects
o Evaluators (expertise)
o Context

User Interface Design - UTCN 39


Questions and problems
1. Explain why the both types of evaluations are compulsory in
the interactive application evaluation process?
2. Exemplify and explain why the laboratory and the field based
studies are complementary.
3. Explain and exemplify the Cognitive evaluation as formative
evaluation. Wo does execute this type of the evaluation?
4. Explain and exemplify the Heuristic evaluation as formative
evaluation. Wo does execute this type of the evaluation?
5. Exemplify the ten usability heuristics defined by Jakob Nielsen?
6. Why the set of the ten usability heuristics are not enough in
particular cases of some interactive applications? Explain and
exemplify.
7. What is main result of the usability heuristics based
evaluation? Does it resolve the software issues? Explain who
and when resolve these issues?

User Interface Design - UTCN 40


Questions and problems
8. Exemplify and explain the use of Fitts’s law to evaluate and
compare the GUI supporting the following command:
MOVE CHAIR to POS(40,75)
9. Describe the main components of the definition of an
experiment.
10. Describe the main issues of selecting the group of evaluators.

11. Describe the main issues of selecting the group and sequence
of task executions.

User Interface Design - UTCN 41


Capitolul 4

Evaluarea calităţii site-urilor web


Alexandru Dan Donciulescu, Ioana Costache, Cornelia Lepădatu
Institutul Naţional de Cercetare-Dezvoltare în Informatică – ICI Bucureşti
Bd. Mareşal Averescu, Nr. 8-10, 011455 Bucureşti, România
Tel.: +40(0)21 2240736/157
e-mail: [email protected]

Rezumat În acest capitol se descrie experimentul efectuat pentru evaluarea calităţii site-
urilor web prin aplicarea metodei WebQEM. În prima secţiune se prezintă fazele de
aplicare, procedurile şi algoritmii utilizaţi în cadrul metodei WebQEM. În următoarele
secţiuni se descrie în detaliu experimentul efectuat: obiectivele urmărite şi condiţiile
desfăşurării experimentului, metodologia de experimentare şi rezultatele obţinute. În partea
finală sunt prezentate constatările şi concluziile desprinse din experimentare.

Cuvinte-cheie: calitatea site-urilor web, WebQEM, ISO 9126, ISO 14598.

4.1 INTRODUCERE
Odată cu creşterea volumului informaţional disponibilizat prin Internet, devine tot
mai dificilă selectarea acelor site-uri care îndeplinesc cerinţele utilizatorilor.
Literatura de specialitate este relativ bogată în prezentarea şi analiza de metode de
evaluare a calităţii site-urilor web (v. pentru exemple [6], [8]).
O serie de cercetări recente desfăşurate în ICI au fost orientate spre analiza
calităţii şi elaborarea de recomandări privind metodele de evaluare a calităţii site-
urilor web pentru diferite domenii de activitate.
În scopul elaborării de recomandări privind proiectarea şi realizarea site-urilor
web din domeniul cultural, un colectiv de specialişti din ICI a experimentat metoda
WebQEM pe un număr de site-uri web autohtone dedicate unor muzee româneşti
reprezentative care dispun şi de prezentări virtuale pentru Internet.

 
2 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

4.2 PREZENTAREA GENERALĂ A METODEI


WebQEM (Web Quality Evaluation Method) este o metodă de evaluare şi
comparare cantitativă a calităţii site-urilor web, elaborată în perioada 1998-2000 de
un grup de cercetători de la Universitatea Naţională La Pampa din Argentina,
condus de Prof. Luis Olsina [8].
Metoda WebQEM are la bază standardele internaţionale şi profesionale
referitoare la calitate, precum şi conceptele, principiile şi tehnicile specifice
domeniului evaluării calităţii software, cum sunt:
− metodologia de evaluare a calităţii software promovată prin standardul IEEE
Std 1061-1992 [2];
− modelul calităţii software din standardul ISO 9126 [3];
− modelul procesului de evaluare a calităţii din standardul ISO 14598 [4].
Suplimentar, metoda a fost extinsă şi perfecţionată pe baza următoarelor
concepte şi standarde:
− abordarea LSP (Logic Scoring of Preference) dezvoltată de Dujmovic [7]
pentru evaluarea şi selectarea sistemelor hardware-software complexe;
− recomandările IEEE referitoare la elaborarea paginilor web [1];
− recomandările Consorţiului W3C referitoare la accesibilitatea web [5].
Metoda WebQEM a făcut obiectul unor numeroase experimentări pe site-uri din
diferite domenii, cum sunt: e-commerce, e-learning, e-government. Unele din
componentele metodei sunt sprijinite prin instrumentul software WebQEM-Tool.
În ansamblu, se poate afirma că metoda WebQEM constituie o adaptare a
modelului calităţii ISO 9126 şi a procesului de evaluare ISO 14598 la site-urile
web [10], [11].
În scopul evaluării şi comparării cantitative a site-urilor web, metoda WebQEM
cuprinde patru faze majore strâns corelate:
− definirea şi specificarea cerinţelor referitoare la calitate
− evaluarea elementară
− evaluarea globală
− analizarea şi documentarea rezultatelor, formularea concluziilor.

(1) Definirea şi specificarea cerinţelor referitoare la calitate


Această fază este constituită din următoarele activităţi principale:
− definirea domeniului de aplicabilitate a site-ului: stabilirea domeniului la care
se referă site-ul sau site-urile ce urmează să fie selectate şi evaluate (ex.:
comerţ electronic, informaţii financiare, divertisment, educaţie, sănătate etc.);
− definirea obiectivelor evaluării. Obiectivele evaluării depind de scopul în care
se utilizează rezultatele (ex.: înţelegerea calităţii, compararea calităţii
diferitelor site-uri web, îmbunătăţirea calităţii unui site existent etc.);
− definirea perspectivei din care se face evaluarea: definirea necesităţilor
explicite şi implicite ale utilizatorului din perspectiva căruia se face evaluarea;
Evaluarea calităţii site-urilor web 3

utilizatorul poate fi, de exemplu: vizitatorul site-ului (obişnuit sau expert),


elaboratorul site-ului, administratorul site-ului ş.a.;
− specificarea caracteristicilor de calitate şi a atributelor măsurabile: selectarea
modelul calităţii şi a caracteristicilor de calitate specifice domeniului de
aplicabilitate a site-ului web, stabilirea nivelului de importanţă a
caracteristicilor.
Rezultatul acestei faze este un document intitulat Specificaţia cerinţelor calităţii
în care se defineşte structura arborescentă a caracteristicilor, subcaracteristicilor şi
atributelor măsurabile.
În procesul de specificare a structurii arborescente, metoda recomandă utilizarea
standardelor IEEE 1061 [2] şi ISO 9126 [3].

(2) Evaluarea elementară


Această fază cuprinde două activităţi principale: proiectarea evaluării şi implemen-
tarea evaluării (cf. standardului ISO 14598 [4]).
Proiectarea evaluării elementare constă în selectarea unui ansamblu de metrici
luând în considerare obiectivele evaluării şi descrierile referitoare la site-ul web
(arhitectura, caracteristicile, funcţiunile etc.).
Pentru fiecare atribut măsurabil identificat Ai se asociază o variabilă Xi a cărei
valoare numerică se obţine din aplicarea unei metrici directe sau indirecte. În
scopul evaluării nivelului de calitate, se stabilesc scale de preferinţă prin definirea
criteriilor elementare de preferinţă a calităţii (EQi), astfel:

Valoarea variabilei Xi Valoarea criteriilor elementare de preferinţă (EQi)


Xi=0 EQi = 1 (sau 100%)
Xi ≥ Xmax EQi = 0 (sau 0%)
0 < Xi < Xmax EQi = (Xmax - Xi) / Xmax
(Xmax este o valoare superioară (prag); de ex.: 0.06)

Criteriul elementar de preferinţă a calităţii (EQi) se interpretează ca fiind


procentul de satisfacere a unei cerinţe pentru un anumit atribut.
În scopul interpretării uşoare a preferinţelor, scala de preferinţă se împarte în
trei nivele de acceptabilitate:
− nesatisfăcător: de la 0% la 40%;
− mediu: de la 40% la 60%;
− satisfăcător: de la 60% la 100%.
Implementarea evaluării elementare constă dintr-o serie de acţiuni de măsurare
a atributelor site-ului web (definite şi specificate anterior). Măsurătorile se pot
efectua atât prin metode manuale (ex.: inspectarea aspectului site-ului, observarea
diferitelor caracteristici vizuale etc.), cât şi prin metode asistate de calculator
utilizând în acest scop instrumente software specializate.
Pe baza valorilor reale măsurate (Xi) şi ţinând seama de modul în care s-au
definit criteriile elementare de preferinţă (EQi) se obţin rezultatele evaluării
4 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

elementare. Pentru n atribute măsurate (Ai , i=1,n) se obţin n preferinţe elementare


(EQi i=1,n). Se obţine astfel “calitatea parţială” a site-ului web (în terminologia
WebQEM).

(3) Evaluarea globală


Preferinţele elementare (EQi) oferă o imagine incompletă a calităţii site-ului web şi
răspund numai parţial necesităţilor utilizatorului. Este necesară o perspectivă de
ansamblu asupra calităţii site-ului web care să acopere integral cerinţele referitoare
la calitate (“calitatea globală” în terminologia WebQEM).
Această fază cuprinde două activităţi principale: proiectarea evaluării şi
implementarea evaluării (cf. standardului ISO 14598 [4]).
Proiectarea evaluării globale constă în selectarea unuia sau mai multor criterii
de agregare a preferinţelor elementare, precum şi selectarea unei scale de notare.
Metoda WebQEM sugerează utilizarea a cel puţin două modele: modele liniare
aditive şi modele neliniare multi-criteriale. În ambele modele se utilizează ponderi
pentru exprimarea importanţei preferinţelor (sau indicatorilor).
Implementarea evaluării globale constă din aplicarea efectivă a criteriilor de
agregare şi de notare definite anterior. Agregarea se efectuează “de jos în sus”, în
mai mulţi paşi, în structura arborescentă a caracteristicilor şi atributelor.
Rezultatul final al agregării este o schemă globală ce permite calculul
indicatorilor parţiali şi globali ai calităţii.
Ultimul indicator (ultima preferinţă globală) obţinut(ă) reprezintă gradul general
de satisfacţie în îndeplinirea cerinţelor privind calitatea.

(4) Analizarea şi documentarea rezultatelor, formularea concluziilor


Pe parcursul fazelor anterioare, se înregistrează toate rezultatele calculelor şi
evaluărilor, inclusiv informaţiile relevante pentru utilizator. Desfăşurarea acestei
faze are loc în concordanţă cu cerinţele privind documentarea procesului de
evaluare, specificate în standardul ISO 14598 [4].

4.3 OBIECTIVELE URMĂRITE PRIN EXPERIMENTARE


Obiectivele experimentării metodei WebQEM sunt următoarele:
− verificarea aplicabilităţii metodei la măsurarea şi evaluarea site-urilor web din
domeniul culturii;
− verificarea şi validarea criteriilor de evaluare a site-urilor web specificate în cadrul
metodei WebQEM;
− formularea de noi cerinţe privind criteriile de evaluare a site-urilor web, în
general, şi a site-urilor web din domeniul culturii, în special.
Evaluarea calităţii site-urilor web 5

4.4 CONDIŢIILE DESFĂŞURĂRII EXPERIMENTĂRII


Procesul de evaluare s-a aplicat la un număr de 7 site-uri operaţionale ale unor
muzee cunoscute din România, menţionate mai jos.

Numele muzeului Adresa website


Muzeul Brukenthal http://www.brukenthalmuseum.ro/
Muzeul Naţional de Istorie a României http://www.mnir.ro/
Muzeul Ţăranului Român http://www.itcnet.ro/mtr/
Muzeul Literaturii Române http://www.mlr.ro/
Muzeul Satului http://www.muzeul-satului.ro/
Muzeul Naţional de Arta http://art.museum.ro/
Muzeul “Grigore Antipa” http://www.antipa.ro/
Gallery of Art http://www.nga.gov/

S-a avut în vedere cuprinderea unor domenii cât mai diverse, reprezentative în
peisajul cultural românesc (istorie, artă, literatură, etnografie, ştiinţele naturii).
Suplimentar, a fost analizat site-ul web al National Gallery of Art din SUA în
vederea comparării rezultatelor obţinute din experiment cu rezultatele obţinute în
studiul de caz realizat în [8] şi [9].
Experimentul a avut loc timp de 2 luni şi a implicat un număr de 4 specialişti,
realizatori de aplicaţii şi site-uri web în domeniul culturii.
În scopul colectării datelor, fiecare evaluator a primit o listă cu adresele web ale
celor 8 site-uri şi câte un chestionar în format electronic însoţit de instrucţiuni de
completare, precum şi de comentarii explicative privind semnificaţia criteriilor.
Răspunsurile evaluatorilor s-au concretizat prin acordarea pentru fiecare criteriu
a unui punctaj (pe o scală de la 0 la 100), a unui răspuns de tip da/nu, sau a unei
evaluări cuprinse într-o listă ataşată. Evaluatorii au fost solicitaţi să răspundă la
toate criteriile incluse în chestionar.
Pentru calcularea numărului de link-uri valide, respectiv invalide sau
neimplementate, evaluatorii au utilizat instrumentul software XENU (freeware).

4.5 METODOLOGIA DE EXPERIMENTARE


În vederea realizării experimentului, colectivul de cercetare a efectuat o serie de
activităţi preliminare, cum sunt:
− studierea lucrărilor de specialitate având subiect similar (experimentarea metodei
WebQEM). Pentru exemplificare, menţionăm referinţele [8] şi [9], în care sunt
evaluate 4 site-uri web operaţionale ale unor muzee cunoscute, situate în patru
oraşe din trei ţări diferite: Louvre, Prado, Metropolitan şi National Gallery of Art.
− pregătirea / instruirea personalului utilizat în experimentare.
Membrii colectivului de experimentare au primit chestionarele conţinând
criteriile de evaluare şi au fost instruiţi asupra modului de completare a acestora.
Instruirea a constat în prezentarea obiectivelor experimentului, a conceptelor şi
procedurilor WebQEM.
6 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

De asemenea, a fost prezentat şi comentat conţinutul chestionarului,


modalităţile aşteptate de apreciere a întrebărilor, precum şi mijloacele de
investigare presupuse că vor fi folosite de evaluatori.

4.5.1 Colectarea datelor


Formularul de evaluare cuprinde structura arborescentă a caracteristicilor,
subcaracteristicilor şi atributelor măsurabile şi tipul de răspuns aşteptat din partea
evaluatorului:
− un punctaj pe o scară de la 0 la 100, care reprezintă aprecierea făcută de evaluator
asupra caracteristicii/ subcaracteristicii/ atributului respective;
− da/nu în cazul în care caracteristica respectivă satisface/nu satisface cerinţa;
− o valoare (aleasă dintr-o listă prezentată în nota explicativă care însoţeşte
chestionarul) care reprezintă gradul în care este îndeplinită cerinţa respectivă.
Evaluarea s-a realizat din perspectiva vizitatorului obişnuit, ocazional, având
cunoştinţe minime sau un interes general în domeniul muzeelor.
Fiecare evaluator a inspectat paginile web supuse evaluării independent de
ceilalţi evaluatori şi a completat chestionarul conform opiniei sale.
Colectarea datelor a fost realizată manual, semi-automat sau automat. Cea mai
mare parte a valorilor atributelor a fost realizată manual deoarece nu există o altă
modalitate de realizare a acestui lucru. Un exemplu este cazul atributelor de tip
da/nu care impun verificarea existenţei unui cuprins, a unei hărţi a site-ului
respectiv sau a atributelor care se referă la tururile organizate.
De asemenea, pentru calcularea nivelului de suport pentru limbi străine sau a
facilităţilor de efectuare a tranzacţiilor, datele pot fi colectate şi verificate cu
uşurinţă. În plus, pentru toate atributele măsurabile printr-un criteriu de preferinţă
direct, singurul mod de obţinere a unei evaluări este prin judecata expertului.
Pe de altă parte, colectarea automată a datelor este în unele cazuri unicul şi cel
mai simplu şi adecvat mecanism de colectare a datelor pentru un anumit atribut.
Este cazul calculării legăturilor invalide şi detectarea numărului de noduri în
structura site-ului.
Datele colectate în cele 4 formulare completate de evaluatori au fost centralizate
într-un document Excel şi analizate statistic. S-a convenit ca datele asupra cărora se
va aplica metoda WebQEM să fie cele obţinute prin media datelor primare
colectate de la cei 4 evaluatori.

4.5.2 Proceduri şi algoritmi WebQEM


Faza 1: Definirea şi specificarea cerinţelor referitoare la calitate
S-au luat în considerare următoarele caracteristici de calitate: funcţionalitate,
utilizabilitate, fiabilitate şi eficienţă. În selectarea subcaracteristicilor de calitate, a
atributelor măsurabile şi a indicatorilor, au fost utilizate standardele ISO 9126 [3],
şi recomandările [1] şi [5]. Ca urmare a mai multor iteraţii, s-a obţinut structura
finală prezentată în fig. 4.1.
Evaluarea calităţii site-urilor web 7

1. UTILIZABILITATE 2.2.1.2. Orientare


1.1 Prezentarea globală a site-ului 2.2.1.2.1. Indicator de cale
1.1.1 Schema de organizare globală 2.2.1.2.2. Indicator de poziţie
1.1.1.1 Harta site-ului 2.2.2. Navigabilitate globală
1.1.1.2 Index global (după 2.2.2.1. Legături între subsite-uri
subiect sau alfabetic) 2.2.3. Controale de navigare
1.1.1.3. Cuprins 2.2.3.1. Permanenţa şi stabilitatea
1.1.2. Calitatea sistemului de etichetare prezentării controalelor
1.1.2.1. Etichetare de tip text contextuale
1.1.2.2. Etichetare de tip 2.2.3.1.1.Permanenţa controalelor
pictogramă 2.2.3.1.2. Stabilitatea controalelor
1.1.3. Tururi organizate 2.2.3.2. Componentă de scrolling
1.1.3.1. Tururi convenţionale 2.2.3.2.1. Scrolling vertical
1.1.3.2. Tururi virtuale 2.2.3.2.2. Scrolling orizontal
1.1.4. Planurile etajelor şi camerelor 2.2.4. Anticipare navigaţională
1.2. Feedback şi help 2.2.4.1. Titlul link-ului (link cu text
1.2.1. Calitatea componentelor de help explicativ)
1.2.1.1. Help explicativ 2.2.4.2. Calitatea frazei de link
1.2.1.2. Help pentru căutare 2.3. Funcţii specifice domeniului şi altele
1.2.2. Indicator pentru ultima actualizare a diverse
site-ului 2.3.1. Relevanţa conţinutului
1.2.2.1. Global 2.3.2. Relevanţa link-ului
1.2.2.2. Local (pe componente 2.3.3. Comerţ electronic
sau pagină) 2.3.3.1. Posibilităţi de achiziţionare
1.2.3. Listă de adrese 2.3.3.1.1. Posibilităţi de cumpă-
1.2.3.1. Listă de e-mail rare cu “coş de
1.2.3.2. Listă de tel., faxuri cumpărături”
1.2.3.3. Listă de adrese poştale 2.3.3.1.2. Calitatea catalogului
1.2.4. Componentă FAQ de produse
1.2.5. Componentă chestionar 2.3.3.2. Securitatea tranzacţiilor
1.3. Interfaţă şi aspect estetic 2.3.4. Facilităţi pentru imagini
1.3.1. Coerenţa grupării controalelor 2.3.4.1. Indicator pentru dimensiunea
principale imaginii
1.3.2. Permanenţa şi stabilitatea prezentării 2.3.4.2. Zoom pentru imagine
controalelor principale 3. FIABILITATE
1.3.2.1. Permanenţa 3.1. Erori ne-tehnice (de context)
controalelor directe 3.1.1 Erori ale link-urilor (legăturilor)
1.3.2.2. Permanenţa 3.1.1.1. Link-uri izolate
controalelor indirecte 3.1.1.2. Link-uri invalide
1.3.2.3. Stabilitate 3.1.1.3. Link-uri neimplementate
1.3.3. Preferinţe estetice 3.1.2. Diverse erori şi inconveniente
1.3.4. Uniformitatea stilului 3.1.2.1. Numărul deficienţelor sau
1.4. Aspecte diverse caracteristicilor absente datorate
1.4.1. Suport pentru limbi străine browser-elor
1.4.2. Facilităţi de salvare (download) 3.1.2.2. Numărul deficienţelor
2. FUNCŢIONALITATE site-ului sau rezultate neaşteptate,
2.1. Facilităţi de căutare independente de browser
2.1.1. Mecanisme de căutare pe site 3.1.2.3. Numărul nodurilor web fără
2.1.1.2. Căutare punctuală (colecţie) răspuns
2.1.1.2. Căutare globală 3.1.2.4. Numărul nodurilor destinaţie
2.2. Posibilităţi de navigare (răsfoire) în construcţie
2.2.1. Navigabilitate punctuală 4. EFICIENŢĂ
2.2.1.1. Nivel de interconexiune 4.1. Accesibilitatea informaţiilor
punctuală 4.1.1. Suport pentru versiune în modul text
4.1.2. Lizibilitatea informaţiei prin dezac-
tivarea imaginilor în browser
4.1.2.1. Titlul imaginii
4.1.2.2. Lisibilitate globală
Fig. 4.1 Structura arborescentă a caracteristicilor şi atributelor de calitate
8 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

Faza 2: Evaluarea elementară


2a) Stabilirea criteriilor elementare
Pentru fiecare atribut cuantificabil Ai se asociază o variabilă Xi care poate lua o
valoare reală prin intermediul unei funcţii criteriu elementare. Rezultatul final
reprezintă transpunerea valorii funcţiei într-o preferinţă elementară de calitate,
EQi cu valori în intervalul (0 - 1). Valorile EQi se pot interpreta astfel:
EQi = 0 semnifică faptul că Xi nu satisface cerinţa
EQi = 1 semnifică faptul că Xi satisface cerinţa
0 ≤ EQi ≤ 1 semnifică faptul că Xi satisface parţial cerinţa.
Există două categorii importante de clasificare a criteriilor elementare: criterii
absolute şi criterii relative. La rândul lor, criteriile elementare absolute se pot
descompune în variabile continue şi variabile discrete.
Pentru fiecare caracteristică a calităţii, în tabelele 4.1 până la 4.4 sunt prezentate
funcţiile de preferinţă elementară şi formulele de calcul pentru variabilele utilizate
în experiment.
Tabelul 4.1 Criteriile stabilite pentru Utilizabilitate
Cod Funcţia de preferinţă elementară Formula
1.1.1.1. D = 0 (nu), D = 1 (da) X = 100 * D
1.1.1.2. D = 0 (nu), D = 1 (da) X = 100 * D
1.1.1.3. D = 0 (nu), D = 1 (da) X = 100 * D
1.1.2.1. D∈[0,100] X=D
1.1.2.2. D∈[0,100] X=D
1.1.3.1. D = 0 (nu), D = 1 (da) X = 100 * D
1.1.3.2. D = 0 (nu), D = 1 (da) X = 100 * D
1.1.4 D = 0 - nu sunt disponibile planuri
D = 0.8 - planuri parţial disponibile X = 100 * D
D = 1 - planuri total disponibile
1.2.1.1. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.1.2. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.2.1. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.2.2. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.3.1. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.3.2. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.3.3. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.4. D = 0 (nu), D = 1 (da) X = 100 * D
1.2.5. D = 0 (nu), D = 1 (da) X = 100 * D
1.3.1. D∈[0,100] X=D
1.3.2.1. D∈[0,100] X=D
1.3.2.2. D∈[0,100] X=D
1.3.2.3. D∈[0,100] X=D
1.3.3. D∈[0,100] X=D
1.3.4. D∈[0,100] X=D
1.4.1. Ni – numărul de limbi străine suportate
Si – nivelul de suport pt limbi străine X = 30 * Σ Si * Ni
S1 = 0.2 – suport minim Dacă X>100, atunci X=100
S1 = 1 – suport mediu
S1 = 2 – suport total
1.4.2. D = 0 (nu), D = 1 (da) X = 100 * D
Evaluarea calităţii site-urilor web 9

Tabelul 4.2 Criteriile stabilite pentru Funcţionalitate


Cod Funcţia de preferinţă elementară Formula
2.1.1.1. D = 0 (nu), D = 1 (da) X = 100 * D
2.1.1.2. D = 0 (nu), D = 1 (da) X = 100 * D
2.2.1.1. D = 0 (nu), D = 1 (da) X = 100 * D
2.2.1.2.1. D∈[0,100] X=D
2.2.1.2.2. D∈[0,100] X=D
2.2.2.1. D∈[0,100] X=D
2.2.3.1.1. D∈[0,100] X=D
2.2.3.1.2. D∈[0,100] X=D
2.2.3.2.1. D = 0 (nu), D = 1 (da) X = 100 * D
2.2.3.2.2. D = 0 (nu), D = 1 (da) X = 100 * D
2.2.4.1. D∈[0,100] X=D
2.2.4.2. D∈[0,100] X=D
2.3.1. D∈[0,100] X=D
2.3.2. D∈[0,100] X=D
2.3.3.1.1. D = 0 (nu), D = 1 (da) X = 100 * D
2.3.3.1.2. D∈[0,100] X=D
2.3.3.2. D∈[0,100] X=D
2.3.4.1. D = 0 (nu), D = 1 (da) X = 100 * D
2.3.4.2. D = 0 (nu), D = 1 (da) X = 100 * D

Tabelul 4.3 Criteriile stabilite pentru Fiabilitate


Cod Funcţia de preferinţă elementară Formula
3.1.1.1. BL = numărul de link-uri izolate X = 100 – (BL*100/TL) * 10
(TL = numărul total de link-uri)
3.1.1.2. CL = numărul de link-uri invalide X = 100 – (CL*100/TL) * 10
3.1.1.3. DL = numărul de link-uri neimplementate X = 100 – (DL*100/TL) * 10
3.1.2.1. D∈[0,100] X=D
3.1.2.2. D∈[0,100] X=D
3.1.2.3. D∈[0,100] X=D
3.1.2.4. D∈[0,100] X=D

Tabelul 4.4 Criteriile stabilite pentru Eficienţă


Cod Funcţia de preferinţă elementară Formula
4.1.1. D = 0 (nu), D = 1 (da) X = 100 * D
4.1.2.1. D∈[0,100] X=D
4.1.2.2. D∈[0,100] X=D

2b) Aplicarea formulelor de calcul pentru atributele măsurabile


În Tabelele 4.5 până la 4.8 sunt prezentate rezultatele calculelor pentru
preferinţele elementare asociate caracteristicilor Utilizabilitate, Funcţionalitate,
Fiabilitate şi Eficienţă.
Tabelul 4.5 Rezultate parţiale ale preferinţelor elementare pentru caracteristica Utilizabilitate
Utilizabilitate Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA
1.1.1.1. 100 0 100 0 0 0 0 100
1.1.1.2. 0 0 0 0 0 0 0 100
1.1.1.3. 100 100 100 100 100 100 100 100
1.1.2.1. 100 100 100 100 100 100 100 100
1.1.2.2. 50 50 50 50 50 50 50 100
1.1.3.1. 0 100 0 0 0 0 100 100
1.1.3.2. 100 0 100 100 100 100 100 100
10 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

1.1.4. 0 0 100 100 0 0 100 100


1.2.1.1. 0 0 0 0 0 0 0 100
1.2.1.2. 0 0 0 0 0 0 0 100
1.2.2.1. 0 0 0 0 0 0 0 0
1.2.2.2. 0 0 0 0 0 0 0 0
1.2.3.1. 100 100 100 100 100 100 100 100
1.2.3.2. 100 100 100 100 100 100 100 100
1.2.3.3. 100 100 100 100 100 100 100 100
1.2.4. 0 0 0 0 100 0 0 100
1.2.5. 100 0 100 100 100 100 0 100
1.3.1. 83 58 80 86 86 55 91 98
1.3.2.1. 89 70 86 85 84 61 91 100
1.3.2.2. 89 70 88 88 83 56 89 95
1.3.2.3. 89 70 89 63 59 60 93 99
1.3.3. 95 65 84 83 76 48 83 95
1.3.4. 81 83 93 95 80 49 85 100
1.4.1. 100 39 60 60 3 12 3 26
1.4.2. 0 0 0 0 0 0 0 100

Tabelul 4.6 Rezultate parţiale ale preferinţelor elementare pentru caracteristica Funcţionalitate
Funcţionalitate Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA
2.1.1.1. 100 0 0 0 100 0 0 100
2.1.1.2. 0 0 0 0 100 0 0 100
2.2.1.1. 0 0 0 0 0 0 0 0
2.2.1.2.1. 25 25 25 25 23 23 28 98
2.2.1.2.2. 25 25 25 16 19 0 25 88
2.2.2. 78 0 0 0 0 0 0 50
2.2.3.1.1. 94 75 86 89 85 58 85 100
2.2.3.1.2. 94 76 89 81 84 55 85 98
2.2.3.2.1. 100 100 100 100 100 100 100 100
2.2.3.2.2. 0 0 0 0 0 0 0 0
2.2.4.1. 99 91 91 88 70 50 85 100
2.2.4.2. 95 90 93 84 71 48 88 100
2.3.1. 100 84 90 93 91 69 89 100
2.3.2. 96 83 90 73 60 51 88 100
2.3.3.1.1. 0 0 0 100 0 0 0 100
2.3.3.1.2. 0 23 0 75 0 0 0 98
2.3.3.2. 0 50 0 0 0 0 0 80
2.3.4.1. 0 0 0 0 0 0 0 0
2.3.4.2. 100 100 100 100 100 100 100 100
Tabelul 4.7 Rezultate parţiale ale preferinţelor elementare pentru caracteristica Fiabilitate
Fiabilitate Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA
3.1.1.1. 100 100 100 85 100 100 100 100
3.1.1.2. 99 99 100 58 100 97 100 100
3.1.1.3. 100 905 100 65 67 99 100 100
3.1.2.1. 100 100 100 100 100 100 100 100
3.1.2.2. 100 100 100 100 80 100 100 100
3.1.2.3.. 93 95 100 45 100 100 100 100
3.1.2.4. 98 90 100 65 100 100 100 100
Tabelul 4.8 Rezultate parţiale ale preferinţelor elementare pentru caracteristica Eficienţă
Eficienţă Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA
4.1.1. 0 0 0 0 100 100 0 100
4.1.2.1. 18 6 36 25 39 60 25 93
4.1.2.2. 75 53 88 74 73 75 80 100
Evaluarea calităţii site-urilor web 11

Faza 3: Evaluarea globală


3a) Agregarea logică a preferinţelor elementare
În cadrul experimentului s-a utilizat ca metodă de agregare modelul LSP (Logic
Scoring of Preference) împreună cu operatorii CLP (Continuous Logic Preference)
[7]. Structura obţinută utilizând aceste modele este prezentată în Fig. 4.2.

1 0.3
Utilizabilitate
Preferinţe globale

2 0.3
Funcţionalitate

C-+

3 0.25
Fiabilitate

4 0.15
Eficienţă

c)

Fig. 4.2 Structura agregării preferinţelor globale şi parţiale, utilizând modelul LSP
a) Caracteristica Utilizabilitate; b) Caracteristica Fiabilitate; c) Agregarea globală a preferinţelor

3b) Calcularea preferinţelor parţiale şi globale de calitate


După finalizarea procesului de agregare şi obţinerea schemei globale, au fost
calculaţi indicatorii parţiali şi globali ai calităţii pentru fiecare site web considerat.
În Tabelul 4.9 sunt prezentate rezultatele obţinute.
12 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

Tabelul 4.9. Rezultate detaliate ale preferinţelor parţiale şi globale de calitate


Caracteristici/
Subcaracteristici Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA

1. UtilizabiIitate 51.08 38.53 62.99 55.19 45.36 34.61 52.91 84.61


1.1.Prezentarea
26.75 25.75 64.25 43.75 25.75 23.25 58.25 93.00
globală a site-ului
1.1.1.Schema de
30.00 20.00 30.00 20.00 20.00 20.00 20.00 80.00
organizare globală
1.1.2.Calitatea sist. de
75.00 75.00 75.00 75.00 75.00 75.00 75.00 100.00
etichetare
1.1.3.Tururi organizate 25.00 38.00 63.00 38.00 38.00 25.00 50.00 100.00
1.1.4.Planurile etajelor
0.00 0.00 100.00 60.00 0.00 0.00 100.00 100.00
si camerelor
1.2.Feedback şi
37.50 25.00 43.75 43.75 51.25 40.00 25.00 85.00
componente de help
1.2.1.Calitatea
0.00 0.00 0.00 0.00 0.00 0.00 0.00 100.00
componentelor de help
1.2.2.Indicator pt
0.00 0.00 0.00 0.00 0.00 13.00 0.00 25.00
ultima actualiz. site
1.2.3.Lista de adrese 100.00 100.00 100.00 100.00 100.00 100.00 100.00 100.00
1.2.4.Componenta
0.00 0.00 0.00 0.00 75.00 0.00 0.00 100.00
FAQ
1.2.5.Componenta
50.00 0.00 75.00 75.00 75.00 50.00 0.00 100.00
chestionar
1.3.Interfaţă şi aspect
87.81 69.75 86.54 84.46 77.83 52.46 86.99 97.51
estetic
1.4.Aspecte diverse 60.00 23.40 36.00 36.00 1.90 7.30 1.80 15.60
2. Funcţionalitate 63.24 27.81 29.10 28.38 55.53 19.24 26.38 82.86
2.1.Facilităţi de
62.50 0.00 0.00 0.00 75.00 0.00 0.00 100.00
căutare
2.2.Posibilităţi de
63.56 40.97 43.00 40.72 37.63 27.63 41.69 67.68
navigare
2.2.1.Navigabilitate
13.00 13.00 13.00 10.00 10.00 6.00 13.00 44.00
punctuală
2.2.2.Navigabilitate
78.00 0.00 0.00 0.00 0.00 0.00 0.00 50.00
globală
2.2.3.Controale de
84.25 76.38 83.50 82.00 81.63 64.75 82.00 87.48
navigare
2.2.4.Anticipare
96.88 90.63 91.88 85.63 70.63 48.75 86.25 100.00
navigaţională
2.3.Funcţii specifice
64.06 57.10 59.50 60.47 52.38 40.94 48.50 78.94
domeniului
2.3.1.Relevanţa
100.00 84.00 90.00 93.00 91.00 69.00 89.00 100.00
conţinutului
2.3.2.Relevanţa link-
96.00 83.00 90.00 73.00 60.00 51.00 88.00 100.00
ului
2.3.3.Comerţ
0.00 25.40 0.00 18.36 0.00 0.00 0.00 55.76
electronic
2.3.4.Facilităţi pt
50.00 25.00 50.00 50.00 50.00 37.50 0.00 50.00
imagini
3. Fiabilitate 98.76 97.06 100.00 73.00 94.04 99.04 100.00 100.00
3.1.1.Erori ale link-
99.60 97.60 100.00 70.00 93.40 98.40 100.00 100.00
urilor
3.1.2.Diverse erori şi
97.50 96.25 100.00 77.50 95.00 100.00 100.00 100.00
inconveniente
4. Eficienţă 32.75 12.28 28.38 22.25 51.13 70.50 23.50 85.25
Preferinţe globale 58.99 53.01 60.58 52.41 61.64 50.16 56.31 83.69
Evaluarea calităţii site-urilor web 13

În Tabelul 4.10 şi Fig. 4.3 se prezintă rezultatele finale privind caracteristicile


de calitate obţinute în cadrul procesului de evaluare, iar în Tabelul 4.11 se prezintă
rezultatele obţinute de fiecare evaluator.

Tabelul 4.10 Caracteristicile de calitate şi preferinţele globale pentru site-urile web ale muzeelor
Carcateristici Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA
1.Utilizabilitate 51.08 38.53 62.99 55.19 45.36 34.61 52.91 84.61
2.Funcţionalitate 63.24 27.81 29.10 28.38 55.53 19.24 26.38 82.86
3.Fiabilitate 98.76 97.06 100.00 73.00 94.04 99.04 100.00 100.00
4.Eficienţă 32.75 12.28 28.38 22.25 51.13 70.50 23.50 85.25
Preferinţe globale 58.99 53.01 60.58 52.41 61.64 50.16 56.31 83.69

83.69
8
56.31
7
50.16
6
Muzee

61.64
5
52.41
4
60.58
3
53.01
2
58.99
1
0.00 20.00 40.00 60.00 80.00 100.00
Preferinte globale [%]

Fig. 4.3 Reprezentarea grafică a rezultatelor finale ale procesului de evaluare

Tabelul 4.11. Rezultate finale obţinute de fiecare evaluator


Muzeu1 Muzeu2 Muzeu3 Muzeu4 Muzeu5 Muzeu6 Muzeu7 NGA
Evaluator 1 54.91 51.27 60.69 52.70 63.80 54.16 55.11 84.25
Evaluator 2 65.87 56.33 60.52 50.33 61.61 49.22 59.08 82.26
Evaluator 3 60.14 52.82 60.94 55.33 57.87 43.38 54.95 83.25
Evaluator 4 55.02 51.63 60.16 51.28 63.27 53.88 56.10 84.99

4.6 ANALIZA ŞI INTERPRETAREA REZULTATELOR


Analiza şi compararea rezultatelor finale obţinute în urma experimentului permit
efectuarea unor constatări şi desprinderea unor concluzii ce sunt prezentate în
continuare.

(1) Constatări referitoare la caracteristicile de calitate


Caracteristica Utilizabilitate
Scorul obţinut de caracteristica Utilizabilitate pentru fiecare muzeu este prezentat
în Fig. 4.4.
− există un procent destul de mic al site-urilor care pun la dispoziţie o hartă
(1.1.1.1.) sau un index global (1.1.1.2) (37%);
14 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

100.00
80.00
Scor
60.00
caracteristica
Utilizabilitate 40.00
20.00
0.00
1 2 3 4 5 6 7 8
Muzee

Fig. 4.4 Scorul pentru Utilizabilitate

− toate site-urile prezintă un cuprins (1.1.1.3.) şi etichetare de tip text (1.1.2.1.);


− numai 37% din ele au facilităţi de vizualizare a unor tururi convenţionale
(1.1.3.1.), în schimb 87% pun la dispoziţie tururi virtuale (1.1.3.2.);
− planurile etajelor şi camerelor (1.1.4.) sunt prezente doar în 50% din cazuri;
− nici unul din muzeele româneşti analizate nu furnizează help explicativ al site-ului
(1.2.1.1.) şi help de căutare (1.2.1.2.), în schimb muzeul Gallery of Art dispune de
acest atribut;
− indicatorii de actualizare global (1.2.2.1.) şi local (1.2.2.2.) lipsesc în toate cele 8
site-uri studiate;
− lista de adrese (1.2.3.) este completă în toate cazurile;
− componenta FAQ (1.2.4.) este prezentă în doar 25% din situaţii;
− componenta chestionar (1.2.5.) există în 75% din cazuri;
− preferinţele estetice (1.3.3.) înregistrează scoruri de peste 80% în 62% din cazuri;
− se remarcă o uniformitate a stilului (1.3.4.) în 87% din site-uri;
− suportul pentru limbi străine (1.4.1.) variază mult de la un caz la altul (este oferit
suport total pentru 3 limbi străine în 12% din cazuri, suport parţial pentru 3 limbi
străine în 12% din cazuri, suport total pentru 2 limbi străine în 25% din cazuri, iar
suportul este extrem de redus în 37% din cazuri);
− facilităţile de salvare (1.4.2.) sunt inexistente în toate cele 7 site-uri româneşti.
Caracteristica Funcţionalitate
Scorul obţinut de caracteristica Funcţionalitate pentru fiecare muzeu este prezentat
în Fig. 4.5.
− mecanisme de căutare punctuală (2.1.1.1.) sunt disponibile în 37% din cazuri, în
timp ce mecanisme de căutare globală (2.1.1.2.) există în numai 25% din site-uri;
− toate cele 8 site-uri au fost apreciate cu valori satisfăcătoare (peste 60%) pentru
atributul Controale de navigare (2.2.3.) şi Anticipare navigaţională (2.2.4.);
− din punctul de vedere al nivelului de interconexiune punctuală (2.2.1.1.), toate
cele 8 site-uri au primit punctaj minim, iar în privinţa orientării (2.2.1.2.) 87% din
site-uri se află în jurul valorii de 25%, cu excepţia muzeului Gallery of Art care a
primit 90%;
Evaluarea calităţii site-urilor web 15

100.00
80.00
Scor 60.00
caracteristica
40.00
Functionalitate
20.00
0.00
1 2 3 4 5 6 7 8
Muzee

Fig. 4.5 Scorul pentru Funcţionalitate


− relevanţa conţinutului (2.3.1.) şi relevanţa link-ului (2.3.2.) au primit punctaje
situate în general în intervalul 70-100%;
− facilităţi de comerţ electronic (2.3.3.) sunt disponibile numai în 25% din cazuri;
− niciunul din cele 8 site-uri supuse experimentului nu beneficiază de indicator
pentru dimensiunea imaginii (2.3.4.1.), în schimb facilitatea de zoom pentru
imagine este prezentă în toate aceste site-uri.
Caracteristica Fiabilitate
Scorul obţinut de caracteristica Fiabilitate pentru fiecare muzeu este prezentat în
Fig. 4.6.

100.00
80.00
Scor
60.00
caracteristica
40.00
Fiabilitate
20.00
0.00
1 2 3 4 5 6 7 8
Muzee

Fig. 4.6 Scorul pentru Fiabilitate


− s-a înregistrat un număr relativ redus de erori ale link-urilor: 87% din site-uri au
primit punctaj maxim pentru atributul Link-uri izolate (3.1.1.1.), 87% din site-uri
au fost evaluate cu scoruri cuprinse între 97-100% pentru atributul Link-uri
invalide (3.1.1.2.), iar 75% din total au primit punctaje cuprinse între 95-100%
pentru atributul Link-uri neimplementate (3.1.1.3.);
− la capitolul Diverse erori şi inconveniente (3.1.2.), 50% din site-uri au primit
punctaj maxim, 37% un punctaj cuprins între 80-100%, numai unul dintre site-uri
fiind evaluat cu punctaj mic (45%, respectiv 65%) pentru atributele Numărul
nodurilor web fără răspuns (3.1.2.3.) şi Numărul nodurilor destinaţie în
construcţie (3.1.2.4.).
Caracteristica Eficienţă
Scorul obţinut de caracteristica Eficienţă pentru fiecare muzeu este prezentat în
Fig. 4.7.
16 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

100.00
80.00
Scor
60.00
caracteristica
Eficienta 40.00
20.00
0.00
1 2 3 4 5 6 7 8
Muzee

Fig. 4.7 Scorul pentru Eficienţă

− suportul pentru versiunea text (4.1.1.) este prezent în numai 37% din cazuri;
− atributul Titlul imaginii (4.1.2.1.) a fost evaluat cu scoruri mici (6-39%) în 75%
din situaţii, singurul site care a obţinut o valoare ridicată pentru acest atribut
(93%) este cel al muzeului Gallery of Art;
− lizibilitatea (4.1.2.2.) a primit aprecieri cuprinse între 73-100% în 87% din cazuri.

(2) Constatări referitoare la site-urile evaluate


S-a solicitat evaluatorilor să facă un clasament al site-urilor web analizate, pe baza
opiniei personale, în scopul comparării acestuia cu clasamentul obţinut prin
WebQEM. Aceste rezultate au fost centralizate şi s-a calculat clasamentul general
în urma acordării unui punctaj (de la 7 până la 0) pentru fiecare poziţie ocupată (de
la 1 până la 8) în ordinea preferinţelor (v. Tabelele 4.12 până la 4.15):

Tabelul 4.12 Punctajul realizat pe baza preferinţelor evaluatorilor


Evaluator 1 Evaluator 2 Evaluator 3 Evaluator 4 Punctaj
Muzeul 1 5 2 3 5 17
Muzeul 2 7 4 7 7 7
Muzeul 3 3 6 5 3 15
Muzeul 4 8 7 2 2 13
Muzeul 5 2 3 4 8 15
Muzeul 6 6 5 8 4 9
Muzeul 7 4 8 6 6 8
NGA 1 1 1 1 28

Tabelul 4.13 Clasamentul preferinţelor evaluatorilor


Muzeul Punctaj obţinut
NGA 28
Muzeul 1 17
Muzeul 3 15
Muzeul 5 15
Muzeul 4 13
Muzeul 6 9
Muzeul 7 8
Muzeul 2 7
Evaluarea calităţii site-urilor web 17

Tabelul 4.14 Punctajul realizat cu ajutorul metodei WebQEM


Evaluator 1 Evaluator 1 Evaluator 1 Evaluator 1 Punctaj
Muzeul 1 5 2 3 5 17
Muzeul 2 8 6 7 7 4
Muzeul 3 3 4 2 3 20
Muzeul 4 7 7 5 8 5
Muzeul 5 2 3 4 2 21
Muzeul 6 6 8 8 6 4
Muzeul 7 4 5 6 4 13
NGA 1 1 1 1 28

Tabelul 4.15 Clasamentul obţinut cu ajutorul metodei WebQEM


Punctaj obţinut Punctaj obţinut
Muzeul
prin metoda WebQEM prin evaluarea poziţiei
NGA 83.69 28
Muzeul 5 61.64 21
Muzeul 3 60.58 20
Muzeul 1 58.99 17
Muzeul 7 56.31 13
Muzeul 2 53.01 4
Muzeul 4 52.41 5
Muzeul 6 50.16 4

Din analiza rezultatelor evaluării se constată următoarele:


site-ul muzeului Gallery of Art a primit scoruri satisfăcătoare pentru toate cele 4
caracteristici luate în considerare (între 80 - 100%);
site-ul Muzeului 1 a obţinut scoruri variabile (între 32 - 99%), clasându-se pe
locul 4 conform WebQEM şi pe locul 3 în topul preferinţelor evaluatorilor.
Acest site trebuie îmbunătăţit, în special la atributele ce se referă la
caracteristica Eficienţă (32.75%). De exemplu, conform datelor
prezentate în Tabelul 4.8, se impune adăugarea suportului pentru
versiunea text (atributul 4.1.1.).
Pentru caracteristica Utilizabilitate (cotată cu 51.08% şi detaliată în
Tabelul 2.6) sunt necesare adăugarea funcţiei de căutare globală
(2.2.1.2.), îmbunătăţirea facilităţilor de navigare (2.2.1.1.), adăugarea
funcţiei de comerţ electronic (2.3.3.) şi a indicatorului pentru
dimensiunea imaginii (2.3.4.1.)
site-ul Muzeului 2 a obţinut scoruri foarte variabile (între 12 - 97%), locul ocupat
atât în ierarhia furnizată de metoda WebQEM, cât şi de evaluatori, fiind 6.
Caracteristica Eficienţă a înregistrat cel mai scăzut punctaj (12.28%),
urmată de caracteristicile Funcţionalitate (cotată cu 27.81%) şi
Utilizabilitate (38.53%).
Singura caracteristică ce a obţinut o valoare ridicată (97.06%) este cea
de Fiabilitate (Tabelul 4.7) datorită faptului că numărul de link-uri
izolate, invalide sau neimplementate, precum şi cel al deficienţelor
determinate de browser sunt extrem de mici.
18 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

În schimb, la capitolul Eficienţă (Tabelul 4.8) se impun adăugarea


suportului pentru versiunea (4.1.1.) şi îmbunătăţirea lizibilităţii prin
dezactivarea imaginii în browser (4.1.2.).
Din punct de vedere al Funcţionalităţii (Tabelul 4.5), trebuie realizate
mecanisme de căutare pe site (2.1.1.) şi îmbunătăţite posibilităţile de
navigare (2.2.).
site-ul Muzeului 3 a obţinut de asemenea scoruri foarte variabile (între 28 -
100%), poziţia sa în ambele clasamente fiind 3.
Se observă o mare diferenţă între punctajul mic obţinut pentru
caracteristicile Eficienţă şi Funcţionalitate ( 28.38%, respectiv
29.10%), faţă de 62.99% pentru caracteristica Utilizabilitate şi 100%
pentru Fiabilitate. Valorile mici se datorează, în primul rând, faptului
că acest site nu dispune de facilităţi de căutare (2.1.1.) sau de comerţ
electronic (2.3.3.)
site-ul Muzeului 4 se situează pe poziţia 7 conform WebQEM şi pe poziţia 5 în
clasamentul evaluatorilor. El a obţinut scoruri variabile (între 22 - 73%).
Caracteristicile-problemă sunt reprezentate în aceste caz de Eficienţă
(22.25%) şi Funcţionalitate (28.38%), iar cea mai bine cotată
caracteristică este cea de Fiabilitate (73.00%). Lipsa suportului pentru
versiune în mod text (4.1.1.), lizibilitatea scăzută a titlului imaginii
(4.1.2.1.), inexistenţa hărţii site-ului (1.1.1.1), a indexului global
(1.1.1.2.), a unui mecanism de căutare (2.1.1.), securitatea deficitară a
tranzacţiilor online (2.3.3.2.) sunt aspecte asupra cărora realizatorii
site-ului ar trebui să-şi concentreze atenţia.
site-ul Muzeului 5 a obţinut scoruri variabile (între 45 - 94%), locul ocupat în
ierarhia furnizată de metoda WebQEM este 2, iar în topul preferinţelor
utilizatorilor 3-4 la egalitate cu Muzeul 3.
Caracteristica cea mai slab reprezentată este Utilizabilitate (45.36%),
urmată la distanţă mică de Eficienţă (51.13%), în timp ce caracteristica
Fiabilitate primeşte 94.04%. Site-ul nu dispune de o hartă (1.1.1.1.),
index global (1.1.1.2.), facilităţi de comerţ electronic (2.3.3.).
site-ul Muzeului 6 a obţinut scoruri foarte variabile (între 19 - 99%) şi ocupă
ultima poziţie în clasamentul WebQEM şi poziţia 6 în clasamentul evaluatorilor.
Se observă o mare diferenţă între punctajul extrem de mic obţinut
pentru caracteristica Funcţionalitate (19.24%) şi caracteristica
Fiabilitate (99.04%). Aceasta se datorează absenţei mecanismelor de
căutare pe site (2.1.1.), a hărţii (1.1.1.1.), a indexului global (1.1.1.2.),
a facilităţilor de comerţ electronic (2.3.3.), precum şi nivelului slab al
interconexiunii punctuale (2.2.1.1.).
Site-ul are o fiabilitate ridicată (99.04%), erorile determinate de link-
uri şi deficienţele datorate browserelor fiind foarte reduse.
site-ul Muzeului 7 a obţinut scoruri foarte variabile (între 26 - 100%), locurile
ocupate în ierarhia furnizată de metoda WebQEM, respectiv în topul preferinţelor
evaluatorilor, fiind 5 şi 7.
Evaluarea calităţii site-urilor web 19

Caracteristicile Eficienţă şi Funcţionalitate au înregistrat cele mai


scăzute punctaje (23.50%, 26.38%) deoarece site-ul nu dispune de
suport pentru versiune în mod text (4.1.1.), mecanisme de căutare
(2.1.1.), facilităţi de comerţ electronic (2.3.3.). Pe de altă parte însă,
site-ul a obţinut punctaj maxim la caracteristica Fiabilitate.

(3) Constatări referitoare la ierarhizările obţinute cu ajutorul metodei


WebQEM, respectiv topul preferinţelor evaluatorilor
clasamentul realizat în urma evaluării celor 8 site-uri cu ajutorul metodei
WebQEM (prezentat în Tabelul 4.15) situează muzeul Gallery of Art pe primul
loc cu un scor de 83.69% (28 de puncte), poziţie care se regăseşte cu acelaşi
punctaj şi în clasamentul preferinţelor evaluatorilor (prezentat în Tabelul 4.13) la
distanţă considerabilă faţă de următoarea clasată;
poziţiile 2, 3, 4 sunt ocupate în ambele clasamente de Muzeele 1, 3 şi 5, cu punctaj
apropiat (17, 15, 15 în clasamentul evaluatorilor, respectiv 17, 20, 21 în
clasamentul WebQEM), cu deosebirea că în primul caz ele se află în ordinea 1, 3,
5, în timp ce în cel de-al doilea sunt cotate în ordinea 4, 3, 2;
în viziunea evaluatorilor, pe poziţia 5 se află Muzeul 4 (cu 13 puncte), în timp ce
din perspectiva WebQEM pe această poziţie se găseşte Muzeul 7 (cu 13 puncte);
ultimele 3 poziţii sunt cotate cu punctaje apropiate, ele se repartizează astfel:
Muzeele 6, 7, 2 (cu 9, 8, 7 puncte) în Tabelul 4.13, respectiv Muzeele 2, 4, 6 (cu 4,
5, 4 puncte) în Tabelul 4.15;
se remarcă diferenţe relativ reduse între topul preferinţelor evaluatorilor şi
clasamentul realizat pe baza metodei WebQEM, nu se înregistrează diferenţe mai
mari de 2 poziţii între locurile pe care le ocupă un site în cele 2 clasamente.

4.7 CONCLUZII
Experimentarea metodei WebQEM pe un eşantion de 8 muzee a urmărit
evidenţierea aplicabilităţii acestei metode de analiză la site-uri din domeniul
culturii. S-a urmărit de asemenea desprinderea de concluzii şi recomandări
privitoare la derularea unui proces de evaluare experimentală a metodelor de
analiză a calităţii.
Pe lângă cele 7 site-uri selectate din peisajul cultural românesc, colectivul de
cercetare a analizat şi site-ul muzeului Gallery of Art în scopul comparării
rezultatelor obţinute de specialiştii români cu cele prezentate în [10] şi [11],
comparaţie prezentată în Tabelul 4.16:
Se remarcă diferenţa de 4.43% între punctajele finale obţinute în cadrul celor 2
evaluări. Analizând datele din Tabelul 4.16, se constată că această diferenţă
provine în special din valorile calculate pentru caracteristica Utilizabilitate.
20 Alexandru Dan Donciulescu, Cornelia Lepădatu, Ioana Costache

Tabelul 4.16. Caracteristicile de calitate şi preferinţele globale pentru site-ul Gallery of Art
Carcateristici Evaluare realizată în acest experiment Evaluare realizată în [10]
1.Utilizabilitate 84.61 70.39
2.Funcţionalitate 82.86 80.41
3.Fiabilitate 100.00 89.67
4.Eficienţă 85.25 80.00
Preferinţe globale 83.69 79.26

Dacă se au în vedere datele din Tabelul 4.5 şi se compară cu valorile


corespunzătoare din [8] şi [10], se deduce că evaluatorii români au acordat un
punctaj mai mare atributelor (1.1.1.1.), (1.1.2.2.), (1.3), (1.4.2.). Acest lucru se
datorează faptului că, de la data efectuării evaluării (anul 2000 în [10]) şi până la
realizarea experimentului de către specialiştii români, realizatorii site-ului respectiv
au operat modificări care i-au îmbunătăţit performanţele (de exemplu au fost
adăugate o hartă a site-ului şi facilităţi de download).
Rezultatele obţinute reprezintă un început încurajator în sprijinirea procesului
dificil de realizare a unor site-uri web peformante care să îndeplinească în grad cât
mai înalt cerinţele de utilizabilitate, funcţionalitate, fiabilitate şi eficienţă ale
“vizitatorului obişnuit”.

Bibliografie
1. *** IEEE Web Publishing Guide, http://www.ieee.org/web/developers/style/
2. *** IEEE Std 1061-1992, IEEE Standard for a Software Quality Metrics Methodology.
3. *** ISO/IEC 9126-1991 Information technology – Software product evaluation –
Quality characteristics and guidelines for their use.
4. *** ISO/IEC 14598-5:1998 Information Technology – Software product Evaluation –
Part 5: Process for evaluators.
5. *** W3C, 1998, W3C Working Draft, “WAI Accessibility Guidelines: Page
Authoring”, http://www.w3c.org/TR/1998/WD-WAI-PAGEAUTH-19980918/
6. Balog, Al. şi colectiv. Specificarea sistemului de metode şi proceduri de măsurare şi
evaluare a calităţii produselor / serviciilor. Raport de cercetare, Programul CALIST,
Contract C3505, Faza 2, iunie 2003.
7. Dujmovic, J.J.A Method for Evaluation and Selection of Complex Hardware and
Software Systems, The 22nd Int´l Conference for the Resource Management and
Performance Evaluation of Enterprise CS. CMG 96 Proceedings, Vol. 1, pp.368-378.
8. Olsina, L. Quantitative Methodology for Evaluation and Comparison of Web Site
Quality, Doctorial Thesis (in Spanish), UNLP, La Plata, Argentina, 2000.
9. Olsina, L. Web site Quantitative Evaluation and Comparison: a Case Study on
Museum, ICSE `99, Workshop on Software Engineering over the Internet, Los
Angeles, SUA, 1999.
10. Olsina, L. Strategies for Quality Assessment of WebApps. ICWE’02, Second Ibero-
American Conference on Web Engineering 10-12 September 2002; Santa Fe de la Vera
Cruz, Santa Fe.
11. Olsina, L., Rossi, G. A Quantitative Method for Quality Evaluation of Web Sites and
Applications. IEEE Multimedia Magazine, Oct. 2002, pp. 20-29.
WEBQEM

Web Quality Evaluation Method


Introduction
o WebQEM is a method of quantitative evaluation and
comparison of the quality of websites
o Developed between 1998-2000 by a group of
researchers at the National University La Pampa in
Argentina, led by Prof. Luis Olsina
o Based on international and professional quality
standards, as well as the concepts, principles and
techniques specific to the field of software quality
assessment:
n Software quality assessment methodology, IEEE Standard
1061-1992
n Software quality model, Std. ISO 9126
n Model of the quality evaluation process, Std. ISO 14598

User Interface Design - UTCN 2


Introduction
o WebQEM method has been extended by:
n Logic Scoring of Preference (LSP) for evaluation and
selection of complex hardware-software systems,
developed by Dujmovic
n IEEE recommendations regarding the development of web
pages
n W3C recommendations on web accessibility

User Interface Design - UTCN 3


WebQEM phases
o The method consists of four major correlated phases:
1. Defining and specifying quality requirements
2. Elementary evaluation
3. Global evaluation
4. Analysis and documentation of results, conclusions, and
recommendations

User Interface Design - UTCN 4


Phase 1: Quality requirements definition
o Main activities:
1. Defining the domain of the site
e.g. e-commerce, financial information, entertainment,
education, health, etc.

2. Defining the objectives of the evaluation


e.g. understanding the quality, comparing the quality,
improving the quality, etc.

3. Defining the perspective from which the evaluation is made


e.g. regular or expert visitor, administrator, designer

4. Specification of quality characteristics and measurable


attributes
e.g. quality model, quality characteristics, level of importance
of the characteristics

o Result: document on Quality Requirements Specification

User Interface Design - UTCN 5


Phase 2: Elementary evaluation
o Main activities:
1. Designing the elementary evaluation
Standard ISO 14598
Select a set of metrics

2. Implementing the elementary evaluation


Consists of a set of actions for measuring site attributes
(defined and specified above)

User Interface Design - UTCN 6


Phase 2: Design elementary evaluation
Ai - Measurable attribute
Xi - Variable
EQi - Elementary quality criteria

Variable Xi Quality criteria EQi


Xi=0 EQi=1 (i.e. 100%)
EQi=0 (i.e. 0%)
0<Xi<Xmax EQi=(Xmax-Xi)/Xmax

Level of acceptability:
Unsatisfactory 0% - 40%
Medium 40% - 60%
Satisfactorily 60%-100%

User Interface Design - UTCN 7


Phase 3: Global evaluation
o Main activities:
1. Designing the global evaluation
Standard ISO 14598
Selects one or more criteria for aggregating elementary
preferences, as well as scoring scales.
Two models: additive linear models and multicriteria linear
models.
Weights for expressing the importance of preferences.

2. Implementing the global evaluation


Apply the aggregation of the elementary preferences.

User Interface Design - UTCN 8


Phase 4: Documentation, conclusions
o Main activities:
1. During the previous phases, all the results of calculations
and evaluations are recorded, including the relevant
information for the user

User Interface Design - UTCN 9


WebQEM Experiments on Web Sites

User Interface Design - UTCN 10


WebQEM evaluation objectives
o Experiment the applicability of the method of
measurement and evaluation of the websites in the field
of culture
o Experiment and validate the criteria for evaluating the
websites specified in the WebQEM method
o Enunciate new requirements regarding the website
evaluation criteria of in general and particularly the
websites in the field of culture

User Interface Design - UTCN 11


WebQEM evaluation conditions
o 7 museums (Romania) + Gallery of Art (USA)

o Duration: 2 months
o Evaluation team: 4 specialists in the domain of Web site
development, and software quality evaluation

User Interface Design - UTCN 12


Phase 1: Quality characteristics
o Functionality, usability, reliability, efficiency
o Standard ISO 9126

User Interface Design - UTCN 13


Phase 1: Quality characteristics

User Interface Design - UTCN 14


Phase 1: Quality characteristics

User Interface Design - UTCN 15


Phase 2: Elementary evaluation
Usability: Design the variable computation

User Interface Design - UTCN 16


Phase 2: Elementary evaluation
Functionality : Design the variable computation

User Interface Design - UTCN 17


Phase 2: Elementary evaluation
Reliability: Design the variable computation

User Interface Design - UTCN 18


Phase 2: Elementary evaluation
Efficiency: Design the variable computation

User Interface Design - UTCN 19


Phase 2: Elementary evaluation
Usability: Implement the variable computation

User Interface Design - UTCN 20


Phase 2: Elementary evaluation
Functionality : Implement the variable computation

User Interface Design - UTCN 21


Phase 2: Elementary evaluation
Reliability: Implement the variable computation

User Interface Design - UTCN 22


Phase 2: Elementary evaluation
Efficiency: Implement the variable computation

User Interface Design - UTCN 23


Phase 3: Global evaluation

connect

connect

User Interface Design - UTCN 24


Phase 3: Global evaluation

User Interface Design - UTCN 25


Phase 3: Global evaluation

User Interface Design - UTCN 26


Phase 3: Global evaluation
o Computation of the global quality characteristics

Museums

Global preferences [%]

User Interface Design - UTCN 27


Phase 3: Global evaluation
o Computation of the global quality characteristics given
by each evaluator

User Interface Design - UTCN 28


Phase 4: Documentation, conclusions
1. Conclusions on the quality characteristics
2. Conclusions on the evaluated sites
3. Conclusions regarding the hierarchy obtained by the
WEBQEM method and by the preferences of the
evaluators

User Interface Design - UTCN 29


Phase 4.1: Quality characteristics
o Usability

Sample of conclusions:
o There is a fairly small percentage of sites that provide a map
(1.1.1.1) or a global index (1.1.1.2), 37%
o All sites have a content (1.1.1.3) and text tags (1.1.2.1)
o Only 37% of the sites have facilities for viewing conventional
tours (1.1.3.1), while 87% offer virtual tours (1.1.3.2)

User Interface Design - UTCN 30


Phase 4.1: Quality characteristics
o Floor and room plans (1.1.4) are present in only 50% of cases
o None of the analyzed Romanian museums offer explanatory
help of the site (1.2.1.1) and search help (1.2.1.2), but the
NGA museum has this feature
o Global (1.2.2.1) and local (1.2.2.2) update indicators are
missing in all 8 sites analyzed
o The list of addresses (1.2.3) is complete in all cases
o The questionnaire component (1.2.5) exists in 75% of the
cases

User Interface Design - UTCN 31


Phase 4.1: Quality characteristics
o Functionality

Sample of conclusions:
o Punctual search mechanisms (2.1.1.1) are available in 37% of
cases, while global search mechanisms (2.1.1.2) exist in only
25% of sites
o All 8 sites were evaluated with satisfactory values (over 60%)
for the attributes Navigation Controls (2.2.3) and navigation
anticipation (2.2.4)
User Interface Design - UTCN 32
Phase 4.1: Quality characteristics
o From the point of view of the level of punctual interconnectivity
(2.2.1.1), all the 8 sites received a minimum score, and
regarding the orientation (2.2.1.2) 87% of the sites are around
25%, with the exception of the NGA that received 90%
o E-commerce functions (2.3.3) are only available in 25% of
cases
o None of the 8 sites have an indicator for the size of the image
(2.3.4.1), instead the zoom facility for the image is present in
all these sites

User Interface Design - UTCN 33


Phase 4.1: Quality characteristics
o Reliability

Sample of conclusions:
o There was a reduced number of link errors: 87% of the sites
received maximum score for the Isolated links attribute
(3.1.1.1), 87% of the sites were evaluated with scores
between 97-100%, for the Invalid links attribute (3.1.1.2), and
75% of the total received scores between 95-100% for the
attribute of Unimplemented links (3.1.1.3)

User Interface Design - UTCN 34


Phase 4.1: Quality characteristics
o Efficiency

Sample of conclusions:
o Text version support (4.1.1) is present in only 37% of cases
o The Image title attribute (4.1.2.1) has been evaluated with low
scores in 75% of cases, the only site that has obtained a high
value for this attribut is NGA museum
o Readability (4.1.2.2) received ratings from 73-100% in 87% of
cases
User Interface Design - UTCN 35
Phase 4.2: Evaluated sites
o Score based on the evaluators' preferences

User Interface Design - UTCN 36


Phase 4.2: Evaluated sites
o The score obtained using the WEBQEM method

User Interface Design - UTCN 37


Phase 4.2: Evaluated sites
o Museum 1

Sample of conclusions:
o The site has obtained variable scores (32-99%), ranking 4th
place according to WebQEM and 3rd place among the
evaluators' preferences
o This site should be improved, especially with the attributes that
refer to the Efficiency feature (32.75%). For instance,
according to the data presented in table Elementary evaluation
on Efficiency, it is necessary to add the support for the text
version (4.1.1)
o Regarding the Usability feature (51.08%), is has to add the
function of global search (2.2.1.2), improvement of navigation
feature (2.2.1.1), adding the function of e-commerce (2.3.3),
and the specification of the image dimension (2.3.4.1).

User Interface Design - UTCN 38


Phase 4.2: Evaluated sites
o Museum 2

Sample of conclusions:
o The site has obtained variable scores (12-97%), ranking 6th
place according to WebQEM and among the evaluators'
preferences
o The Efficency feature has the lowest score (12.28%), followed
by the Functionality feature (27.81%) and the Usability feature
(38.53%).
o ...

User Interface Design - UTCN 39


Phase 4.3: Regarding the hierarchy
Conclusions regarding the hierarchy obtained by the
WEBQEM method and by the preferences of the evaluators
o The evaluation ranks the Gallery of Art museum on the
first position with 28 points, and a score of 83.69%
o Position 2, 3 and 4 are occupied by Museums 1, 3 and 5
with 17, 15 and 15 points by the evaluators’s
preferences, and order 4,3 and 2, with 17, 20 and 21
points, by WEBQEM method
o On position 5 there is Museum 4 (13 points) by the
evaluators’s preferences, instead of Museum 7 (13
points) by WEBQEM method
o On positions 6,7 and 8 there are Museums 6,7 and 2 (9,
8, and 7 points) by the evaluators’s preferences, instead
of Museums 2, 4, 6 (4, 5, 4 points) by WEBQEM method
o There are no big differences by the two classifications

User Interface Design - UTCN 40


Conclusions
o The evaluation of the Gallery of Art is different
on two different years (2002 and 2012)
o The site has been improved since 2002

Characteristics Evaluation on 2012 Evaluation on 2002


1. Usability 84.61 70.39
2. Functionability 82.86 80.41
3. Reliability 100.00 89.67
4. Efficiency 85.25 80.00
Global preferences 83.69 79.26

User Interface Design - UTCN 41


Logic Scoring of Preference (LSP)
o Logic Scoring of Preference (LSP) - a general
quantitative decision method for evaluation, comparison
and selection of complex hardware and software
systems
o LSP method is a generalization and an extension of
various scoring techniques
o Mathematical background is a continuous preference
logic

User Interface Design - UTCN 42


LSP example

User Interface Design - UTCN 43


Logic Scoring of Preference (LSP)

o k elementary variables -> k elementary preferences (ei)


-> global preference (e0)
o significance of elementary preferences by weights (Wi)
o d describes the position of e0 between emin and emax. It is
called disjunction degree
o r is real number describing the logical properties of the
aggregation function, r=p(d,k). The computation of r for
k=2 is by:

User Interface Design - UTCN 44


Logic Scoring of Preference (LSP)

o k elementary variables -> k elementary preferences (ei)


-> global preference (e0)
o significance of elementary preferences by weights (Wi)
o d describes the position of e0 between emin and emax. It is
called disjunction degree
o r is real number describing the logical properties of the
aggregation function, r=p(d,k). The computation of r for
k=2 is by:

User Interface Design - UTCN 45


Continuous Preference Logic (CLP)

o andor - generalized conjunction/disjunction


o 0<d<0.5, quasi conjunction (QC), it has properties
similar to the pure conjunction.
o QC model situations where we need a certain degree of
simultaneity in satisfying the input elementary criteria
and want to penalize systems that cannot simultaneously
satisfy the input criteria.
o Penalty effect: strong (0<d<0.25), weak (d close to 0.5)

User Interface Design - UTCN 46


ANDOR function

User Interface Design - UTCN 47


Example of QC function C-+

input preferences: x1, x2, x3


global preference z

User Interface Design - UTCN 48


Example of QC function C-+

User Interface Design - UTCN 49


QD – andor function
o 0.5<d<1, quasi disjunction (QD) because it has
properties similar to the pure disjunction.
o QD is used to model situations where we have a certain
degree of replaceability in satisfying the input
elementary criteria and want to penalize only those
systems that cannot satisfy any of the input criteria.
o High level of replaceability is expressed by large values
of d, 0.75<d<1
o Low level of replaceability corresponds to small values of
d slightly above 0.5<d

User Interface Design - UTCN 50


Arithmetic mean - A andor function
o For d=0.5 andor is called the neutrality function because
it is located right in the middle between the conjunction
and the disjunction.
o It is used when we want to produce a criterion function
having a perfectly balanced mix of conjunctive and
disjunctive properties.
o This is the traditional weighted arithmetic mean
o A combination of various andor functions can be used to
create more complex aggregation structures.

User Interface Design - UTCN 51


References
1. Dujmovic, J. 1996. A Method of Evaluation and Selection of Complex
Hardware and Software Systems. 22nd International Conference for
the Resource Management and Performance Evaluation of Enterprises.
368-378.
2. Olsina L., Godoy D., Lafuente G., and Rossi G. 2001. Specifying Quality
Characteristics and Attributes for Websites. Lecture Notes in Computer
Science. 2016(1): 266-278.
3. Olsina L. 1998. Website Quantitative Evaluation and Comparison:
Workshop on Software Engineering Over the Internet: International
Conference on Software Engineering.

User Interface Design - UTCN 52


Feature Article

ductivity requirements as nonfunctional, and

Measuring Web
measuring these less tangible characteristics
directly isn’t practical, but we can assess them by
measuring the product’s “lower abstraction attrib-
utes.”5 We see attributes as measurable properties

Application
of an entity—here, a Web application—and pro-
pose using a quality model (in the form of a qual-
ity requirement tree) to specify them.
In this context, stakeholders must focus on

Quality with
characteristics and attributes that influence prod-
uct quality and “quality in use.”4 (Ensuring high
product quality doesn’t always suffice to guaran-
tee quality in use, but such discussion exceeds

WebQEM
this article’s scope.) Specifically, characteristics
that influence product quality as prescribed in
the ISO 9126-1 standard include usability, func-
tionality, reliability, efficiency, portability, and
maintainability. To specify the quality require-
Luis Olsina ment tree for a given assessment goal and user
La Pampa National University, Argentina viewpoint, we should consider such diverse
attributes as broken links, orphan pages, quick
Gustavo Rossi access pages, table of contents, site map, link
La Plata National University, Argentina color uniformity, and main control permanence.
Of course, we recognize how difficult it is to
design a rigorous nonfunctional requirement

T
he Web plays a central role in such model that provides a strong correlation between
This article discusses diverse application domains as busi- attributes and characteristics.
using WebQEM, a ness, education, industry, and enter- Though our method works for assessing all
quantitative tainment. Its growing importance aspects of Web sites and applications, we focus
evaluation strategy heightens concerns about Web application devel- on user-perceptible product features such as nav-
to assess Web site opment methods and argues for systematic, dis- igation, interface, and reliability rather than
and application ciplined use of engineering methods and tools.1 product attributes such as code quality or design.
quality. Defining and In particular, we need sound evaluation methods That is, we consider Web site characteristics and
measuring quality for obtaining reliable information about product attributes from a general visitor viewpoint.
indicators can help quality. These methods should identify attribut-
stakeholders es and characteristics that can serve as meaning- The WebQEM evaluation process
understand and ful indicators for specific evaluation goals given The WebQEM process includes four major
improve Web a user viewpoint. technical phases:
products. An e- This article discusses the Web Quality
commerce case Evaluation Method2 and some aspects of its sup- 1. Quality requirements definition and specifi-
study illustrates the porting tool, WebQEM_Tool.3 Using WebQEM to cation
methodology’s assess Web sites and applications supports efforts
utility in to meet quality requirements in new Web devel- 2. Elementary evaluation (design and imple-
systematically opment projects and evaluate requirements in mentation stages)
assessing attributes operational phases. It also helps us discover
that influence absent features or poorly implemented require- 3. Global evaluation (design and implementa-
product quality. ments such as interface-related design and imple- tion stages)
mentation drawbacks or problems with
navigation, accessibility, search mechanisms, 4. Conclusion (recommendations)
content, reliability, and performance.
We follow common practice in describing soft- Figure 1 shows the evaluation process underly-
ware quality in terms of quality characteristics as ing the methodology, including the phases, stages,
defined in the ISO/IEC 9126-1 standard.4 The lit- main steps, inputs, and outputs. This model fol-
erature often characterizes quality, cost, or pro- lows the ISO’s process model for evaluators.5

20 1070-986X/02/$17.00 © 2002 IEEE


Nonfunctional requirements Elementary evaluation Partial/global evaluation Figure 1. The
evaluation processes
Web audience's needs underlying the
Managerial WebQEM methodology.
ISO/IEC 9126 model- requirement
WebQEM models and tools

Quality
requirements
Quality requirements specification
definition

Web product Evaluation

the evaluation
descriptions goals Elementary Global

Design of
Metric
preference preference
selection
criteria definition criteria definition

Elementary Aggregation
criterion schema
descripton
Measured
Web product value

of the evaluation
Measurement

Implementation
components implementation
Scored
value
Elementary
preference
implementation Partial/global
preference
Measurement
implementation
result
Elementary Final
result result

Recomendations
Documentation

Conclusion of the evaluation

Quality requirements definition and explicit and implicit user needs), we can specify
specification characteristics, subcharacteristics, and attributes
In this phase, evaluators clarify the evaluation in a quality requirement tree. This phase yields a
goals and the intended user viewpoint. They quality requirement specification document.
select a quality model, for instance, the ISO-
prescribed characteristics in addition to attribut- Elementary evaluation
es customized to the Web domain. They then This phase defines the two major stages that
October–December 2002

identify these components’ relative importance Figure 1 depicts: elementary evaluation design
to the intended Web audience and the extent of and implementation.
coverage required. For each measurable attribute Ai from the
The user profile may entail three abstract eval- requirement tree, we can associate a variable Xi,
uation categories—visitor, developer, and man- which will take a numerical value from a direct
ager—that we can break into subcategories. For or indirect metric. However, because this metric’s
example, the visitor category can include general value won’t represent the elementary require-
and expert visitor subcategories. Once we’ve ment’s satisfaction level, we need to define an
defined the domain and product descriptions, elementary criterion function that will yield an
agreed goals, and selected user view (that is, the elementary indicator or preference value.

21
For instance, consider the Broken Links 0 ≤ EPi ≤ 1 ; or given a percentage scale, 0 ≤ EPi
attribute, which measures (counts) links that lead ≤ 100
to missing destination pages. A possible indirect
metric is X = #Broken_Links / #Total_Links_of_Site. and the sum of weights must fulfill
Now, how do we interpret the measured value, and
what are the best, worst, and intermediate pre- (W1 + W2 + ... + Wm ) = 1; if Wi > 0 ; to i = 1 ... m
ferred values? We can represent a possible criteri-
on function to determine the elementary quality The basic arithmetic aggregation operator for
preference EP as such: inputs is the plus (+ or A) connector. We can’t
use Equation 1 to model input simultaneity or
EP = 1 (100 percent) if X = 0; EP = 0 (0 percent) replaceability, among other limitations, as we
if X ≥ Xmax discuss later.
otherwise EP = (Xmax – X) / Xmax if 0 < X < Xmax Therefore, once we’ve selected a scoring
where Xmax is some agreed upper threshold model, the aggregation process follows the hier-
such as 0.04 archical structure as defined in the nonfunction-
al requirement tree (see Figure 2), from bottom
So the elementary quality preference EP is fre- to top. Applying a stepwise aggregation mecha-
quently interpreted as the percentage of a satis- nism, we obtain a global schema. This model lets
fied requirement for a given attribute and is us compute partial and global indicators in the
defined in the range between 0 and 100 percent implementation stage. The global quality prefer-
(so the scale type and the unit of metrics become ence ultimately represents the global degree of
normalized6). To simplify preferences interpre- satisfaction in meeting the stated requirements.
tation, we define three acceptability levels:
unsatisfactory (0 to 40 percent), marginal (40 to Concluding the evaluation
60 percent), and satisfactory (60 to 100 percent). This phase documents Web product compo-
The implementation stage applies the select- nents, quality requirements, metrics, and crite-
ed metrics to the Web application as Figure 1 ria; and records elementary and final results as
shows. We can measure some values observa- well. Requesters and evaluators can then analyze
tionally and obtain others automatically using and understand the assessed product’s strengths
computerized tools. and weaknesses with regard to established goals
and user viewpoint, and suggest and justify rec-
Global evaluation ommendations.
This phase also has two major stages: design
and implementation of the partial and global Automating the process using WebQEM_Tool
quality evaluation. We select aggregation criteria The evaluation and comparison processes
and a scoring model in the design stage. The require both methodological and technological
quantitative aggregation and scoring models aim support. We developed a Web-based tool3 to sup-
to make the evaluation process well structured, port the administration of evaluation projects. It
accurate, and comprehensible by evaluators. At permits editing and relating nonfunctional
least two type of models exist: those based on lin- requirements.
ear additive scoring models7 and those based on For instance, in our e-commerce case study
nonlinear multicriteria scoring models8 where dif- (which we discuss in the next section), we
ferent attributes and characteristics relationships defined more than 90 attributes.2 Then, by auto-
can be designed. Both use weights to consider matically or manually editing elementary indi-
indicators’ relative importance. For example, if our cators, WebQEM_Tool aggregates the elements to
procedure is based on a linear additive scoring yield a schema and calculates a global quality
model, the aggregation and computing of par- indicator for each site. This lets evaluators assess
tial/global indicators or preferences (P/GP), con- and compare Web product quality.
IEEE MultiMedia

sidering relatives weights (W) is based on: WebQEM_Tool relies on a Web-based hyper-
document model that supports traceability of eval-
P/GP = (W1 EP1 + W2 EP2 + ... + Wm EPm) (1) uation aspects. It shows evaluation results using
linked pages with textual, tabular, and graphical
such that if the elementary preference (EP) is in information, and dynamically generates pages with
the unitary interval range, the following holds: these results from tables stored in the data layer.

22
1. Usability 2.1.2 Retrieve mechanisms
1.1 Global site understandability 2.1.2.1 Level of retrieving customization
1.1.1 Global organization scheme 2.1.2.2 Level of retrieving feedback
1.1.1.1 Table of contents 2.2 Navigation and browsing issues
1.1.1.2 Site map 2.2.1 Navigability
1.1.1.3 Global indexes 2.2.1.1 Orientation
1.1.1.3.1 Subject index 2.2.1.1.1 Indicator of path
1.1.1.3.2 Alphabetical index 2.2.1.1.2 Label of current position
1.1.1.3.3 Chronological index 2.2.1.2 Level of links per page
1.1.1.3.4 Geographical index 2.2.2 Navigational control objects
1.1.1.3.5 Other indexes (by audience, by format, or 2.2.2.1 Presentation permanence and stability of
hybrid such as alphabetical and subject-oriented) contextual (subsite) controls
1.1.2 Quality of labeling system 2.2.2.1.1 Contextual controls permanence
1.1.3 Audience-oriented guided tour 2.2.2.1.2 Contextual controls stability
1.1.3.1 Conventional tour 2.2.2.2 Level of scrolling
1.1.3.2 Virtual reality tour 2.2.2.2.1 Vertical scrolling
1.1.4 Image map (metaphorical, building, campus, floor and 2.2.2.2.2 Horizontal scrolling
room imagemaps) 2.2.3 Navigational prediction
1.2 Feedback and help features 2.2.3.1 Link title (link with explanatory help)
1.2.1 Quality of help features 2.2.3.2 Quality of link phrase
1.2.1.1 Global help (for first-time visitors) 2.2.4 Browse mechanisms
1.2.1.2 Specific help (for searching, checking out) 2.2.4.1 Quick browse controls
1.2.2 Addresses directory 2.3 Domain-specific functionality and content
1.2.2.1 Email directory Note: See, for example, the specification to e-bookstores in
1.2.2.2 Phone and fax directory Figure 4.
1.2.2.3 Post mail directory
1.2.3 Link-based feedback 3. Reliability
1.2.3.1 FAQ feature 3.1 Nondeficiency (or Maturity)
1.2.3.2 What’s New feature 3.1.1 Link errors
1.2.4 Form-based feedback 3.1.1.1 Broken links
1.2.4.1 Questionnaire feature 3.1.1.2 Invalid links
1.2.4.2 Comments and suggestions 3.1.1.3 Unimplemented links
1.2.4.3 Subject-oriented feedback 3.1.2 Spelling errors
1.2.4.4 Guest book 3.1.3 Miscellaneous errors or drawbacks
1.3 Interface and aesthetic features 3.1.3.1 Deficiencies or absent features due to different
1.3.1 Cohesiveness by grouping main control objects browsers
1.3.2 Presentation permanence and stability of main controls 3.1.3.2 Deficiencies or unexpected results (such as
1.3.2.1 Direct controls permanence (main, search, nontrapped search errors, frame problems)
browse controls) independent of browsers
1.3.2.2 Indirect controls permanence 3.1.3.3 Orphan pages
1.3.2.3 Stability 3.1.3.4 Destination nodes (unexpectedly) under
1.3.3 Style issues construction
1.3.3.1 Links color style uniformity
1.3.3.2 Global style uniformity 4. Efficiency
1.3.4 Aesthetic preference 4.1 Performance
1.4 Miscellaneous features 4.1.1 Quick static pages
1.4.1 Foreign language support 4.2 Accessibility
1.4.2 Web site last update indicator 4.2.1 Information accessibility
1.4.2.1 Global 4.2.1.1 Support for text-only version
1.4.2.2 Scoped (per subsite or page) 4.2.1.2 Readability by deactivating the browser image
1.4.3 Screen resolution indicator feature
4.2.1.2.1 Image title
2. Functionality 4.2.1.2.2 Global readability
2.1 Searching and retrieving issues 4.2.2 Window accessibility
2.1.1 Web site search mechanisms 4.2.2.1 Number of panes regarding frames
2.1.1.1 Global search 4.2.2.2 Nonframe version
2.1.1.2 Scoped search (such as museum collections,
books, academic personnel)

Figure 2. Tailorable
quality requirement
Putting WebQEM to work domain specific, contribute to Web application tree for a general visitor
We’ve used WebQEM to evaluate sites in sev- quality. Figure 3 (next page) shows an e-store standpoint. Italics
eral domains9 and discuss here its application in homepage (http://www.cuspide.com.ar) and represent direct or
an e-bookstore case study.2 highlights several attributes generally available indirect measurable
on such sites. Figure 2 documents a wider list of attributes.
About quality requirements tailorable quality requirements assuming a gen-
Many potential attributes, both general and eral visitor profile.

23
We developed the requirement tree shown in specific functionality and content. Because this last
Figure 2 to be reusable among domains. For tree component (where Functionality is the super-
instance, the Usability characteristic splits into sub- characteristic) should be customized among
characteristics such as global site understandabili- domains, we don’t intend it for wholesale reuse.
ty, feedback and help features, and interface and Figure 4 outlines the schema we used in the
aesthetic features. The Functionality characteristic e-bookstore study. We identified five main e-
decomposes into searching and retrieving issues, store components:10 product information (2.3.1),
navigation and browsing issues, and domain- purchase features (2.3.2), customer features

Figure 3. A screen shot


of Cúspide’s homepage,
with several attributes
highlighted.

2.3 Domain-specific functionality and content (for e-bookstores) 2.3.2.1.2.2 Fax, phone, or email purchase
2.3.1 Product information 2.3.2.2 Purchase policies
2.3.1.1 Product description 2.3.2.2.1 Purchase cancellation policy
2.3.1.1.1 Basic book description 2.3.2.2.2 Return policy information
2.3.1.1.2 Book content and structure 2.3.2.2.3 Shipping and handling information
2.3.1.1.2.1 Book’s table of contents 2.3.2.2.4 Payment policy information
2.3.1.1.2.2 Content description 2.3.2.2.5 Resent purchase (gift service)
2.3.1.1.3 Product image 2.3.3 Customer features
2.3.1.1.3.1 Image availability 2.3.3.1 E-subscriptions
2.3.1.1.3.2 Zooming 2.3.3.2 Customized recommendations
2.3.1.2 Price evaluation 2.3.3.3 Account facility
2.3.1.2.1 Price comparison availability 2.3.3.3.1 Account availability
2.3.1.3 Product rating availability 2.3.3.3.2 Account security
2.3.1.4 Related titles and authors’ recommendations 2.3.3.3.3 Account configuration
2.3.1.5 Catalog download facility 2.3.3.3.3.1 Order history and status
2.3.2 Purchase features 2.3.3.3.3.2 Account settings
2.3.2.1 Purchase mode 2.3.3.3.3.3 Address book
2.3.2.1.1 Online 2.3.3.4 Customer revision of a book
2.3.2.1.1.1 Shopping basket 2.3.4 Store features
2.3.2.1.1.1.1 Shopping basket availability 2.3.4.1 Title availability rate
2.3.2.1.1.1.2 Continue buying feedback 2.3.4.2 Store ranking
2.3.2.1.1.1.3 Edit and recalculate feature 2.3.4.2.1 The top books
2.3.2.1.1.2 Quick purchase (1-click or similar) 2.3.4.2.2 The best-selling books
2.3.2.1.1.3 Checkout features 2.3.5 Promotion policies
2.3.2.1.1.3.1 Checkout security 2.3.5.1 With-sale promotion availability
2.3.2.1.1.3.2 Canceling feedback 2.3.5.2 Appetizer promotion availability (such as contests,
2.3.2.1.2 Offline frequent-purchase points)
2.3.2.1.2.1 Printable checkout form

Figure 4. Domain-specific functionality and content subcharacteristics for e-bookstore sites. Italics represent direct or indirect
measurable attributes.

24
(2.3.3), store features (2.3.4), and promotion Table 1. Template and example with the characteristic items.
policies (2.3.5). WebQEM_Tool uses this information.
Though we’ve specified the Figure 4 subtree for
the e-bookstore field, we could easily reuse many Template Example
of its parts for a more general e-commerce domain. Title (code) Reliability (3)
Examples include the purchase features (2.3.2) and Type Characteristic
its subfactors purchase mode (2.3.2.1) and pur- Factor Quality
chase policies (2.3.2.2). For the purchase mode sub- Subcharacteristic (code) Nondeficiency (3.1)
characteristic, we characterize online and offline Definition and comments “The capability of the software product to
modes, though the former is becoming more pop- maintain a specified level of performance
ular as confidence in security increases.11 For when used under specified conditions”4
online purchases, we model the shopping basket, Model to determine Nonlinear multicriteria scoring model,
quick purchase, and checkout features. the global or specifically, the Logic Scoring of Preferences
As noted elsewhere, 12 developers generally partial computation model8
use the shopping basket mechanism to decou- Employed tools WebQEM_Tool
ple product or service selection from checkout. Arithmetic or logic operator C*
We find it interesting to compare many of Relative weight 0.2
these criteria with existing navigation and Calculated preference values A set of values for Reliability, as shown in Table 3
interface patterns. We believe that recording * We explain the arithmetic or logic operator item for the subcharacteristic and
and reusing design experience yields valuable characteristic aggregation later.
information for specifying quality attributes or
subcharacteristics.
cally grouped attributes, subcharacteristics, and
Designing and implementing the elementary characteristics accordingly.
evaluation As mentioned earlier, we can use a linear
As mentioned earlier, the evaluators should additive or a nonlinear multicriteria scoring
define, for each quantifiable attribute, the basis model. We can’t use the additive scoring model
for the elementary evaluation criterion and per- to model input simultaneity or replaceability,
form measurement and preference mapping. however, because it can’t express for example
To record the information needed during simultaneous satisfaction of several require-
evaluation, we defined a descriptive specification ments as inputs. Additivity assumes that insuf-
framework as Tables 1 and 2 (next page) show. ficient presence of a specific attribute (input)
This framework includes specific information can always be compensated by sufficient pres-
about attribute, subcharacteristic, and character- ence of any other attribute. Furthermore,
istic definition as well as metrics, elementary additive models can’t model mandatory require-
preference criteria, scoring model components, ments; that is, a necessary attribute’s or
and calculations. (Tables 1 and 2 template codes subcharacteristic’s total absence can’t be com-
correspond to those shown in the requirement pensated by others’ presence.
tree in Figure 2.) A nonlinear multicriteria scoring model lets
Once evaluators have designed and imple- us deal with simultaneity, neutrality, replace-
mented the elementary evaluation, they should ability, and other input relationships using aggre-
be able to model attribute, subcharacteristic, and gation operators based on the weighted power
characteristic relationships. They should consid- means mathematical model. This model, called
er not only each attribute’s relative importance Logic Scoring of Preferences8 (LSP), is a general-
October–December 2002

but also whether the attribute (or subcharacteris- ization of the additive-scoring model and can be
tic) is mandatory, alternative, or neutral. For this expressed as follows:
task, we need a robust aggregation and scoring
model, described next. P/GP(r) = (W1 EP r1 + W2 EP r2 + ... + Wm EP rm)1/r (2)

Designing and implementing the where


partial/global evaluation
This is where we select and apply an aggrega- –∞ ≤ r ≤ +∞ ; P/GP(–∞) = min (EP1 , EP2 , ... , EPm)
tion and scoring model (see Figure 1). Arithmetic
or logic operators will then relate the hierarchi- and

25
Table 2. Template and example with the attribute items.

Template Example
Title (code) Broken links (3.1.1.1)
Type Attribute
Highest-level
characteristic (code) Reliability (3)
Supercharacteristic (code) Link errors (3.1.1)
Definition and comments It represents found links that lead to missing destinations, both internal and external static pages (also
known as dangling links). “Users get irritated when they attempt to go somewhere, only to get their
reward snatched away at the last moment by a 404 or other incomprehensible error message.” (See
http://www.useit.com/alertbox/ 980614.html.)
Template of metric The metric and parameters item links another template with information of the selected metric criterion,
and parameters the expected and planned values, measurement dates, and other fields.13 For instance, the metric
criterion is X = #Broken_Links / #Total_Links_of_Site. For each e-store in the field study, we got the
respective X value.
Data collection type The data collection type item records whether the data are gathered manually or automatically and what
tool is employed (if done automatically, as for the broken links attribute).
Employed tools Our Web site metrics automation tool, among others.
Elementary preference EP = 1 (or 100 percent) if X = 0;
function EP = 0 (or 0 percent) if X ≥ Xmax ;
otherwise EP = (Xmax – X) / Xmax if 0 < X < Xmax
Xmax = 0.04
Relative weight 0.5
Elementary preference values Cúspide’s site yielded an elementary preference of 99.83 percent; Amazon, 98.40 percent; Barnes and
Noble, 97.45 percent; Borders, 76.34 percent; and Díaz de Santos, 60.07 percent.

P/GP(+∞) = max (EP1 , EP2, ... , EPm) low-quality input preferences can never be well
compensated by a high quality of some other input
The power r is a parameter (a real number) to output a high-quality preference (in other words,
selected to achieve the desired logical relationship a chain is as strong as its weakest link). Conversely,
and polarization intensity of the aggregation func- disjunctive operators (D operators) imply that low-
tion. If P/GP(r) is closer to the minimum, such a quality input preferences can always be compen-
criterion specifies the requirement for input simul- sated by a high quality of some other input.
taneity. If it is closer to the maximum, it specifies Designing the LSP aggregation schema
the requirement for input replaceability. requires answering the following key basic ques-
Equation 2 is additive when r = 1, which mod- tions (which are part of the Global Preference
els the neutrality relationship; that is, the formu- Criteria Definition task in Figure 1):
la remains the same as in the first additive model.
Equation 2 is supra-additive for r > 1, which mod- ❚ What’s the relationship among this group of
els input disjunction or replaceability. And it’s related attributes and subcharacteristics: con-
subadditive for r < 1 (with r != 0), which models junctive, disjunctive, or neutral?
input conjunction or simultaneity.
For our case study, we selected this last model ❚ What’s the level of intensity of the logic oper-
and used a 17-level approach of conjunction- ator, from a weak to strong conjunctive or dis-
disjunction operators, as defined by Dujmovic.8 junctive polarization?
Each operator in the model corresponds to a par-
IEEE MultiMedia

ticular value of the r parameter. When r = 1 the ❚ What’s the relative importance or weight of
operator is tagged with A (or the + sign). The C or each element in the group?
conjunctive operators range from weak (C–) to
strong (C+) quasiconjunction functions, that is, The WebQEM_Tool lets evaluators select the
from decreasing r values, starting from r < 1. aggregation and scoring model. When using the
In general, the conjunctive operators imply that additive scoring model, the aggregation operator

26
is A for all tree composites (subchar-
acteristics and characteristics). If
evaluators select the LSP model, they
must indicate the operator for each
subcharacteristic and characteristic.
Figure 5 shows a partial view of the
enacted schema for Amazon.com as
generated by our tool.

Analyzing and recommending


Once we’ve performed the final
execution of the evaluation,
decision-makers can analyze the
results and draw conclusions.
Table 3 shows the final values for
usability, functionality, reliability
and efficiency characteristics, and
the global quality indicator in our e-
commerce case study. It also shows
the domain-specific functionality
and content subcharacteristic (2.3
in Figures 2 and 4) for the function-
ality characteristic.
The colored quality bars at the
right side of Figure 6 (next page) indicate the ified sites, we observed 63.72 percent satisfaction Figure 5. Once the
acceptability levels and clearly show the quality for Amazon versus 10.20 percent for Díaz de weights and operators
level each e-bookstore has reached. For instance, Santos. The WebQEM tool lets us follow the were defined and the
a score within a gray bar indicates a need for anchored codes in tables and navigate backward schema checked, the
improvement actions. An unsatisfactory rating and forward to see the partial and elementary WebQEM_Tool could
means change actions must take high priority. A indicators (as shown on the left side of Figure 6) yield the partial and
score within a green bar indicates satisfactory that further clarify these measurements. As such, global preferences as
quality of the analyzed feature. we can easily see which site features need shown in the right-side
Looking at the product information (2.3.1) improvement and which are satisfactory. pane.
subcharacteristic in the “best” and “worst” qual- We see similar score differences for the pur-

Table 3. Summary of partial and global preferences for e-bookstores assessed for quality.

Characteristics and Amazon Barnes & Noble Cúspide Díaz de Santos Borders
Subcharacteristics (percent) (percent) (percent) (percent) (percent)
1. Usability 76.16 82.62 75.93 56.09 72.67
2. Functionality 83.15 80.12 61.69 28.64 61.45
2.1 Searching and retrieving issues 100 100 91 42.67 72.06
2.2 Navigation and browsing issues 70.71 69.85 73.25 64.12 51.95
2.3 Domain-specific functionality
October–December 2002

and content 81.99 76.53 45.81 14.42 61.55


2.3.1 Product information 63.72 42.20 40.64 10.20 15.98
2.3.2 Purchase features 91.76 84.84 67.72 17.11 81.92
2.3.3 Customer features 100 85 20 28.08 65
2.3.4 Store features 100 96.80 71.20 33.60 93.57
2.3.5 Promotion policies 60 100 40 0 100
3. Reliability 99.44 99.11 90.97 78.51 91.66
4. Efficiency 96.88 74.54 90.17 86.01 90.90
Global quality preference 86.81 82.95 75.52 50.37 74.86

27
nonfunctional require-
ments subjectively
based on human exper-
tise and occasional field
studies. Moreover, we
must sometimes sub-
jectively assess how
well requirements are
satisfied (such as quali-
ty of help features or a
Web site’s aesthetic
preference). However,
we can minimize sub-
jectivity in the evalua-
tion process by
focusing on objectively
Figure 6. The chase feature subcharacteristic. The Díaz de Santos measurable attributes such as broken links, orphan
Web_QEM tool yields application doesn’t have checkout security, quick pages, and quick access pages. A robust and flexi-
diverse information purchase, or checkout canceling feedback, among ble evaluation methodology must properly aggre-
types. The graph at other mandatory or desirable attributes. gate both subjective and objective components
right shows final e- Conversely, Amazon has a superb online purchase controlled by experts.
bookstore rankings. preference of 100 percent. Overall, four of five WebQEM works well for assessing and compar-
evaluated sites received satisfactory ratings for the ing quality requirements for operative Web sites
purchase feature subcharacteristic. Finally, Table and applications as well as in early phases of Web
3 shows the partial preferences for the customer development projects. The tool can be useful in
features (2.3.3), store features (2.3.4), and promo- assessing diverse application domains according to
tion policies (2.3.5) subcharacteristics. different user views and evaluation goals. The eval-
uation process must start with defining and speci-
Conclusions and future work fying quality requirements. For example, to assess
Quantitative evaluation of Web applications the developer viewpoint rather than a visitor view-
remains scarce despite the publication of many point, we must plan additional internal and exter-
style guides, design principles, and tech- nal attributes and evaluation criteria, and also
niques.14,15 Guidelines with prioritized check- consider the ISO-prescribed maintainability and
points for designers to make sites more efficient portability characteristics. The manager view,
and accessible16 have shed light on essential char- meanwhile, may have different constraints, requir-
acteristics and attributes and might help improve ing that evaluators consider management factors
the Web design and authoring process but don’t such as cost or productivity to optimize quality
constitute formal evaluation methods by them- within cost, resource, and time constraints.
selves. Quantitative surveys and Web domain- Planned WebQEM_Tool improvements include
specific evaluations10,11,17 offer important usability support for collaborative evaluations because we
evaluation information using, in some cases, sub- have seen that in many assessment projects,
jective user-based questionnaires, a strategy with domain experts aren’t colocated yet must interact
its own strengths and weaknesses.17 during the design and implementation of elemen-
We’ve been developing the WebQEM method- tary and global evaluation processes, or at the eval-
ology since the late 1990s. Because the underlying uation’s conclusion. Groupware mechanisms will
strategy is evaluator-driven by domain experts let evaluators assume different roles, with appro-
rather than user-driven, the method is more objec- priate access rights, to share workspaces and trigger
tive than subjective and is quantitative and data visualizers, multiparty chats, and whiteboards,
IEEE MultiMedia

model-centered rather than qualitative and among other facilities. We’re also cataloging Web
intuition-centered. Of course, a global quality metrics, specifically those where data gathering can
evaluation (and eventually comparison) of com- be automated. We’ve already cataloged up to 150
plex products can’t entirely avoid subjectivity. The direct and indirect automated Web metrics, and
evaluation process starts with specifying goals that hope this catalogue13 will generate a framework for
are to some extent subjective, and we derive the evaluation criteria and procedure reuse. MM

28
Acknowledgment Patterns,” Proc. WWW8 Congress, Elsevier Science,
This research is partially supported by La Amsterdam, 1999, pp. 589-600.
Pampa National University UNLPam-09/F022 13. L. Olsina et al., “Designing a Catalogue for
research project and by the CYTED (Science and Metrics,” Proc. 2nd Ibero-American Conf. Web Eng.
Technology for Development) Program in the (ICWE’02), 2002, pp. 108-122.
VII.18 West (Web-Based Software Technology) 14. IEEE Web Publishing Guide,
Iberoamerican Project. http://www.ieee.org/web/developers/style/.
15. J. Nielsen, The Alertbox, 1995-2002,
References http://www.useit.com/alertbox.
1. S. Murugesan et al., “Web Engineering: A New 16. WWW Consortium, Web Content Accessibility
Discipline for Development of Web-Based Guidelines 1.0, W3C Recommendation, 1999,
Systems,” Lecture Notes in Computer Science http://www.w3c.org/TR/WD-WAI-PAGEAUTH/.
2016, Web Engineering: Managing University and 17. J. Kirakowski et al., “Human Centred Measures of
Complexity of Web Application Development, S. Success in Web Site Design,” Proc. 4th Conf.
Murugesan and Y. Deshpande, eds., Springer- Human Factors and the Web, AT&T,Basking Ridge,
Verlag, Heidelberg, 2001, pp. 3-13. N.J., 1998, http://www.research.att.com/conf/
2. L. Olsina, G.J. Lafuente, and G. Rossi, “E-Commerce hfweb/proceedings/kirakowski/index.html.
Site Evaluation: A Case Study,” Lecture Notes in
Computer Science 1875, Proc. 1st Int’l Conf. Electronic
Commerce and Web Technologies (EC-Web 2000), Luis Olsina is an associate profes-
Springer-Verlag, Heidelberg, 2000, pp. 239-252. sor of object-oriented technology
3. L. Olsina et al., “Providing Automated Support for at La Pampa National University,
the Web Quality Evaluation Methodology,” Proc. Argentina, and heads the Software
Fourth Workshop on Web Eng./10th Int’l WWW Engineering R&D group (GIDIS).
Conf., 2001, pp. 1-11. His research interests include Web
4. ISO/IEC 9126-1:2001, Software Engineering— engineering, particularly Web metrics, cataloging, and
Product Quality—Part 1: Quality Model, Int’l Org. for quantitative evaluation issues. He authored the WebQEM
Standardization, Geneva, 2001. methodology. He received a PhD in software engineer-
5. ISO/IEC 14598-5:1998 International Standard, ing and an MSE from La Plata National University,
Information technology—Software Product Argentina. He is member of the IEEE Computer Society.
Evaluation—Part 5: Process for Evaluators, Int’l Org.
for Standardization, Geneva, 1998.
6. H. Zuse, A Framework of Software Measurement, Gustavo Rossi is a professor of
Walter de Gruyter, Berlin, N.Y., 1998. object-oriented technology at La
7. T. Gilb, Software Metrics, Chartwell-Bratt, Plata National University, Argen-
Cambridge, Mass., 1976. tina, and heads LIFIA, a comput-
8. J.J. Dujmovic, “A Method for Evaluation and er science research lab in La Plata,
Selection of Complex Hardware and Software Argentina. His research interests
Systems,” Proc. 22nd Int’l Conf. Resource include hypermedia design patterns and frameworks.
Management and Performance Evaluation of Enterprise He coauthored the Object-Oriented Hypermedia
Computer Systems, vol. 1, Computer Measurement Design Method (OOHDM) and is currently working on
Group, Turnersville, N.J., 1996, pp. 368-378. application of design patterns in Web applications. He
9. L. Olsina et al., “Assessing the Quality of Academic earned a PhD in computer science from Catholic
Websites: A Case Study,” New Review of Hypermedia University of Rio de Janeiro (PUC-Rio), Brazil. He is an
October–December 2002

and Multimedia (NRHM) J., vol. 5, 1999, pp. 81-103. ACM member and IEEE member.
10. G. Lohse and P. Spiller, “Electronic Shopping,”
Comm. ACM, vol. 41, no. 7, July 1998, pp. 81-86. Readers may contact Luis Olsina at GIDIS, Dept. of
11. C. Kehoe et al., Results of GVU’s 10th World Wide Informatics, Faculty of Engineering School, UNLPam,
Web User Survey, Graphics Visualization and Usability Calle 9 y 110, (6360) General Pico, La Pampa,
Center, College of Computing, Georgia Inst. of Argentina, [email protected].
Technology, Atlanta, Ga., http://www.gvu.gatech.
edu/user_surveys/survey-1998-10/tenthreport.html. For further information on this or any other computing
12. G. Rossi, D. Schwabe, and F. Lyardet, “Improving topic, please visit our Digital Library at http://computer.
Web Information Systems with Navigational org/publications/dlib.

29
WEBQEM
WebQEM methodology

User Interface Design - UTCN 2


Domain-specific functionality

Domain-specific functionality
and content subcharacteristics
for e-bookstore sites. Italics
represent direct or indirect
measurable attributes.

User Interface Design - UTCN 3


Elementary evaluation

User Interface Design - UTCN 4


Elementary evaluation

User Interface Design - UTCN 5


Elementary evaluation

User Interface Design - UTCN 6


Global evaluation

User Interface Design - UTCN 7


User Interface Design - UTCN 8
Heuristic Evaluation of the EvoGlimpse Video Game
Bianca-Cerasela-Zelia Blaga1, Selma Evelyn Cătălina Goga2,
Al-doori Rami Watheq Yaseen3, Dorian Gorgan4
Technical University of Cluj-Napoca
Computer Science Department
Cluj-Napoca, Romania
[email protected], [email protected],
[email protected], [email protected]

ABSTRACT
The evaluation of the interface of a video game is essential
for its development. In this paper, a heuristic evaluation is
proposed, from the perspective of an interactive application.
The goal is to estimate the level of usability. The game is
tested by evaluators who follow a series of scenarios and
relevant actions, with the purpose of answering questions
that can determine if it respects the usability requirements.
Specific evaluation criteria are established, and solutions
are proposed for the found problems.

Author Keywords
evaluation criteria; heuristic evaluation; interactive
applications; video games; usability. Figure 1. The game interface of EvoGlimpse, whose usability
will be evaluated
ACM Classification Keywords
empirical studies; HCI design and evaluation methods; to be avoided, the execution speed, and also the number of
interactive games. errors. These methods of estimating the usability are
preferred by developers and are done by experts in the
INTRODUCTION domain, which are familiar with such systems.
Video games are a very popular type of interactive The evaluators are focusing on finding the problems and
application, with a large number of objectives. For creating thorough and helpful reports. Usability can be
example, they can be used in educational purposes [1], to established by cognitive or pluralist evaluations, inspecting
offer useful information to the user, in an enjoyable way. the consistency, standards, and characteristics of the system
The games are also used because they are recreational and a or by heuristic evaluation [3]. In the current paper, they are
preferred pastime for children and teenagers. They can help combined in order to reach solutions to solve the problems
develop fast problem-solving skills, with applicability in which are found.
real life too. This can be better observed in the case of The interactive application that will be evaluated in this
strategy games, where the player has to use the existing paper is the game called EvoGlimpse. It started with the
environment, resources, and characters to efficiently win aim to give to players a glimpse into evolution from the
the game. perspective of an exterior observer, who can travel at
The main motivation behind the concept of usability of an different points in time of Earth’s existence. This game is
interactive application, so implicitly of video games, relies heavily inspired by the movie and the book „2001: A space
on its capacity of establishing the success or failure rate of a odyssey” [2], in which a civilization of advanced beings
software product. Therefore, numerous companies have helps humans that are in different stages of evolution by
strict evaluation criteria, with some of them presented in presenting to them ways that can aid in their survival.
this paper. The evaluation of usability can be done during A series of worlds would be available, starting from the
the implementation and development stages of a game, first appearance of life – the fusion between RNA and an
which is highly recommended. Evaluation is an iterative enzyme, then at different stages of the evolution of species
process, by intercalating it in the stages of developing a – underwater life, transitioning to land and dinosaurs,
video game, and because it has the advantage of moving on to the human history – from the ancestors until
highlighting the design and implementation flaws, errors today, and for a plus of entertainment, will continue with a
and specific deficiencies, which can only be observed science fiction view of humankind – the union of human-
during testing. There are various evaluation methods, but in machine and the exploration of the universe. The player
general, they are done by testing some scenarios with would be able to travel in these worlds in different specific
necessary actions that need to be done when executing the shapes: atoms, energy, swimming, walking, riding animals,
project, establishing what outputs are expected, what needs
driving the cars, flying with the flying cars, and exploring freedom, consistency and standards, error prevention,
space in spaceships. recognition rather than recall, flexibility and efficiency of
Each stage has as objective finding the knowledge source, use, aesthetic and minimalist design, help users recognize,
represented by the monolith, which has an imposing shape, diagnose, and recover from errors, and help and
tall, black, created by a superior entity and which holds documentation.
superior information about the current state of the world. This paper is structured as follows: in Section Related
For example, in the stone age, this can offer to the monkeys Work will be presented a literature review of this domain,
the idea of creating weapons that represent an advantage in together with some evaluation methods. In Section
the fight for survival. Theoretical Considerations, the exact methodology that
As a world is explored, different obstacles appear, and the was taken into account for the heuristic evaluation will be
player must overcome them with the current set of skills. explained. In Section Experimental Considerations, the
This is enhanced each time the monolith is found. Once the stages of the evaluation are discussed, and the observations
world has been completely observed and the enemies are are explained; there are presented the requirements, the
defeated, the monolith appears to present the way of going evaluators, the heuristic evaluation details, the scenarios
from the past to the future. Using visual and auditory and the tasks that need to be followed for testing, and the
information, the player will know if he/she is close to the means of recording the results. Then, in Section Result
location of the monolith, and when this will be found, an Analysis, the outcomes are analyzed after the independent
educational video about evolution will be presented. The and group evaluations, the errors discovered are highlighted
player will also be able to see all finished phases and all the and solutions are proposed. The final observations are
discovered videos in a library, to which he/she can return at written in Section Conclusions.
any time.
For the actual game implementation, the goal was to create RELATED WORK
only a world, a futuristic one, on a planet covered by water, For the evaluation of the usability of an interactive
in a developed society, with modern architecture and flying application, there are various methods, each one specific to
cars. The main enemies will be planes guided by artificial the type of application, and the main goals of its developers.
intelligence. The player will have to protect itself from In general, there are used usability questionnaires like
them by shooting, for example with bullets, plasma or laser. SUMI [5] or QUIS [6], from which standard information
The main plot of the game follows 3 stages. In the first one, from the domain of usability can be extracted.
the player will have some time to get used to the planet and In [7], a series of steps are defined for evaluating the
the controls, being able to peacefully explore and observe usability: data gathering – by collecting information related
the world scene. In the second stage, the player will have to to how the application should be used, data analysis –
protect the planet from some invaders; as the game summarizing the statistics that were done and pointing out
advances, the abilities of the player increase. In the last the flaws and coming up with ways of improving them.
stage, since an advanced technology state has been reached, In virtual reality applications, for example, there are 6
the monolith will appear in an unknown location and will stages [8]: the exploratory one – where similar applications
have to be found by following its sound signals. An in- are analyzed and bibliographic material, related to the
game image can be seen in Figure 1. Here there can be domain and the evaluation heuristics, is collected, the
observed the game scene composed of water, building and a descriptive one – where the conclusions from the first stage
separator ring, the player’s vehicle, and the dynamic object are synthetized, and specific evaluations are formalized, the
with which the player will interact (enemies and power-up correlative one – where the principal characteristics of the
boxes). usability heuristics are identified, and representative case
We want to heuristically evaluate this game, which is a studies are presented, the explanatory one – where the
technique that helps determine the usability problems of a
heuristics are established following five characteristics
user interface. This is done by a small number of evaluators
(two), using a specific set of heuristics, proposed by the (identifiers, explanation, example, benefits, and problems),
developer. Afterward, the evaluation results are centralized, the validation one – where the evaluators inspect the
and the noticed problems are marked out, and solutions are application based on the previously mentioned heuristics,
proposed. The chosen criteria come from the 10 heuristics and the refinement one – after which three types of
of Nielsen [4]: the visibility of system status, match problems are found and need to be solved.
between system and the real world, user control and The developer is the one who proposes game scenarios, and
him/her describes how these can be done by the evaluators,
Table 1. The developer and the usability evaluators
Name Specialization and year of study Domain
Developer Artificial Vision and Intelligence Researcher in the image processing group;
1st year master’s student medium experience with video games
Eval1 Artificial Vision and Intelligence Researcher in the image processing group;
1st year master’s student little experience with video games
Eval2 Artificial Vision and Intelligence Experience in designing interactive applications;
1st year master’s student medium experience with video games
as the execution of specific actions. Thus, there can be
Table 2. Scenarios and actions that will be executed by the
observed how these actions can be done, how easily they evaluators to test the game
are understood, their difficulty level and the differences Scenario Actions
between the expectations and the actual implementation can S1. Navigation in T1. controlling the vehicle using
be seen. An evaluator has to test the game while keeping in the 3D scene the mouse movements
mind the requirements and will write reports which will T2. increase speed by pressing
point out the discovered flaws. space
Another evaluation method is AOP (Aspect Oriented T3. zoom in and out using the
Programming) [9], a recent technique with satisfying scroll wheel
results, and which is easy to use. In [10], the same authors S2. Attacking and T1. observing the enemies
avoiding T2. flying towards enemy
propose the use of agents that can automatically do the
enemies T3. player attacks by pressing the
evaluation. Based on an initial set of knowledge, they have left button of the mouse
the capacity to learn how to use the environment in which T4. the enemies attack when the
they are placed and know what tasks to execute. player gets in a certain range
The heuristic evaluation proposed by Nielsen [4] asks the and in a certain field of view
evaluators to establish the usability level based on 10 T5. observing the enemies reaction
criteria. This is done by a small number of evaluators, based T6. avoiding enemies
on a detailed set of scenarios and materials. The tasks have S3. Monolith T1. the player should understand
to be executed twice on the application’s interface, with the objective, by reading the
each element being inspected (button, object, control message shown on the screen
T2. successfully navigating in the
element etc.), followed by the evaluation of the
scene
implementation techniques and the interaction with them. T3. observe the monolith
The main goal is to find design and implementation errors T4. fly towards objective
and solutions to them. T5. message of winning the game
THEORETICAL CONSIDERATIONS S4. Repair power-up T1. recognizing the object
box T2. flight towards the objective
Usability is defined by Shackel [11, 12] as the capacity of a
T3. colission with the object
system to be easily understood and efficient to use by a T4. object destroyed
specific category of users, which received instructions and T5. life health increased
assistance in the usage of the application, by executing S5. Immunity T1. recognizing the object
tasks defined for a system. The emphasis is on efficiency, power-up box T2. flight towards the objective
ease of learning, flexibility, and attitude. A similar way of T3. colission with the object
defining usability is the one devised by Preece [13], which T4. object destroyed
measures it as the ease with which a system can be used, T5. enemy attack canceled for 20
together with its efficiency and security. seconds
In the ISO 9241-11 standard [14], usability is defined as S6. Display relevant T1. message with the game
messages objectives
being: “the extent to which a product can be used by
T2. toggle help option
specified users to achieve specified goals with T3. quit button
effectiveness, efficiency, and satisfaction in a specified T4. player health information
context of use”. Efficiency is the ratio between the used T5. message of collecting repair
resources and the accuracy with which they can be used, power-up box
efficacy is the accuracy and correctness of the system, T6. message of collecting
while satisfaction is a more subjective measure that refers immunity power-up box
to the user’s comfort. T7. message of destroying enemy
From Dix’s perspective [15], usability depends on three T8. message of losing the game
factors: the ease of learning – how fast can the new users T9. message of winning the game
use the system correctly and at a high level of performance,
Functionality refers to the degree of correctness the
flexibility – how easy it is to use the controls, together with
their correctness and robustness – the help that the user has implementation of the application has, while the interface is
what the user sees and a way of sending inputs and getting
to fulfill the specific actions. These represent a starting
point for creating evaluation tools. outputs. It has a big impact, especially in video games,
Usability evaluation has three main objectives, which are because the interaction is more visual and based on
highly correlated to the previously mentioned factors: metaphors specific to the game genre. On it depends the
ease of learning and the usage flexibility, but also the
1. establishing the degree of functionality of the
ability to recognize not recall, which doesn’t load the
interactive application;
2. assessing the suitability of the interaction of the user memory of the user with too much information.
with the interface; The planning is done together with the evaluators, after an
3. identifying the system’s problems. implementation phase of the system. The used concepts are
defined in order to avoid misunderstandings. Afterward, a
series of criteria are defined, which are clear and specific to whose information can be seen in Table 1. Next, six usage
the evaluated interactive application. Then the evaluation is scenarios have been set, which contain the game scene
done based on them, highlighting the errors, and finally, the navigation by controlling the player’s vehicle using the
results are evaluated, and solutions are proposed to improve mouse, interaction with the enemies by attacking them,
the system. In the next section, these steps will be shown on collecting the power-up boxes etc. Each scenario can be
the game EvoGlimpse. executed by following a set of tasks, which result in
feedback from the system, and that can be instantly seen by
EXPERIMENTAL CONSIDERATIONS the user. This information is contained in Table 2.
In the heuristic evaluation done on the proposed video Afterward, the evaluation criteria were established.
game, the main goal was to find the implementation errors Nielson’s 10 usability heuristics for user interface design
of the proposed scenarios. First, the evaluators were chosen, were chosen [4], with supplementary explanations that will

Table 3. Evaluation criteria


Nb. Questions and requirements
1. Visibility of system status
• Is the state of the system visible at all times?
• Is the feedback offered by the system suitable?
• Is the response time appropriate, without unacceptable delays?
• The game scene will be observed, as well as the interaction with the objects and elements specific to each game
scenario; attention will be payed to movement of the vehicle, attack, collection of the power-ups, the display of
messages and particle effects.
2. Match between system and the real world
• Does the game correspond to the mental model that the user has from a real-world game? Is it what you expected
or similar to other games?
• Are the language, words, and phrases used familiar to the user?
• Is there a natural way in displaying the information?
• Is this a suitable shooter game? Is the game scene realistic?
• Are there any uncertainties?
3. User control and freedom
• Can the user execute the necessary actions to fulfill the scenarios? Is their functioning correct?
• Can the user exit an unwanted state? For example, is there a need for an undo/ redo button?
• How does the vehicle control, attack, collection, and buttons feel?
4. Consistency and standards
• Is the user surprised by different words, situations or actions that have the same meaning?
• Is there consistency in the use of colors and symbols?
• Is the meaning of the objects from the scene understood?
5. Error prevention
• What is the functional correctness level of the game?
• Are the errors eliminated or are there methods to prevent situations that favor the apparition of errors?
• For example, notice what happens if the player tries to get too close to the water, at the collision with different
objects etc.
6. Recognition rather than recall
• Can the player recognize the objects and their usage?
• Are there elements that require storage in the memory of the user?
7. Flexibility and efficiency of use
• What is the level of flexibility and efficiency of the game usage?
• Is the user bothered by certain aspects? Or are some of them missing?
8. Aesthetic and minimalist design
• What is the quantity of relevant information?
• Is there any redundant information?
• Is the information presented clear and easily accessible?
• Is the field of view of the player cluttered with too many elements or is it suitable?
9. Help users recognize, diagnose, and recover from errors
• Are the messages clear and helpful for the player?
• Should there be any additional error prevention cases?
10. Help and documentation
• Is the help menu complete?
• Does it contain clear, simple, and easily accessible information?
• Is the documentation clear, does it contain sufficient information for the player? If not, what should be added?
Table 4. Evaluation stages
Nb. Name Evaluation technique
1. Individual evaluation • done independently by the 2 evaluators, by filling in separate tables for each scenario
• a mark between 0 and 100 is assigned to each evaluation criteria, and at the end the average is
taken
• at the end of testing, reports are written with the encountered problems
• the developer proposes solutions to solve the errors
2. Group evaluation • done by the 2 evaluators together with the game developer
• tables with the most important questions are written, together with the found answers

Table 5. Individual heuristic evaluation results of the first scenario


Scenario Criteria Eval1 Eval2 Average
S1. Navigation in the 1. Visibility of system status 100 100 100
3D scene 2. Match between system and the real world 90 90 90
3. User control and freedom 70 80 75
4. Consistency and standards 100 90 95
5. Error prevention 50 90 70
6. Recognition rather than recall 90 100 95
7. Flexibility and efficiency of use 100 90 95
8. Aesthetic and minimalist design 100 100 100
9. Help users recognize, diagnose, and recover from errors 50 95 72.5
10. Help and documentation 80 100 90
Eval1’s report The collision with objects such as buildings is an enormous problem, as you have probably observed.
After colliding with a building, I was simply floating in space, without being able to reposition
myself. I know why this is happening, I have the same issue in my game, but an inexperienced user
will not understand a thing.
Eval2’s report The game starts abruptly, without a start menu, but the movements of the vehicle are very smooth. It
is easier to move left-right than up-down.
Solutions There is a problem at the level of materials that are attached to the objects, in particular to the vehicle
and the buildings. This can be solved by changing the bounce value in the phycis property of the
materials. The creation of a menu will be taken into consideration for the next implementation
iteration.

Table 6. Individual heuristic evaluation results of the second scenario


Scenario Criteria Eval1 Eval2 Average
S2. Attacking and 1. Visibility of system status 70 100 85
avoiding enemies 2. Match between system and the real world 20 95 57.5
3. User control and freedom 90 80 85
4. Consistency and standards 100 90 95
5. Error prevention 90 80 85
6. Recognition rather than recall 100 100 100
7. Flexibility and efficiency of use 80 90 85
8. Aesthetic and minimalist design 100 100 100
9. Help users recognize, diagnose, and recover from errors 90 80 85
10. Help and documentation 100 100 100
Eval1’s report The match between the virtual world of the game and the real world has such a small mark because it
is not very easy to see when someone is shooting you or when you are attacking someone. I was
expecting to see a laser or a bulltet that would appear. Instictually I want to get close to attack
objects because I know that a bullet shot at a closer distance is more accurate that one shot at a
higher distance. Here it does not matter. Another severe issue is that it is too easy to destroy an
enemy. It would have been useful to add a life-bar on top of each one, and to be necessary at least 2-
3 shots to take down an object. When an enemy shots you, there is not enough information. You
expect to see a particle effect on the car or at least to hear a specific sound. That is why I gave it only
20 points.
Eval2’s report The enemies are easy to attack and avoid, but their answer is too slow.
Solutions The enemies have attack particle effects, but those can not be observed since they are behind the
player. This can be changed by adding effects on the car, and adding sounds that would help the
player know if he / she is shot. Also, the user attacks in the center of the screen, where the crosshair
is displayed. The player should experiment with the attacks, and thus it can be seen that the enemies
can be shot only at a certain distance. The enemies do not die instantly, as it can be seen on the
particles displayed on the player’s vehicle, multiple shots are needed. A health bar should be added
aid the evaluators to focus toonthethe
enemies to aid
desired in this problem.
elements. Each Also, the enemies only attack if the user is at a certain distance
from them, and in a certain field of view. To make their response faster, I can increase their
movement speed.
Table 7. Individual heuristic evaluation results of the third scenario
Scenario Criteria Eval1 Eval2 Average
S3. Monolith 1. Visibility of system status 90 100 95
2. Match between system and the real world 100 90 95
3. User control and freedom 100 100 100
4. Consistency and standards 90 80 85
5. Error prevention 100 90 95
6. Recognition rather than recall 60 100 80
7. Flexibility and efficiency of use 100 100 100
8. Aesthetic and minimalist design 90 100 95
9. Help users recognize, diagnose, and recover from errors 100 100 100
10. Help and documentation 100 100 100
Eval1’s report Being just a prototype version of the game, it is alright to put the monolith always in the same place,
but I admit it would have been more fun to compute its position randomly at each run of the game so
I wouldn't know where it is when a new game begins.
Eval2’s report At the first run of the game, I destroyed all the enemies and I collected all the power-up boxes, and
afterwards I found the monolith and the game stoped. A little too repetitive.
Solutions I chose the option of fixing the position of the monolith because the game scene is small and the
objective would have been too easy to find. If the scene was bigger, then yes, the position of the
monolith would be randomly computed at each run of the game. The same is available for the
enemies and the power-up boxes - if the game scene is bigger, more objects can be inserted, thus
making the game more entertaining.

Table 8. Individual heuristic evaluation results of the fourth scenario


Scenario Criteria Eval1 Eval2 Average
S4. Repair power-up 1. Visibility of system status 100 100 100
box 2. Match between system and the real world 90 100 95
3. User control and freedom 100 90 95
4. Consistency and standards 100 100 100
5. Error prevention 100 90 95
6. Recognition rather than recall 80 100 90
7. Flexibility and efficiency of use 100 100 100
8. Aesthetic and minimalist design 100 100 100
9. Help users recognize, diagnose, and recover from errors 100 90 95
10. Help and documentation 100 100 100
Eval1’s report These look nice, but I expected them to disspear before passing through them. I do not notice if the
disappear from the game scene for example, because it is difficult to turn the vehicle around.
Eval2’s report It is a very good game object, but not always necessary, especially because the enemies do not
represent a big threat.
Solutions I choose the option of making the boxes dissapear after the collision because it would have been
confusing otherwise. It can be noticed that they do dissapear instantly after we touch them, and the
interaction with them is correct since their effect is immediately observed and a feedback in the form
of a system message is displayed. Their necessity can be increased by adding different abilites to the
enemies or making them smarter.

Table 9. Individual heuristic evaluation results of the fifth scenario


Scenario Criteria Eval1 Eval2 Average
S5. Immunity power- 1. Visibility of system status 100 100 100
up box 2. Match between system and the real world 90 100 95
3. User control and freedom 100 90 95
4. Consistency and standards 100 100 100
5. Error prevention 100 90 95
6. Recognition rather than recall 80 100 90
7. Flexibility and efficiency of use 100 100 100
8. Aesthetic and minimalist design 100 100 100
9. Help users recognize, diagnose, and recover from errors 100 90 95
10. Help and documentation 100 100 100
Eval1’s report Same observation as above. These look really nice, I love the graphics. The concentric circles look
really good.
Eval2’s report I think more enemies are needed in order to increase the game difficulty.
personSolutions
who tests the game will complete
I admit a tablemore
that I focused for each
on the functional correctness of the game rather than on the level of
entertainment. This can be changed by increasing the game scene and adding variety to the enemies.
Table 10. Individual heuristic evaluation results of the sixth scenario
Scenario Criteria Eval1 Eval2 Average
S6. Display relevant 1. Visibility of system status 100 100 100
messages 2. Match between system and the real world 80 90 85
3. User control and freedom 100 90 95
4. Consistency and standards 90 80 85
5. Error prevention 50 90 70
6. Recognition rather than recall 70 100 85
7. Flexibility and efficiency of use 100 90 95
8. Aesthetic and minimalist design 100 100 100
9. Help users recognize, diagnose, and recover from errors 100 100 100
10. Help and documentation 100 100 100
Eval1’s report I took points for error prevention because the user should receive a feedback when he / she is
colliding with the buildings or a message should be displayed with how to solve this issue or
something similar.
Eval2’s report The messages are clear and correctly displayed
Solutions I think the best solution is to do error prevention at the collision with the buildings, so the user won't
have to worry about it.

Table 11. Group heuristic evaluation results – first part


Scenario Eval1 Eval2
S1 Q1: The problem of the collision with the building is pretty Q1: The vehicle now moves with the help of the mouse, not
serious. Did you think how you could solve it? with the keys W, A, S, D?
A1: Yes, by altering the paramethers of the physics A1: Indeed, I modified this interaction to increase the usability
materials from which the objects are made. These have a of the game.
bounce value that determines how they react when the
collision takes place.
Q2: The navigation using the mouse is very good, that's all
I'm going to say. Much better than before, it is even fun to
play.
A2: Indeed, I took into consideration your early evaluations
where you have observed that controlling the vehicle by the
keys W, A, S, D is not intuitive, so I worked to improve it.
S2 Q1: I have noticed that there is no indicator saying that you Q1: It is too easy to take down enemies, and they rarely attack
hit an enemy or if you were hit, other than a message that I you. How can you improve these aspects?
do not always have enought time to read. Is there a A1: I can change the life amount of enemies, the damage of
minimum distance needed to shoot an enemy? Did you both the player and the enemies, and the speed for the later, so
think of using bullets, sounds and a life bar? they would move faster.
A1: There is a certain distance that both the player's and the
enemies attack can take place. If you are too far, the attack
won't have any effect. I want to add power-up boxes with
new types of attacks (laser, plasma etc.) and to add more
useful effects to them, like particles, rays and sounds. And
also the enemies should receive a health bar to aid the
player.
S3 Q1: Did you think to randomly compute the monolith's Q1: I was expecting the monolith to be more colourful and
position randomly with each new game? Then I could really smaller, now it is just a black object. We can not differentiate
say from the beginning of the game that it is true that you between it and the buildings from far away. What is its
"find the monolith" and not "recall where you last saw the purpose?
monolith". A1: Actually, I am really happy that it is harder to notice, and
A1: Yes, I did think about it, but because the game scene is I am glad that it is a correct representation from the monolith
too small, I had to position it at the furthest and most in „2001: A Space Odyssey” [2], it even respects the 1:4:9
difficult to see position. Otherwise, there would have been proportions. I left it bigger to help in testing, but I can scale it
the risk of it to appear right next to the player and the game to a smaller size. In the broader picture of the game, it has the
would have ended instantly. purpose of displaying an educational video to the player, so it
would cease to be just a big black box.
Q2: The monolith is in the same place all the time and I can
just avoid the enemies and fly directly towards it to win the
game. How can you change this?
A2: For the first issue, see answer to S3, Q1, Eval1. For the
second one, I can make the monolith appear only after the
scenario, in which he/she will give a mark between 0 and player has destroyed all the enemies in the game, by setting it
as active at a random position on the scene.
Table 12. Group heuristic evaluation results – second part
Scenario Eval1 Eval2
S4 Q1: Do you think it is possible to identify the moment when Q1: Why is it so big and easy to obtain?
a player entered too much in collision with a building and A1: So it can be noticed from far away. I do not think it
lost control, and therefore restart the game or display a should be difficult to obtain. Inded, right now we can't see
relevant message? its true importance, but if more enemies were present, and
A1: Yes, I can write a method that would check when it you had a limited amount of resources (for eg amo), the
enters in collision with an object and have a certain player would have to pick the right time to use each
functionality - restart or display message. But I believe that in power-up box. This would bring the game more on the
this case it is better to prevent this error. strategy type.
S5 - Q1: Why is it so big and easy to obtain? Also, did you
think of informing the player when it expires?
A1: Same as before, with the addition that For now the
player has no information when the power-up expires, so
a suitable message should be added when this happens.
S6 - -

aid the evaluators to focus on the desired elements. Each deficiency of the game, solutions were proposed by the
person who tests the game will complete a table for each game developer. It has been found that the game has a high
scenario, in which he/she will give a mark between 0 level of usability. This evaluation represents a big help for a
and100 for each of the 10 heuristics, and then will write a creator of interactive application because it is an extremely
report with the found problems. These aspects are detailed helpful way of finding in a fast and efficient way what the
in Table 3 and Table 4. There are two stages, one where the problems are, which speeds up the process of improving the
evaluation is done independently by each person, and one system.
where it is done together with the developer to discuss the
REFERENCES
found issues directly on the game and to assess how [1] C. Pribeanu, D. D. Iordache, V. Lamanauskas, R. Vilkonis, "Evaluarea
suitable are the proposed solutions. utilizabilităţii şi eficacităţii pedagogice a unui scenariu de învăţare
bazat pe realitate îmbogăţită," presented at the Conferinta Nationala de
RESULT ANALYSIS Interactiune Om-Calculator - RoCHI, 2008.
In this section, tables that contain the usability evaluation [2] A. C. Clarke, S. Kubrick, 2001: a space odyssey, 1968.
will be presented. The results of the evaluation were [3] E. d. Kock, J. v. Biljon, and M. Pretorius, "Usability evaluation
gathered in tables to keep track of the scenario that is tested, methods: mind the gaps," presented at the Proceedings of the 2009
together with the problems it contains, and specific Annual Research Conference of the South African Institute of
Computer Scientists and Information Technologists, 2009.
solutions that are proposed to remove the errors. Table 5 to
[4] J. Nielsen, Usability Engineering: Morgan Kaufmann Publishers Inc.,
Table 10 contain the results of the individual evaluation for 1993.
each proposed scenario, together with a report on the [5] K. Jurek and C. Mary, "SUMI: the Software Usability Measurement
discovered errors, and also with the solutions found by the Inventory," British Journal of Educational Technology, vol. 24, pp.
developer. In Table 11 and Table 12, the most important 210-212, 1993.
questions that were asked during the group evaluation are [6] J. P. Chin, V. A. Diehl, and K. L. Norman, "Development of an
instrument measuring user satisfaction of the human-computer
taken apart and answered. Thus, we have successfully interface," presented at the Proceedings of the SIGCHI Conference on
identified the drawbacks of the user interface design and Human Factors in Computing Systems, 1988.
implementation, and we were able to find solutions to [7] M. Y. Ivory and M. A. Hearst, "The state of the art in automating
correct them. usability evaluation of user interfaces," ACM Comput. Surv., vol. 33,
After the two evaluations, both evaluators found different pp. 470-516, 2001.
[8] D. Gorgan, C. Rusu, D. Mihon, V. Colceriu, S. Roncagliolo, V. Rusu,
types of errors and problems. Mostly, it had to do with "Euristici specifice de utilizabilitate pentru aplicaţiile paralele şi
functionality flaws, related to the interaction between the distribuite," presented at the Revista Română de Interacţiune Om-
player’s vehicle and the buildings. It was also noted that the Calculator, Vol.4, Nr.2, 2011.
user is not fully satisfied with the game, because it is [9] A. M. Tarta and G. S. Moldovan, "Automatic Usability Evaluation
repetitive and does not bring a lot of excitement with the Using AOP," 2006 IEEE International Conference on Automation,
Quality and Testing, Robotics, vol. 2, pp. 84-89, 2006.
low variety of tasks it had to do, and with the objects it has [10] A. M. Tarta, G. S. Moldovan, G. Serban, " An Agent Based User
to interact with. After the heuristic evaluation, we reached Interface Evaluation Using Aspect Oriented Programming
the conclusion that the game has a level of usability of Techniques," presented at the ICAM5, 2006.
92.6%. [11] B. Shackel, "Usability - context, framework, definition, design and
evaluation," in Human factors for informatics usability, Cambridge
CONCLUSIONS University Press, 1991, pp. 21-37.
This paper focused on heuristically evaluating a video [12] B. Shackel, "Usability - Context, framework, definition, design and
game. The two evaluators had to execute a set of specific evaluation," Interact. Comput., vol. 21, pp. 339-346, 2009.
scenarios, to follow a group of tasks, and to write down [13] B. D. Preece J., Davies G., Keller G., RogersY, "A Guide to
Usability," 1990.
reports with the problems that they found while testing the [14] ISO9214-11, "Ergonomic Requirements for office Work with VDT’s
interactive application. Two main types of tables resulted, – Guidance on Usability," 1991.
one containing the individual evaluations, and one with the [15] A. Dix, J. E. Finlay, G. D. Abowd, and R. Beale, Human-Computer
group evaluation discussions. For each mentioned error or Interaction (3rd Edition): Prentice-Hall, Inc., 2003.
Heuristic Evaluation of the EvoGlimpseVideo Game
Bianca-Cerasela-Zelia Blaga
Selma Evelyn Cătălina Goga
Al-doori Rami Watheq Yaseen
Dorian Gorgan
Introduction
• Digital games are recreational and a preferred pastime for children and teenagers [1]
• Evaluation is a key step in the development methodology of any human-computer interactive application
Ø iterative process, being intercalated in the stages of developing a video game [2]
• Advantage: highlights the design and implementation flaws, errors and specific deficiencies
• Establishes the success or failure rate of a software product
Ø Final Fantasy XIV was a massive multiplayer online role-playing game in
Square Enix's Final Fantasy series, developed as a spiritual successor to
Final Fantasy XI.
Ø The game was released for Microsoft Windows on September 30, 2010.
Ø The initial release of the game was met with poor reviews, with critics
describing grind-heavy gameplay, poor controls, and a confusing user
interface.
Ø Shortly after release, then-CEO of Square Enix Yoichi Wada issued an
official apology for the quality of the game at the 2011 Tokyo Game Show
in December 2011, saying that "the Final Fantasy brand [had] been greatly
damaged". [3]
INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 2
3-4 September, Cluj-Napoca, Romania
M otivation
• Case study: EvoGlimpse video game [4]
• Focus: finding the errors and problems, creating thorough and helpful reports, and finding solutions
• Use: heuristical evaluation, which is a technique that helps determine the usability problems of a user interface
Ø (ISO 9241-11 standard) the extent to which a product can be used by specified users to achieve specified goals with
effectiveness, efficiency, and satisfaction in a specified context of use [5]
Ø (Shackel) the capacity of a system to be easily understood and efficient to use by a specific category of users, which
received instructions and assistance in the usage of the application, by executing tasks defined for a system [6, 7]
Ø The emphasis is on:
ü Efficiency – ratio between the used resources and the accuracy with which they can be used
ü Efficacy – the accuracy and correctness of the system
ü Ease of learning – how fast can the new users use the system correctly and at a high level of performance
ü Flexibility – how easy it is to use the controls, together with their correctness
ü Robustness – the help that the user has to fulfill the specific actions
ü Satisfaction – a more subjective measure that refers to the user’s comfort
INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 3
3-4 September, Cluj-Napoca, Romania
T heoretical Considerations
• In virtual reality applications, for example, there are 6 stages [8]:
1. exploratory – where similar applications are analyzed and bibliographic material, related to the domain and the
evaluation heuristics, is collected;
2. descriptive – where the conclusions from the first stage are synthetized, and specific evaluations are formalized;
3. correlative – where the principal characteristics of the usability heuristics are identified, and representative case
studies are presented;
4. explanatory – where the heuristics are established following five characteristics (identifiers, explanation, example,
benefits, and problems);
5. validation – where the evaluators inspect the application based on the previously mentioned heuristics;
6. refinement – after which three types of problems are found and need to be solved.

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 4
3-4 September, Cluj-Napoca, Romania
Case Scenario:
EvoGlimpse
EvoGlimpse aims to give the players a glimpse into
evolution from the perspective of an exterior observer,
who can travel at different points in time of Earth’s
existence. This game is heavily inspired by the movie and
the book „2001: A space oddysey” [9], in which a
civilization of advanced beings helps humans that are in
different stages of evolution by presenting them ways
that can aid in their survival.
For the actual game implementation, the goal was to
create only a world, a futuristic one, on a planet covered
by water, in a developed society, with modern
architecture and flying cars. The main enemies will be
planes guided by artificial intelligence. The player will
have to protect itself from them by shooting, for example
with bullets, plasma, or laser. After defeating all enemies,
since an advanced technology state has been reached,
the monolith will appear in an unknown location and will
have to be found by following its sound signals.

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 5
3-4 September, Cluj-Napoca, Romania
Scenarios
Scenario Actions
1. Navigation in the 1. controlling the vehicle using the mouse movements
2. increase speed by pressing space
3D scene 3. zoom in and out using the scroll wheel
2. Attacking and 1. observing the enemies
2. flying towards enemy
avoiding enemies 3. player attacks by pressing the left button of the mouse
4. the enemies attack when the player gets in a certain range and in a certain field of view
5. observing the enemies reaction
6. avoiding enemies

3. Monolith 1. the player should understand the objective, by reading the message shown on the screen
2. successfully navigating in the scene
3. observe the monolith
4. fly towards objective
5. message of winning the game

4. Repair power-up 1. recognizing the object


2. flight towards the objective
box 3. colission with the object
4. object destroyed
5. life health increased
5. Immunity power- 1. recognizing the object
2. flight towards the objective
up box 3. colission with the object
4. object destroyed
5. enemy attack canceled for 20 seconds
6. Display relevant 1. message with the game objectives
2. toggle help option
messages 3. quit button
4. player health information
5. message of collecting repair power-up box
Heuristic Evaluation of the 6. message of collecting immunity power-up box INTERNATIONAL CONFERENCE ON
7. message of destroying enemy HUMAN-COMPUTER INTERACTION 6
EvoGlimpse Video Game
8. message of losing the game 3-4 September, Cluj-Napoca, Romania
9. message of winning the game
Team and Evaluation Stages
• Method: Nielson’s 10 usability heuristics for user interface design [10]

Role Specialization and year of study Domain


Developer Artificial Vision and Intelligence, Researcher in the image processing group;
1st year master’s student medium experience with video games
Evaluator 1 Artificial Vision and Intelligence, Researcher in the image processing group;
1st year master’s student little experience with video games
Evaluator 2 Artificial Vision and Intelligence, Experience in designing interactive applications;
1st year master’s student medium experience with video games

Nb. Name Evaluation technique


1. Individual evaluation • done independently by the 2 evaluators, by filling in separate tables for each scenario
• a mark between 0 and 100 is assigned to each evaluation criteria, and at the end the average is taken
• at the end of testing, reports are written with the encountered problems
• the developer proposes solutions to solve the errors
2. Group evaluation • done by the 2 evaluators together with the game developer
• tables with the most important questions are written, together with the found answers

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 7
3-4 September, Cluj-Napoca, Romania
Nielson’s 10 Usability Heuristics
1. Visibility of system status
Ø Is the state of the system visible at all times?
Ø Is the feedback offered by the system suitable?
Ø Is the response time appropriate, without unacceptable delays?
Ø The game scene will be observed, as well as the interaction with the objects and elements specific to each game
scenario; attention will be payed to movement of the vehicle, attack, collection of the power-ups, the display of
messages
2. Match between system and the real world
Ø Does the game correspond to the mental model that the user has from a real-world game? Is it what you expected
or similar to other games?
Ø Are the language, words, and phrases used familiar to the user?
Ø Is there a natural way in displaying the information?
Ø Is this a suitable shooter game? Is the game scene realistic?
Ø Are there any uncertainties?

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 8
3-4 September, Cluj-Napoca, Romania
Nielson’s 10 Usability Heuristics
3. User control and freedom
Ø Can the user execute the necessary actions to fulfill the scenarios? Is their functioning correct?
Ø Can the user exit an unwanted state? For example, is there a need for an undo/ redo button?
Ø How does the vehicle control, attack, collection, and buttons feel?
4. Consistency and standards
Ø Is the user surprised by different words, situations or actions that have the same meaning?
Ø Is there consistency in the use of colors and symbols?
Ø Is the meaning of the objects from the scene understood?
5. Error prevention
Ø What is the functional correctness level of the game?
Ø Are the errors eliminated or are there methods to prevent situations that favor the apparition of errors?
Ø For example, notice what happens if the player tries to get too close to the water, at the collision with different
objects etc.

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 9
3-4 September, Cluj-Napoca, Romania
Nielson’s 10 Usability Heuristics
6. Recognition rather than recall
Ø Can the player recognize the objects and their usage?
Ø Are there elements that require storage in the memory of the user?
7. Flexibility and efficiency of use
Ø What is the level of flexibility and efficiency of the game usage?
Ø Is the user bothered by certain aspects? Or are some of them missing?
8. Aesthetic and minimalist design
Ø What is the quantity of relevant information?
Ø Is there any redundant information?
Ø Is the information presented clear and easily accessible?
Ø Is the field of view of the player cluttered with too many elements or is it suitable?

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 10
3-4 September, Cluj-Napoca, Romania
Nielson’s 10 Usability Heuristics
9. Help users recognize, diagnose, and recover from errors
Ø Are the messages clear and helpful for the player?
Ø Should there be any additional error prevention cases?
10. Help and documentation
Ø Is the help menu complete?
Ø Does it contain clear, simple, and easily accessible information?
Ø Is the documentation clear, does it contain sufficient information for the player? If not, what should be added?

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 11
3-4 September, Cluj-Napoca, Romania
R esults
Criteria Scenario 1 Scenario 2 Scenario 3 Scenario 4 Scenario 5 Scenario 6

Visibility of system 100 85 95 100 100 100


Legend:
status
100 Match between system 90 57.5 95 95 95 85 Overall:
and the real world 92.6
90 - 99 User control and 75 85 100 95 95 95
freedom
50 - 89
Consistency and 95 95 85 100 100 85
standards
Error prevention 70 85 95 95 95 70

Recognition rather 95 100 80 90 90 85


than recall
Flexibility and 95 85 100 100 100 95
efficiency of use
Aesthetic and 100 100 95 100 100 100
minimalist design
Help users recognize, 72.5 85 100 95 95 100
diagnose, and recover
from errors
Heuristic Evaluation of the Help and 90 100 100 100 100 INTERNATIONAL
100 CONFERENCE ON
HUMAN-COMPUTER INTERACTION 12
EvoGlimpse Video Game documentation
3-4 September, Cluj-Napoca, Romania
Summary of Detected Problems

1 2 3
Level of entertainment Errors
Difficulties
Problem: small map, and easy to win Problem: building collision Problem: not enough
Solution: make the game scene Solution: alter the bounce sounds and visual effects
bigger, and add more enemies, with parameter of the physics Solution: add a menu, new
more variety to their behavior materials sounds and particle
effects, and life bar for
enemies

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 13
3-4 September, Cluj-Napoca, Romania
Additional Questions and Observations
1. The mouse controlled movement of the vehicle is a great improvement
2. How can the collision problem be solved?
3. Insuficient indicators for the enemies attacks (sounds, effects or health bar)
4. Too easy to take enemies down
5. The monolith causes confusions – its position does not change and it is difficult to spot
6. The power-up boxes are redundant and too easy to obtain

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 14
3-4 September, Cluj-Napoca, Romania
Conclusions
• After the two evaluations, both evaluators found different types of errors and problems;
• Mostly, it had to do with functionality flaws, related to the interaction between the player’s vehicle and the buildings;
• It was also noted that the user is not fully satisfied with the game, because it is repetitive and does not bring a lot of
excitement with the low variety of tasks it had to do, and with the objects it has to interact with;
• After the heuristic evaluation, we reached the conclusion that the game has a level of usability of 92.6%.

• As future work, use other methods such as Heuristics for Evaluating Playability [11] for each of the categories:
Ø game play – the set of problems and challenges a user must face to win a game;
Ø game story – plot and character development;
Ø game mechanics – the programming that provides the structure by which units interact with the environment;
Ø game usability – the interface; encompasses the elements the user utilizes to interact with the game (e.g. mouse,
keyboard, controller, game shell, heads-up display).

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 15
3-4 September, Cluj-Napoca, Romania
Bibliography
1. C. Pribeanu, D. D. Iordache, V. Lamanauskas, R. Vilkonis, "Evaluarea utilizabilităţii şi eficacităţii pedagogice a unui scenariu de învăţare bazat pe
realitate îmbogăţită," presented at the Conferinta Nationala de Interactiune Om-Calculator - RoCHI, 2008.
2. E. d. Kock, J. v. Biljon, and M. Pretorius, "Usability evaluation methods: mind the gaps," presented at the Proceedings of the 2009 Annual
Research Conference of the South African Institute of Computer Scientists and Information Technologists, 2009.
3. Senior, Tom (October 18, 2010). "Final Fantasy XIV review". PC Gamer. Archived from the original on August 10, 2015. Retrieved October 20,
2010.
4. B. C. Z. Blaga, D. Gorgan, " Game Development Methodology Mapped on the EvoGlimpse Video Game Experiment “, RoCHI 2018.
5. ISO9214-11, "Ergonomic Requirements for office Work with VDT’s – Guidance on Usability," 1991.
6. B. Shackel, "Usability - context, framework, definition, design and evaluation," in Human factors for informatics usability, Cambridge University
Press, 1991, pp. 21-37.
7. B. Shackel, "Usability - Context, framework, definition, design and evaluation," Interact. Comput., vol. 21, pp. 339-346, 2009.
8. D. Gorgan, C. Rusu, D. Mihon, V. Colceriu, S. Roncagliolo, V. Rusu, "Euristici specifice de utilizabilitate pentru aplicaţiile paralele şi distribuite,"
presented at the Revista Română de Interacţiune Om-Calculator, Vol.4, Nr.2, 2011.
9. A. C. Clarke, S. Kubrick, "2001: a space odyssey", 1968.
10. J. Nielsen, "Usability Engineering", Morgan Kaufmann Publishers Inc., 1993.
11. H. Desurvire, M. Caplan, J. A. Toth, "Using heuristics to evaluate the playability of games“, CHI ’04 Extended Abstracts on Human Factors in
Computign Systems, Vienna, Austria, p. 1509 – 1512, 2004.

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 16
3-4 September, Cluj-Napoca, Romania
Thank you!
To do: add information and picture

INTERNATIONAL CONFERENCE ON
Heuristic Evaluation of the EvoGlimpse Video Game HUMAN-COMPUTER INTERACTION 17
3-4 September, Cluj-Napoca, Romania
Interaction Techniques

User Interface Design


Contents

o Interaction basics

o Types of interaction techniques

o Classification
n Gestures
n Simple interaction techniques
n Complex interaction techniques
n Other interaction techniques

User Interface Design - UTCN 2


Human-Computer Interaction
o Interaction devices
o Interaction technologies
o Interaction techniques

User Interface Design - UTCN 3


Interaction cycle
o We want to reuse good, basic methods of interaction
o Techniques are collected into higher level interactions called dialogs
o These interaction techniques are collected into most UI toolkits
o The components of a technique are:
1. Prompter: initial output giving context
2. Symbol: user input, what does the user with input device
3. Echo: output in response to user actions, tell user how actions
interpreted
4. Value: final result of technique, value passed on to system

Echo

Start
Value Symbol
Finish

Prompter

User Interface Design - UTCN 4


Interaction technique
o Metaphor (symbolical presentation of a real case):
1. Visual presentation +
2. Scenario +
3. Sequence of user actions +
4. Interaction device

User Interface Design - UTCN 5


Interaction architecture

Visual UI object
Procedures
- UI object
User events - Visual presentation
P1
- Interactive object
- Process user messages ____
- Mousemove ____
- Mouseover ...
- Push_left_button
(MLB) PushBttn P2
__________ ____
__________ ____
__________ Object events ...
__________
__________
... Pn
__________ ____
____
...

User Interface Design - UTCN 6


Physical interaction devices
o Input devices
n Indirect and direct pointing devices, keyboards,
microphones, video cameras, body sensors, position/
orientation sensors: accelerometers, magnetometers,
gyroscopes, Wii, Kinect, leap motion, robots, etc.

o Output devices
n Monitors, earphones, loudspeakers, haptic devices (force
feedback, tactile, temperature), robots, dolls, etc.
o Input-output devices

User Interface Design - UTCN 7


Input devices
o Indirect pointing
n Mouse, trackballs, joysticks, etc.
o Direct pointing devices
n Touchscreens, touchpads, multi-touch devices

User Interface Design - UTCN 8


Interaction techniques in GUI
o Classification by operation/logical device:

1. Select, choice

2. Quantify, valuator

3. Text, string

4. Position, locator

5. Object selection, pick

6. Point sequence, sweep, stroke

User Interface Design - UTCN 9


Interaction techniques in GUI

Logical input device/function =


metaphor + interaction cycle + physical device

User Interface Design - UTCN 10


Select, choice
o Choose one or more integers from a known set
o Examples: menus, checkboxes
o Physical device: keyboard, set of buttons (function keys)

CHOICE input function = metaphor (labeled circles, squares) +


interaction cycle + physical device (e.g., mouse)
Single choice -> radio button
Multiple choices -> check boxes

A set of option buttons A set of check boxes

User Interface Design - UTCN 11


Quantify, valuator
o Input a real value
o Examples: scroll bars, dials (potentiometers)
o Physical device: analog to digital converter, potentiometers

VALUATOR input function = metaphor (index on the scale) +


interaction cycle + physical device (e.g., mouse)

A slider

A static text label for a slider

User Interface Design - UTCN 12


Text, string
o Input some characters
o Examples: text box, text field
o Physical device: keyboard

STRING input function = metaphor (edit box) + interaction cycle +


physical device (e.g., keyboard)

A standard text box

User Interface Design - UTCN 13


Position, locator
o Input a 2D location
o Examples: coursor / mouse, rollerball
o Physical device: graphic tablet, joystick

LOCATOR input function = metaphor (screen area) + interaction


cycle + physical device (e.g., mouse)

(x,y)

User Interface Design - UTCN 14


Object selection, pick
o Choose an object from a known set
o Examples: cursor / mouse, keyboard input
o Physical device: light-pen

PICK input function = metaphor (application objects) + interaction


cycle + physical device (e.g., mouse)

User Interface Design - UTCN 15


Point sequence, sweep, stroke
o Input a series of 2D points
o Examples: cursor / mouse, pen
o Physical device: mouse, track-ball

o STROKE input function = metaphor (screen area) + interaction


cycle + physical device (e.g., mouse)

(dx,dy)

User Interface Design - UTCN 16


Classification of interaction techniques
o Classification by interaction complexity:

1. Gestures
n en event sequence that have a well defined significance

2. Simple interaction techniques


n process the input value of a single variable

3. Complex interaction techniques


n tools that provide for the definition and operation on complex
information

User Interface Design - UTCN 17


Gestures

1. Click
2. Press-down

3. Release

4. Press-timer

5. Range
6. Drag

Reference: Classification given by P. Szekely in “Separating the User Interface from the Functionality
of Application’s Program”, PhD Thesis, Carnegie-Mellon Univ., 1988.

User Interface Design - UTCN 18


Click gesture
o Formal description:
press release output
release
press: vtemp = value
output
release: var = vtemp
output: vtemp = null
input: vtemp = value
output release: none input

o Example: value
identifier of a UI control such as radio button, check box,
ID of an application object,
ID of a menu item.
o Use case:
Click on application object (i.e. rectangle) to select it.
Click on a menu item to select it.

User Interface Design - UTCN 19


Press-down gesture
o Formal description:
input press output
input: vtemp = value
press: var = vtemp
output: vtemp = null

o Example: value
identifier of a UI control such as radio button, check box,
ID of an application object,
ID of a menu item.
o Use case:
Press on application object (i.e. rectangle) to select and drag it.
Press on a file icon and drag it into a directory or a garbage collector.

User Interface Design - UTCN 20


Release gesture
o Formal description:
release
input: vtemp = value
release: var = vtemp input output
output: vtemp = null
move

o Example: value
ID of an application object,
Position within a working area.
o Use case:
Release an application object onto a specific position.
Release the file icon onto the icon of a garbage collector.
Specifies by direct manipulation the second endpoint of a line segment.

User Interface Design - UTCN 21


Press-timer gesture
o Formal description:
input press release output
input: vtemp = value
press: var = vtemp tick output
time tick
vtemp = new_value
start_timer(delay)
tick time: var = vtemp
vtemp = new_value
output: vtemp = null
output tick: none
release: stop_timer

o Example:
value - ID of a slider,
new_value – ID of the index within a slider.
o Use case: The user moves the mouse cursor inside the slider area. The slider
has associated a timer that give the rate to scroll the working area of a graphic
editor. When the user pres the left button of the mouse on the upper arrow of the
slider the object scene scrolls up until the user releases the button. Leaving the
slider area the timer stops.

User Interface Design - UTCN 22


Range gesture
o Formal description: press release release

press: var_1 = value_1 output


move: v2temp = value_2 move
release: var_2 = v2temp input
output move: none range area output
move
o Example:
value_1 - first end point of a line segment,
value_2 – second end point of the line segment.
var_1 and var_2 – values sent to the application, i.e. graphics editor

o Use case:
Pressing the mouse button, the user specifies the first corner of a rubber
band. Moving the cursor, the user specifies the temporary second
corner. The editor draws the intermediate rubber band rectangle. When
the user releases the button the cursor position is sent to the editor
and the application draws the final rubber band rectangle.

User Interface Design - UTCN 23


Drag gestures
o Formal description: press release
output
release

press: vtemp = value output


move: vtemp = value
move
release: var = vtemp input
output: vtemp = null presentation
input: vtemp = value
dragging area
output release: none

o Example:
value - current position of the moving cursor,
var – value sent to the application program.
o Use case:
The user clicks on the iconic presentation of an application object. He drags
the icon around the dragging area. The application drops the icon on
the position where the user releases the button.
If the user realeases the button outside the dragging area, the icon jumps
onto the initial position.

User Interface Design - UTCN 24


Simple interaction techniques
o The user enters a single value
into the application.

o Example:

n Radio buttons

n Check boxes

n Command button

n Push button

n Scroll bars, slider

n Edit box

n List box

User Interface Design - UTCN 25


Buttons
o Buttons
n Command buttons
n Option buttons
n Check boxes

User Interface Design - UTCN 26


Command buttons

Command button states

Command Button state


button
appearance A button that provides access to
additional information
Normal appearance

Pressed appearance

Input focus appearance

Default appearance

Unavailable appearance

Command buttons A menu button (closed and opened


states)

User Interface Design - UTCN 27


Radio button
o Radio button definition
1. General presentation: label
2. Not selected: label
3. Selected: label
4. Selection preview: label
5. Invalidated: label

User Interface Design - UTCN 28


Option buttons and check boxes

A set of option buttons An option button used to label another control

A set of check boxes


Option buttons with
mixed-value appearance

A check box setting used


A check box label used to label another control to filter the contents of a
list

User Interface Design - UTCN 29


Text boxes

A standard text box

A rich-text box

A drop-down combo box (closed and opened


states)

A combo box
A static text field A spin box

User Interface Design - UTCN 30


List boxes

A drop-down list box (closed and opened states)

A single-selection list box

A multiple-selection list box

User Interface Design - UTCN 31


Complex interaction techniques
o The user enters more than one input value
o Examples:

1. Menus

2. Valuators: dials (potentiometers), sliders

3. Object position: mouse

4. Object selection: mouse

5. Object shape: mouse

6. Drag and drop

7. Drag and guess

8. Bubble cursor

User Interface Design - UTCN 32


Menus
o Menu items: text or icons
o Text is understandable but verbose
o Icons are often difficultly interpreted, but once understood, are very
easy perceived
o Item order
o Alphabetical, functional, frequence of use
o Trend is to functional (task grouping) order
o Choose an ordering scheme
o Do not change the order during use
o If item goes inactive, gray it out
o Item shape, size
o Rectangular
n Bigger is easier to select, but take up more screen space
n Various sizes in a single application can be unpleasant
n Horizontal does not work well with textual items
o Pie
n Farther mouse moves, less error likely
n Explicit control of time accuracy tradeoff
n Distance to each choice is same

User Interface Design - UTCN 33


Example of menus

Menu bar

Accelerator

Cascading
menu and
multiple
selections

User Interface Design - UTCN 34


Example of pie menu

Example given by Prof. Olga


de Troyer, VUB, HCI course

User Interface Design - UTCN 35


Menus

User Interface Design - UTCN 36


User Interface Design - UTCN 37
Menu - examples

User Interface Design - UTCN 38


Menu – toolbars and status bars

User Interface Design - UTCN 39


Menus
o Multiple (sub-) menus
n Linear
n Hierarchy
o Advantage of context
o Can group tasks based on the task analysis
o Hierarchy also requires changes in mouse motion
o Decision: breadth vs. depth
o Research says depth should be more limited
o Choice time a function of depth
n Networks
o Allow more than one path to a choice
o Makes sense for large applications with many subsystems
o Disadvantage: Easier to get lost

User Interface Design - UTCN 40


Menus

a. Singular menu b. Liniar sequence

c. Tree structure d. Acyclic network

e. Cyclic network

User Interface Design - UTCN 41


Menus
o Shortcutting
Large menu sequences need shortcuts for frequent actions
n Typeahead
o Text commands map onto menu
o Users need not wait for display to type ahead
o Example, phone systems, commercial devices
n Menu names
o Skip parent menus by naming submenus
o Example: “go MGraphOps”
o Useful when users need only few submenus
n Macros
o Can record sequence of actions, keystrokes to hotkey
n Toolbars
o Iconic menus (usually horizontal) allow quick access to choices
o Common in word processors, graphic applications

User Interface Design - UTCN 42


Menus
o Temporal properties
On web, menu response not instant
n Response time
o Time until submenu appears
o Depth needs more time
o Have more width
n Display rate
o Time until all choices appear
o Large width needs more time
o Have more depth
o Requires frequency based ordering

User Interface Design - UTCN 43


Menus - examples

User Interface Design - UTCN 44


Valuators
o Valuators: dials (potentiometers), sliders
n Text
o Very precise
o Disadvantage: can be annoying in direct manipulation style
o Can be too precise
n Dials/sliders
o Basic issues:
n Decide precision vs. range
n To get needed range of numbers
n Provide required resolution
o Solutions:
1. Multiple widgets
n Each has different gain
2. Multiple buttons:
n Each button has different increase
3. nth order control:
n Zeros: position to position, e.g. mouse
n First: position to velocity, e.g. steering wheel
n Second: position to acceleration, e.g. space
4. Map velocity to precision:
n Slow: high precision, low range
n Fast:: low precision, high range

User Interface Design - UTCN 45


Object position: mouse
o Example: where does object go
o Feedback
n Dragging object, cursor follows
n Display cursor coordinates
n Display ruler lines
o Accuracy
n Gridding
o Truncating coordinates to a certain precision
o Easier to place things up
n Gravity
o Cursor jumps to objects of interest
o Counterpart is area cursors, with more than one point for selection
n Control and display ratio
o Movement of physical mouse results in certain cursor motion on display
o The degree of movement is controlled by a ratio, the Control/Display
ratio
o The Control/Display ratio is not constant, it changes with velocity

User Interface Design - UTCN 46


Object selection: mouse
o "picking up“ the object

o Issues:
1. If an object consists of subobjects, does the user want the object
or the subobject?
2. If several objects overlap, which one does the user want?
3. Can differentiate points with different selection meaning
4. Can also have different actions for different selection levels
5. The notion of front and back

User Interface Design - UTCN 47


Object selection: Classification
o Selection space
n Object (3D)
n Image (2D)

o Selected item
n Text identifier
n Object name
n Integer identifier
n Direct manipulation
n Graphics primitive: point, line, rectangle, etc.
n Graphics label
n Object area
n Contour based selection
n Active area, context

User Interface Design - UTCN 48


Object selection: Classification
o Access type
n Sequential
n Direct
n Indirect

o Object type
n Icon
n Bitmap
n Graphics primitive
n Complex objects (aggregate)
n Raster
n Vector based object

User Interface Design - UTCN 49


Object selection: Classification
o Logical input device
n Locator: mouse, trackball, joystick
n Stroke: mouse
n Pick: light-pen
n Valuator: potentiometer, slider
n Choice: buttons, function keys
n Text: keyboard

o Interaction techniques
n Gestures: press, release, etc.
n Simple: buttons, box, slider, etc.
n Complex: dialog box, menu, palette, etc.

o Selection semantics
n One context
n Sequence of contexts

User Interface Design - UTCN 50


Object selection: examples

User Interface Design - UTCN 51


Object selection: examples

User Interface Design - UTCN 52


Object selection: examples

User Interface Design - UTCN 53


Object selection: examples

User Interface Design - UTCN 54


A set of option buttons

Other general controls

A progress indicator A slider

A static text label for a slider

A date picker control

A tab control

Balloon tips in a dialog box


A taskbar notification balloon tip

User Interface Design - UTCN 55


Object shape: mouse

o Shape: line, polygon, etc.

o Feedback: control the shape as the points are entered

o Rubber banding

User Interface Design - UTCN 56


Interaction Styles
Contents

o Interaction style

o Command based interaction styles


n Key-modal
n Linguistic
n Direct manipulation

o Continuous interaction styles

User Interface Design - UTCN 2


Interaction mode and style
o Conceptual models: from interaction mode to style

o Interaction mode:
n what the user is doing when interacting with a system
e.g. instructing, talking, browsing or other

o Interaction style:
n the kind of interface used to support the interaction mode
e.g. speech, menu-based, gesture
n interaction techniques are organized into higher level structures,
called interaction styles or dialogue styles
n usually the interfaces provide more than one style, but one is
predominant
n appropriate style depends on user and task

User Interface Design - UTCN 3


Types of interaction styles

A. Command based
o Interactions can be broken down into discrete units of interaction
o Example: operator + a sequence of operands

B. Continuous
o This is a new type of style evolving in simulation, games, virtual reality.
No commands
o Examples: the control of visualization in VRML browsers, user
movement through the virtual space in 3D graphical editors

User Interface Design - UTCN 4


A. Command based interaction style
o Interactions can be broken down into discrete units of
interaction
o This is the most common (and older) style

Results
o Interaction takes a cyclic form:
1. user makes request
User System
2. system processes request
3. user processes system results
Command

o Most forms of this style use


o operators: the commands, describing action
o operands: the parameters for the command

User Interface Design - UTCN 5


Basic types of command style

A1. Key-modal
o Command set depends on and is limited by system state, or
mode
o Easy to learn, ideal for “walk-up”

A2. Linguistic
o Keyboard is used as input
o There is a language syntax that governs input

A3. Direct manipulation


o Operands are displayed graphically, chosen by user with
mouse
o Operators chosen with commands or menus
o Operands show result of operation graphically

User Interface Design - UTCN 6


User Interface Design - UTCN 7
Command syntax specification
o Operator to operand ordering
o Prefix
o Postfix
o Infix

o Operand ordering
o Fixed order means easier to check, explain
o Free order means no need to learn ordering
o Free order possible only if types or entry methods are
different

User Interface Design - UTCN 8


Prefix - operator to operand ordering

o Sintax: operator comes first


op opnd1 opnd2...
o +: Seems natural in english, where verb comes first
o +: Sets context (syntax) for operands
o -: Which is last operand? It needs a “do it” signal, e.g. return
(invokation input communication concept)
o Implicit invokation
E.g. Rotate obj1 point angle
o Explicit invokation
E.g. Aggregate agg obj1 obj2 … objn ^

User Interface Design - UTCN 9


Postfix - operator to operand ordering

o Sintax: operator comes last:


opnd1 opnd2 ... op
o +: end is clear. Implicit invokation
o +: operands provide some limited context for operator choice
o +: supports use of many operands
o -: no operator context for operands

o E.g. obj1 obj2 … objn Group

User Interface Design - UTCN 10


Infix - operator to operand ordering

o Sintax: operator comes somewhere in the middle:


opnd1 ... op opnd2 ...
o -: limited operator context
o -: end uncertain
o +: flexibility in entry

o E.g. agg Add obj obj2 … objn ^


obj1 obj2 … objn Add agg

User Interface Design - UTCN 11


Feedback in command style
o Informs the user how the user interface is interpreting the
user's actions

o Feedback types:
1. Lexical
n After each physical action,
n e.g. echo key press, mouse move
2. Syntactic
n Is syntax of command correct?
n Usually after command complete (earlier if prefix)
n No feedback if the command is ok, error message if bad
3. Semantic
n Shows results of executed command
n If command destructive, confirmation is requested

User Interface Design - UTCN 12


Default in command style

o Enable user efficiency by avoiding repetitive entry


o Can have default operators, or operands

o Selection of default
o Explicitly chosen
o Last used
o Most frequently used

User Interface Design - UTCN 13


Basic types of command style

A1. Key-modal
o Command set depends on and is limited by system state, or
mode
o Easy to learn, ideal for “walk-up”

A2. Linguistic
o Keyboard is used as input
o There is a language syntax that governs input

A3. Direct manipulation


o Operands are displayed graphically, chosen by user with
mouse
o Operators chosen with commands or menus
o Operands show result of operation graphically

User Interface Design - UTCN 14


A1. Key-modal
o Command set depends on and is limited by system state, or
mode
o Easy to learn, ideal for “walk-up”
o Examples:
o Menu based interaction
o Question and answer
o Function key interaction
o Voice based interaction
o Real world examples:
o ATMs, on-line banking system
o Voice based phone interface
o Configuration tools
o Characteristics:
o Used in complex applications
o Modes in a well determined sequence: sequential dialogue
o Prevalent in key-modal, but mixed with other styles

User Interface Design - UTCN 15


Key-modal - example

User Interface Design - UTCN 16


Key-modal - example

User Interface Design - UTCN 17


A2. Linguistic
o Keyboard is used as input
o There is a language syntax that governs input
o Steep learning curve, but experts are efficient

o Examples:
o Command line
o Natural language
o Real world examples:
o UNIX
o Database query languages (e.g. SQL)
o Organization schemes:
o Simple sets: one command per operator
o Operator with operand
o Operator, option, operand
o Hierarchical

User Interface Design - UTCN 18


Command line
o Way of expressing instructions directly to the computer
o Provides direct access to the system functionality
o Suitable for repetitive tasks, expert users. Experienced user
has feeling of control
o Difficult for beginners. Requires to memorize the command
language
o Command names should be meaningful

Example: Query dialogue


n Question/answer interfaces
o user led through interaction via series of questions
o suitable for beginners, but restricted functionality
n Query languages (e.g. SQL)
o used to retrieve information from database
o requires understanding of database structure and
language syntax, hence requires some expertise

User Interface Design - UTCN 19


Natural language

o Familiar to user
o Use speech recognition or typed natural language
o Problems: vague, ambiguous

User Interface Design - UTCN 20


A3. Direct manipulation
o Operands are displayed graphically, chosen by user with
mouse
o Operators chosen with commands or menus
o Operands show result of operation graphically
o A compromise in ease of learning and efficiency between key-
modal and linguistic.

o Examples:
o graphical direct manipulation
o forms fill in

o Examples of real world:


o GUI in actual applications
o MS Windows
o Databases with form entry

User Interface Design - UTCN 21


Interaction style classification

1. Key-modal
n Menu selection

2. Linguistic
n Command line
n Natural language

3. Direct manipulation
n WIMP (Windows, Icons, Menu, Pointers)
n Form fill-in and spreadsheets

User Interface Design - UTCN 22


Interaction style characteristics

Application
Interaction style Main advantages Main disadvantages
examples
Direct manipulation Fast and intuitive May be hard to implement. Video games,
(WIMP) interaction. Only suitable where there is a CAD systems.
Easy to lean. visual metaphor for tasks and
objects.
Menu selection Avoid user error. Slow for experienced users. Most general –
Little typing required. Can become complex if many purpose systems.
menu options.

Form fill-in Simple data entry. Takes up a lot of screen space. Stock control,
Easy to learn. Causes problems where user Personal loan
Checkable. options don’t match the form processing.
fields.

Command line Powerful and flexible. Hard to learn. Operating systems,


Poor error management. Command and
control systems.
Natural language Accessible to casual Requires more typing. Information retrieval
users. Natural language understanding systems.
Easily extended. systems are unreliable.

Table from Ch16-User Interface Design, in “Software Engineering”, of Ian Sommerville, 7’th ed., 2004

User Interface Design - UTCN 23


Other kinds of interaction styles
o Command
o Speech
o Data-entry
o Form fill-in
o Query
o Graphical
o Web
o Pen
o Augmented reality
o Gesture
o …

User Interface Design - UTCN 24


Direct manipulation - principles

Principles of direct manipulation:

n Continuous representation of the objects and actions of interest


with meaningful visual metaphors

n Physical actions or presses of labeled buttons, instead of complex


syntax

n Rapid incremental reversible operations whose effect on the


object of interest is visible immediately
(WYSIWYG – what you see is what you get)

User Interface Design - UTCN 25


Direct manipulation
o Default style for majority of PCs

o Based on
o windows
o icons
o cursors
o menus

o WIMP interface - Windows, Icons, Menu, Pointers


o Widgets: elements of the WIMP interface,
Toolkit for interaction between user and system
Based on Buttons, Toolbars, Palettes, Dialog boxes

Appearance + Behavior = Look and Feel

User Interface Design - UTCN 26


Direct manipulation - example

User Interface Design - UTCN 27


Direct manipulation - windows

Windows - areas of the screen that behave as if they were


independent terminals

n can contain text or graphics


n can be moved or resized
n can overlap and obscure each other,
or can be laid out next to one another (tiled)
n scrollbars allow the user to move the contents
of the window up and down or from side to side
n title bars describe the name of the window

User Interface Design - UTCN 28


Direct manipulation - icons

Icons - small picture or image

n represents some object in the interface, often a window or action


n windows can be closed down (iconised)
n small representation Þ many accessible windows
n icons can be many and various
n highly stylized or realistic representations

User Interface Design - UTCN 29


Direct manipulation - cursors

Cursors – pointers

n usually achieved with mouse


n and also joystick, trackball, cursor keys or keyboard shortcuts
n wide variety of graphical images

User Interface Design - UTCN 30


Direct manipulation - menus

Menus - choice of operations or services offered on the screen

Types of menu:
1. Cascading menu
item calls other menu
2. Pop-up and pull-down menu’s
pull-down menu usually connected to menu bar (top of the screen or
window)
pop-up menu more efficient (anywhere on the screen, usually
connected to an object - context menu)
3. Pin-up menu
4. Multiple selection
select/deselect items
need ‘done” function
5. Pie menu
select time for each item is the same

User Interface Design - UTCN 31


Example of menus

Menu bar

Accelerator

Cascading
menu and
multiple
selections

User Interface Design - UTCN 32


Example of pie menu

Example given by Prof. Olga


de Troyer, VUB, HCI course

User Interface Design - UTCN 33


Form fill-in

o Primarily for data entry or data retrieval


o Screen like paper form
o Data put in relevant place
o Requires:
n good design
n obvious correction facilities

User Interface Design - UTCN 34


Form fill-in - example

User Interface Design - UTCN 35


Form fill-in - example

User Interface Design - UTCN 36


B. Continuous based interaction style
o This is a new type of style evolving in:
o simulation
o games
o virtual reality
o 3D graphics

o Characterized by:

1. No commands, input and output is continuous


n Or the command cycle is so fast as to seem so

2. Interaction techniques more “natural”, intuitive


n Pointing, grasping, head turning, arm movement
n Less menus and typing

3. Environment is 3D
n Not a desktop, but a room or a 3D working space

User Interface Design - UTCN 37


Continuous based interaction style
o Command syntax sample:
op object to direction by speed
move avatar to forward by fast
o Continuous command inputs and output results

op move, jump, . . .

speed
object
slow, fast, . . .
avatar, . . .
direction

forward, left, right, . . .

User Interface Design - UTCN 38


Video games

User Interface Design - UTCN 39


User interaction devices in video games

User Interface Design - UTCN 40


Continuous interaction style - example

User Interface Design - UTCN 41

You might also like