Unit III. Implementation Support
Unit III. Implementation Support
OVERVIEW
Programming tools for interactive systems provide a means of effectively translating abstract
designs and usability principles into an executable form. These tools provide different levels of
services for the programmer.
Windowing systems are a central environment for both the programmer and user of an
interactive system, allowing a single workstation to support separate user–system threads of
action simultaneously.
Interaction toolkits abstract away from the physical separation of input and output devices,
allowing the programmer to describe behaviors of objects at a level similar to how the user
perceives them.
User interface management systems are the final level of programming support tools, allowing
the designer and programmer to control the relationship between the presentation objects of a
toolkit with their functional semantics in the actual application.
INTRODUCTION In this chapter, we will discuss the programming support that is provided for the
implementation of an interactive system. We have spent much effort up to this point considering design
and analysis of interactive systems from a relatively abstract perspective. We did this because it was not
necessary to consider the specific details of the devices used in the interaction. Furthermore,
consideration of that detail was an obstacle to understanding the interaction from the user’s
perspective. But we cannot forever ignore the specifics of the device. It is now time to devote some
attention to understanding just how the task of coding the interactive application is structured. The
detailed specification gives the programmer instructions as to what the interactive application must do
and the programmer must translate that into machine executable instructions to say how that will be
achieved on the available hardware devices. The objective of the programmer then is to translate down
to the level of the software that runs the hardware devices. At its crudest level, this software provides
the ability to do things like read events from various input devices and write primitive graphics
commands to a display. Whereas it is possible in that crude language to produce highly interactive
systems, the job is very tedious and highly error prone, amenable to computer hackers who relish the
intricacy and challenge but not necessarily those whose main concern is the design of very usable
interactive systems. The programming support tools which we describe in this chapter aim to move that
executable language up from the crudely expressive level to a higher level in which the programmer can
code more directly in terms of the interaction objects of the application. The emphasis here is on how
building levels of abstraction on top of the essential hardware and software services allows the
programmer to build the system in terms of its desired interaction techniques, a term we use to indicate
the intimate relationship between input and output. Though there is a fundamental separation between
input and output devices in the hardware devices and at the lowest software level, the distinction can
be removed at the programming level with the right abstractions and hiding of detail. In the remainder
of this chapter, we will address the various layers which constitute the move from the low-level
hardware up to the more abstract programming concepts for interaction. We begin in Section 8.2 with
the elements of a windowing system, which provide for device independence and resource sharing at
the programming level. Programming in a window system frees the programmer from some of the
worry about the input and output primitives of the machines the application will run on, and allows her
to program the application under the assumption that it will receive a stream of event requests from the
window manager. In Section 8.3 we describe the two fundamental ways this stream of events can be
processed to link the interface with the application functionality: by means of a read–evaluation control
loop internal to the application program or by a centralized notification-based technique external to it.
In Section 8.4, we describe the use of toolkits as mechanisms to link input and output at the
programming level. In Section 8.5, we discuss the large class of development tools lumped under the
categories of user interface management systems, or UIMS, and user interface development systems,
UIDS.
In earlier chapters, we have discussed the elements of the WIMP interface but only with respect to how
they enhance the interaction with the end-user. Here we will describe more details of windowing
systems used to build the WIMP interface. The first important feature of a windowing system is its
ability to provide programmer independence from the specifics of the hardware devices. A typical
workstation will involve some visual display screen, a keyboard and some pointing device, such as a
mouse. Any variety of these hardware devices can be used in any interactive system and they are all
different in terms of the data they communicate and the commands that are used to instruct them. It is
imperative to be able to program an application that will run on a wide range of devices. To do this, the
programmer wants to direct commands to an abstract terminal, which understands a more generic
language and can be translated to the language of many other specific devices. Besides making the
programming task easier, the abstract terminal makes portability of application programs possible. Only
one translation program – or device driver – needs to be written for a particular hardware device and
then any application program can access it. A given windowing system will have a fixed generic language
for the abstract terminal which is called its imaging model. The imaging models are sufficient to describe
very arbitrary images. For efficiency reasons, specific primitives are used to handle text images, either as
specific pixel images or as more generic font definitions.
Pixels The display screen is represented as a series of columns and rows of points – or pixels – which can
be explicitly turned on or off, or given a color. This is a common imaging model for personal computers
and is also used by the X windowing system. Graphical kernel system (GKS) An international standard
which models the screen as a collection of connected segments, each of which is a macro of elementary
graphics commands. Programmer’s hierarchical interface to graphics (PHIGS) Another international
standard, based on GKS but with an extension to model the screen as editable segments. PostScript A
programming language developed by Adobe Corporation which models the screen as a collection of
paths which serve as infinitely thin boundaries or stencils which can be filled in with various colors or
textured patterns and images.
Though these imaging models were initially defined to provide abstract languages for output only, they
can serve at least a limited role for input as well. So, for example, the pixel model can be used to
interpret input from a mouse in terms of the pixel coordinate system. It would then be the job of the
application to process the input event further once it knows where in the image it occurred. The other
models above can provide even more expressiveness for the input language, because they can relate the
input events to structures that are identifiable by the application program. Both PHIGS and PostScript
have been augmented to include a more explicit model of input. When we discussed the WIMP interface
as an interaction paradigm in Chapter 4, we pointed out its ability to support several separate user tasks
simultaneously. Windowing systems provide this capability by sharing the resources of a single hardware
configuration with several copies of an abstract terminal. Each abstract terminal will behave as an
independent process and the windowing system will coordinate the control of the concurrent processes.
To ease the programming task again, this coordination of simultaneously active processes can be
factored out of the individual applications, so that they can be programmed as if they were to operate in
isolation. The window system must also provide a means of displaying the separate applications, and
this is accomplished by dedicating a region of the display screen to each active abstract terminal. The
coordination task then involves resolving display conflicts when the visible screen regions of two
abstract terminals overlap. In summary, we can see the role of a windowing system, depicted in Figure
8.1, as providing independence from the specifics of programming separate hardware devices;
management of multiple, independent but simultaneously active applications. Next, we discuss the
possible architectures of a windowing system to achieve these two tasks.
Bass and Coutaz [29] identify three possible architectures for the software to implement the roles of a
windowing system. All of them assume that device drivers are separate from the application programs.
The first option is to implement and replicate the management of the multiple processes within each of
the separate applications. This is not a very satisfactory architecture because it forces each application
to consider the difficult problems of resolving synchronization conflicts with the shared hardware
devices. It also reduces the portability of the separate applications. The second option is to implement
the management role within the kernel of the operating system, centralizing the management task by
freeing it from the individual applications. Applications must still be developed with the specifics of the
particular operating system in mind. The third option provides the most portability, as the management
function is written as a separate application in its own right and so can provide an interface to other
application programs that is generic across all operating systems. This final option is referred to as the
client–server architecture, and is depicted in Figure 8.2. In practice, the divide among these proposed
architectures is not so clear and any actual interactive application or set of applications operating within
a window system may share features with any one of these three conceptual architectures. Therefore, it
may have one component that is a separate application or process together with some built-in operating
system support and hand-tuned application