Lesson 1
Lesson 1
The early history of computing can be traced back to the narrow aims of
mathematicians, logicians and astronomers who had particular calculations that
needed to be performed. The Persian astrologer Al-Kashi (1393–1449) built a device
to calculate the conjunction of the planets. Records of this work survived and were
transported to Europe, although the device itself was lost. The German
mathematician Wilhelm Schickard (1592–1635) developed a much less sophisticated
tool to perform simple addition and subtraction. The Schickard machine was
destroyed during the Thirty Years’ War. The French mathematician Blaise Pascal
(1612–1662) replicated much of Schickard’s work and only succeeded in building an
even more simplified version of that machine.
There was no gradual improvement in our knowledge over time. War, famine and the
plague interrupted the development of mechanical computing devices. This, combined
with the primitive nature of the hardware, meant that user interfaces were almost
non-existent. The systems were used by the people who built them. There was little
or no incentive to improve HCI.
The agricultural and industrial revolutions in Western Europe created the need for
external markets and external sources of raw materials. This greatly increased the
level of trade that was already conducted in commodities such as spices, gold and
slaves. This, in turn, led to a rapid expansion in the merchant navies maintained
by many countries. In the past the captains of these ships relied upon local
knowledge and expertise. They always followed the same sea routes. As trade
developed, this expertise became less important. As a result, there was an
increasing need to produce accurate maps and navigation charts. These involved the
calculation of precise distances and longitudes needed for navigation.
The demand for navigational aids fuelled the development of computing devices.
Charles Babbage (1791–1871) was a British mathematician and inventor whose early
attempts were funded by the Navy Board. As in previous centuries, his Difference
Engine was designed to calculate a specific function (6th degree polynomials): a +
bN + cN2 + dN3 + eN4 + fN5 + gN6.
This machine was never completed. Babbage’s second machine, called the Analytical
Engine, was a more general computer. This created the problem of how to supply the
machine with its program. Punched cards were used and became perhaps the first
solution to a user interface problem. The idea was so popular that this style of
interaction dominated computer use for the next century.
With the rise of mass production techniques on the east coast of the United States,
economic pressures for trade increased. This had the effect of drawing migrants who
were fleeing famine in Ireland and Scandinavia. The rapid influx of people caused
severe problems for the United States government. They wanted to monitor this flow
in order to avoid the introduction of epidemics from certain parts of the world.
They also wanted to build a profile of the population for tax reasons.
The Second World War created another set of narrow applications for computing
devices. Alan Turing, an English logician and a founder of computer science, was
employed to break the German encryption techniques. This led to the development of
the Colossus (1943), perhaps the first truly interactive computer. The operator
could type input through a keyboard and gain output via a teleprinter.
ENIAC (1943)
http://ftp.arl.army.mil/~mike/comphist/ Two early programmers (Standing: Marlyn
Wescoff Crouching: Ruth Lichterman) at work on the ENIAC US Army photo from the
archives of the ARL Library (US Army Research Laboratory)
Many of the Colossus techniques were applied in the ENIAC From machine (see figure
1.2), the first all-electronic digital computer produced around 1946 by JW Mauchly
and JP Eckert in the United States. As with Colossus, the impetus for this work
came from the military who were interested in ballistic calculations. To program
the machine, you had to physically manipulate 200 plugs and 100 to 200 relays. The
Manchester Mark I computer was also from about this period.
In 1945 Vannevar Bush, an electrical engineer in the USA, published his “As we may
think” article in Atlantic Monthly. This article was the point of departure for
Bush’s idea of the Memex system. The Memex was a device in which individuals could
store all personal books, records and communications, and from which items could be
retrieved rapidly through indexing, keywords and cross-references. The user could
annotate text with comments; construct a trail (chain of links) through the
material and save it. Although the system was never implemented, and although the
device was based on microfilm record rather than computers, it conceived the idea
of hypertext and the World Wide Web (WWW) as we know it today.
By this time the first machine languages began to appear. These systems were
intended to hide the details of the underlying hardware from programmers. In
previous approaches, one was required to understand the physical machine. In 1957
IBM launched FORTRAN, one of the first high-level programming languages which
created a new class of novice users: people who wanted to learn how to program but
who did not want a detailed understanding of the underlying mechanisms. FORTRAN was
based on algebra, grammar and syntax rules, and became the most widely used
computer language for technical work.
In the early 1950s some of the earliest electronic computers such as MIT’s
Whirlwind and the SAGE air-defence command and control system, had displays as
integral components. By the middle of the 1950s it became obvious that the computer
could be used to manipulate pictures as well as numbers and text. Probably the most
successful in this area was Ivan Sutherland who, in 1963, developed the SketchPad
system at the MIT Lincoln Laboratory. It was a sophisticated drawing package which
introduced many of the concepts found in today’s interfaces such as the
manipulation of objects using a light-pen, (including grabbing objects, moving them
and changing their size) and using constraints and icons. Hardware developments
that took place during the same period include “low-cost” graphics terminals, input
devices such as data tablets, and display processors capable of real-time
manipulation of images.
Two of the most dominant influences in suggesting the potential of the technology
of this era have been Doug Engelbart and Ted Nelson. They both took the concept of
the Memex system and elaborated on it in various ways. Whereas Nelson focused on
links and interconnections (which he named ‘hypertext’ and implemented as the
Xanadu system), Engelbart concentrated primarily on the hierarchic structure of
documents. In 1963 he published an article entitled “A conceptual framework for
augmenting human intellect”, in which he viewed the computer as an instrument for
augmenting man’s intellect by increasing his capability to approach complex problem
situations.
The turning points in the development of computers that would allow it to become
available to the man in the street occurred in the mid-1970s ‒ also the period
which saw the rise of two major American role players in today’s computer industry:
Microsoft and Apple Computers. Initial attempts to support the “desktop metaphor”
pushed graphical facilities and processor speeds to their limits. Below we follow
the development towards where we are today.
The Apple Company was founded by Steve Jobs and Steven Wozniak in 1976. Initially,
they produced a series of kit machines similar to those that led to the development
of the IBM PC a few years later. They hit upon the idea of pushing the code needed
to represent the desktop into hardware. Graphics and device handling were burned
into ROM (read-only memory). This led to a higher degree of consistency because it
became more difficult to change the look and feel of the interface. (The Apple
history website http://www.apple-history.com/ provides greater detail about their
early history.)
Apple I (Figure 1.3) was Steven Wozniak’s first personal computer. It made its
first public appearance in April 1976 at the Homebrew Computer Club in Palo Alto,
but few took it seriously. It was sold as a kit one had to assemble. At $666, it
was an expensive piece of machinery even by today’s standards, considering that the
price only included the circuit board. Users even had to build the computer case
themselves. It was based on the MOStek 6502 chip (most other kit computers used the
Intel 8080 chip).
Apple I From: http://en.wikipedia.org/w/index.php?title=Apple_I&oldid=499416018
Before the 1980s, personal computers were only used by enthusiasts. They were sold
in kits and were distributed through magazines and electronic shops. This meant
that their user population consisted almost entirely of experts. They understood
the underlying hardware and software mechanisms because they had built most of it.
Many people thought that they were mere toys. In the late seventies this attitude
began to change as the demand for low-end systems began to increase.
In 1981 IBM introduced their first PC (see figure 1.4) together with DOS (Disk
Operating System). Little has changed in the underlying architecture of this system
since its introduction. The relatively low cost and ease with which small-scale
clusters could be built (even if they were not networked), vastly expanded the user
population. A cycle commenced in which more people were introduced to computers.
Increasing amounts of work were transferred to these systems and this forced yet
more people to use the applications. As a result, casual users began to appear for
the first time. They were people whose work occasionally required the use of a
computer but who spend most of their working life away from a terminal. This user
group found PCs hard to use. In particular, the textual language required to
operate DOS was perceived to be complex and obscure.
1.2.5.3 The Xerox Star (1982)
Although the graphical user interface (GUI) had its roots in the 1950s, it was not
developed until the 1970s when a group at the Xerox Palo Alto Research Center
(PARC) developed the Alto, a GUI-based computer. The Alto was the size of a large
desk, and Xerox believed it to be unmarketable. In 1982 Xerox introduced their STAR
user interface. This marks what many people believe to be the beginning of HCI as a
conscious design activity by software companies. In response to the increasing use
of PCs by casual users and in office environments, Xerox began to explore more
intuitive means of presenting the files, directories and devices that were
represented by obscure pieces of text in DOS. Files were represented by icons and
were deleted by dragging them over a wastebasket. Other features or principles
included a small set of generic commands that could be used throughout the system,
a high degree of consistency and simplicity, a limited amount of user tailor
ability – what you see is what you get (WYSIWYG) – and the promotion of
recognising/pointing rather than remembering/typing. It was the first system based
upon usability engineering. Ben Shneiderman of the University of Maryland coined
the term “direct manipulation” in 1982 and introduced the psychological foundations
of computer use (Myers 1998). In the lesson tools that follow we will come back to
the importance of some of these principles in interface design.
Steve Jobs of Apple Computers took a tour of PARC in 1979 and saw the future of
personal computing in the Alto. Although much of the interface of both the Apple
Lisa (1983) and the Apple MacIntosh (Mac) (1984) was based (at least
intellectually) on the work done at PARC, much of the Mac OS (operating system) was
written before Jobs’ visit to PARC. Many of the engineers from PARC later left to
join Apple. When Jobs accused Bill Gates of Microsoft of stealing the GUI from
Apple and using it in Windows 1.0, Gates fired back: “No, Steve, I think it’s more
like we both have a rich neighbour named Xerox, and you broke in to steal the TV
set, and you found out I’d been there first, and you said. ‘Hey that’s not fair! I
wanted to steal the TV set!’”
The fact that both Apple and Microsoft got the idea of the GUI from Xerox put a
major dent in Apple’s lawsuit against Microsoft over the GUI several years later.
Although much of The Mac OS was original, it was similar enough to the old Alto GUI
to make a look-and-feel suit against Microsoft doubtful. Today the look and feel of
the Microsoft Windows Environment and the Macs are very similar, although both have
retained some of their original unique features and identities (also in the naming
of features). As far as hardware is concerned, the Apple and the PC have developed
in more or less the same direction. The only difference is that Apple has
experimented beyond pure functionality as far as the aesthetics of their machines
is concerned. The Macintosh was the first popular computer to use a mouse and
graphical user interface (GUI). The Macintosh was initially used as a desktop
publishing tool.