Discover millions of ebooks, audiobooks, and so much more with a free trial

From $11.99/month after trial. Cancel anytime.

A New History of Modern Computing
A New History of Modern Computing
A New History of Modern Computing
Ebook1,012 pages12 hours

A New History of Modern Computing

Rating: 0 out of 5 stars

()

Read preview

About this ebook

How the computer became universal.

Over the past fifty years, the computer has been transformed from a hulking scientific supertool and data processing workhorse, remote from the experiences of ordinary people, to a diverse family of devices that billions rely on to play games, shop, stream music and movies, communicate, and count their steps. In A New History of Modern Computing, Thomas Haigh and Paul Ceruzzi trace these changes. A comprehensive reimagining of Ceruzzi's A History of Modern Computing, this new volume uses each chapter to recount one such transformation, describing how a particular community of users and producers remade the computer into something new.

Haigh and Ceruzzi ground their accounts of these computing revolutions in the longer and deeper history of computing technology. They begin with the story of the 1945 ENIAC computer, which introduced the vocabulary of "programs" and "programming," and proceed through email, pocket calculators, personal computers, the World Wide Web, videogames, smart phones, and our current world of computers everywhere--in phones, cars, appliances, watches, and more. Finally, they consider the Tesla Model S as an object that simultaneously embodies many strands of computing.
LanguageEnglish
PublisherThe MIT Press
Release dateSep 14, 2021
ISBN9780262366472
A New History of Modern Computing

Related to A New History of Modern Computing

Related ebooks

Technology & Engineering For You

View More

Related articles

Reviews for A New History of Modern Computing

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    A New History of Modern Computing - Thomas Haigh

    HISTORY OF COMPUTING

    William Aspray and Thomas J. Misa, editors

    A complete list of the titles in this series appears in the back of this book.

    A NEW HISTORY OF MODERN COMPUTING

    THOMAS HAIGH AND PAUL E. CERUZZI

    The MIT Press

    Cambridge, Massachusetts

    London, England

    © 2021 Smithsonian Institution

    All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.

    The MIT Press would like to thank the anonymous peer reviewers who provided comments on drafts of this book. The generous work of academic experts is essential for establishing the authority and quality of our publications. We acknowledge with gratitude the contributions of these otherwise uncredited readers.

    Library of Congress Cataloging-in-Publication Data

    Names: Haigh, Thomas, 1972– author. | Ceruzzi, Paul E., author.

    Title: A new history of modern computing / Thomas Haigh and Paul E. Ceruzzi.

    Description: Cambridge, Massachusetts : The MIT Press, [2021] | Series: History of computing | Includes bibliographical references and index.

    Identifiers: LCCN 2020048457 | ISBN 9780262542906 (paperback)

    Subjects: LCSH: Computer science—History. | Electronic digital computers—History.

    Classification: LCC QA76.17 .H34 2021 | DDC 004.09—dc23

    LC record available at https://lccn.loc.gov/2020048457

    d_r0

    To Prof. Dr. Erhard Schüttpelz and to the SIGCIS community, both vital sources of support for our work

    CONTENTS

    Acknowledgments

    BECOMING UNIVERSAL: INTRODUCING A NEW HISTORY OF COMPUTING

    1. INVENTING THE COMPUTER

    2. THE COMPUTER BECOMES A SCIENTIFIC SUPERTOOL

    3. THE COMPUTER BECOMES A DATA PROCESSING DEVICE

    4. THE COMPUTER BECOMES A REAL-TIME CONTROL SYSTEM

    5. THE COMPUTER BECOMES AN INTERACTIVE TOOL

    6. THE COMPUTER BECOMES A COMMUNICATIONS PLATFORM

    7. THE COMPUTER BECOMES A PERSONAL PLAYTHING

    8. THE COMPUTER BECOMES OFFICE EQUIPMENT

    9. THE COMPUTER BECOMES A GRAPHICAL TOOL

    10. THE PC BECOMES A MINICOMPUTER

    11. THE COMPUTER BECOMES A UNIVERSAL MEDIA DEVICE

    12. THE COMPUTER BECOMES A PUBLISHING PLATFORM

    13. THE COMPUTER BECOMES A NETWORK

    14. THE COMPUTER IS EVERYWHERE AND NOWHERE

    15. EPILOGUE: A TESLA IN THE VALLEY

    Bibliography

    Index

    List of Figures

    Figure 1.1

    ENIAC as installed at the University of Pennsylvania, in a US Army photograph used by theNew York Timesin its 1946 report. This image defined public ideas of what an electronic computer looked like. The machine was configured by setting switches and wiring connections between its many panels, which collectively established a room within a room in which its operators and associated punched card machinery worked. Corporal Irwin Goldstein, an Army maintenance technician, is in the foreground setting data on a portable function table later used to hold encoded program instructions. Technician Homer Spence and two operators, Frances Bilas and Betty Jean Jennings (later Jean Bartik), work in the background.

    Figure 1.2

    The logical architecture of EDVAC as described in theFirst Draft. Reproduced from Thomas Haigh, Mark Priestley, and Crispin Rope,ENIAC in Action(Cambridge, MA: MIT Press, 2016, p. 145).

    Figure 1.3

    The staff of the Eckert-Mauchly Computer Company gather in 1948 at their Philadelphia factory, in front of their first computer, the custom BINAC created for military aircraft developer Northrop. Front row (left to right): J. Presper Eckert (cofounder and ENIAC chief engineer), Frazier Welsh, James Wiener, Bradford Sheppard, and John Mauchly (cofounder and ENIAC project initiator). Back row: Albert Auerbach, Jean Bartik (former ENIAC operator and programming manager), Marvin Jacoby, John Sims, Louis Wilson, Robert Shaw (former ENIAC engineer), and Gerald Smoliar. Courtesy Unisys Corp.

    Figure 2.1

    Magnetic core memory. ©1953 IEEE. Reprinted, with permission, from Jan A. Rajchman, A Myriabit Magnetic-Core Matrix Memory,IRE Proceedings(October 1953): 1408.

    Figure 2.2

    Advertisement for ERA Drum memory, 1953. Small electronics firms used ERA drums to enter the computer business. Note the simplified diagram of a von Neumann architecture, which was novel at the time.

    Source:Electronicsmagazine, April 1953, p. 397. Courtesy Unisys.

    Figure 2.3

    Grace Hopper, with students Donald Cropper, K. C. Krishnan, and Norman Rothberg, at a Univac I console, c. 1960. Photo courtesy Division of Medicine and Science, National Museum of American History, Smithsonian Institution.

    Figure 2.4

    The transistorized IBM 7094 was the most versatile of IBM’s early mainframes. Note the four additional index register readouts, mounted on top of the standard IBM 7090 console.

    Source: IBM.

    Figure 2.5

    An IBM 7090 Direct Couple System in use at North American Aviation, c. 1962. Note the rows of tape drives—twenty-one visible in this photo. Magnetic tape was the primary mass storage medium for second-generation mainframe computers. Photo by Robert W. Kelley/The LIFE Picture Collection via Getty Images.

    Figure 2.6

    The Cray 1’s cooling units, several of which are exposed here, were usually covered with upholstery so that the supercomputer, which was for six years the world’s fastest also offered a comfortable place to sit. Photo by Eric Long, National Air and Space Museum, Smithsonian Institution (NASM 2006–937).

    Figure 3.1

    Computer installations on college campuses often created images composed of text characters to show off the capabilities of the IBM Chain Printer. Such pictures were later shared online. This widely shared example of what was later called ASCII art was reportedly created by David Wright in 1978.16

    Figure 3.2

    An IBM RAMAC disk unit is loaded onto a Pan American Airways aircraft, for its installation at the Brussels World’s Fair, 1957. The drive had a capacity of five million characters and required two additional cabinets holding power supplies and controller electronics to function. Courtesy of International Business Machines Corporation, © International Business Machines Corporation.

    Figure 3.3

    An IBM System 360 installation, c. 1965. Well into the third generation, mainframe computers still relied heavily on magnetic tape for mass storage. Note the raised floor, punched card reader, the typewriter-like console printer, and next to it, an IBM 3270 video terminal and the distinctive slope of the main console. At the lower left is a suite of fixed disk drives; at the lower right, a set of chain printers housed in soundproofed cabinets. Courtesy of International Business Machines Corporation, © International Business Machines Corporation.

    Figure 3.4

    System/360 was designed for technical computing as well as business data processing. In both markets, IBM’s fast and rugged chain printers were a crucial selling point. This page, simulating a lunar trajectory, was printed at the MIT lab responsible for the Apollo guidance computer on October 23, 1968, a few months prior to the Apollo 8 mission. Smithsonian National Air and Space Museum (NASM 9A12593–45506-A).

    Figure 3.5

    Consultant W. Robert Widener promoted the concept of management decision rooms based around computerized screens with armrest controls in his 1968 article New Management Concepts.

    Figure 3.6

    By the mid-1960s, Univac promoted its computers as total management information systems with almost magical powers to centralize and simplify the control of large diversified corporations.Fortune, October 1965, pages 32–33. Courtesy Unisys.

    Figure 4.1

    This M9 gun was controlled by a director built at Bell Labs: an analog electronic computer.

    Source: Bell Labs, AT&T. © Alcatel-Lucent

    Figure 4.2

    A SAGE console and part of the processor. Note the light gun in the protective box next to the display scope. Courtesy of the Computer History Museum.

    Figure 4.3

    A Digital Equipment Corporation PDP-8. The original PDP-8 was the size of a refrigerator, earning it theminicomputertag. Its electronics were spread over many small and easily removed circuit boards, visible here without the usual smoked glass cover. Division of Medicine and Science, National Museum of American History, Smithsonian Institution.

    Figure 4.4

    The PDP-8e, a miniaturized rack-mount version of the PDP-8, was released in 1970. As one of the smallest and cheapest computers of its day, it helped to establish the minicomputer as an embedded control device. It is the box with the switches on the center left, above two DEC Tape drives. This one was integrated with neurosurgery equipment at Massachusetts General Hospital, but the picture is posed as part of an exhibit at the former Computer Museum in Boston.

    Figure 4.5

    A page from Robert Noyce’s 1959 patent, which covered a crucial process for the creation of what later became known as integrated circuits.

    Figure 4.6

    Margaret Hamilton, working at her desk at MIT’s Charles Stark Draper Lab in a photograph taken for its 1973 Annual Report. Hamilton played a crucial role in producing reliable software for the Apollo missions, for which she won the Presidential Medal of Freedom in 2016. After Apollo she founded a business to extend similar methods to other software projects. Courtesy of Draper.

    Figure 5.1

    Dan Edwards and Peter Samson, two creators ofSpacewar,play the game on MIT’s PDP-1. Courtesy of the Computer History Museum.

    Figure 5.2

    Ivan Sutherland demonstrates his Sketchpad system running on MIT’s TX-2. Shapes could be drawn using either a light pen applied to the screen or a digitizing tablet, as shown here. Courtesy MIT Museum.

    Figure 5.3

    Corbató poses with an IBM 7090 used to run the CTSS timesharing system. Courtesy MIT Museum.

    Figure 5.4

    Teletype ASR-33, the principle I/O device for many minicomputers and early personal computers. It allowed upper case only, with a few special characters. Many of the control commands were later adopted by early personal computer operating system commands. The @ sign (Shift-P), was chosen by Ray Tomlinson, an engineer at Bolt Beranek and Newman, to specify the destination host in electronic mail addresses. Charles Babbage Institute, University of Minnesota.

    Figure 5.5

    Teletype terminals used to access the BASIC programming language at Dartmouth College, c. 1964. Professor Kemeny, the codeveloper of BASIC, is standing at left. Courtesy of Dartmouth College Library.

    Figure 5.6

    DEC engineers seated in front of a PDP-6, DEC’s first 36-bit mainframe scale computer. Gordon Bell, who led the development, is standing third from left, wearing a jacket. Its successor, the PDP-10, was a favorite basis for timesharing systems. Computer History Photographs, Archives Center, National Museum of American History, Smithsonian Institution.

    Figure 5.7

    Ken Thompson (seated) and Dennis Ritchie (standing), at a PDP-11 installation at Bell Labs, Murray Hill, NJ, working on the Unix operating system. Credit: AT&T.

    Figure 5.8

    Armando Stettner was a leading advocate for Unix within DEC who persuaded the company to distribute thousands of these replica license plates at the USENIX conference. He had the genuine plate on his own car.UNIXworked well with the New Hampshire state motto, Live free or die, a sentiment somewhat undermined by the insistence of AT&T lawyers on the trademark acknowledgment. Photograph: Paul Ceruzzi.

    Figure 6.1

    A PLATO terminal. Behind the user is a brace, subtly suggesting that the system would allow people with disabilities to be full participants in the Information Age. In the back of the room is a Radio Shack TRS-80 personal computer. Its presence inadvertently suggests why PLATO was unsuccessful: steady improvements to personal computers allowed PLATO-like applications at a much lower cost.

    Source: Charles Babbage Institute.

    Figure 6.2

    Commemorative plaque, 1401 Wilson Blvd. Arlington, Virginia. The ARPANET was initially conceived in the Pentagon, but the Information Processing Techniques Office (IPTO) moved to a nearby office building. Photo by Paul Ceruzzi.

    Figure 6.3

    Logical map of ARPANET, c. 1978. DEC PDP-11 computers dominated the network at the time, with PDP-10, CDC, and IBM systems also well represented. Note also three satellite links: to Norway, the UK, and Hawaii.

    Figure 6.4

    Geographical map of ARPANET nodes, c. 1980. This view of the network highlights its concentration in four areas (each highly magnified): Silicon Valley, the Los Angeles basin, Cambridge, Massachusetts, and northern Virginia.

    Figure 6.5

    Although little remembered today, by 1986 Usenet had built dense connections in North America and Western Europe, with tendrils extending to Australasia and Japan. Reproduced courtesy of Brian Reid.

    Figure 6.6

    In 1983, CompuServe was promoting its propriety chat (CB simulator) and electronic mail (Email) systems to home computer users across North America.

    Figure 7.1

    Shuttle astronaut Sally Ride aboard the Challenger during mission STS-7 in 1983. Note the three Hewlett-Packard programmable calculators floating next to her. NASA purchased the calculators at a Houston department store and made only minimal modifications. Although the Shuttle carried a suite of IBM 4-pi computers for guidance, navigation, and control, the HP calculators saw heavy use by the crew.

    Source: NASA.

    Figure 7.2

    The Altair 8800’s exterior was modeled after minicomputers, but because it was based on a microprocessor the cost was much lower. Unless additional expansion cards were fitted, these lights were its only means of communication. Photo courtesy Division of Medicine and Science, National Museum of American History, Smithsonian Institution.

    Figure 7.3

    Paper tape containing Microsoft BASIC, for the Altair. The ease with which the tape could be copied and distributed without payment to Microsoft led to Bill Gates’ famous Open Letter to hobbyists. Photo courtesy Division of Medicine and Science, National Museum of American History, Smithsonian Institution.

    Figure 7.4

    Microsoft employees gather for a group picture in 1978, just before their move from Albuquerque to Seattle. Front row from left: Bill Gates (cofounder), Andrea Lewis (technical writer), Maria Wood (bookkeeper), Paul Allen (cofounder). Middle row: Bob O’Rear (mathematical programmer), Bob Greenberg (programmer), Marc McDonald (first salaried employee), Gordon Letwin (programmer). Back row: Steve Wood, Bob Wallace, Jim Lane (project manager). Used with permission from Microsoft.39

    Figure 7.5

    Three pre-assembled microcomputers established the consumer computing market in 1977. Each had a plastic case with built-in keyboard, at least 4 KB of RAM, and BASIC burned into a ROM chip for instant access. From left to right: the Commodore Pet 2001; the Apple II (shown with the Disk II floppy drives introduced the next year); and the original TRS-80 Micro Computer System (later called the Model 1) connected to the optional expansion interface (the box under the monitor, introduced in 1978). The TRS-80 was initially the most successful product. Image by Timothy Colegrove.

    Figure 7.6

    The Woz, Steve Wozniak, holding an Apple II and two floppy disk drives, at the American Computer Museum, 2002. Photo by Paul Ceruzzi.

    Figure 7.7

    ThisAsteroidsarcade cabinet sold three lives for a quarter. Early cabinets relied on bold, colorful graphics on the cabinet and around the edge of the screen to supplement their limited graphics.Asteroidsoffered a crisp, but monochrome, vector display inspired bySpacewar.

    Figure 7.8

    TheCombatgame (1977), bundled with many VCS consoles, used its hardware as designed: a symmetrical blocky playfield, two sprites controlled by players, and two missiles to fire at each other. Later games used the same ingredients to draw surprisingly complex screens.

    Figure 7.9

    Advertisements like this one, for the Texas Instruments home computer, attempted to showcase the new machine as a new gathering place for the nuclear family. The applications areas highlighted were programming, personal finance, education, and entertainment. Courtesy Texas Instruments, scan by Bryan Roppolo.

    Figure 7.10

    The Magic Machinewas promoted as a fun and educational coloring book to introduce your home computer to the youngest members of the family. The image on the top left depicts the text Mum laughed and said the Magic Machine can start by cooking dinner. FromByte, January 1983, p. 444. Used courtesy of Informa.

    Figure 7.11

    The ZX81 sold very well in the United Kingdom, where its extraordinarily low price more than offset limitations such as a memory capacity of just one kilobyte. W.H. Smith, a nationwide British chain, was advertising not just the computer but also programs, magazines, and blank tapes to use with it.Computer & Video Games1, November 1981, p. 38, used courtesy of W. H. Smith.

    Figure 7.12

    The programmers of home computers displayed great creativity. The ZX81 had a text-based display with special characters to represent block shapes and shading.3D Monster Maze, programmed by Malcolm Evans in 1981, used these unpromising ingredients to animate a 3D maze, home to a rampaging Tyrannosaurus rex.

    Figure 7.13

    Eventually priced at less than $200, the Commodore 64 became the bestselling desktop computer model in history. Commodore stressed the savings its computers offered against better built rivals with similar memory capacities. Notice the family members crowded around a television set.

    Figure 7.14

    Elitewas one of the most complex games of the 8-bit era, beloved for the open universe of trading, exploration, and space combat it squeezed into 32 KB of memory. The original BBC Micro version, shown here, made use of a unique split-resolution feature, combining higher resolution monochrome graphics in the upper part of the screen, used for its smoothly animated 3D wireframe view of space, with colorful but clunky graphics for the radar scope and instruments at the bottom.

    Figure 8.1

    A Wang Word Processing System in use. Its screen could display a full eighty-column line of text. Photo: Charles Babbage Institute, University of Minnesota.

    Figure 8.2

    Portabilityis a relative concept. The Osborne 1, an early and affordable portable computer, weighed around 24 pounds and had a tiny five-inch screen. Image created by Wikimedia user Biby, reproduced under Creative Commons Attribution 3.0 Unported license.

    Figure 8.3

    VisiCalc (1979), the original spreadsheet program, running on an Apple IIe (1983) with monochrome monitor and dual Disc II drives. VisiCalc was ideally suited to the Apple II’s combination of a low-resolution, forty-column display with rapid scrolling. The / key triggered the rather cryptic main command menu, BCDEFGIMPRSTUW, seen here in the editing area at the top of the screen. Photograph Thomas Haigh.

    Figure 8.4

    Launched in 1981, the IBM PC set the standard for an entire industry. Disk drives and expansion cards fitted inside its beige box. Photo courtesy Division of Medicine and Science, National Museum of American History, Smithsonian Institution.

    Figure 8.5

    To redraw its 3D graphics rapidly,Flight Simulatordisplaced MS-DOS and bypassed the PC BIOS to work directly with the innards of IBM’s color graphics hardware. This made it a powerful test of complete IBM compatibility.

    Figure 8.6

    Amstrad’s IBM PC clones used large production runs and a highly integrated design to reduce manufacturing costs. Note the slogan: Compatible with you know who. Priced as only we know how. The PC 1512 sold for little more than a home computer, packaged with a mouse, monitor, and the GEM graphical environment.

    Figure 8.7

    RadioShack’s TRS-80 Model 100 was the most successful portable computer of the early 1980s, despite its tiny screen. It was particularly popular with journalists. Courtesy of Computing History Photographs, Archives Center, National Museum of American History, Smithsonian Institute.

    Figure 8.8

    The GRiD Compass laptop pioneered the clamshell design. It used an orange electroluminescent display and bubble memory, with no moving parts. Photo by Eric Long, Smithsonian National Air and Space Museum (TMS A19890006000_PS01).

    Figure 8.9

    By the end of the 1980s, most PC companies purchased standard parts and screwed them together. This page, taken from a four-page Dell advertisement run on the inside cover ofByte’sNovember 1989 issue, consisted almost entirely of technical specifications and prices in small print. Potential customers would compare the components and pricing offered by Dell with those of its many rivals. By the December issue, the base price of Dell’s flagship System 325 had fallen another $800.

    Figure 9.1

    Xerox researchers pioneered the graphical user interface. This originated not as an operating system feature but as part of the Smalltalk programming environment, which included its own tools for drawing (bottom) and for editing event-driven code that defined the action taken when a user clicked on a graphical object. Image from Wikimedia user SUMIM.ST used under the Creative Commons Attribution-Share Alike 4.0 International license.

    Figure 9.2

    The Xerox Star was the first computer with a standardized graphical user interface based on a desktop metaphor (the printer and folder icons in the lower right). This image flaunted its unique ability to mix graphics and text in multiple fonts and languages, displaying the document on screen as it would look when printed. It took many years for Microsoft and Apple products to match these capabilities. Courtesy Xerox, image scanned by Digibarn.

    Figure 9.3

    Although its user interface won many fans, the original Apple Macintosh had few compelling applications. Once the almost essential second floppy disk was added, this approachable little computer had a list price of nearly $3,000. Photo courtesy Division of Medicine and Science, National Museum of American History, Smithsonian Institution.

    Figure 9.4

    Aldus PageMaker was the first compelling application for the Macintosh, creating a new market for desktop publishing. Coupled with a laser printer, it could publish crisp text and graphics at a fraction of the cost of traditional typesetting technology. The Macintosh standardized on menu titles at the top of the screen, which when clicked, dropped down to present control options.

    Figure 10.1

    Windows 3 could multitask DOS applications such as Lotus 1-2-3 and Word Perfect, but the bundled Solitaire game was a threat to office productivity. Its interface split the capabilities of the Macintosh desktop between the Program Manager (used to launch programs) and the File Manager (to manipulate files via directory structures). Minimized windows appeared as icons on the desktop. Unlike the Macintosh, Windows placed the pull-down menus for each application at the top of its window, not at the top of the screen. In 1990, the 1024×768 resolution shown here was usable only with an expensive monitor connected to a recent graphics card.

    Figure 10.2

    Word for Windows, based on a long-established Macintosh package, dominated the growing market for Windows-based word processing software created by the abrupt success of Windows 3.0. The broad resemblance of Windows to the Macintosh interface is clear here, including the rows of control icons, the pull-down menus, and the scroll bar.

    Figure 10.3

    Apple’s PowerBooks, offered from 1991, were the first laptops to provide a practical mobile experience using a graphical user interface. They captured a large part of the laptop market. Note the mechanical trackball used as a mobile alternative to the mouse. The PowerBook 180c shown here, from 1993, was a top-of-the-range model with an 8.4-inch color screen. Courtesy diskdepot.co.uk, used under license Creative Commons Attribution-Share Alike 3.0 Unported.

    Figure 11.1

    Left: Andrew Lippman (project leader) and John Boren (who designed the rig) balance on top of a truck to adjust the four-way camera assembly used by a precursor of MIT’s Media Lab in fall 1978 to capture images at ten foot intervals for the Aspen Movie Map. Courtesy Andrew Lippman. Right: A similar camera assemblage on a Street View car, photographed on the Google campus in 2010 by Wikimedia user Kowloonese.

    Figure 11.2

    An image from the 1982 Disney movieTron, which broke new ground by including more than fifteen minutes of purely computer-generated animation and, as here, by mixing filmed and generated elements in the same shots. Computer scientists Alan Kay advised Disney on the film, which required cutting-edge computer power and expertise. (Kay later married writer Bonnie MacBird, who edited her original script forTronon an Alto at Xerox PARC.) The Solar Sailer visible through the window was animated by Information International, Inc., founded in 1962 by Ed Fredkin, an early adopter of DEC computers and later the director of MIT’s Laboratory for Computer Science. Information International had the most advanced capabilities of the four companies hired to produce graphics for the movie. It relied on the unique Foonly F1, the most powerful PDP-10 compatible computer, custom built in the mid-1970s by former members of the Stanford AI Lab. Photo: Moviestore Collection Ltd / Alamy Stock Photo.

    Figure 11.3

    The Roland TR-808 drum machine flaunted its digital sequencing ability in the label Rhythm Composer. Computer Controlled. (Image by Wikimedia user Brandon Daniel used under Creative Commons Attribution-Share Alike 2.0 Generic.)

    Figure 11.4

    A SIGSALY installation, c. 1943. The device was used to encrypt voice communications between the US and UK during World War II. The US National Security Agency claims that this was the beginning of the digital revolution in sound. Note the twin turntables, suggestive of the hip-hop decks of the 1980s.

    Source: National Security Agency.

    Figure 11.5

    The Fairlight Computer Musical Instrument, 1979, was a hybrid of an electronic organ and a computer workstation, with dual processors, a screen, two disk drives, and both computer and musical keyboards. Courtesy Blackmagic Design.

    Figure 11.6

    The Napster client running in 2001. Search results (top panel) came from Napster’s own servers, but the seven simultaneous downloads taking place in the bottom window were copying songs directly from the hard drives of other Napster users. The process often slowed down or failed, which may be why the user is downloading three copies of You Can Call Me Al. Image from Wikimedia user Njahnke, shared under a Creative Commons Attribution-Share Alike 4.0 International license.

    Figure 11.7

    Apple’s original iPod was controlled by four buttons, and a mechanical control wheel turned to move through menus and song lists. It held up to three thousand songs on a miniature hard disk drive with more than a thousand times the capacity of IBM’s original RAMAC unit. Image created by Wikimedia user Miguelon756–5303, used under the Creative Commons Attribution-Share Alike 4.0 International license.

    Figure 11.8

    NASA’s Ames Research Center in Silicon Valley was the most important venue for early work on virtual reality in the mid-1980s. Note the data gloves, stereo headset, and head-mounted motion sensors. NASA photograph.

    Figure 11.9

    Under its lid, the Xbox was built mostly with standard desktop PC components, including this Toshiba DVD-ROM and Western Digital hard disk drive. Its price was initially subsidized by Microsoft, making it a tempting target for hackers looking for a cheap but powerful computer to repurpose. Image by Wikimedia user Evan-Amos.

    Figure 12.1

    The website NoMoreAOLCDs.com. In 2001, as AOL continued to distribute vast numbers of discs holding software and trial offers, Californians Jim McKenna and John Lieberman began to collect unwanted CDs with the aim of eventually dumping a million of them in front of AOL’s head office. The tally had reached 410,176 when the campaign ended in 2007.

    Figure 12.2

    Tim Berners-Lee (right), with email pioneer Ray Tomlinson, at the American Computer Museum, April 2000. Photo by Paul Ceruzzi.

    Figure 12.3

    The first widely used graphical Web browser was xMosaic in early 1993, seen here visiting CERN’s website. It ran on Unix workstations using the X Window system. Features such as the forward and backward buttons, the URL bar, and the underlining of hyperlinks remain common in today’s browsers. The large globe spun when data was being received.

    Figure 12.4

    Yahoo’s homepage in October 1996, when the site was still primarily a hierarchical catalog of the Web. Clicking one of the top-level headings on the front page brought up a page of subheadings, and so on. Shown running in Windows 95 on version 2.0 of the Netscape Navigator browser. The broken key in the bottom left reflects the addition of encryption to support credit card transactions.

    Figure 12.5

    Richard Stallman, MIT hacker and originator of the GNU project. Taken from the cover of the O’Reilly bookFree as in Freedom: Richard Stallman’s Crusade for Free Softwareunder the Creative Commons Attribution-Share Alike 3.0 Unported license.

    Figure 13.1

    Google corkboard server, 1999, one of thirty servers built by Larry Page and Sergey Brin early in the company’s history, establishing a tradition of cheap homebrew hardware. Each row in the rack held four motherboards and eight hard drives resting on a piece of cork. Courtesy of Google, Inc. Image provided by National Museum of American History, Smithsonian Institution.

    Figure 13.2

    Top: Data center, Ashburn, Virginia. Cloud servers are located around the world. The data centers in Ashburn, just north of Dulles Airport in Loudoun County, Virginia, may be the epicenter of cloud storage but their exteriors give little away. Photo by Paul Ceruzzi. Bottom: inside a T-Systems (Deutsche Telecom) data center in Biere, Germany, in 2014. A technician removes a standard-sized rack mount unit from a chassis into which servers, network switches, backup power supplies, and storage arrays all fit. These rack-mounted servers placed a PC motherboard, drives, and expansion cards into a compact, easily swappable case. Photo: Thomas Trutschel via Getty Images.

    Figure 13.3

    A home-built gaming PC. The limited-edition case (2019) was designed by iBuyPower to celebrate theFalloutseries of role-playing videogames. Its glass side shows off the components within, lit by two Corsair fans cycling through a range of colors. The compact Micro ATX scale motherboard, filling barely half the available area and decorated with racing-inspired red and black trim, is dominated by the lightly glowing Sapphire Nitro+RX Vega 64 video card (2017) which blocks three of its four expansion slots and weighs three and a half pounds. Modern PCs are built for heat dissipation: a liquid cooling system pumps heat from the quad core, 4 GHz Intel processor (a sixth generation Core i7 variety from 2015), hidden under the Nuka Cola cap, to a radiator at the rear of the machine. The graphics card’s three large fans (bottom) and integrated radiator draw away heat generated by thousands of graphics processing units. Other functions that would once have needed cards are built onto the motherboard, including Ethernet and sound. The case is close to the dimensions of the original IBM PC, but the space at the front where earlier PCs housed floppy, hard, or optical drives is filled with an ornamental bobble head. Instead a tiny Samsung Evo solid state drive screwed to the motherboard provides a terabyte of ultrafast storage. The power supply, much of the cabling, and a traditional hard drive are hidden behind partitions. Photo: Thomas Haigh.

    Figure 14.1

    These three stylus-controlled personal digital assistants sought to replace the Filofax ring-binder organizer (front; far left, in slimline version). The 1995 Apple Newton MessagePad 100 (right, with tutorial video) was the largest and most ambitious. The pocket-sized Palm series (center, budget-priced Palm IIIe, 1999) was smaller, cheaper, and far more commercially successful. The Dell Axim 50v from 2004 (left) represents the last major generation of PDAs. It was far more powerful, with a bright color screen, Wi-Fi, and a 624 MHz ARM family processor (versus 20 MHz for the Message Pad). Its Microsoft operating system attempted to transplant aspects of Windows, including the start menu and Excel spreadsheet program, to the tiny screen.

    Figure 14.2

    Three generations of Motorola Phones. The original 1983 DynaTAC brick phone (left) weighed 28 ounces. It was soon replaced by smaller alternatives, culminating in the 1996 StarTAC (top right) which weighed just three ounces. It was so compact that it had to be unfolded to reach both mouth and ear. The RAZR V3 (bottom right), a hugely popular second-generation phone launched in 2004, offered SMS messaging, mobile email, and even Web access, but the main draw was its slim metal case. DynaTAC courtesy Cooper Hewitt, Smithsonian Design Museum; StarTAC by Wikimedia user Nkp911m500, shared under Creative Commons Attribution-Share Alike 3.0 Unported license; RAZR by Thomas Haigh.

    Figure 14.3

    An iPhone 4 circuit board from 2010. The large A4 system on chip (SOC), also used in the first iPad, integrates an ARM-based microprocessor with a graphics processor and, sandwiched into a second layer above the processor, 512 MB of RAM. Other chips integrate up to 32 GB of flash memory, radio receivers and transmitters, a GPS receiver, an accelerometer, and a magnetic compass. The entire board is less than 10 cm long. Photo: Paul Ceruzzi.

    Figure 14.4

    Dockless scooters owned by theunicornstart-up Bird and local rival Skip obstruct a Washington, DC, sidewalk in 2018. A computer and cellular modem in the box of electronics mounted to the handlebar reported the position of the scooter to cloud servers and unlocked it when rented via a smartphone app. Photograph: Paul Ceruzzi.

    Figure 14.5

    2020 editions of the Apple iPad Pro 12.9-inch (left, with optional Apple Pencil and Logitech Slim Folio Pro detachable keyboard sleeve) and the Lenovo X1 Carbon (right), a premium business-oriented laptop. With keyboard, the iPad (3.0 pounds) was thicker and heavier than the X1 (2.6 pounds). In an example of convergent evolution, the former grew from a smartphone and the latter shrank from a desktop PC. They met in the middle with comparable pricing and hardware: large vivid touch screens, powerful processors (a 6-core Intel i7 versus an 8-core custom Apple ARM architecture chip) and flash memory storage. With five cameras and four microphones the iPad was ideal for video conferencing, and its graphics chip gave it an edge for video games. Although both could be connected to full-size peripherals and used to run office applications, Lenovo’s Windows 10 operating system retained a significant edge for most tasks. For example, the iPad could not display more than two applications at once. Photograph: Thomas Haigh.

    Figure 15.1

    Robert Tinney’s cover forByte’s January 1977 issue positioned an Altair 8800 computer, symbol of the emerging personal computer industry, with floppy disks and paper tape in front of a depressing and polluted cityscape. Its video terminal seems to promise a computer utopia. Courtesy Robert Tinney.

    Figure 15.2

    Stanley, a self-driving Volkswagen, won the 2005 DARPA Grand Challenge and its $2 million prize for a Stanford University team. GPS receivers, roof-mounted LIDAR units, and video cameras helped it navigate a 132 mile desert tract without human intervention. Photo by Mark Avino, Smithsonian National Air and Space Museum (NASM 2012–01952).

    List of Table

    Table 1.1

    Univac Installations, 1951–1954

    ACKNOWLEDGMENTS

    We are particularly grateful to David Hemmendinger, Gerardo Con Diaz, Marc Weber, Michael J. Halvorson, Alan Staiti, Eugene Miya, and the MIT Press reviewers who all read through the entire manuscript offering innumerable useful suggestions. Paul McJones, Clem Cole, David Brock, Troy Astarte, Tom Lean, Bradley Fidler, Tom Van Vleck, Henry Lowood, Jerome Coonen, Forrest Park, Gordon Bell, and Donald B. Wagner provided important input on specific topics and sections. Series editor Tom Misa had an eagle’s eye for typos. Much of the material in this book is adapted from Ceruzzi’s earlier A History of Modern Computing, and so we also remain in debt to those previously acknowledged there. In particular we appreciate William Aspray’s work as longtime series editor, which befitted this book and many of the other books we drew on to create it.

    The book’s creation was generously supported by Siegen University’s Media of Cooperation CRC. This included three periods spent working together in Siegen, Germany, to review the potential for a new overview history, outline the new structure of the book, review existing material in the second edition, and assemble two draft chapters as a test of our new approach. Additional support from Siegen allowed Haigh to focus on writing. We happily acknowledge Erhard Schüttpelz, Sebastian Giessmann, and Tristan Thielman for arranging this support and for their collegial contributions to our project. We talked through the new structure with participants at three Early Digital workshops and benefited from the suggestions of Laine Nooney, David Brock, Stephanie Dick, William Aspray, Martin Campbell-Kelly, Doron Swade, Matthew Kirschenbaum, Len Shustek, Rebecca Slayton, and many others. Parts of the book were later discussed in a Siegen workshop on the history of database management systems with the participation of Moritz Feichtinger and Francis Hunger.

    For help with image permissions and scans we are grateful to Debbie Douglas, Janice Hussain, Angela Schad, Katherine Taylor, Brian Daly, Erik Rau, Stephanie Hueter, Andrea Wescott, Ingrid Crete, Lindsi Wyner, Bryan Roppolo, Amanda Wick, and various contributors to Wikimedia Commons.

    Haigh happily acknowledges the support of his family members Maria, Peter, and Paul during the five years of sometimes intense work it took to produce this book. Their love and sacrifice made the book possible. Both authors appreciate the efforts of the MIT Press editorial and production teams. Acquisitions editor Katie Helkie advocated early for the need to update Ceruzzi’s book, while Laura Keeler gave prompt answers to all our queries on manuscript preparation. We are particularly grateful for the work of Helen Wheeler and Virginia A. Schaefer at Westchester Publication Services. Stefan Swanson worked heroically on proofreading and reference checking, catching dozens of errors.

    Fragments of chapters 6 and 12 were adapted from Haigh’s contribution to The Internet and American Business (edited by William Aspray and Paul Ceruzzi) and Ceruzzi’s contribution to Social Media Archaeology and Poetics (edited by Judy Malloy), both published by MIT Press. Portions of chapter 3 are modeled after Haigh’s The Chromium-Plated Tabulator: Institutionalizing an Electronic Revolution, 1954–1958. IEEE Annals of the History of Computing 23, no. 4 (October–December 2001): 75–104 and How Data Got Its Base: Information Storage Software in the 1950s and 60s. IEEE Annals of the History of Computing 31, no. 4 (October–December 2009): 6–25.

    Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)—Project-ID 262513311-SFB 1187 Media of Cooperation.

    BECOMING UNIVERSAL: INTRODUCING A NEW HISTORY OF COMPUTING

    This book is a comprehensive reimagining of A History of Modern Computing, first published in 1998 and expanded with a new chapter in 2003. A lot has changed since 1998 when the Web was a novelty, iPhones didn’t exist, and the founders of Google and Facebook were in graduate school and high school, respectively. Doing justice to those changes required more than just adding a few more chapters at the end of the book. For example, as the first edition was being written and conceived, the Internet was still quite an obscure system. Today we view the development of computer communications as a central thread in the history of computing, not just in the 1990s but also in the 1960s and 1970s. The wholesale shift of video and music reproduction to digital technologies likewise challenges us to integrate media history into the long history of computing. Since the original book was written, the computer had become something new, which meant that the book also had to become something new.

    The unmistakable importance of the Internet, digital media devices, and video games to modern life has driven public interest in their stories. Yet this discussion is rarely grounded in the longer and deeper history of computer technology. For example, as we finalized our revisions to this book, one of us chanced upon How the Internet Happened: From Netscape to the iPhone by Brian McCullough, a tech industry insider.¹ It is readable, admirably tight, and solidly researched—based on two hundred interview podcasts McCullough recorded with company founders. We recommend it to you. Yet we were also struck by how little engagement such approaches to history have with the larger story of computing. As his title suggests, McCullough starts the story of the Internet in 1994 with the first commercial Web browser, giving only occasional flashbacks to the first twenty-five years of the Internet (and its precursor, the ARPANET). He says little about where the core technologies, protocols, or algorithms of the Web came from, or about the evolving technologies personal computing, such as new processors and operating systems, that made the rapid spread of Web browsers possible. He says nothing about Web server technology, or the programming languages and practices that evolved alongside Web browsers. Similar observations can, and have, been made about popular histories of video games and personal computing. Our aim here is to integrate Internet and Web history into the core narrative of the history of computing, along with the history of iPods, video game consoles, home computers, digital cameras, and smartphone apps.

    To write the history of a technology only as a series of models, inventors, and refinements is to miss the point. Thomas J. Misa once suggested that the great challenge facing historians of computing is to explain How Computing Has Changed the World.² Doing that might not seem so hard. The computer has a relatively short history, which for our purposes begins in the 1940s. Set against technologies such as agriculture, Arabic numerals, or alphabets its span looks like the blink of an eye. Despite its ever-growing importance, its influence on our lives has so far been less fundamental that that of industrial age technologies such as electric light or power, automobiles, or antibiotics.

    Important technologies have complex histories. The automobile, for example, was made possible by the development of earlier technologies such as coaches and bicycles. Its mass adoption with the Ford Model T took place decades after its invention, and was made possible by the development of big business and the invention of mass production as a way to build complex machines cheaply in huge quantities. The Model T’s users discovered new uses for it, building new body work or adapting it as a portable power source for agricultural machinery. The automobile facilitated, but did not dictate, a mass exodus from America’s cities into sprawling suburbs and exurbs. Because most Americans came to rely on cars to shop, get to work, and socialize, their national culture grew up around the technology. The resulting need for massive quantities of oil reshaped American foreign policy and transformed the fortunes of nations from Norway to Nigeria, usually for the worse.³

    Doing justice to that story would challenge even the most ambitious historian, but the automotive historian has a crucial advantage over the historian of computing: over the century from 1920 to 2020, the typical car had a roughly stable physical form: a large self-propelled metal box able to move between two and eight people over asphalt at a maximum speed that has roughly doubled, from forty miles an hour to a (legally mandated) seventy or eighty. Cars are still built on assembly lines by large, capital-intensive companies. Ford, General Motors, and Chrysler were the big three US automakers of the 1920s and retain that status today. Cars are still distributed by franchised dealers. A basic but functional car costs a skilled worker a few months of salary.

    The story of computing offers us no comparable continuities. Few, if any, other technologies have changed their scale, dominant applications, and users so often and so fundamentally. The computer started out as esoteric and specialized as the cyclotron and has finished up only slightly less ubiquitous than clothing or food. In the 1940s, computers were used by a few hundred people worldwide to carry out complex numerical calculations. They were built as one-of-a-kind pieces of custom lab equipment, each costing the equivalent of several million present-day dollars.

    Computer scientists have adopted a term from Alan Turing, the universal machine, to describe the remarkable flexibility of programmable computers. To prove a mathematical point he described a class of imaginary machines (now called Turing machines) that processed symbols on an unbounded tape according to rules held in a table. By encoding the rules themselves on the tape, Turing’s universal machine was able to compute any number computable by a more specialized machine of the same ilk. Computer scientists came to find this useful as a model of ability of all programmable computers to carry out arbitrary sequences of operations, and hence (if unlimited time and storage were available) to mimic each other by using code to replicate missing hardware.

    In practice, however, the first modern computers faced severe practical limits on their capabilities. As those restraints were gradually lifted, the scope of what could feasibly or economically be computerized grew dramatically as the computer evolved toward what economists call a general purpose technology with highly varied applications. Today about half the world’s inhabitants use hand-held computers daily to facilitate almost every imaginable human task. They carry out their work millions of times faster than those early models, fit easily in a pocket, and are cheap enough to be thrown away when a cracked piece of glass needs repair.

    Computers will never do everything, be used by everyone, or replace every other technology, but they are more nearly universal than any other technology. In that broader sense the computer began as a highly specialized technology and has moved toward universality and ubiquity. We think of this as a progression toward practical universality, in contrast to the theoretical universality often claimed for computers as embodiments of Turing machines.

    To the extent that it has become a universal machine, the computer might also be called a universal solvent, achieving something of that old dream of alchemy by making an astounding variety of other technologies vanish into itself. Maps, filing cabinets, video tape players, typewriters, paper memos, and slide rules are rarely used now, as their functions have been replaced by software running on personal computers, smartphones, and networks. We conceptualize this convergence of tasks on a single platform as a dissolving of those technologies and, in many cases, their business models by a device that comes ever closer to the status of universal technological solvent.

    In many cases the computer has dissolved the insides of other technologies while leaving their outward forms intact. Although computer technology is universal, most actual computers are configured and deployed to carry out extremely specialized tasks. They hide inside cars and consumer appliances to replace the guts of many of the technologies of everyday life, such as telephones, photocopiers, televisions, pianos, and even light bulbs. These computers outnumber humans many times over and cost as little as three cents apiece in bulk. They still have processors and memory and run software, but only computer scientists habitually think of them as computers.

    This shape shifting makes the construction of a satisfactory overall history of computing exceptionally difficult. How to tell a story when the scale of the stage and the cast of characters changes so fundamentally? The easiest way to write a book like this would be to devote one or two chapters to each decade. But we want to tell a story with a plot, not just arrange a succession of facts and anecdotes in roughly chronological order. Our answer was to focus on constructing each individual chapter to tell the story of a transformation, in which particular communities of users and producers remade the computer into something new. Each chapter tells a coherent story with a stable cast of characters, even though the companies, applications, and communities relevant to that chapter may not appear in others.

    For example, the first transformations began in the 1950s, as computers were remade for use as scientific supercomputers capable of feats of number crunching, data processing devices able to automate the work of hundreds of clerks, and real-time control systems used to coordinate air defense. We tell those three parallel stories in three parallel chapters, each reaching into the 1970s. Chapters continue to overlap in time, although as you move through the book you will draw gradually closer to the present. In later chapters the computer becomes a communications medium, a graphical tool, a personal plaything, and so on. The full list is longer, as you can read in the table of contents. We hope that you find the new structure clear and coherent. After sketching out this version, we began by rearranging passages of the existing text within it, filling in the gaps with new material. If you are familiar with Ceruzzi’s original book you will be able to find compressed versions of almost all the topics it covered somewhere in our new text.

    You will find many examples here of how computerization changed specific parts of the world, but not every part of the world has been changed in the same way. Misa’s question, How did the computer change the world, which was posed to an entire field, admits no single answer. We have instead tried to give a reasonably comprehensive answer to a more tractable question: How did the world change the computer? Read together, the smaller stories told in each chapter add up to a larger one. The protagonist of this story is the computer itself. To talk about the computer might sound a little ridiculous, in a world where some computers are thrown away inside hotel key cards and others cost millions of dollars. Yet at the core of each machine is a package of programming techniques and architectural features reflecting a shared descent. Architectural advances pioneered by Cray supercomputers now help your phone to play Netflix video more effectively. The original A History of Modern Computing engaged more deeply than any other overview history of computing with the evolution of computer architecture. Preserving and deepening that focus on the origins and diffusion of new architectural features contributes to knitting the new book together.

    Technologies are shaped by societies or, more specifically, by institutions such as governments and corporations, by inventors responding to incentives, and by users who apply and reshape technologies in ways unimagined by their original creators. Another distinctive feature of the original A History of Modern Computing was its interest in the stories of computer users, with deeply researched case studies of NASA, the Internal Revenue Service, and other influential organizational users of computers. We have retained these and have also woven shorter examinations of the experiences of computer users into each chapter. This complements our focus on architecture, because new architectural features and software technologies were originally created to serve the specific needs of specific users. This structure builds on the insights of Michael S. Mahoney in his classic paper Histories of Computing(s). Mahoney argued that the histories and continuing experience of the various communities show that they wanted and expected different things from the computer. They encountered different problems and levels of difficulty in fitting their practice to it. As a result, they created different computers or (if we may make the singular plural) computings.⁶ Whenever the computer became a new thing it did not stop being everything it had been before. Computers are still used by nuclear weapons labs and banks. These stories intertwined, as new capabilities move from one domain to another.

    Our story starts in the 1940s with programmable electronic computers and not, like more traditional overview histories, with mechanical calculators or Charles Babbage. To tell the story of a new technology one would ideally begin by documenting the practices it was applied to and the earlier technologies used and then explore its origin, its spread, and the new practices and institutions that coevolved with it. Decades ago, when the scope of computing was smaller, it made sense to see electronic computing as a continuation of the tradition of scientific computation. The first major history of computing, The Computer from Pascal to von Neumann by computing pioneer Herman Goldstine, concluded in the 1940s with the invention of the modern computer. In A History of Computing Technology, published in 1985, Michael Williams started with the invention of numbers and reached electronic computers about two thirds of the way through. By the 1990s the importance of computer applications to business administration was being documented by historians, so it was natural for Martin Campbell-Kelly and William Aspray, when writing Computer: A History of the Information Machine, to replace discussion of slide rules and astrolabes with mechanical office machines, filing cabinets, and administrative processes.

    Giving up their coverage of earlier technologies carries a cost. To understand what changed in the world because of the adoption of a technology, we need to know the before as well as the after. Yet its influence will confront future historians of every kind, whether they are writing about presidential politics or pop music. The breadth of technologies displaced by the computer and practices remade around it makes it seem arbitrary to begin with chapters that tell the stories of index cards but not of televisions; of slide rules but not of pinball machines; or of typewriters but not of the postal system. But to include those stories, each of our chapters would need to become a long book of its own, written by different experts.

    A History of Modern Computing was the most widely cited scholarly overview history of computing. For many of the people who picked up a copy in a library, or were assigned it for a class, it gave a first introduction to the topic. We hope our new book is a starting point, and not an end point, for your engagement with this rich history. To help guide you, we have systematically added citations to, and quotations from, some of the many outstanding works of scholarly history on different aspects of the history of computing. Our challenge here is to condense stories big enough to fill entire books into a page or a paragraph. Most histories focus on a specific aspect, occasionally something as broad as the software industry, but more often a single company or computer platform. There are more histories of Google, Microsoft, or Apple than there are of the computer itself. We do not include further reading lists for each chapter, but when we mention a book in the text you can safely assume that it is an outstanding and highly relevant source of further reading. These books have inspired and informed us, and we would like to share that gift with you.

    Broad as this book is, we must warn you that it is a history of computing technology and practice, not of computer science. Computer science is an academic discipline. It began to come together intellectually in the late 1950s and was institutionalized during the 1960s and 1970s via university departments, corporate research labs, funding agencies, conferences, and journals. When specific work done by computer scientists has a major influence on practice we discuss its contribution, but we cannot try to squeeze into this book the stories of research areas, influential departments, intellectual schools, or the development of subdisciplines such as architecture, theory, graphics, databases, networking, and artificial intelligence. Historians of science have paid remarkably little attention to computer science (and, alas, computer scientists to history), so unfortunately there are no major histories of computer science or of any of its subdisciplines for us to point you toward.

    Another question we won’t be answering is What was the first computer? Arguments about firsts once drove lengthy lawsuits and patent proceedings. They continue to dominate much general discussion of early electronic computing, particularly in Internet forums. Any answer depends on one’s definition of computer. In the 1940s the question would not even have made sense, because computer usually meant a person employed to carry out complex calculations. The new machines being built at the time were called automatic computers or computing machines. Even those weren’t the first calculating machines, which is why we call this a history of modern computing.

    But we do have to start this book somewhere. We start it in 1945 with the first operation of a machine called ENIAC at the University of Pennsylvania. A truce reached in the 1980s, as professionally trained historians began to engage with this topic, established strings of adjectives to qualifying the firstness of the various novel machines constructed during the 1940s. ENIAC is usually called something like the first electronic, general purpose, programmable computer.

    Those qualifying adjectives separate it from two earlier groups of machines. Electronic distinguishes it from electromechanical computers whose logic units worked thousands of times more slowly. Often called relay calculators, these computers carried out computations one instruction at a time under the control of paper tapes. They were player pianos that produced numbers rather than music. Among the best known were the Harvard Mark 1, produced by International Business Machines to meet the specification of Harvard’s Howard Aiken, and the Z3 designed by German computing pioneer Konrad Zuse. General purpose and programmable separated ENIAC from special purpose electronic machines whose sequence of operations was built into hardware and so could not be reprogrammed to carry out fundamentally different tasks. The ABC, or Atanasoff-Berry Computer, built at Iowa State, used a fixed program to solve systems of linear equations.¹⁰ The British wartime Colossus machines applied logical tests to inputs from encrypted messages and electronically simulated code wheels. Their basic sequence of operations was likewise fixed.¹¹

    The ENIAC project introduced the vocabulary of programs and programming and the automation of higher-level control functions such as branches and looping. It was publicized around the world, stimulating interest in electronic computation. Its two main designers founded the first electronic computer company. And even before ENIAC was finished, design work on a planned successor, EDVAC, had defined the key architectural features of the modern computer.

    NOTES

    1. Brian McCullough, How the Internet Happened: From Netscape to the iPhone (New York: Liveright, 2018).

    2. Thomas J. Misa, Understanding ‘How Computing Has Changed the World’, IEEE Annals of the History of Computing 29, no. 4 (October–December 2007): 52–63.

    3. The historical literature on the interaction of users with the Model T is much richer and deeper than that for the personal computer. For example, Kathleen Franz, Tinkering: Consumers Reinvent the Early Automobile (Philadelphia: University of Pennsylvania Press, 2005), and Ronald Kline and Trevor Pinch, Users as Agents of Technological Change: The Social Construction of the Automobile in the Rural United States, Technology and Culture 37, no. 4 (October 1996): 763–795. It plays a central role in studies of industrial production, such as David Hounshell, From the American System to Mass Production, 1800–1932: The Development of Manufacturing Technology in the United States (Baltimore: Johns Hopkins University Press, 1984). There have been many broad studies of the role of the automobile in American life, such as James J. Flink, The Automobile Age (Cambridge, MA: MIT Press, 1988), and Clay McShane, Down the Asphalt Path (New York: Columbia University Press, 1994).

    4. Liesbeth De Mol, Turing Machines, Stanford Encyclopedia of Philosophy, September 24, 2018, https://plato.stanford.edu/entries/turing-machine/.

    5. Part way through the production of this book, one of us outlined the new structure and its motivation in more detail in Thomas Haigh, Finding a Story for the History of Computing (Siegen, Germany: Media of Cooperation Working Paper Series, Siegen University, 2018).

    6. Michael S. Mahoney and Thomas Haigh (ed.), Histories of Computing (Cambridge, MA: Harvard University Press, 2011), 64.

    7. Herman H. Goldstine, The Computer from Pascal to von Neumann (Princeton, NJ: Princeton University Press, 1972); Michael R. Williams, A History of Computing Technology (Englewood Cliffs, NJ: Prentice Hall, 1985); and Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: Basic Books, 1996).

    8. Thomas Haigh, The Tears of Donald Knuth, Communications of the ACM 58, no. 1 (Jan 2015): 40–44. Although not a conventional history, you can get a good sense of the development of computer science in Matti Tedre, The Science of Computing: Shaping a Discipline (New York: CRC Press, 2015).

    9. The role of these adjectives inserted between first and computer is discussed in Michael R. Williams, A Preview of Things to Come: Some Remarks on the First Generation of Computers, in The First Computers: History and Architectures, ed. Raúl Rojas and Ulf Hashagen (Cambridge, MA: MIT Press, 2000), 1–16.

    10. Alice R. Burks and Arthur W. Burks, The First Electronic Computer: The Atanasoff Story (Ann Arbor, MI: University of Michigan Press, 1989).

    11. Thomas Haigh and Mark Priestley, Colossus and Programmability, IEEE Annals of the History of Computing.

    1 INVENTING THE COMPUTER

    On February 15, 1946, subscribers to the New York Times might have been startled to discover a front-page story titled Electronic Computer Flashes Answers, May Speed Engineering. It opened with news that one of the war’s top secrets, an amazing machine was heralded … as a tool with which to begin to rebuild scientific affairs on new foundations.¹ That evening, the new machine, ENIAC (electronic numerical integrator and computer), shown in figure 1.1, was ceremonially switched on and dedicated by Major General Gladeon M. Barnes. He represented the US Army’s Ordnance Department, the owner of the new machine and sponsor of the project.

    Figure 1.1

    ENIAC as installed at the University of Pennsylvania, in a US Army photograph used by the New York Times in its 1946 report. This image defined public ideas of what an electronic computer looked like. The machine was configured by setting switches and wiring connections between its many panels, which collectively established a room within a room in which its operators and associated punched card machinery worked. Corporal Irwin Goldstein, an Army maintenance technician, is in the foreground setting data on a portable function table later used to hold encoded program instructions. Technician Homer Spence and two operators, Frances Bilas and Betty Jean Jennings (later Jean Bartik), work in the background.

    The ceremony was taking place at the University of Pennsylvania where ENIAC had been designed and built, starting in 1943. It would not be reassembled at the Ballistics Research Laboratory in nearby Maryland until 1947, where it enjoyed a

    Enjoying the preview?
    Page 1 of 1