Electron and The Bits
Electron and The Bits
John Guttag and Paul Penfield, Jr., "One Hundred Years of Transformation," The Centennial Celebration of the MIT Department of Electrical Engineering and Computer Science, Cambridge, MA; May 23, 2003.
12/25/12
invented the commutator, an essential part of every DC motor. (William Sturgeon in England had independently made the same invention a few months earlier.) Henry held Davenport's motor in disdain. His scientific arguments would not be considered convincing today, but his practical concern was that the motor could not compete with steam engines because the batteries then available were so cumbersome and expensive. He was right--this motor was ahead of its time. Although Davenport got a U.S. patent (no. 132) in 1837, the first issued for any electrical machine, he could not interest anyone in using motors. He went to his grave in 1851 a defeated man. Electricity needed what would be known today as a "killer app"--an application so compelling that an underlying technology would be acquired just to run it. Henry concluded, correctly, that the killer app for electricity was the telegraph. As early as 1816 an electric telegraph line had been built in Europe, but it took advances in technology--the hand key and Morse code--by the American painter and inventor Samuel F. B. Morse, a native of Charlestown, Massachusetts, to make it practical. Henry gave Morse strong encouragement and sound scientific advice, so much so that they tangled later about who had really invented what. Morse sent the famous message, "WHAT HATH GOD WROUGHT" from Washington, D.C., to Baltimore, Maryland, in 1844. The telegraph spread like wild fire. Electricity caught the fancy of the public. The mood was not unlike the Internet euphoria 150 years later. Finally there was a bona fide electrical industry. Universities, including MIT, took note. MIT admitted its first students in 1865 and provided instruction in physics right from the start. Two MIT physicists recognized the importance of electricity. Edward C. Pickering, a member of the National Academy of Sciences, was extraordinarily energetic and eclectic, judging from the range of his projects. At MIT from 1867 to 1877, he beefed up laboratory instruction and, perhaps most importantly, persuaded one of his students, Charles R. Cross, to join the physics faculty and continue the electrical work. It was Cross who in 1874 invited Alexander Graham Bell, then teaching at Boston University, to use MIT's acoustics and electrical laboratory. Bell did so, since the facilities were superior to what he had at BU, and demonstrated a working telephone in 1876. Three years later Thomas Edison invented the electric light bulb and in 1882 commissioned the first electric power plant. Had Davenport still been alive, he would have seen that his 1833 invention of the DC motor made this power plant possible: the generator was simply a motor running backwards. By 1882 five major electrical devices and systems were of growing national importance: telegraph, telephone, rotating machines, illumination, and the power grid. These were the products of scientists, industrialists, and inventors. It was now time to define electrical engineering, and the way to do that was to design an educational program. Charles Cross understood. He was the right person, in the right place, at the right time. He started, at MIT, the nation's first electrical engineering degree program.
12/25/12
it was redesignated Course VI. Between 1882 and 1902 the electrical engineering program was run by Cross, out of the physics department. Then, as now, students could sniff out fields with a bright future. By 1892, 27% of all MIT undergraduates were in electrical engineering. Early graduates included Charles A. Stone '88 and Edwin S. Webster '88, who founded Stone and Webster, the firm that built MIT's new Cambridge campus in 1916; another was Alfred P. Sloan '95, who became president of General Motors and a major MIT benefactor. Electrical engineering programs were also springing up at other universities. Of those that grew out of physics or mathematics departments, some were very scientific or theoretical. Others, designed to train people for the rapidly growing electrical industry, taught contemporary techniques but not the underlying science. MIT, whose programs had always maintained a balance between practice and theory, avoided both extremes. Cross himself fit the pattern, working in a science department but with a definite industrial and engineering bent. In 1900 Cross began to press for a new Department of Electrical Engineering, but when it was established in 1902 he did not join. He assisted with the teaching, but stayed in the physics department, where he served as department head for another 15 years. To lead the new department, MIT looked outside and recruited Louis Duncan. This selection did not work out too well; Duncan's real interests were with industry, and he left before long. But the next department head from outside, Dugald C. Jackson, was a spectacular success.
12/25/12
development in the school of life." His was a no-nonsense approach intended to equip the students for 40 years, the full duration of an engineering career. Two major educational milestones mark Jackson's years as department head. First, the VI-A cooperative program (now called the Internship Program) was launched in 1917. Jackson wanted students to get work experiences with educational value. VI-A was not a "summer job" program or a "work-study" program designed to pay for tuition, but a bona fide educational program. His philosophy has distinguished VI-A from other programs around the nation ever since. Over the decades, about 18% of the department's undergraduates have participated in the VI-A program. Second, in the early 1930s, Jackson started a major curriculum revision, motivated in part by recent developments in radio and electronics. He ordered the writing of a new series of textbooks, which would come to be known as the "Blue Books" because of the color of the covers. The series was not actually finished for over a decade (World War II intervened) and two planned volumes were abandoned because of advances in technology. Jackson decreed that the books were team efforts and individual authors were not to be identified. Jackson's vision can be interpreted today as having been based on three assumptions. First, the underlying science base as developed by scientists was in a form engineers could use. Second, science would not change much during a graduate's 40-year career. Third, society itself would not change much during that same time span. His vision guided the department until, in the late 1940s, it became apparent that two of these assumptions were no longer valid. It was the other great department head, Gordon Brown, who would recognize the problem and provide the remedy.
www-mtl.mit.edu/~penf ield/pubs/eb-03.html
4/13
12/25/12
In Brown's view, the science should be taught in the first years, followed by contemporary technology based on the science. Specialization and theses would come in the senior year. The best students would be encouraged to enter an expanded doctoral program, which would produce engineers able to extend engineering science. He served on a department curriculum committee and rallied support for these views. When he became department head in 1952, he immediately instituted a curriculum review to identify the underlying sciences in all areas, and relate them to engineering techniques. Six undergraduate textbooks, called the "Green Books" after the color of their covers, were written during the late 1950s. Brown kept colleagues at other universities informed, giving them free access to MIT's most recent thoughts. Doctoral graduates from this era took teaching positions here and elsewhere and spread the word. This policy of free sharing of curricular material continues today with the MIT OpenCourseWare program announced in 2002. In 1959 Brown became MIT's Dean of Engineering and started to promote similar ideas for other engineering disciplines. In short, Brown's update to Jackson's vision was recognizing that science changes rapidly, and if engineers participate in the process, the changes will be in a usable form. The first major test of this engineering-science approach was provided by semiconductor circuits. The transistor was invented in 1947; circuit applications began in the 1950s; the integrated circuit came along in 1960. Universities had to include transistors and integrated circuits in their undergraduate programs. But how? Should devices be taught in terms of terminal characteristics or the internal physics? What if fields of science not previously thought relevant were needed? Could nonlinear circuits be covered? How much solid-state physics would be required? MIT led the way in answering these questions. In the fall of 1960 Richard B. Adler and Campbell L. Searle organized the Semiconductor Electronics Education Committee (SEEC). By 1966, 31 people from nine universities and six companies produced seven coordinated textbooks and related curricular material, aimed at third-year and fourth-year electrical engineering students. The books featured more solid-state physics than had ever before been used in teaching electronics. In the books, semiconductor-device models were derived from the solid-state physics, and they in turn were used in transistor circuits. SEEC was a triumph of engineering science, with a substantial, lasting impact. The basic ideas influenced many textbooks written in subsequent years. The approaches are still used in EE education throughout the world, even though the SEEC books themselves can no longer claim contemporary relevance because they were never updated to cover integrated circuits, MOS devices, or much on digital circuits. Engineering science, as a paradigm for engineering education, survived its first major test. The next test, posed by the rising importance of computer science, would prove more challenging.
V. Computer Science
The first electronic digital computers were made during World War II. Their importance was recognized in academic circles in the 1950s, and their use became common during the 1960s. Two research groups devoted to computer science were established at MIT--the Artificial Intelligence Group in 1959 and Project MAC in 1963.
www-mtl.mit.edu/~penf ield/pubs/eb-03.html
5/13
12/25/12
But to teach computer science at the undergraduate level according to the engineering-science paradigm, an appropriate science base needed to be identified. Here a difficulty was encountered. Unlike the case with electrical engineering topics, there were no natural laws or previously developed science to guide practical techniques in programming, architecture, or artificial intelligence. Mathematical theories of lambda calculus, Turing machines, algorithmic complexity, and Boolean algebra were not enough. The approach taken was to develop the courses anyway, and not worry about the science base. Some of the early courses were highly theoretical. Some were very practical, tightly coupled to contemporary hardware or mainstream computer languages. These courses were popular, and before long there were enough of them, with enough intellectual coherence, to comprise a degree program in computer science and engineering. In 1969, this program was announced and the first such degrees were awarded in 1975. But what about the engineering-science paradigm? Was it still relevant? It had given electrical engineering two principal benefits: first, a foundation usable by graduates for their entire careers (40 years), and second, the opportunity for the engineering community to maintain its own intellectual underpinnings. In retrospect it can be seen that those who developed the computer science curriculum obtained these same benefits by other means. They identified fundamental, generic concepts that were commonly encountered in contemporary technologies, and taught them, adding examples from current practice. This body of generic knowledge was meant to last 40 years, and was in a form that met the needs of computer scientists. Some day, this body of knowledge may be recognized as a science in its own right, in which case the engineering-science paradigm will have survived in the long run. The rise of computer science raised fundamental questions about its relation to electrical engineering. Were the two basically inseparable or were they different? How would they evolve? If the two fields were expected to diverge, then computer science should have its own separate department. If not, then establishing a separate department would be a costly mistake. Either way, the wrong departmental structure would seriously jeopardize MIT's position of technical leadership. The discussion of this issue in the early 1970s was surely the most important debate the department has ever had. The future of electrical engineering would be very different without a strong connection to computer science, and vice versa. There were several arguments in favor of a split. The department was already big--in some minds too big. Because it was technically broad, no single department head could provide leadership in all technical areas (in fact, starting in 1971 there had been associate department heads, one from electrical engineering and one from computer science). The research and teaching styles of the two fields were different, because computer science was new, rapidly evolving, and more empirical. Some computer scientists felt their field should be free to develop in its own way, unencumbered by established engineering traditions (those making this argument tended to view computer science as a branch of mathematics). Finally, a split would be easy because the computer science people were already housed in a separate building. But there were powerful arguments in favor of staying as one department. If computer science were not tied to an engineering discipline it could not benefit from proven approaches to engineering (those making this argument viewed computer science as a type of engineering). Some argued against the extra cost of separate administrations. Others feared that without the excitement of computer science, electrical engineering would stagnate, becoming less interesting to both faculty and students. Perhaps the strongest argument in favor of a single department was based on the belief that computer hardware and software would eventually be indistinguishable from electrical systems. Of course computers were made of
www-mtl.mit.edu/~penf ield/pubs/eb-03.html 6/13
12/25/12
electronic components, but that was not the point. The point was that other electrical systems would, before long, include components that either resembled computers or were computers. EE graduates would not be able to design such systems if they did not understand computer science. Electrical engineering and computer science would not diverge, but would remain close together, and in effect act like a single discipline. This argument was repeatedly validated in later years, by advances in digital circuits and digital signal processing in the 1970s, VLSI in the 1980s, networking in the 1990s, and now embedded computing. In 1974 the decision was to remain as one department. Soon after, in an informal poll conducted by Joel Moses, the department faculty voted to change the department's name to Electrical Engineering and Computer Science (EECS), in recognition of the permanence and importance of computer science. With this decision made, the next task was to bring into harmony the two separate curricula, one in electrical engineering and one in computer science. A committee was formed to examine whether the two degree programs should have a common set of beginning courses, and concluded that they should. All EECS graduates needed to know about computer programming, electric and electronic circuits, signal processing, and computer architecture. These four courses already existed, and were adapted for this new role. They still serve as the common core of all departmental undergraduate programs. The retention of computer science in the department's degree programs, however desirable, led to new problems. There was simply too much material in the curricula. And many students wanted to learn both EE and CS in more depth than either program permitted. These issues precipitated another curriculum revision during the 1990s.
opportunity. Although bachelor's degrees would still be available, the M.Eng. degree would be considered the department's flagship program, open only to department undergraduates. The S.M. program would be retained for students from outside. The result, then, was this model for the three degree levels: S.B. degree: Classroom-oriented, structured program Appropriate for entry-level engineering positions Appropriate for graduate work elsewhere Appropriate for other career goals M.Eng. degree: Classroom-oriented, structured program with thesis Appropriate for a career in engineering Doctoral degree: Research-oriented program Appropriate for academic and research careers Curricula based on this model were approved in 1993, and the first M.Eng. degree was awarded in 1994. The M.Eng. program is consistent with Dugald Jackson's 1903 vision, with added research-oriented activities. It is consistent with the engineering-science model of Gordon Brown, with greater technical breadth than he envisioned. It is consistent with the students' demonstrated demand for education beyond the bachelor's degree. It satisfies the desire of many students for more breadth than either the EE or CS programs. The new degree has been popular with students; typically over half of the EECS undergraduates continue for the fifth year. This five-year model has not yet been widely adopted elsewhere. Other MIT engineering departments define their M.Eng. degrees as professional one-year programs open to graduates of other universities. Some let the better undergraduates pursue a five-year combined program, but only on an individually arranged basis. Only a few other universities have been able to put similar programs in place. In 1998 the M.Eng. program was evaluated; the principal disadvantage identified was that first-year graduate courses had to be modified to accommodate the large number of M.Eng. students, who were on average less scholarly than the doctoral students. The curriculum committee also reorganized the two bachelor's degree programs, "VI-1" for EE and "VI-3" for CS, and gave them the same structure, making it easy for students to design programs that combined EE and CS in novel ways. A new undergraduate program was added with greater breadth across EE and CS at the expense of some specialization. This "VI-2" program proved very popular. Apparently students want to keep their options open and prepare for a world in which the boundary between things electrical and things computational is at best fuzzy, and perhaps even nonexistent. The MIT EECS M.Eng. and VI-2 programs are successful, but they are certainly not the final word in the evolution of the department's programs. What will motivate the next curricular changes? Perhaps more biological material will be needed. Perhaps cognitive science will merge with artificial intelligence. Perhaps quantum mechanics will become critical. Perhaps students will need a better appreciation of the social, economic, and political context of engineering. Perhaps our programs can be made more accessible and attractive, particularly to women and underrepresented minorities.
12/25/12
Better Ideas
Some advances in technology have the property that they are unarguably superior to earlier technology and as a result they completely replace it. A new memory chip may be smaller, consume less power, operate faster, and be more reliable than a chip from the previous generation. The older technology is rendered obsolete. Here are a few example of such technology trends: Smaller. The trend in devices, from large to small to mini to micro to nano to quantum will continue as far as the fundamental laws of physics will allow. Faster. The demand to speed things up shows no sign of slowing down. Stronger. The public deserves systems that are increasingly robust and secure.
www-mtl.mit.edu/~penf ield/pubs/eb-03.html 9/13
12/25/12
Cheaper. Modern digital systems have the unusual property that every year they cost less, thereby defying inflation. It is not easy to teach technology knowing that it will be replaced soon. How can we be sure that our graduates are prepared for the long run? Our third challenge is to continue to focus on the fundamentals that will remain valid and relevant during a graduate's 40-year career.
Competing Ideas
Some trends in technology and its applications do not make older approaches obsolete but merely make them less dominant. Here are a few examples from the past century: Energy to Information. Our technologies are used for processing energy and information. Lately the emphasis has been on information, but that may or may not continue. Analog to Digital. The noise immunity and universality of digital approaches let them compete in areas once characterized by analog techniques only. However, there will always be some requirements that can only be satisfied by analog techniques. Inorganic to Organic. The incorporation of living systems, or components inspired by the study of living systems, is just starting, so it is too early to judge how their special properties may effectively compete with today's technologies. These trends may or may not be reversible. The older ideas are not obsolete, or at least not yet. Our graduates should be able to evaluate competing ideas in particular circumstances. The fourth challenge to the department is to teach competing approaches and application areas without letting the new ideas crowd out older ideas that are still of substantial importance.
More Responsibility
Gordon Brown's educational vision is 50 years old and Dugald Jackson's twice that. These visions have been expressed here in terms of what engineers should be able to do--apply known techniques, develop new techniques from known science, and develop new engineering science. It falls on us, as heirs to these visions, to see if they are still sufficient, or if more should be expected of engineers today. My conclusion is that at least some of our graduates should be prepared to undertake a higher level of social responsibility. A bit of history beyond our department's will help explain this fifth challenge. In 1893 the University of Wisconsin was small, with only 61 professors. One of them was Dugald Jackson, who had just established Wisconsin's department of electrical engineering. Another was the historian Frederick Jackson Turner, who that year revolutionized the study of American history. In a talk delivered at the Columbian Exposition in Chicago, he said that the existence of America's western frontier was "the fundamental, dominating fact" that shaped the character of the American people and the nature of its institutions. This "Turner thesis" soon became the most important paradigm in the study of American history. (Jackson also attended that Exposition, and while there he and others founded what is today the American Society for Engineering Education.) Jackson and Turner had much in common. They were about the same age. Each had worked in Chicago before coming to Wisconsin, had clear vision, could express himself well, and would in time become a leader in his own field. At one point the two men served together on a faculty committee to "consider the condition of athletics in
www-mtl.mit.edu/~penf ield/pubs/eb-03.html 10/13
the University"--evidently football rowdiness had led Turner to fear that human values were "put in wrong perspective and the fundamental purpose of the University lost sight of." Three years after Jackson came to MIT, Turner moved to Harvard. Turner, the historian, understood in 1893 that the western frontier was rapidly vanishing, though its influence would remain. But presumably he did not know what the next dominating influence on America's development would be. It turned out to be a different "frontier," one that would be familiar to his colleague Dugald Jackson. Fifty years after Turner introduced his thesis, the Second World War was under way. Vannevar Bush, who had left the MIT electrical engineering department, was serving in Washington, D.C. In 1945 he wrote a seminal proposal for a system of federal support of scientific and engineering research, and called it "Science: the Endless Frontier." Bush had a right to use this title because his own field, electrical engineering, was on that frontier. A young, vibrant, immature discipline, it exploited scientific advances rapidly. The intellectual excitement of electrical engineering was a direct consequence of its proximity to the scientific frontier. Besides being exciting, electrical engineering, and later computer science, have been essential to America's development. Their impact has been enormous. Consider the list of the ten "Greatest Engineering Achievements of the 20th Century," as judged by the National Academy of Engineering (NAE) in 2000. Half are based on EECS-related technologies--electrification, electronics, radio and television, computers, and telephone. (The other five--automobile, airplane, water supply, agriculture mechanization, and refrigeration--are more closely connected to other engineering disciplines.) Although I am not a historian, it seems to me that the exploitation of this scientific frontier, especially by electrical engineers and computer scientists, has shaped America as much in the 20th century as the western frontier did in earlier times. The successor to the Turner thesis, then, may be a similar thesis but one involving a different kind of frontier: the frontier of science. Bush called the scientific frontier "endless." But is it, really? And will electrical engineering and computer science keep their privileged position on this frontier? It does seem so. Many engineering achievements involving EECS technologies, including Internet, laser, World Wide Web, solar cell, embedded computation, signal processing, artificial intelligence, control systems, and MEMS, were not on the NAE top ten, but seem poised to shape the 21st century. Or consider Moore's law, the famous observation by Gordon Moore in 1965 that the number of devices on an integrated circuit doubles every year or two. This trend has continued to this day and there is no end in sight, short of the limitations imposed by quantum mechanics (and even those may represent opportunities rather than obstacles). Whenever people try to predict when Moore's law will expire, they forget about the inventiveness of modern engineers and their ability to get around all but the most fundamental limits of nature. So the next century will, in my opinion, bring more and more exciting scientific advances to be exploited by our fields of engineering, and these technologies will exert a continuing influence on America and the rest of the world. In other words, in the 21st century, as in the 20th, we will continue to live and work on an important frontier. Life on the frontier is exciting. Research thrives where there is ambiguity, where much is unknown; overturning a major principle or law is considered a success, an accomplishment worthy of distinction. The disruptive, somewhat chaotic, character of frontier life is one we engineers relish.
12/25/12
But most institutions in a civilized society need stability and predictability. Consider what happened to America's western frontier. Civilization arrived and brought with it law and order. For better or for worse, the frontier became a more predictable and less exciting place. Is it our turn now? Our scientific and engineering frontier is of critical importance to America. Must our frontier become "civilized?" History suggests that it must. In fact, it is already happening. We are already confronting, and will continue to confront in the years ahead, tensions between the ambiguity inherent in the scientific frontier and the predictability required by society. Every day newspapers report examples of legal, political, and economic institutions grappling with new technologies that they only vaguely understand, and often perceive as a threat. Think of the frictions between technological standardization and product differentiation. Or between intellectual property and information freedom. Think about why regulated monopolies resist new technology. Or why e-mail spam is such a problem. The issue is not whether "law and order" will be established, but how. Will the crude tools available to America's legal, political, and economic systems be used to impose stability in a way that reduces the excitement that nourishes technological development? Will scientific studies of some types be restricted or even forbidden? Will long-established institutions resist the opportunities for improvement afforded by engineering advances? Or can society be persuaded to accept new technologies? Can the engineering community lead the movement for responsive and responsible change? Dugald Jackson said in 1903 that engineers, besides knowing science, "must know men and the affairs of men...must be acquainted with business methods and the affairs of the business world." In 1911 he expanded on this point, saying that "it is the duty of engineers to do their share in moulding their various economic creatures [companies and even sectors] so that the creatures may reach the greatest practicable usefulness to society." But both Jackson and Brown stopped short of saying that engineers should help the nation's institutions change to accommodate new technology. Today, the need is different. Both science and society are changing rapidly, partly because of advances in technology. Because the institutions of modern society need to adapt to modern technology, they need help from those who fully comprehend that technology. In other words, society will be best served if we engineers take an active role. The fifth challenge to this department, then, is to educate students so that at least some of them are prepared to help the world understand and embrace rapid changes in technology, and utilize them wisely. In my judgement, this is our most important challenge of all. If we meet it, society will be better off, and we will have earned the right to continue to work on the scientific frontier with all the excitement that we so cherish.
Acknowledgements This essay was written for a book celebrating the centennial of the MIT Department of Electrical Engineering and Computer Science. The author takes pleasure in acknowledging the helpful advice from the committee set up to produce this book: Ellen Williams, staff and chair; Fernando J. Corbat, Robert M. Fano, Paul E. Gray, John V. Guttag, J. Francis Reintjes, and the late Hermann A. Haus, members. Some information about Dugald Jackson's time at the University of Wisconsin was provided with the help of Bahaa Saleh, Christopher L. DeMarco, and Donald W. Novotny. Several present and former MIT colleagues contributed to the author's understanding of
www-mtl.mit.edu/~penf ield/pubs/eb-03.html 12/13
12/25/12
the events and trends covered here. Many helpful suggestions about the writing were made by Barbara B. Penfield. A spoken version of this essay was presented at the centennial symposium, May 23, 2003. The visual images for that presentation were gathered and organized by Ellen Williams and Abigail Mieko Vargus. Finally, the readability of the essay is in large part due to Ellen Williams, who served as a technical editor and insisted on clarity and consistency. The author is grateful for her efforts, without which many of the points would not be nearly as well thought out. URL of this page: h t : / w - t . i . d / s r / e f e d p b / b 0 . t l tp/wwmlmteuuespnil/use-3hm Created: May 15, 2003 | Modified: May 30, 2003 Related page: Penfield publication list Site map | To Paul Penfield's home page | Your comments are welcome.
www-mtl.mit.edu/~penf ield/pubs/eb-03.html
13/13