0% found this document useful (0 votes)
74 views14 pages

Chaptar 1 0 Engineers and Technological Progress

This document discusses different perspectives on technological progress - optimism, pessimism, and realism/contextualism. It provides examples of prominent thinkers who represent each view: - Optimists like Francis Bacon, Auguste Comte, and Wulf see technology as liberating and enabling significant advances. They celebrate its achievements while calling for greater responsibility. - Pessimists emphasize technologies' disruptive impacts like unemployment, worker alienation, and loss of control to large organizations. - Those with a realistic/contextual view like Barbour acknowledge both benefits and risks are context-dependent. As authors, the document's view straddles optimism and realism - being cautiously optimistic while recognizing complexity requires wisdom.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views14 pages

Chaptar 1 0 Engineers and Technological Progress

This document discusses different perspectives on technological progress - optimism, pessimism, and realism/contextualism. It provides examples of prominent thinkers who represent each view: - Optimists like Francis Bacon, Auguste Comte, and Wulf see technology as liberating and enabling significant advances. They celebrate its achievements while calling for greater responsibility. - Pessimists emphasize technologies' disruptive impacts like unemployment, worker alienation, and loss of control to large organizations. - Those with a realistic/contextual view like Barbour acknowledge both benefits and risks are context-dependent. As authors, the document's view straddles optimism and realism - being cautiously optimistic while recognizing complexity requires wisdom.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

• CHAPTAR

10
ENGINEERS AND
TECHNOLOGICAL
PROGRESS

Ethical responsibility now involves more than leading a decent, honest, truthful
life, as important as such lives certainly remain. . . . Our moral obligations
must now include a willingness to engage others in the difficult work of defining
the crucial choices that confront technological society and how to confront
them intelligently.
Langdon Winner
William A. Wulf, president of the National Academy of Engineering, calls for greater
attention to broader social issues in the study of engineering ethics.' In addition to
studying the micro issues concerning decisions made by individuals and corporations, we
must also consider macro issues about technology, society, and groups within society,
including engineering professional societies and the engineering profession in its
entirety. In support of his view, Wulf cites philosopher John Ladd, who much earlier
argued that an overly narrow focus on codes of ethics neglects broader issues about
"technology, its development and expansion, and the distribution of the costs (e.g.,
disposition of toxic wastes) as well as the benefits of technology. What is the significance
of professionalism from the moral point of view for democracy, social equality, liberty
and justice?"2
In tune with the suggestions of Wulf and Ladd, we have linked micro and
macro issues throughout this book, especially in developing the model of
engineering as social experimentation and in discussing environmental and
global issues. This chapter links our themes more explicitly to broader
studies of technology in the interdisciplinary field called STS—an
acronym for Science, Technology, and Society and for Science and
Technology Studies—and also in the branch of philosophy called Philosophy
of Technology.3 It also underscores the ipt n tame of leadership by engineers
in addressing broader issues abbut technol• )p,ICtll progress and about other
areas of engineering.
10, 1 CAUTIOUS OPTIMISM
Wo begin by discussing general attitudes toward technology. We then shift to more h
used, although still general, points in thinking about technological progress. 11 retie
grounds concern the prospects for human freedom and wisdom. Does technology control
society? Is technology value-neutral or value-laden? Given the uncertainty surrounding
technological development, are there grounds for hope in looking to the future?

10.1.1 Optimism, Pessimism, Realism


Both values and facts are involved in assessing when technological change constitutes
technological progress. Progress means advancement toward valuable goals, hopefully
using permissible means. Typically, debates between optimists and pessimists turn on
more than disagreements about the facts and estimates of risks. They involve differing
judgments about moral values, especially values of social justice, human fulfillment,
and respect for the environment.
Often we think of technological progress narrowly, as enabling the improved
performance of specific tasks. If the task is creating.wannth in dwellings, we see a straight-
line progression from wood fires in caves, to wood-burning fireplaces, to coal-burning
furnaces, to gas furnaces. A fully honest reckoning, however, will take into account the
sum total of benefits and losses provided by a technology, including impacts on the
environment and social structures. Given the enormous complexity and variety of
technology, can we make overall assessments of technology in its entirety?
In fact, most of us do develop global attitudes about the major aspects of our lives,
for example, love, money, health, nature—and technology. Scholars typically group
these attitudes into three categories: optimism, pessimism, and a third category sometimes
called realism (being realistic about power) or contextualism (paying close attention to
variations within specific contexts) that emphasizes the moral ambiguities of technology.
Thus, Ian Barbour establishes such a threefold distinction: technology as liberator,
technology as threat, and technology as a morally ambiguous instrument of power.4 As
authors, we are "cautious optimists" whose views straddle Barbour's first and third
categories—optimism combined with realism. Irurierni briefly outline each of his three
categories, citing representative thinkers.
General optimism about technology as liberator emerged with and helped
fuel the emergence of modern science and industry. Early spokespersons for
this emergence were understandably enthusiastic and even utopian in their
vision of the steady development of techno-science: science as an unlimited
wellspring for new technology, and technology in turn as advancing science.
Around 1600, Francis Bacon proclaimed that "knowledge is power," and
around 1800 Auguste Comte envisioned a technocracy (as it is now called)
in which technologists govern society for the good of all. Following the
technology-involved horrors in the twentieth century, especially world wars,
a more nuanced optimism emerged that celebrates technology while calling
for greater wisdom in its application.
Wulf voices strong optimism of this sort. In support of his view, he cites the list of
top 20 engineering achievements of the twentieth century identified by the National
Academy of Engineering.5 That list, cited in chapter 1, bears repeating: electrification,
automobiles, airplanes, water supply and distribution, electronics, radio and television,
agricultural mechanization, computers, telephones, air-conditioning and refrigeration,
highways, spacecrafts, Internet, imaging technologies in medicine and elsewhere,
household appliances, health technologies, petrochemical technologies, laser and fiber
optics, nuclear technologies, and high-performance materials. Wulf cautions that
technologies have become so complex, and interactive in their complexit.t some negative
impacts are literally unforeseeable. Nevertheless, it is precisely this new awareness of
complexity that provides hope that humanity—and engineering—will act responsibly.
Similar optimism is expressed by Emmanuel G. Mesthene, former director of the
Harvard Program on Technology and Society. Mesthene acknowledges that in solving
some problems, technology generates new challenges. For example, automobiles solved
transportation problems and provided greatly expanded mobility, but it led to smog,
congested freeways, and the death of tens of thousands of people each year in the
United States alone. Nevertheless, Mesthene sees strong grounds for overall optimism
about technology's power to overcome poverty and create wealth, fight starvation and
disease, raise the quality of life by steadily increasing opportunities, and above all to open
up new possibilities. Hope is especially justified because humanity has become self-
conscious about the unintended side effects of technology, and hence the need for
wisdom in dealing with them.
Technology, in short, has come of age, not merely as technical capability, but as
a social. phenomenon. We have the power to create new possibilities, and the
will to do so. By creating new possibilities, we give ourselves more choices. With
more choices, we have more opportunities. With more opportunities, we can have
more freedom, and with more freedom we can be more human. 6
As one more spokesperson for technological optimism, we cite Alvin M.
Weinberg, a pioneer in the development of atomic energy and the person who
coined the expression "quick technological fix," or "quick fix." Traditional
approaches to social problems centered on "social engineering," that is,
inducing change in the motivation and habits of individuals, as well as the
use of powerful social institutions with the authority to control humanity.
Technological progress offers a more effective and less coercive remedy to
intractable social problems. A social engineering approach to deaths on the
highways might be to educate, to preach, to pass laws, and then to punish
violators. A quick fix would be to design safer highways and cars. Again, a
social engineer would try to solve water shortages in western states by trying
to persuade individuals to conserve water and fining tin owl win) do not. In
contrast, the engineer would develop ways to generate more vonliir at a cheaper
price, perhaps by designing nuclear desalination plants. Shunning utopian
visions, Weinberg nevertheless celebratruse of te,chnolte y technology
will never replace social engineering. But technology has pro-v tdvd and
will continue to provide to the social engineer broader options, to make Inn
iv table social problems less intractable; perhaps, most of all, technology will
buy lune—that precious commodity that converts violent social revolution
into in 4 ridable social evolution."7
In contrast, technological pessimists see a predominance of bad over good In major
technological trends. Pessimists emphasize that technologies can disrupt 'mummifies,
cause massive layoffs, alienate workers who are reduced to menial i►ks, and create a
sense of lost control as large organizations come to dominate 4 bi ir lives. Although few
engineers embrace such pessimism, it is an attitude they must understand and confront in
others.
Much pessimism about technology flows from how it threatens cherished values.
Sometimes the values are moral, religious, and aesthetic values that are Pillu filed aside
amid the distractions of technology-driven consumerism. Ralph Waldo Emerson, for
example, complained that technology tends to narrow the human personality: "Look up
the inventors. Each has his own knck; his genius is m veins and spots. But the great,
equal, symmetrical brain, fed from a great heart, you shall not find."8 Such a view seeds
implausible today. Not only do many engineers achieve breadth of understanding, but
narrowness about technology is equally commonplace among moralists, religious
thinkers, literary people, and oven some scientists.9
Much pessimism is based on uncovering general patterns of technological thinking
and dominant technological trends that subvert traditional values such as freedom and
community. Ominous visions of technology emerged in the post–World War II era. Lewis
Mumford, for example, began his career optimistic about technological progress, but he
came to depict a world of impersonal, technology-driven bureaucracies to which the
individual had to conform. Technology moves in the direction of concentrating power
in ways that erode democratic freedoms:
the dominant minority will create a uniform, all-enveloping, super-planetary
structure, designed for automatic operation. Instead of functioning actively as an
autonomous personality, man will become a passive, purposeless, machine-
conditioned animal whose proper functions, as technicians now interpret man's
role, will either be fed into the machine or strictly limited and controlled for the
benefit of de-personalized, collective organizations. 10
The French thinker Jacques Ellul went even further in characterizing technology as
"autonomous," literally beyond the control of human beings. Conceiving of technology
as "technique"—that is, the modes of thinking and types of organizational structures
driving the development of machines—Ellul wrote that "technique has become the new
and specific milieu in which man is required to exist. . . It is artificial, autonomous,
self-determining, and independent of all human intervention."

Albert Borgmann, an influential philosopher whose response to contemporary


technology veers in a pessimistic direction, uses the example of the traditional fireplace
to illustrate how easily we overlook the changes brought about by technologies. The hearth
was part of what centered and unified a family: "It was a focus, al:1.01lb, a place that
gathered the work and leisure of a family and gave the house a center. Its coldness marked
the morning, and the spreading of its warmth the beginning of the day."12 Valuable
technology, in Borgmann's view, promotes and sustains the values of family and
community, and it also engages individuals' skills and caring—as in cutting firewood or
starting the morning fire. Much contemporary technology does not do that. Instead, it
consists of "devices": artifacts that serve a specific purpose, but whose inner workings
we have no grasp of and which we view as mass-produced, disposable items. In addition,
much of it erodes valuable relationships and activities. Television, for example, seems to
liberate us by opening us to a wider world, but it also reduces occasions for family
activities and absorbs time for reading. Even the computer has drawbacks as a focal
activity because it engages us more in a virtual than a genuine reality. Citing Tracy
Kidder's Soul of a New Machine, Borgmann acknowledges the role of concentrated
engineering work to function as a focal activity, but regrets how it is so time-consuming
as to threaten relationships with family and the larger world. 13
Examples like these lead some critics to reject Borgmann's value perspective as
nostalgic and regressive. More important, he has been criticized for insufficiently
appreciating the enormous diversity of acceptable uses of technology, including television
and computers, and also for failing to fully appreciate how democratic procedures will
inevitably result in some technologies that are not ideal from the perspective of any
given person.I4 These criticisms of Borgmann pinpoint a tension that arises in reflecting
on technology at the macro level of this chapter. For those living in a democratic and
capitalistic society, there will always be some debate about exactly what the values
defining democracy and capitalism are, at least when they are specified concretely. For
example, is allowing large campaign contributions a legitimate expression of democratic
freedom of speech, or in fact a subversion by big money of equal access to political
representatives? Nevertheless, there is now wide agreement about many of the core
values defining democratic rights and economic opportunities. Critiques of technology
must take these core values into account. •
What should we make of this clash between optimism and pessimism about technology
in general? Insofar as the debate is empirical, we can evaluate specific claims for their
truth and falsity. Thus, general pessimism about information technologies and the Internet
is being proven unfounded. Certainly there are moral challenges, as discussed in chapter 9.
Yet the overall trend has clearly been one of human liberation—not only in productivity
and creative extensions of human powers, but in the liberalizing and democratic
tendencies of these technologies. In contrast, it is still too early to determine the overall
impact of biotechnologies (including genetic engineering), but at some time in the future
things will become clearer.15
Yet, more than facts are at stake in the attitudes we adopt toward technology (and
toward life). Many people waver between technological optimism and
110.1111111iSin according to the technologies of greatest concern or interest to them at
}liven time and whether what we value is threatened or pronfoted by them. Such gcneral
attitudes play practical roles in influencing our responses to choosing ca-1 eel s (most
engineers are optimistic), leaving careers (some engineers become 1114111usioned),
voting as citizens, and risking our money as investors. We believe,
iwever, that it is better to explore the more specific issues that underlie the disl1utrs
between optimists and pessimists, guided by Barbour's third attitude toward t4•4
huology—with an overall optimistic accent.
That third attitude is that technology is "an ambiguous instrument of power whose
consequences depend on its social context." 16 Power can be used for good 4)1. evil, and
for greater or lesser good. All technology involves trade-offs, and the trade-offs can be
made wisely or selfishly. Again, technologies can be used for good or bad purposes: a
knife can cut bread or kill an innocent person. Often values become embedded in
technological products and approaches in ways that create unfair power imbalances. And
major technologies carry a momentum that cannot be fully controlled by individuals,
although they remain under the control of larger groups. These interwoven themes,
discussed in what follows, can be affirmed while retaining a strong sense of optimism
and hope that is so essential in engineering. The optimism and hope, however, are
selectively targeted toward those specific technologies that are reasonably foreseen to
produce genuine benefits to humanity, and even then they are tempered with an
awareness of the risks involved. This is the import of our first theme of cautious optimism
about moral agency and decency, about engineering professionalism as having a moral
core, and about responsible technology as integral to human progress.

10.1.2 Technology: Value-Neutral or Value-Laden?


)ne issue underlying the clash of optimism and pessimism about technology concerns
how values are related to technology: Is technology value-neutral or value-laden? Many
people think of technology as value-neutral, and hence their optimism or pessimism is
actually about humanity's capacity for wisdom in guiding technology. Thus, optimists
about human capacities for wisdom envision the steady advance of technology as
generating new instruments that can be used to solve problems and make steady
progress. Pessimists emphasize that in advancing human powers, technology tends to
multiply the scale of stupidity in making choices—witness the dangers of nuclear war, of
environmental destruction, of a crassly materialistic society preoccupied with pleasure.
However, most scholars believe that things are more complex, and as authors we
agree. Technology, properly understood, is not altogether value-neutral. It already
embeds values, and optimism or pessimism are better focused on what those values are.
Clearly, this debate turns on how we define technology in the first place. According
to the value-neutral view, technology consists of artifacts or devices machines, tools,
structures—perhaps together with knowledge about how to make and maintain devices.
As such, it is neither good nor bad, but merely a means that can be used for good or bad
purposes. A screwdriver can be put to many uses, including building homes or killing
persons, but by itself it has no intrinsic value or even tendency toward desirable or
undesirable ends. This view of technology is often dubbed instrumentalism:
technology consists of devices and knowledge that are mere instruments, with no single
connection to any particular values or ends.17
In opposition to the instrumentalist view, those who view technology
as value-laden insist that it consists in more than artifacts and knowledge.
It also consists of the organizations and general approaches that make
technological development possible, and organizations and approaches are
guided by values. Hence, in the context in which they are developed and
used, artifacts and knowledge embody the dominant values of those who
make and use them. Thus, an artificial heart emerges from the value to
extend and improve the quality of human lives, and we could not understand
what it is a technological object without grasping those values.
Mary Tiles and Hans Oberdiek state the point clearly:
values become embodied in technologies. Just as artists naturally
express their artistic values in their art, so do the makers of
technologies. If, for instance, price is more important than safety in the
minds of manufacturers, their products will undoubtedly embody that
trade-off.I 8
Tiles and Oberdiek quickly add that the values embedded in a device or
process are fluid rather than fixed. I9 As an example, they note that
lighthouses were designed to protect ships against dangerous shoals; one
could not understand what lighthouses are, as a technology, without
grasping their function and that value. Today, however, when electronic
devices have replaced the original function of lighthouses, their primary value
is historic and aesthetic, as picturesque reminders of another era. These
symbolic values are important, as land developers, nonprofit preservationist
organizations, and large segments of the general public will attest.
Devices and technological knowledge might seem value-neutral
because we abstract them from the social contexts where they are designed
and function. Such abstraction misses the essential connection between
devices, knowledge, and the value-laden purposes of organizations that
create technology and groups that use it. As Balfour summarizes, "historical
analysis suggests that most technologies are already molded by particular
interests and institutional goals. Technologies are social constructions, and
they are seldom neutral because particular purposes are already built into
their design," typically by corporations and intended consumers.2°
The narrow instrumentalist definition of technology as value-neutral arti-
facts misleads by leaving out essential aspects of technological change.
Worse, it provides a ready-made basis for engineers and other participants in
technological development to deny responsibility.2I "I am meiely making
things; responsibility for them lies entirely with the user," thinks the engineer
seeking to abandon moral responsibility. As we have also emphasized,
however, engineers share responsibility with many others, which brings us
to the next point.

10,13 Shared Responsibility and Control of Technology


Jarlonological determinism is the thesis "that technology somehow causes all
IIhrr aspects of society and culture, and hence that changes in technology
dictate I Hinges in society."22 In a strong version, the thesis of technological
determinism dilutes human choice: We are victims of technology rather than in
control of it. I n hnological determinism undermines shared responsibility for
technological pi I iIrcts. For, responsibility presupposes freedom, and engineers
are not responsihlo for changes that are entirely beyond human control.
Is technological determinism true? It has some intuitive appeal, for each of
tar has at times felt pushed or pulled by technology. On the one hand, can we
genuinely choose not to use a telephone, ride in cars, or rely on a computer? To
be Me, we are usually happy to have such technologies available, and hence their
attraction (pull) strikes us as expanding rather than limiting freedom.
Nevertheless, the impact of such technology in shaping our lives is pronounced
and pervasive. On the other hand, at every turn our lives are shaped by large,
technology-driven organizations and structures over which we have no control:
traffic lights, the telephone company, the Internal Revenue Service, and
increasingly sophisticated terrorists. When we become victims of identity theft,
impersonal health maintenance organizations, or layoffs because of shifts in the
global economy, we experience how limited freedom is in an increasingly
complex technological society. And as we witness large-scale human events,
such as wars, genocide, and mass starvation, the presence or absence of
technology seems to be the primary causal factor.
The thesis of technological determinism, however, is not directly about us
as individuals. None of us controls every aspect of our lives. An appreciation of
our vulnerability, as individuals, to economic and political forces is part of
humility and intelligence. Technological determinism is the view that the primary
structures of human society are determined by technology, rather than human
beings (as a group) controlling technology. A few optimists hold this view,
confident that technology overall is beneficial for humanity But technological
determinism is most vigorously supported by pessimists like Ellul, who believe
that "technology is autonomous"—driving us in ways that tend to subvert
human freedom and values.
How does Ellul defend such a sweeping thesis? He identifies each of the
main groups and features of moral choice that are usually assumed to govern
technology and then tries to show how each source of control is illusory.
Managers of corporations are driven to develop and apply technologies as
dictated by profit rather than moral values. Scientists and even engineers tend
to be naive about unintended side effects of technologies; consider, for
example, Einstein's initial support for nuclear weapons followed by his
opposition to them. Politicians are either crassly self-interested or
ideologically directed. Consumers are uninformed and duped by advertising,
and citizens are easily manipulated. As science progresses, new technological
possibilities become irresistible—if something can be done, it will be done.
And the entire process is driven by increasing concentrations of wealth, large
corporations, and government support.

Despite its grandiose scale, the thesis of technological determinism


remains a claim about the facts concerning causal relationships. As-such, it
is open to empirical study. Obviously, clarifying what is meant by
technology is an important first step. As we noted, Ellul understands
technology broadly, to include the primary ways of thinking (for example,
instrumentalism) involved in technology, and also the organizational
structures and procedures used within technological organizations. Because
these ways of thinking and procedures involve human choices and
commitments, it can be argued that Ellul is not playing fair. Essentially, he
builds into "technology" the kinds of choices and procedures he dislikes and
is pessimistic about, and then he complains that they are dominating the
moral, religious, and community values he cherishes.
In fact, a very large body of careful interdisciplinary study has refuted
any strong version of technological determinism.23 Human choices matter!
Technology is not a juggernaut with a will of its own that renders all of
humanity its victims. To be sure, once major technological trends become
entrenched, they tend to carry a momentum of their own. But those trends
emerge as a combination of human freedom exercised within constraints
from past technologies and other factors. There is always a two-way
interaction between human choice and technological momentum.
A telling example is the automobile. In the United States, according to
early 1990 estimates, there were 1.7 automobiles for every U.S. citizen; one
in seven jobs were in car-related industries; one-fifth of retail dollars
centered on cars; 10 percent of arable land went to the car infrastructure,
and in Los Angeles two-thirds of the land space is used for cars. 24 The
automobile's rise to dominance seems inevitable, once the basic
technology of internal combustion engines merged with Henry Ford's
assembly line production to make available a financially accessible product.
So do the effects of its dominance, which include the depletion of world oil
supplies, pollution, and tens of thousands of deaths each year. In fact,
despite this seeming inevitability, the emergence of the automobile is clearly
the cumulative product of decisions by corporations, consumers, and gov-
ernment. If technology dramatically influences us, we also shape the directions
of technology.
In STS studies, this two-way interaction often goes under the heading
of social constructionism. Social constructivists highlight the importance of
human perceptions and interpretations, emphasizing how different groups can
see a technological change in very different ways. As Wiebe E. Bijker
illustrates,
a nuclear reactor may exemplify to a group of union leaders an almost
perfectly safe working environment with very little chance of on-the-job
accidents compared to urban building sites or harbors. To a group of
international relations analysts, the reactor may, however, represent a
threat through enhancing the possibilities for nuclear proliferation,
while for the neighboring. village the chances for radioactive emissions
and the (indirect) employment effects may strive for prominence.'
Critics of social constructionism, however, see it as neglecting the full pos-
sibility of moral reasoning about the values that ought to govern the
assessment of িlint di 'weal change. When done carefully, such assessments
focus on particular to-1 I t nt ill iities and leave room for the kind of shared
responsibility for implementing piktilled values that we have emphasized in
this book.
►i►gdon Winner holds such a view. He is most famous for his book
A utonomous Technology, which developed some of Ellul's themes of
technologic t►1 determinism. Nevertheless, he has always emphasized the
potential for reaio 'Heil human choices about technology. Especially in The
Whale and the Reactor, In makes it clear that our central problem is
"technological somnambulism"—the
tendency to sleepwalk through technological change, rather than "to exoi
in lie, discuss, or judge pending innovations with broad, keen awareness of what
l i a i s e c h a n g e s m e a n . " 2 6
As one of many examples, Winner discusses the development of the
mechanical tomato harvester, which plucks and sorts tomatoes with a single pass
through fields. The cost of harvesting tomatoes was reduced significantly,
although tougher (and less flavorful) varieties of tomatoes had to be developed
to withstand the machinery. Yet, tens of thousands of jobs were permanently lost,
and thousands of small growers were forced out of business by the high costs of
the machines they could not afford. Funding for developing the new technology
came from California taxpayers, thereby supporting the financial interests of large
agribusiness at the expense of less powerful constituencies. Winner's point is that
democratic values require public understanding and debate of such changes, and
too often that does not occur.
Winner calls upon engineers to develop greater "political savvy" about
power relationships within their corporations and within the economic
system in which they work. Equally important, engineers nee 400 develop
"political imagination"—an understanding of how their work affects public
life.
As part of mastering the fundamentals in their fields, engineers and other
technical professionals ought to be encouraged to ask: Can we imagine
technologies that enhance democratic participation and social equality?
Can we innovate in ways that help enlarge human freedom rather than
curtail it? How can planning for technological change include a concern for
the public good as distinct from narrowly defined economic interests?"
Winner develops these suggestions with discussions of the interplay of
engineering, politics, and free enterprise that we find illuminating. Indeed, his
insights resonate with our theme of shared responsibility among engineers,
managers, and the public for technological ventures in pursuing social
experiments.

10.1.4 Uncertainty, Ambiguity, and Social Experimentation


Uncertainty about general trends in technological change, as well as about specific
technologies, lies at the heart of debates about technological optimism. Although the pace
of scientific and engineering advancement is breathtaking, there is a lag in moral, social,
and political understanding. The contemporary world leaves ample room for
disagreements about which risks and benefits, in what degrees of each, surround new
technologies and the cumulative effects of older ones. The model of engineering as
social experimentation highlights this dimension of engineering, whatever one's specific
beliefs about the relevant facts concerning technological change.
The social experimentation model underscores how more is at stake than
straightforward disagreements about the hard facts. Perceptions of risk and benefit turn
in part on how facts are presented to individuals. Statistics are easily manipulated in the
direction one favors. Environmental impact statements can be phrased in language that
foregrounds, backgrounds, or selectively omits detailed information. Distinct from purely
factual disagreements, there are different responses to risks that center on values. Some
differences pertain to individual psychology: some people are more risk-averse than
others, either in general or with regard to specific activities such as flying on an airplane.
Other differences pertain to the core values endorsed by individuals. Individuals'
environmental ethics, for example, will be reflected in their responses to clear-cut logging
and strip mining. And safety is acceptable risk—acceptable in light of one's settled value
principles and knowing the pertinent facts. In all these ways, values need to be applied
contextually and with nuance, rather than globally with regard to all technology.

DISCUSSION QUESTIONS
1. Do you agree or disagree with the following passage from Alvin M. Weinberg? In
defending your view, discuss how values shape what counts as a quick fix, as distinct
from an unsuccessful "fix."
Edward Teller [inventor of the H-bomb] may have supplied the nearest thing to a
Quick Technological Fix to the problem of war. The hydrogen bomb greatly
increases the provocation that would precipitate large-scale war—and not because
men's motivations have been changed, not because men have become more tolerant
and understanding, but rather because the appeal to the primitive instinct of self-
preservation has been intensified far beyond anything we could have imagined be-
fore the H-bomb was invented.28
2. The distinguished British philosopher Bernard Williams (1929-2003) once wrote that
"Nuclear weapons are neither moral nor immoral—they are just piles of chemicals,
metals and junk."29 Identify and assess The instrumentalist concept of technology that
Williams seems to be using. Can we comprehend weapons of mass destruction without
grasping the aims with which they are developed and their intended functions? As an
additional example, discuss the notion of technology reflected in a definition of
handguns as "structured metal and bullets," and the claim that "guns don't kill people;
people do."
3. Each of the following claims, concisely stated by Merritt Roe Smith and Leo Marx,
has been explored in science and technology studies (STS). With regard to each
claim, (a) clarify what is being claimed, (b) identify the element of truth (if any) in
the claim, and (c) identify relevant truths neglected in the claim.
The automobile created suburbia. The atomic bomb divested Congress of
its power to declare war. The mechanical cotton-picker set off the migration
of southern black farm workers to northern citThe robots put the riveters
out of work. The Pill produced a sexual revolution?".
.1, '1 iles and Oberdiek point out that the values embedded in technology are fluid rather
than fixed. Think of, and discuss, two examples illustrating this theme, in addition to
their example of lighthouses. For example, research and discuss the controversial drug
RU-486, which was developed as an abortion agent and later was found to have
promise in treating various diseases.
M. Robert Moses was given unprecedented power to shape the landscape of New York
City and surrounding areas?' In exercising that power, he used several ways to block
minorities and low-income people, who depended on public transportation, from
having access to the state parks he developed, including blocking proposals to extend
railway access to them. His most ingenious way, however, was to order that key over-
passes and bridges be build a few feet lower than normal in order to block buses from
using convenient access roads. This is not an isolated case of how a distorted concep-
tion of social justice has in the past distorted engineering ethics, even though most dis-
tortions are less conscious and deliberate. Reflecting on neighborhoods you are famil-
iar with, can you think of an example of where the interests of a dominant economic or
racial group shaped an engineering project? Are such distortions less likely to occur
today, and if so, why—and owing to which shared values?
6. One of the most complex and also most studied urban transformations is Boston's
recently completed Central Artery/Tunnel Project. Research that project and discuss
how the five themes discussed in this chapter apply to it—both the themes of this book
and their analogs in STS and Philosophy of Technology studies. As references, good
starting points include: Clive L. Dym and Patrick Little, Engineering Design: A
Project-Based Introduction (New York: John Wiley & Sons, 2000), pp. 233-63; and
Thomas P. Hughes, Rescuing Prometheus: Four Monumental Projects that Changed
the Modern World (New York: Vintage, 1998), pp. 197-254.
7. The term Luddite is used to denote reactionary opposition to technological develop-
ment. In fact, the term derives from Ned Ludlum, a stocking maker who destroyed his
stocking frames in response to a reprimand from his employer. The term came to refer
to violent outbursts among textile workers in early nineteenth century England against
their employers. In most instances the rebellion was not against all technology, but only
against specific innovations that were perceived as threatening jobs and in some cases
reducing the quality of products. The rebellion was primarily due to extremely low
wages and poor working conditions at a time of general economic depression. Hence,
the Luddite movement illustrates how pessimism about technological trends is typically
rooted in wider economic and political conditions. 32 Research the issue and write a paper
linking the Luddite movement and a related contemporary topic concerning technology of
your choosing.

10.2 MORAL LEADERSHIP


As managers, business entrepreneurs, corporate consultants, academics, and
government officials, engineers provide many forms of leadership in
developing and implementing technology—where "technology" is understood
broadly to include artifacts, knowledge, organizations, and approaches. In
this concluding section we focus on engineers as moral leaders within their
professions and communities who contribute to technological progress. We
will sample a few current activities that illustrate leadership within the
profession, and we will take note of ongoing challenges that will require
continuing moral leadership.

You might also like