Div Class Title Models Simulations and Their Objects Div
Div Class Title Models Simulations and Their Objects Div
247-260
SERGIO SISMONDO
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
248 SERGIO SISMONDO
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
Models, Simulations, and Their Objects 249
of populations, but also of estimates of the fecundity of its organisms, the carrying
capacity of the environment, and so on — it could easily be expanded to give a
more complex representation of any of these factors, and to add others — to create
an expected trajectory of population size. Models, then, have theories as inputs,
and in so doing connect theories to data; they generally make more precise
predictions than do theories, or make predictions where the theories can make
none.
The material uses of the word "model" suggest that even mathematical models
should be analogues of physical systems, and as analogues they are tools for
understanding, describing, and exploring those systems. They should behave in
the same way as the things they represent behave. Models are therefore different
from theories not only in being applied, but in being analogues. Theories are now
typically thought not to represent particular physical systems or even classes of
physical systems, but underlying structures or necessities. According to the "se-
mantic" conception, the current favorite philosophical account of theories, theories
are defined by or specify families of models; theories provide structural ingredients
of models. A distinction between models and theories is not always made in
practice, however; and even when a distinction is drawn, some things are called
models by people who want to call attention to their inadequacies, and theories by
people who want to call attention to the insight that they provide.
Mary Hesse chooses as one origin point for debates about scientific models the
comments and arguments of Pierre Duhem in his 1914 book La Theorie Physique.
There Duhem ties models to national characters:
This whole theory of electrostatics constitutes a group of abstract ideas and
general propositions, formulated in the clear and precise language of geome-
try and algebra, and connected with one another by the rules of strict logic.
This whole fully satisfies the reason for a French physicist and his taste for
clarity, simplicity and order. ...
Here is a book [by Oliver Lodge] intended to expound the modern
theories of electricity and to expound a new theory. In it are nothing but
strings which move around pulleys, which roll around drums, which go
through pearl beads ... toothed wheels which are geared to one another and
engage hooks. We thought we were entering the tranquil and neatly ordered
abode of reason, but we find ourselves in a factory. (Quoted in Hesse 1966,2)
While many English scientists would not have seen the factory metaphor as an
insult — William Ashworth (1996) shows the impact of descriptions of the brain as
a factory for thinking on Charles Babbage's ideas for his Difference Engine —
Duhem pushes what he sees as an insult further, making the well-known contrast
between the "strong and narrow" minds of Continental physicists, and the "broad
and weak" ones of the English. Duhem is keen not only to support French over
English science, but to support a picture of physical theory, soon to become
roughly the Logical Positivists' picture, as the creation of abstract and elegant
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
250 SERGIO SISMONDO
structures that predict and save empirical data. Hesse, drawing on the response to
Duhem by the English physicist N. R. Campbell, sees models as heuristically
essential to the development of and extension of theories, and also essential to the
explanatory power of theories. Being able to derive statements of data from
theories is not to explain that data, though being able to derive statements of data
from theories via intuitive models is. Hence successful theories need models that
concretize, that provide metaphors in which to think. Hesse's 1960s work on
models and metaphors can now be seen as one of the arguments for the semantic
conception of theories, as opposed to the "received" view, no longer so received".
The theoretical models that Hesse describes are first and foremost analogies,
ways of bringing a theory to bear on the world by seeing one domain in terms of
another. The wave theory of light creates an analogy between light and waves in
media like water or air. But scientific models are much more various than that,
many of them depending upon no analogy between domains, unless one of those
domains is the mathematical formalism of the model itself, a possibility that
strains Hesse's picture beyond recognition. For example, the fisheries model
above doesn't posit any analogy (though there are undoubtedly many metaphors
hidden in the background), but itself stands in for different factors affecting
populations. The model does not essentially make use of some other analogy, but
is itself something of an analogue. By being an analogue it is expected to have
outputs that can be compared with data.
Computer simulations are similarly analogues, virtual copies of systems. Because
of the problems that computers are seen to solve efficiently, simulations are
usually more obvious analogues than are models. In most social simulations, for
example, the researcher defines "agents" with fixed repertoires of behaviors.
Those agents are then set to interact, responding at each moment to the actions of
others. The goal is to see what, if any, social patterns emerge, or what assumptions
are needed to create particular social patterns. Social simulations are analogues,
then, because components of the program are analogous to individuals, and those
components interact in a timeframe that is analogous to a sequence of moments in
real time. Simple one-to-one correspondences between virtual objects and real
ones (however idealized and simplified they might be) can be easily drawn. For
most non-computerized mathematical models such correspondences are more
difficult to draw, because preferred mathematical styles occlude individuals in
favor of aggregates or larger-scale relations. The components of an equation can
only rarely be neatly paired with obvious objects in the world, instead following a
logic defined by relations among objects.
Because of their applicability we might want to say that a good model or simulation
is more true than theories to which it is related. It makes predictions that are more
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
Models, Simulations, and Their Objects 251
precise or more correct than associated theories, if those theories make any
predictions at all; for this reason Nancy Cartwright sees good models in physics as
correcting theories and laws. Cartwright's work turns on recognizing the impor-
tance of ceterisparibus clauses. Most explanations involving fundamental physical
laws use implicit ceteris paribus clauses, either to account for the fact that the
fundamental laws assume an idealized world, or to account for complex interac-
tions among causes. To take an example from her How the Laws of Physics Lie
(1983) even so simple a law as Snell's law for refraction needs to be corrected.
Snell's Law: At the interface between dielectric media, there is (also) a
refracted ray in the second medium, lying in the plane of incidence, making
an angle q, with the normal, and obeying Snell's law:
sin q/sinq, = n2/nl
where vl and v2 are the velocities of propagation in the two media, and nl =
(c/vl), n2 = (c/v2) are the indices of refraction. (Miles V. Klein, in Cartwright
1983,46)
But, as Cartwright points out, this law is for media which are isotropic, having the
same optical properties in all directions. Since most media are optically anisotropic,
Snell's law is generally false, and needs to be corrected if it is to be applied in
particular situations.
While fundamental laws and theories lie about real situations, models that use
them or apply them can often get much closer to the truth. To see an easy case of
that, we might turn to a model of climate change. The basic theory of the
"greenhouse effect" is simple and firmly established. At equilibrium, the energy
absorbed by the earth's surface from the sun would balance the energy radiated
from the earth into space. However, the earth's surface radiates energy at longer
wavelengths than it absorbs, which means that accumulations in the atmosphere
of certain gases, like carbon dioxide and methane, will change the equilibrium
temperature by absorbing more energy at longer wavelengths than shorter ones.
General Circulation Models, which are computer simulations, address questions
about the effects of greenhouse gases by assessing the interaction of many factors,
including: water vapor feedback with temperature, the effects of clouds at different
heights, the feedback with temperature of ice and snow's reflection of light, the
changing properties of soil and vegetation with temperature, and possibly the
effects of ocean circulation (IPCC 1995). Without taking into account these
interacting factors, and more, the basic theory says almost nothing useful, or even
true, about the real material effects of greenhouse gases.
Scientific theories are typically about idealized or abstract realms, which are
uncovered in the process of developing and establishing the theories. Because
theoretical reasoning is highly mobile, theories can achieve universality of recogni-
tion and respect that more particularistic scientific knowledge is less likely to
attain. In addition, theoretical reasoning often does important work in explaining
particulars; explanation usually is the placing of particulars in a persuasive theo-
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
252 SERGIO SISMONDO
retical context, often one which indicates key causes. Therefore the realms that
theories describe can be taken as fundamental and real, despite their lack of
immediacy.
Some vaguely parallel points might be made about experiments. The category
of "experiment" has been the subject of sustained inquiry in Science and Technol-
ogy Studies for the past twenty years, and in that time it has been shown to be
considerably more interesting, both epistemologically and socially, than it appears
at face value (e.g. Hacking 1983; Latour 1983; Knorr Cetina 1981; Shapin 1984;
Rheinberger 1997). With the exception of natural experiments, in which scientists
achieve material control by drawing close but often problematic analogies between
different naturally occurring situations, experiments are about induced relations
between objects that are themselves often pre-constructed and purified. As such,
experiments' epistemic value depends upon the careful social discrimination be-
tween the natural and the artificial or artifactual. Science's line of discrimination
between natural and artificial became roughly fixed toward the end of the seven-
teenth century, and its exact location is renegotiated in different sciences on an
ongoing basis.
For solid theoretical and experimental claims, then, the object of representation
is given by nature, but it has become transparent with the acceptance of theoretical
and experimental styles of reasoning. In both cases the object domain is only made
manifest by human agency: theoretical claims are about idealized, transcendent,
or at least submerged structures, the details of which become clear with theoretical
work; experimental claims are about material structures which are almost always
not found in nature, but are rather constructed in the laboratory. And each of
theory and experimentation are given more epistemological value because of
correspondences that are made between individual theories and experiments:
strategies of correspondence are well-developed.
Some of those strategies involve models and simulations, which form bridges
between theory and data. Gaston Bachelard argued that science alternates between
rationalism and realism. In its rationalist mode it creates complex mathematical
theories. In its realist mode it produces objects to match those theories, experimen-
tally. This picture is a good starting point for at least some sciences, but should be
supplemented by some understanding of the gap between theories and even
experimentally produced reality. Models help fill that gap, and thereby legitimize
the continuation of work on theory.
Mathematical models and computer simulations apply more concretely than
the theories they instantiate, but because they don't have the epistemic traditions
behind them that theorizing and experimentation have, they don't have transparent
object domains. Models and simulations are typically considered neither true nor
false, but rather more or less useful. Whereas the agency embedded in theoretical
and experimental knowledge per se has become largely hidden, models and
simulations are complex enough that they cannot be seen as natural: it is easy to
see the assumptions made in modeling. This is despite the fact that, following
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
Models, Simulations, and Their Objects 253
Cartwright's down-to-earth intuitions, they are often more true to the material
world, and thus more natural, than at least the theories to which they are related.
Those theories, after all, make more assumptions about the material world, not
fewer. But the agency behind models and simulations is too visible to allow them to
easily represent transparent domains. They don't have a "home" domain, a meta-
physical category to which they unproblematically apply. Unlike heavily idealized
theories, they don't apply neatly to rationally-discovered Platonic worlds. Unlike
local facts, they don't apply neatly to natural or experimental phenomena. One
result of this is that when we see people arguing over a model, they are likely also
arguing over styles of inquiry and explanation. Sismondo (1999) argues that a
debate over an ecological model was in part an argument over what sort of a
knowledge ecology would allow, whether it would allow highly abstract theoretical
knowledge or only knowledge very tied to the material world. Breslau (1997)
argues that a dispute in the U.S. over the evaluation of federal training programs is
a dispute over the merits of different approaches to socio-economic research.
In the end, the above sort of talk of the domains of theory, experiment, and
models takes us back to an old problem. How should we understand the status of
object domains that appear to depend upon human agency, but have the solidity
that we attribute to independently real structures? We can explain the solidity in
terms of discipline, and assume that all such object domains are constructed. We
can explain the discipline in terms of solidity, and assume that human agency
wraps itself around the real. Or we can try to find terms in which to combine or
evade constructivist and realist discourses. Although this is not a question directly
addressed by any of the studies of this volume, taken as a whole they suggest that
some version of the last option is the only option: models and simulations are
misfits that don't sit comfortably in established categories.
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
254 SERGIO SISMONDO
ishing or enriching theories. Even in their relation only to theory, simulations and
complex models quite often violate Redhead's dichotomy by belonging to both
sides. Simulations may be enriched theories, adding considerable detail in the
effort to portray their objects, and simultaneously impoverished, substituting
approximations for intractable equations, in the effort to make them more appli-
cable. Both movements away from theory, though, are driven by the goal of
applicability: theoretical structures which are computationally unmanageable, or
unmanageably abstract, are inapplicable. Simulations, like models, stand between
theories and material objects, pointing in both directions. For example, a simula-
tion of AIDS transmission in intravenous drug users in a single country might
have parameters that represent quite detailed knowledge of drug use, the popula-
tion structure of drug users, responsiveness to treatment, and so on. At the same
time it may incorporate simplifying assumptions, relative to other models, about
the closure of its population, about the incubation pattern of AIDS, and so on
(see, e.g., Pasqualucci et al. 1998).
In addition, putting models and simulations merely in the context of theories
misses their often complex positioning in the world. Sometimes a theory will be
put to work in applied science, but if so it is combined with so many other types of
knowledge that it becomes only one component of a model or simulation. Take,
for example, Ragnar Frisch's 1933 "Rocking Horse Model" of the business cycle,
discussed in an elegant paper by Marcel Boumans (1999). The model consists of
three equations, relating such quantities as consumption, production, and the
money supply. As Boumans explains, there is a motivating metaphor behind the
model; Frisch imagines the economy as a "rocking horse hit by a club." The model
then brings physical knowledge to bear on the problem, through an equation
normally used to describe a pendulum damped by friction. Yet Frisch was not
merely enriching (or impoverishing) physical theory, because there are too many
other components of the model: basic economic relations, a theory about delays in
the production of goods and the deployment of capital, guesses of values of key
parameters, and so on. As Boumans argues, economic model-building is the
"integration of ingredients" so that the model meets "criteria of quality." Theories
play a role, but so do metaphors, mathematical techniques, views on policy, and
empirical data; for example, Frisch chose values for parameters to make sure that
his model would cycle at the same rate as real economic cycles. In addition, there
may be multiple theories playing roles, from different disciplines. Therefore we
shouldn't see models and simulations only in their relation to theories, bringing
theories into closer contact with data; models and simulations do many things at
once.
Seeing models and simulations just in a space between theories and data, the
typical way of seeing them, misses their articulation with other goals, resources,
and constraints. There are resources provided and constraints imposed by the
media in which they operate, because they are created with the mathematical and
computing tools available. They are created to fit into particular cultures of
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
Models, Simulations, and Their Objects 255
Work on models and simulations is like theoretical work in that the ostensible
object of representation is absent; modelers and simulators trade in symbols. They
produce representations, perhaps check those representations against data, and so
on. But modeling and (especially) simulation is like experimental work in that the
behavior of the model or simulation is the subject of investigation. From her many
interviews with people engaged with simulations, Deborah Dowling shows that
working with simulations is seen to have aspects of experimental work, despite its
being largely in the realm of representation. Researchers make small changes — to
parameters, initial conditions, the grain of calculation, etc. — and learn what
results. The flavor of such activities better matches the flavor of experimentation
than that of theorizing. Thus Dowling describes the ways in which simulations are
pulled back and forth between the categories of theory and experiment, depending
upon context.
Modeling and simulation is also like experimentation in its pattern of give and
take in their creation. Daniel Breslau and Yuval Yonay frame their contribution to
this volume within a critique of the singular focus on metaphor in studies of
economics. Illustrating a microinteractionist approach to the sociology of eco-
nomics, they draw on studies by David Gooding, Ludwig Fleck, and Andrew
Pickering, to show that economic modelers have to perform, in Pickering's term, a
"dance of agencies" in much the same way as do experimenters. The materials that
modelers work with — particular established formulations and modeling tools —
are recalcitrant; they do not behave as the modelers would like, either producing
too many answers, the wrong sort of answer, or intractable equations. In response,
the modelers have to find assumptions and tools that allow them to create objects
with the right disciplinary forms, objects capable of indicating unique solutions to
problems, sophisticated enough to be seen as advances in the field, and uncontrived
enough to produce epistemic solidity. This last condition is particularly important:
to avoid too-easy criticisms of assumptions, those assumptions have to conform to
disciplinary standards. As a result, the model is granted agency by the modelers.
Epistemic solidity for a model or simulation is tricky. The criteria that might be
applied depend upon the uses to which the object might be put, its complexity, the
available data, and the state of the field. As Eric Winsberg shows in his paper in
this volume, simulations and their components are evaluated on a variety of
fronts, revolving around fidelity to either theory or material: assumptions are
evaluated as close enough to the truth, or unimportant enough not to to mislead;
approximations are judged as not introducing too much error; the computing
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
256 SERGIO SISMONDO
tools are judged for their transparency; graphics systems and techniques are
expected to show salient properties and relationships. All of these judgments and
more are difficult matters, and many do not have straightforward answers. In
some cases these judgments create an interesting type of uncertainty. In the global
warming debate, for example, a key type of number has become the estimate of the
sensitivity of temperatures to a doubling of carbon dioxide in the atmosphere.
Sensitivity is typically reported as a range, as in 1.5° to 4.5°, or as an estimate with
upper and lower bounds, as in 3° ± 1.5°. The range does not measure statistical
uncertainty, though, because the key General Circulation Models are made to be
deterministic. Rather, the range measures the confidence of the climatologists in
their models and their assumptions (Sluijs et al. 1998).
In their focus on examining the warrant for fundamental theories, philosophers
of science have almost completely neglected the processes involved in applying
such theories. When philosophers do address application, it is generally assumed
that application is little more than deriving data from equations, with the help of
the right parameters and assumptions. But the papers of this volume show that
modeling and simulation, typical modes of application, are anything but straight-
forward derivation. Applied theory isn't simply theory applied, because it instan-
tiates theoretical frameworks using a logic that stands outside of those frameworks.
Thus Winsberg calls for an "epistemology of simulation," which would study the
grounds for believing the results of complex models and simulations.
Because they are supposed to be analogues, models and simulations are themselves
studied in the way that natural systems might be. Knowledge about them is
supposed to run parallel to knowledge about the things that they represent, which
allows modeling to be like experimentation, in both Dowling's and Breslau and
Yonay's senses. Researchers can learn about the behavior of models and simula-
tions, or make them behave naturally, and be doing properly scientific research.
But as objects they are also open to use in more instrumental contexts, providing
inputs for other research. That is, if treated as black boxes the data they produce
can sometimes simply feed into other research.
Martina Merz's paper here is an ethnographic study of event generators, com-
puter programs in high energy physics that simulate the effect of beams of particles
colliding with other particles. They are important for a number of reasons: they
play a role in data analysis, being used as a template for comparison with real data;
they are used to test simulations of detectors. But they are not just tools, because
creating them and using them is doing physics, too. Merz argues, using terminology
adopted from Hans-Jorg Rheinberger and Karin Knorr Cetina, that event genera-
tors manage to be simultaneously epistemic objects and technological things. As
epistemic objects what is valued is their open-endedness, their ability to behave
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
Models, Simulations, and Their Objects 257
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
258 SERGIO SISMONDO
protestants. Each of these pillars came with its own politics, down to the level of
economic decisions. At a time when fractures were severe, economists had to find
an authoritative approach. A failed approach attempted to synthesize the perspec-
tives of the pillars. The successful approach turned on a confluence between the
neutrality of modeling and the particular style of Dutch tolerance, a careful and
studied tolerance that recognizes the need to grant factions and positions their
autonomy, even while not granting them respect. The successful model could
become objective by being neutral, thereby becoming a tool for and an arbiter
between all of the pillars; the successful modelers could create a monopoly over
some central portions of economic planning.
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
Models, Simulations, and Their Objects 259
References
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409
260 SERGIO SISMONDO
Department of Philosophy
Queen's University
Kingston, Canada, K7L 3N6
Downloaded from https://www.cambridge.org/core. IP address: 207.241.231.81, on 29 Jul 2018 at 10:32:06, subject to the Cambridge Core terms of
use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/S0269889700003409