0% found this document useful (0 votes)
30 views6 pages

Hacking, Ian - Strange Expectations (1980)

Uploaded by

themistoclesxii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views6 pages

Hacking, Ian - Strange Expectations (1980)

Uploaded by

themistoclesxii
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

STRANGE EXPECTATIONS*

IAN HACKING
Stanford University

A new problem about mathematical expectation: there exists a state of


affairs S and options H and T such that in every element of one partition
of S, the expectation of H exceeds that of T, while in every element of
a different partition of S, the expectation of T exceeds that of H. This
problem may be connected with questions about inference in the short and
long run, and with questions about confidence intervals and fiducial probabil-
ity.

Here is a puzzle about expectation with an application to the


distinction between repeated use of a gambling policy, and its use
when only a single bet is to be made.1 A gambling set-up offers
two strategies, H and L. There are two ways to partition possible
states of affairs. In one partition we obtain a set of possibilities in
each of which H is superior to L. But in the other partition of the
same possibilities we obtain a set in each of which L is better than
H!
We can even create a gambler's paradise. A betting house offers
a game to people who arrive at its doors, a come-on in which each
bettor plays just once. Each customer is given some information and
may decide to play or to pass. The house forms a reasonable model
of its customers' behaviour, and finds that it has a positive expectation
of profit, namely a little less than $10.00 per client, averaged over
those who play and those who do not. The gamblers who pass neither
win nor lose anything, but each of those who plays believes that
he has a positive expectation. So the house thinks it is going to

*Received December 1979; revised February 1980.


'This puzzle was first presented as part of my contribution to the symposium on
randomness at the Biennial Meeting of the Philosophy of Science Association, October
1972. I did not publish it in the Proceedings because I hoped the difficulty would
have a clean solution, but I have not yet been told of one that I find convincing.
The puzzle was suggested by a different case with bounded utilities and improper
priors: D. V. Lindley (1971). Lindley's motivation was exactly the opposite of my
considerations about the distinctions between unique case and long run strategies.
It was intended to bring out another absurdity in Fisher's fiducial argument, already
known on occasion to have infinitely bad operating characteristics. I contend that
bad operating characteristics need not mean that a policy is bad in a unique case.
On a related point about expectation, see Peter Enis (1973).

Philosophyof Science,47 (1980)pp. 562-567.


Copyright? 1980by the Philosophyof Science Association.

562

This content downloaded from 87.110.96.146 on Tue, 27 May 2014 12:22:06 PM


All use subject to JSTOR Terms and Conditions
STRANGE EXPECTATIONS 563

make money, but every player thinks he is getting a bargain, playing


in a better than fair game. I believe that both the house and the
gamblers are correct. The reasonable strategy for the gambler playing
only once is the opposite of the reasonable strategy for the house,
running a large number of similar games. The long run and the unique
case meet to form a paradoxical paradise.
The puzzle has affinities with the Petersburg paradox. In Daniel
Bernoulli's game a fair coin is tossed and there is a payoff when
it first falls heads. If it falls heads on the first toss you win $2;
if you have to wait until the second toss, you win $4. In general
you get $2k if the first head occurs on the kth toss. The mathematical
expectation is unbounded (does not exist). What is the fair price
for entering this game? Some argue that the game is a bargain at
any finite price, yet few of us would pay even $25 to enter such
a game. Despite the long history of this problem, I know of no
comprehensive and thoroughly satisfying discussion of it.
My gambling house works as follows. A customer writes in advance,
giving notice of his intention to come. He is assigned a number 0,
chosen by a random device. On his arrival a different number x
is assigned to him, also by chance. Knowing neither number, he
is to wager whether x is high or low, that is, greater than or less
than 0.
The first number 0 is fixed as zero, positive or negative with
probability 1/3, say by rolling a fair die. If 0 is not zero its absolute
value is the number of tosses required, Petersburg-style, to get a
head with a fair coin. Hence the distribution of 0 is,
1
-2-101
3

As for x, it is with equal probability 0 + 1 and 0 - 1. The payoffs


are as follows:

If you call high, i.e. wager that x > 0, and you are right, you
win 3x dollars, but if you are wrong you lose 5(3X). If you call
low and are right, you win 5(3X) dollars, but if you are wrong,
you lose 3X
Possible outcomes may be partitioned either in terms of values of
x or in terms of values of 0. Now for each 0, the expectation of
calling high is 2(3?8-) dollars, while the expectation of calling low
is minus that amount. In short, for every value of 0 the expectation
of strategy H-call 'high'-dominates strategy L-call 'low'.
Yet for every value of x, L dominates H! For x = 0, the expectation
of calling low is $2. If x > 0, the expectation is 3X/5 dollars, while

This content downloaded from 87.110.96.146 on Tue, 27 May 2014 12:22:06 PM


All use subject to JSTOR Terms and Conditions
564 IAN HACKING

if x < 0, the expectation is (19/5)(3X) dollars. The expectations for


strategy H are minus these amounts.
If you come to my gambling house, should you call high or low?
The prudent person will not gamble at all, because with a large x
one risks too much. For example if your value of x turned out to
be 7, you could win $10,935, but you would also stand to lose $2,187.
Because of a certain loss of nerve among its clientele the house
makes a further offer. The terms of play are the same: you write
in advance, to have your private value of 0 fixed by chance. Then
you arrive and are assigned an x. But now you are actually told
your value of x. The house assumes that for low enough x people
will not play because the stakes and profits are too trifling. It also
assumes that as x increases fewer and fewer people will risk playing.
Its clientele is prosperous, and everyone will risk $3,000, but after
that more and more customers decide to pass. The approximate model
constructed by the house is as follows.
First, no customer will play if his x is less than 1. But every customer
will play with positive x until losing his nerve. Each customer loses
his nerve for large enough x.
Secondly, every customer will play if his x is positive and less
than or equal to 7. But about one quarter of those customers whose
x is 8 will drop out, and this decision, to pass, increases geometrically.
I say "about one quarter." In order to achieve a little arithmetical
chicanery, the geometrical multiplier is set at /5/9 = .7435. Thus
in numbers, only 5/9 of those whose x = 9 continue to play and
these people risk a loss of $19,683, although they may gain $98,415.
Each computes his expectation at $3,936.60. Only 25/81 = 31% of
those whose x is 11 choose to play, and so on.
Finally the house assumes that each customer is an independent
rational agent. The clients are not in cahoots with each other, and
each acts so as to maximize his own mathematical expectation relative
to the data available to him.
The house then reasons as follows. Each customer will call low,
or else drop out. Among those for whom 0 is 8 or greater, the sums
lost on those who stay in, (i.e. have an x of 7, and correctly call
'low'), exactly equal the sums gained from those who stay in, (i.e.
have an x of 9, and incorrectly call 'low'). (That is the point of
the arbitrary geometrical multiplier /5/9). Nobody whose 0 is less
than 0 plays. The house uses the probabilities of getting 0 = 0,1,2,. . .,7,
and the expected profits and loss for each, to infer that its mathematical
expectation is, as it works out, $9.46 per customer. This is the average
profit per customer, including those who choose not to play.

This content downloaded from 87.110.96.146 on Tue, 27 May 2014 12:22:06 PM


All use subject to JSTOR Terms and Conditions
STRANGE EXPECTATIONS 565

Here, then, is the gamblers' paradise. The house is persuaded that


it will make a profit, yet at the same time, every player believes
he has a finite positive expectation. Moreover customers do not have
to play for trifling stakes, and, as risks escalate an increasing proportion
of customers drop out.
The issues are somewhat subtle. First, the house is not correct
in saying that its mathematical expectation is $9.46. Letting U be
the random variable denoting house profits and losses, we do obtain
a finite expectation (U/0) for every 0. But we cannot reason that
just because:

, E(U/O) = $9.46

E(U) = $9.46; we could infer that only after proving that E(U)
exists.
Some people, to whom I have propounded this puzzle, are inclined
to say that the mistaken party is not the house but my body of
gamblers, each of whom believes he has positive expectations. The
reasoning is that if we look at the fortunes of the gamblers as a
whole, we shall see that they lose money by their strategy of calling
'low'. This can mean two different things: (a) the mathematical
expectation of a set of gamblers who call 'high'-or equivalently
of the house, when gamblers call 'low'-is positive. Or it can mean
something like (b) a large number of gamblers, calling 'high' will
in fact make an average profit on the stated terms, or, equivalently,
if gamblers call 'low' the house will on average make a profit. Now
(a) is just a mistake, the fallacy of summing expectations stated in
the previous paragraph. But I too am among those who believe (b),
although I am hard pressed to say exactly what I believe.
How can I believe (b)-that averaging the result of a lot of gamblers
calling 'low' will give a net loss per gambler-and still think that
each gambler on his own should call 'low'?
My answer is that this example seems to bring out a quite different
point, namely the difference between gambling in a unique case, and
gambling in repeated trials. Let us describe yet another offer that
the house might contemplate. It will allow for syndicates to play
against it. Like an individual, a syndicate writes off to obtain its
0. Then members of the syndicate arrive. The first member is allowed
to see his value of x, to decide whether to commit the syndicate;
he then commits the syndicate or drops out, on behalf of all his
colleagues. He calls high or low, and so does each member of the

This content downloaded from 87.110.96.146 on Tue, 27 May 2014 12:22:06 PM


All use subject to JSTOR Terms and Conditions
566 IAN HACKING

syndicate in turn; here 0 is fixed and unknown, but x is fixed for


each member as he arrives, and he calls low or high; of course some
will be correct, and some erroneous. In this situation the correct
strategy is for each member of the syndicate to call 'high'. Indeed
no house would ever make such an offer, because a rational syndicate
will certainly make a profit.
Note that the case of the syndicate is one with fixed 0 and variable
x; it is to be distinguished from the case of cahoots, in which the
customers secretly conspire against the gambling house to average
their profits when both 0 and x are variable.
My opinion, which will be controversial, is that in the unique case,
first described in this paper, the rational gambler will call low unless,
in the light of x he decides not to take the risk. But in the case
of the syndicate each member of a rational group will on the same
'data' call high.
One might dismiss the gambler's paradise on the ground that it
plays fast and loose with unbounded utilities. But the single player
knowing his x, and the syndicate whose first bettor knows his x,
are in a position to reason using utilities that are entirely bounded.
The numerical contrast between the two cases is striking. Let the
observed x be 7. Then the conditional probabilities, that 0 is 6 or
8, are 4/5 and 1/5 respectively. The single player calling low has
an expectation of $437.40. But the syndicate leader reasons that if
0 is 6 and every member of his team calls high, their individual
expectations are $486, while if 0 is 8, their expectations are $4374
(with a weighted average of $1263.60). This is not a paradox but
a fact about the probabilities and utilities of the two possible values
of 0, 6 and 8, and the three values of x that the syndicate might
experience, consistent with an observed first x of 7.
Students of statistical inference will notice a certain parallel with
some familiar issues. For the syndicate and for the individual bettor
alike, 0 is like a fixed unknown parameter, and x is like a measurement
made by an eccentric instrument. One requires a strategy for inferring
0 in the light of x. It is commonly supposed that one wants a strategy
with the best 'long run operating characteristics.' If we had a bunch
of idiots savants, each drawing inferences about unknown 0 in the
light of his own measurement x-but never comparing his x with
measurements made by his fellow idiots, then we are in the situation
of the syndicate, and require the rule with the best long run operating
characteristics. On my view, the correct strategy is for each investigator
to call 'high'. But, I contend, a single investigator, confronted by
just the piece of information about x, should guess that x is low,

This content downloaded from 87.110.96.146 on Tue, 27 May 2014 12:22:06 PM


All use subject to JSTOR Terms and Conditions
STRANGE EXPECTATIONS 567

even though this is not the policy which would maximize average
mathematical utility for a lot of similar investigators.

REFERENCES
Enis, P. (1973), "On the Relation E(X) = E(E(X/ Y))," Biometrika 60: 432-3.
Lindley, D. V. (1971), Bayesian Statistics, A Review. Pittsburgh: SIAM.

This content downloaded from 87.110.96.146 on Tue, 27 May 2014 12:22:06 PM


All use subject to JSTOR Terms and Conditions

You might also like