50% found this document useful (2 votes)
265 views

6.825 Exercise Solutions, Decision Theory

The document summarizes solutions to decision theory exercises. It includes: 1) Drawing decision trees to model a patient's decision between a risky operation or no operation, including probabilities of survival and utilities. 2) Analyzing betting decisions at a racetrack between betting on different horses, and whether to accept "gambler's insurance". 3) Modeling an Olympic skier's decision between skiing in a race or not after an ankle injury, including utilities for different outcomes and the expected value of more information.

Uploaded by

Khosro Noshad
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
50% found this document useful (2 votes)
265 views

6.825 Exercise Solutions, Decision Theory

The document summarizes solutions to decision theory exercises. It includes: 1) Drawing decision trees to model a patient's decision between a risky operation or no operation, including probabilities of survival and utilities. 2) Analyzing betting decisions at a racetrack between betting on different horses, and whether to accept "gambler's insurance". 3) Modeling an Olympic skier's decision between skiing in a race or not after an ankle injury, including utilities for different outcomes and the expected value of more information.

Uploaded by

Khosro Noshad
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

6.

825 Exercise Solutions, Decision Theory

Decision Theory I

Dr. No has a patient who is very sick. Without further treatment, this patient will die in about 3 months. The only treatment alternative is a risky operation. The patient is expected to live about 1 year if he survives the operation; however, the probability that the patient will not survive the operation is 0.3. 1. Draw a decision tree for this simple decision problem. Show all the probabilities and outcome values. Solution:
live (0.7) op no op U(3) die (0.3) U(0) U(12)

2. Let U (x) denote the patients utility function, where x is the number of months to live. Assuming that U (12) = 1.0 and U (0) = 0, how low can the patients utility for living 3 months be and still have the operation be preferred? For the rest of the problem, assume that U (3) = 0.8. Solution: The operation would be preferred as long as U (3) < 0.7.
live (0.7) op max(U(3),0.7) no op U(3) 0.7 die (0.3) 0 1

3. Dr. No nds out that there is a less risky test procedure that will provide uncertain information that predicts whether or not the patient will survive the operation. When this test is positive, the probability that the patient will survive the operation is increased. The test has the following characteristics: True-positive rate: The probability that the results of this test will be positive if the patient will survive the operation is 0.90. False-positive rate: The probability that the results of this test will be positive if the patient will not survive the operation is 0.10. What is the patients probability of surviving the operation if the test is positive? Solution:

Pr(survive|pos) = 0.9545.

Pr(pos|survive) Pr(survive) Pr(pos)

Pr(pos|survive) Pr(survive) Pr(pos|survive) Pr(survive)+Pr(pos|survive) Pr(survive)

0.90.7 0.66

4. Assuming the patient has the test done, at no cost, and the result is positive, should Dr. No perform the operation? Solution: Yes. EU (op) > 0.8.

5. It turns out that the test may have some fatal complications, i.e., the patient may die during the test. Draw a decision tree showing all the options and consequences of Dr. Nos problem. Solution: For this problem, we need to calculate Pr(survive|neg) as above: Pr(survive|neg) = 0.2059. Then we have the following decision tree:
live (0.95) die (0.05) op pos (0.66) live take test dont take test EU(no test) die U(0) U(3) neg (0.34) op no op no op U(3) live (0.21) die (0.79) U(0) U(12) U(0) U(12)

6. Suppose that the probability of death during the test is 0.005 for the patient. Should Dr. No advise the patient to have the test prior to deciding on the operation? Solution: Yes, the test should be taken. The evaluated decision tree is as follows:
live (0.95) 0.95 op 0.95 no op 0.8 live (0.21) 0.21 die (0.79) 0 0.8 1 die (0.05) 0 pos (0.66) live (0.995) take test 0.896 dont take test 0.8 0.896 die (0.005) 0 0.90 neg (0.34) 0.8 op no op 1

Decision Theory II
Decline to place any bets at all. Bet on Belle. It costs $1 to place a bet; you will be paid $2 if she wins (for a net prot of $1). Bet on Jeb. It costs $1 to place a bet; you will be paid $11 if he wins (for a net prot of $10). You believe that Belle has probability 0.7 of winning and that Jeb has probability 0.1 of winning. 1. Your goal is to maximize the expected value of your actions. What, if any, bet should you place, and what is your expected value? Draw the decision tree that supports your conclusion. Assume that you are risk-neutral. Solution: You should bet on Belle. The expected utility is U ($0.40).

You go to the racetrack. You can:

2. Someone comes and oers you gamblers anti-insurance. If you agree to it, they pay you $2 up front you agree to pay them 50% of any winnings (that is, $.50 if Belle wins, and $5 if Jeb wins). How would it aect the expected value of each of your courses of action? What would be the best action to take now, again assuming risk-neutrality? Draw the new decision tree. Solution: You should take the $2 insurance and bet on Belle. The expected utility is U ($2.05) (see gure 1).
win (0.7) $2.05 Belle win (0.1) Take insurance $2.05 Dont take insurance $0.40 Belle Jeb $0.40 $2.05 Jeb $1.60 lose (0.9) $1 $7 lose (0.3) $2.50

1.00

win (0.7) lose (0.3)

$1

$-1 win (0.1) lose (0.9)

$0.1

$10

$-1

Figure 1: Decision tree for the betting problem

Decision Theory III

Youre an olympic skier. In practice today, you fell down and hurt your ankle. Based on the results of an x-ray, the doctor thinks that its broken with probability 0.2. So, the question is, should you ski in the race tomorrow?

If you ski, you think youll win with probability 0.1. If your leg is broken and you ski on it, then youll damage it further. So, your utilities are as follows: if you win the race and your leg isnt broken, +100; if you win and your leg is broken, +50; if you lose and your leg isnt broken 0; if you lose and your leg is broken -50. If you dont ski, then if your leg is broken your utility is -10, and if it isnt, its 0. 1. Draw the decision tree for this problem. 2. Evaluate the tree, indicating the best action choice and its expected utility. Solution: U(ski) = 0 and U(not ski) = -2, so we ski (see gure 2).
50

broken (0.2) 90 fine (0.8)

win (0.1) lose (0.9)

100

ski dont ski

-10

broken (0.2) fine (0.8)

-50

-2

broken (0.2) fine (0.8)

-10

Figure 2: Decision tree for the skiers choice.

You might be able to gather some more information about the state of your leg by having more tests. You might be able to gather more information about whether youll win the race by talking to your coach or the TV sports commentators. 3. Compute the expected value of perfect information about the state of your leg. Solution: Given perfect information about the leg, we have the tree in gure 3, so the expected value of the information is E(Uinf o ) E(Unoinf o ) = 6 0 = 6.

4. Compute the expected value of perfect information about whether youll win the race. Solution: Given perfect information about winning, we have the tree in gure 4, so the expected value of the information is E(Uinf o ) E(Unoinf o ) = 7.2 0 = 7.2.

In the original statement of the problem, the probability that your leg is broken and the probability that youll win the race are independent. Thats a pretty unreasonable assumption. 6. Is it possible to use a decision tree in the case that the probability that youll win the race depends on whether your leg is broken? 4

win (0.1) -40 ski -10 dont ski -10 win (0.1) 10 lose (0.9) lose (0.9)

50

-50 broken (0.2) 6 fine (0.8) 10 ski dont ski

100

0 0

Figure 3: Decision tree given perfect information about the leg.


broken (0.2) fine (0.8) 50

90 ski

100

broken (0.2) win (0.1) 7.2 lose (0.9) -2 ski dont ski broken (0.2) -10 fine (0.8) 90 dont ski -2 fine (0.8)

-10

-50

0 broken (0.2) fine (0.8)

-2

-10

Figure 4: Decision tree given perfect information about winning. Solution: Yes. Just put the win branch after the broken branch and use the conditional probabilities for the given state of the leg.

You might also like