For the Final Exam, “Perfect” gives the percentage of students who received full credit, “Partial” gives
the
percentage who received partial credit, and “Zero” gives the percentage who received zero credit.
(Due to rounding, etc., values below may be only approximate estimates.)
1
CS-171, Intro to A.I. — Final Exam — Winter Quarter, 2016
YOUR NAME:
YOUR ID: ID TO RIGHT: ROW: SEAT:
The exam will begin on the next page. Please, do not turn the page until told.
When you are told to begin the exam, please check first to make sure that you have all 14
pages, as numbered 1-14 in the bottom-right corner of each page (including scratch paper at
the end). We wish to avoid copy problems. We will supply a new exam for any copy problems.
The exam is closed-notes, closed-book. No calculators, cell phones, electronics.
Please turn off all cell phones now. No electronics are allowed at any point of the exam.
Please clear your desk entirely, except for pen, pencil, eraser, a blank piece of paper (for
scratch pad use), and an optional water bottle. Please write your name and ID# on the blank
piece of paper and turn it in with your exam.
This page summarizes the points for each question, so you can plan your time.
1. (10 pts total, -1 for each error, but not negative) Mini-Max, Alpha-Beta Pruning.
2. (10 pts total, -1 pt each wrong answer, but not negative) Search Properties.
3. (4 pts total, 1 pt each) TASK ENVIRONMENT.
4. (15 pts total) Bayesian Networks.
5. (15 pts total) Decision Tree Learning.
6. (15 pts total; full credit for a correct proof, else 2 pts each useful resolution up to 10 pts)
Easter Bunny Resolution Theorem Proving in Propositional Logic.
7. (15 points total, 3 pts each) Constraint Satisfaction Problems.
8. (16 pts total, 2 pts each) English and FOL correspondence.
The Exam is printed on both sides to save trees! Work both sides of each page!
2
3
1. (10 pts total, -1 for each error, but not negative) Mini-Max, Alpha-Beta Pruning. In the game
tree below it is Max's turn to move. At each leaf node is the estimated score of that resulting position
as returned by the heuristic static evaluator.
(1) Perform Mini-Max search and label each branch node with its value.
(2) Put X in the box beneath each leaf node that would be pruned by alpha-beta pruning.
The first one has been done for you, as an example.
(3) What is Max’s best move (A, B, or C)? C
See Section 5.3.
(Max) 4
(Min)
3 (A) 2 (B) 4 (C)
(Max)
3 4 6 2 5 9 4 4 4
3 1 4 2 6 2 4 1 2 2 2 5 3 4 1 5 9 8 4 4 2 3 4 2 1 1 2
X X X X X X X X X X X X X X X X X X X
Put X in the box beneath each leaf node that would be pruned by alpha-beta pruning.
The first one has been done for you, as an example.
2. (10 pts total, -1 pt each wrong answer, but not negative) Search Properties.
Fill in the values of the four evaluation criteria for each search strategy shown. Assume a tree search
where b is the finite branching factor; d is the depth to the shallowest goal node; m is the maximum
depth of the search tree; C* is the cost of the optimal solution; step costs are identical and equal to
some positive ε; and in Bidirectional search both directions use breadth-first search.
Note that these conditions satisfy all of the footnotes of Fig. 3.21 in your book.
See Figure 3.21.
Criterion Complete? Time complexity Space complexity Optimal?
Breadth-First Yes O(b^d) O(b^d) Yes
Uniform-Cost Yes O(b^(1+floor(C*/ε))) O(b^(1+floor(C*/ε))) Yes
O(b^(d+1)) also OK O(b^(d+1)) also OK
Depth-First No O(b^m) O(bm) No
Iterative Deepening Yes O(b^d) O(bd) Yes
Bidirectional Yes O(b^(d/2)) O(b^(d/2)) Yes
(if applicable)
3. (4 pts total, 1 pt each) TASK ENVIRONMENT. Your book defines a task environment as a set of
four things, with the acronym PEAS. Fill in the blanks with the names of the PEAS components.
Performance (measure) Environment Actuators Sensors
**** TURN PAGE OVER AND CONTINUE ON THE OTHER SIDE ****
4
4. (15 pts total) Bayesian Networks.
4.a. (2 pts total, -1 for each error, but not negative) Circle the letters that correspond to all valid
Bayesian Networks in the following figure. (If there is not any valid Bayesian Network, circle None.)
Bayesian Networks: None (a) (b) (c) (d) (e) (f)
(a) (b) (c)
A B A B A B
C D C D C D
(d) (e) (f)
A B A B A B
C D C D C D
4.b. (3 pts, -1 for each error, but not negative) Draw the Bayesian Network that corresponds to the
following probability distribution. (It is a Hidden Markov Model, popular in speech recognition, etc.
By convention, Si = state_i, Oi = observation_i.)
P(S1, S2, S3, O1, O2, O3)
= P(S1) P(S2|S1) P(S3|S2) P(O1|S1) P(O2|S2) P(O3|S3)
S2
S1 S3
O1 O3
O2
**** PROBLEM CONTINUES ON THE NEXT PAGE ****
5
4.c. (10 pts total, 2 pts each) Compute the symbolic posterior probability of P(S2 | O1, O2, O3).
Use the probability distribution given in problem 4.b:
P(S1, S2, S3, O1, O2, O3) = P(S1) P(S2|S1) P(S3|S2) P(O1|S1) P(O2|S2) P(O3|S3)
All the necessary steps are given below, except for six empty places, circled and labeled (a) to (f).
Fill in the six empty places (a) to (f) below. The first one is done for you as an example.
(Aside: The procedure below follows the forward-backward algorithm.)
𝑃�𝑆2 � 𝑂1, 𝑂2, 𝑂3)
𝑃(𝑆2, 𝑂1, 𝑂2, 𝑂3) By Definition of Conditional Probability
=
𝑃(𝑂1, 𝑂2, 𝑂3)
𝑃�𝑆2, 𝑂1, 𝑂2� 𝑃�𝑂3 � 𝑆2, 𝑂1, 𝑂2) Apply Product Rule
=
𝑃(𝑂1, 𝑂2, 𝑂3)
𝑃�𝑆2, 𝑂1, 𝑂2� 𝑃�𝑂3 � (𝒂) ) Apply Conditional Independence
=
𝑃(𝑂1, 𝑂2, 𝑂3)
(example) (a) = S2
𝑃(𝑆2, 𝑂1, 𝑂2)
= � 𝑃(𝑆1, 𝑆2, 𝑂1, 𝑂2) Apply Summation Rule
(𝒃)
= � 𝑃(𝑆1, 𝑂1) 𝑃(𝑆2, 𝑂2 | 𝑆1, 𝑂1) Apply Product Rule
(𝒃)
= � 𝑃(𝑆1, 𝑂1) 𝑃(𝑆2 | 𝑆1, 𝑂1) 𝑃(𝑂2 | 𝑆1, 𝑂1, 𝑆2) Apply Product Rule
(𝒃)
= � 𝑃(𝑆1, 𝑂1) 𝑃(𝑆2 | (𝒄) ) 𝑃(𝑂2 | (𝒅) ) Apply Conditional Independence
(𝒃)
(2 pts) (b) = S1 [s1 ∈ S1 is also OK]
(2 pts) (c) = S1 [s1 is OK if you wrote s1 ∈ S1 in (b) above]
(2 pts) (d) = S2
𝑃(𝑂3 | 𝑆2)
= � 𝑃�𝑆3, 𝑂3 | (𝒂)� Apply Summation Rule
(𝒆)
= � 𝑃(𝑆3 | (𝒂) ) 𝑃(𝑂3 | (𝒂), 𝑆3) Apply Product Rule
(𝒆)
= � 𝑃(𝑆3 | (𝒂) ) 𝑃(𝑂3 | (𝒇) ) Apply Conditional Independence
(𝒆)
(2 pt) (e) = S3 [s3 ∈ S3 is also OK]
(2 pts) (f) = S3 [s3 is OK if you wrote s3 ∈ S3 in (e) above]
**** TURN PAGE OVER AND CONTINUE ON THE OTHER SIDE ****
6
5. (15 pts total) Decision Tree Learning. (Adapted from Prof. Ziv-Bar Joseph, Carnegie Mellon
University, Course 10-701 Machine Learning materials.) NASA wants to discriminate Martians (M)
from Humans (H) based on these features (attributes): Green ∈ {𝑁, 𝑌}, Legs ∈ {2, 3}, Height ∈ {𝑆, 𝑇},
Smelly ∈ {𝑁, 𝑌}. Your available training data is as follows (N = No, Y = Yes, S = Small, T = Tall):
Example Height Green Legs Smelly Target: Please note:
Number Species A human might be
1 S Y 3 Y M green or have three
2 T Y 3 N M legs for many
3 S Y 3 N M possible reasons,
4 T Y 3 N M e.g., if they were an
5 T N 2 Y M actor playing a
Martian as a role in
6 T Y 2 Y H
a film or play.
7 S N 2 N H
Anyway, it’s a
8 T N 3 N H made-up problem
9 S N 3 N H for the test.
10 T N 3 N H
5.a. (6 pts total, 3 pts each) For each possible choice of root feature (attribute) below, show the
resulting species distribution. Give your answer as M* over H*. The first one is done for you, as an example.
(example) Height Green
S T Y N
MM MMM MMMM M
HH HHH H HHHH
Legs Smelly
2 3 Y N
M MMMM MM MMM
HH HHH H HHHH
5.b. (4 pts) Which root feature (attribute) would information gain select as the “best” root feature (i.e.,
the highest information gain)? (No calculator is needed; the numbers have been chosen to be obvious.)
Circle one of: Height Green Legs Smelly
**** PROBLEM CONTINUES ON THE NEXT PAGE ****
7
5.c. (5 pts) Draw the decision tree that results from your choice of root feature (attribute) in 5.b above.
(No calculator is needed; the numbers have been chosen to be obvious.) Your tree will be considered correct if
it is correct for the root feature you chose in 5.b above, even if that answer was wrong.
Green
Y N
Legs Smelly
2 3 Y N
H MMMM M HHHH
**** TURN PAGE OVER AND CONTINUE ON THE OTHER SIDE ****
If you chose Height as the root attribute in 5.b above, then your correct tree is:
Height
S T
Green Green
Y N Y N
Legs/ Legs/
MM HH Smelly Smelly
2/Y 3/N 2/Y 3/N
M H
H M
M H
8
If you chose Legs as the root attribute in 5.b above, then your correct tree is:
Legs
2 3
Green Green
Y N Y N
Smelly/
H Height MMMM HHH
Y/T N/S
M H
If you chose Smelly as the root attribute in 5.b above, then your correct tree is:
Smelly
Y N
Green Green
Y N Y N
Legs/
Height M MMM HHHH
2/S 3/T
M H
9
6. (15 pts total; full credit for a correct proof, else 2 pts each useful resolution up to 10 pts) Easter Bunny
Resolution Theorem Proving in Propositional Logic. (Adapted from http://brainden.com/logic-puzzles.htm.)
Four bunny rabbits played together in the bushes. Two were brown bunny rabbits and two were white
bunny rabbits. You translate this fact into Propositional Logic (in prefix form) as:
/* Bi means bunny i is brown. */
(or (and B1 B2 (¬ B3) (¬ B4)) (and B1 (¬ B2) B3 (¬ B4)) See Section 7.5.2.
(and B1 (¬ B2) (¬ B3) B4) (and (¬ B1) B2 B3 (¬ B4))
(and (¬ B1) B2 (¬ B3) B4) (and (¬ B1) (¬ B2) B3 B4)))
However, none of them could see each other very clearly, and their views were obscured by branches
. Bunny 1 reported, “One of bunnies 2 & 3 is brown and one is white, but I can’t tell which.”
Bunny 2 reported, “One of bunnies 3 & 4 is brown and one is white, but I can’t tell which.”
Bunny 3 reported, “Bunny 4 is brown.”
You translate these facts into Propositional Logic (in prefix form) as:
(or (and B2 (¬ B3)) (and (¬ B2) B3)) (or (and B3 (¬ B4)) (and (¬ B3) B4)) B4
Bunny 1 asks, “Is it true that I am a white bunny rabbit?”
You translate this query into Propositional Logic as “(¬ B1)” and form the negated goal as “B1.” Your
knowledge base (KB) in CNF plus negated goal (in clausal form) is:
(B1 B2 B3) ( (¬ B1) (¬ B2) (¬ B3) )
(B1 B2 B4) ( (¬ B1) (¬ B2) (¬ B4) )
(B1 B3 B4) ( (¬ B1) (¬ B3) (¬ B4) )
(B2 B3 B4) ( (¬ B2) (¬ B3) (¬ B4) )
(B2 B3) ( (¬ B2) (¬ B3) )
(B3 B4) ( (¬ B3) (¬ B4) )
B4 B1
Write a resolution proof that Bunny 1 is a white bunny rabbit.
For each step of the proof, fill in the first two blanks with CNF sentences from KB that will resolve to produce
the CNF result that you write in the third (resolvent) blank. The resolvent is the result of resolving the first two
sentences. Add the resolvent to KB, and repeat. Use as many steps as necessary, ending with the empty clause.
The shortest proof I know of is only four lines long. (A Bonus Point for a shorter proof.)
Resolve B4 with ( (¬ B3) (¬ B4)) to produce: (¬ B3)
Resolve (¬ B3) with (B2 B3) to produce: B2
Resolve B4 with ( (¬ B1) (¬ B2) (¬ B4) ) to produce: ( (¬ B1) (¬ B2) )
Resolve B2 with ( (¬ B1) (¬ B2) ) to produce: (¬ B1)
Resolve (¬ B1) with B1 to produce: ()
You received full credit if yourwith
Resolve proof Two examples of 4-line proofs are:
to produce:
was correct, regardless of length.
Resolve with 1. Resolve (B2 B3) and to
(¬B3 ¬B4) to give (B2 ¬B4)
produce:
2. Resolve (¬B1 ¬B2 ¬B4) and (B2 ¬B4) to give (¬B1 ¬B4)
Resolve with 3. to (B1)
Resolve (¬B1 ¬B4) and produce:
to give (¬B4)
4. Resolve (¬B4) and (B4) to give ( )
Resolve with to produce:
1. Resolve (B1) and (¬B1 ¬B2 ¬B4) to give (¬B2 ¬B4)
Resolve with to produce:
**** TURN PAGE OVER 2.
ANDResolve (¬B2 ¬B4)ON
CONTINUE andTHE
(B2 B3) to give SIDE
OTHER (B3 ¬B4)
****
3. Resolve (B3 ¬B4) and (¬B3 ¬B4) to give (¬B4) 10
4. Resolve (¬B4) and (B4) to give ( )
See Chapter 6.
7. (15 points total, 3 pts each) Constraint Satisfaction Problems.
NV CO
NV
UT CO CA
UT
CA
AZ
NM AZ NM
You are a map-coloring robot assigned to color this Southwest USA map. Adjacent regions must be colored a
different color (R=Red, B=Blue, G=Green). The constraint graph is shown.
7.a. (3 pts total, -1 each wrong answer, but not negative) FORWARD CHECKING. Cross out all
values that would be eliminated by Forward Checking, after variable AZ has just been assigned value
R as shown:
CA NV AZ UT CO NM
X
RGB XRGB R RGB X RGB RGB X
7.b. (3 pts total, -1 each wrong answer, but not negative) ARC CONSISTENCY.
CA and AZ have been assigned values, but no constraint propagation has been done. Cross out all
values that would be eliminated by Arc Consistency (AC-3 in your book).
CA NV AZ UT CO NM
B X X
RGB R RGBXX RGB XRGB X
7.c. (3 pts total, -1 each wrong answer, but not negative) MINIMUM-REMAINING-VALUES
HEURISTIC. Consider the assignment below. NV is assigned and constraint propagation has been
done. List all unassigned variables that might be selected by the Minimum-Remaining-Values (MRV)
Heuristic: CA, AZ, UT .
CA NV AZ UT CO NM
RB G RB RB RGB RGB
7.d. (3 pts total, -1 each wrong answer, but not negative) DEGREE HEURISTIC. Consider the
assignment below. (It is the same assignment as in problem 7.c above.) NV is assigned and
constraint propagation has been done. List all unassigned variables that might be selected by the
Degree Heuristic:. AZ .
CA NV AZ UT CO NM
RB G RB RB RGB RGB
7.e. (3 pts total) MIN-CONFLICTS HEURISTIC. Consider the complete but inconsistent assignment
below. AZ has just been selected to be assigned a new value during local search for a complete and
consistent assignment. What new value would be chosen below for AZ by the Min-Conflicts
Heuristic?. R .
CA NV AZ UT CO NM
B G ? G G B
11
8. (16 pts total, 2 pts each) English and FOL correspondence. For each English sentence below, write the
letter corresponding to its best or closest FOPC (FOL) sentence (wff, or well-formed formula).
8.a (2 pts) D “Every person plays some game.” 8.a “Every person plays some game.”
A. ∀x ∀y Person(x) ∧ Game(y) ∧ Plays(x, y) A. Everything(x,y) is a person(x) and is a game(y) and x plays y.
B, Everything(x) is a person(x) and there is some game(y) and x plays y.
B. ∀x ∃y Person(x) ∧ Game(y) ∧ Plays(x, y) C. If something(x) is a person(x) then everything(y) is a game(y) and x plays y.
C. ∀x ∀y Person(x) ⇒ ( Game(y) ∧ Plays(x, y) ) D. Correct.
D. ∀x ∃y Person(x) ⇒ ( Game(y) ∧ Plays(x, y) )
8.b (2 pts) B “All games are fun.” 8.b “All games are fun.”
A. Everything(x) is a game(x) and is fun(x).
A. ∀x Game(x) ∧ Fun(x) B. Correct.
B. ∀x Game(x) ⇒ Fun(x) C. There is something(x) that is a game(x) and is fun(x).
C. ∃x Game(x) ∧ Fun(x) D. Vacuously true if there is anything(x) that is not a game(x).
D. ∃x Game(x) ⇒ Fun(x)
8.c (2 pts) C “For every game, there is a person that plays that game.”
8.c “For every game, there is a person that plays that game.”
A. ∀x ∃y Game(x) ∧ Person(y) ∧ Plays(y, x) A. Everything(x) is a game and there is some person(y) and y plays x.
B. ∀x ∃y [ Game(x) ∧ Person(y) ] ⇒ Plays(y, x) B. Vacuously true if there is anything(y) that is not a person(y).
C. Correct.
C. ∀x ∃y Game(x) ⇒ [ Person(y) ∧ Plays(y, x) ] D. Everything(x,y) is a game(x) and is a person(y) and y plays x.
D. ∀x ∀y Game(x) ∧ Person(y) ∧ Plays(y, x)
8.d (2 pts) A “Every person plays every game.” 8.d “Every person plays every game.”
A. ∀x ∀y [ Person(x) ∧ Game(y) ] ⇒ Plays(x, y) A. Correct.
B. ∀x ∀y Person(x) ⇒ [ Game(y) ∧ Plays(x, y) ] B. If anything(x) is a person(x) then everything(y) is a game(y) and x plays y.
C. Everything(x,y) is a person(x) and is a game(y) and x plays y.
C. ∀x ∀y Person(x) ∧ Game(y) ∧ Plays(x, y) D. Vacuously true if anything(y) is not a game.
D. ∀x ∃y [ Person(x) ∧ Game(y) ] ⇒ Plays(x, y)
8.e (2 pts) B “There is some person in Irvine who is8.e.
smart.”
“There is some person in Irvine who is smart.”
A. ∀x Person(x) ∧ In(x, Irvine) ∧ Smart(x) A. Everything(x) is a person and is in Irvine(x) and is smart(x).
B. Correct.
B. ∃x Person(x) ∧ In(x, Irvine) ∧ Smart(x) C. If something(x) is a person(x) and is in Irvine(x) then that thing is smart(x).
C. ∀x [ Person(x) ∧ In(x, Irvine) ] ⇒ Smart(x) D. Vacuously true if anything(x) is not a person(x).
D. ∃x Person(x) ⇒ [ In(x, Irvine) ∧ Smart(x) ]
8.f (2 pts) C “Every person in Irvine is smart.” 8.f “Every person in Irvine is smart.”
A. Everything(x) is a person(x) and is in Irvine(x) and is smart(x).
A. ∀x Person(x) ∧ In(x, Irvine) ∧ Smart(x) B. There is something(x) that is a person(x) and is in Irvine(x) and is smart(x).
B. ∃x Person(x) ∧ In(x, Irvine) ∧ Smart(x) C. Correct.
C. ∀x [ Person(x) ∧ In(x, Irvine) ] ⇒ Smart(x) D. Vacuously true if anything(x) is not a person(x).
D. ∃x Person(x) ⇒ [ In(x, Irvine) ] ∧ Smart(x)
8.g (2 pts) D “Some person plays every game.”
8.g “Some person plays every game.”
A. ∃x ∀y [ Person(x) ∧ Game(y) ] ⇒ Plays(x, y) A. Vacuously true if there is anything(x) that is not a person(x)).
B. ∃x ∀y Person(x) ∧ Game(y) ∧ Plays(x, y) B. There is some person(x) and everything(x) is a game(y) and x plays y.
C. Everything(x,y) is a person(x) and is a game(y) and x plays y.
C. ∀x ∀y Person(x) ∧ Game(y) ∧ Plays(x, y) D. Correct.
D. ∃x ∀y Person(x) ∧ [ Game(y) ⇒ Plays(x, y) ]
8.h (2 pts) A “Some person plays some game.” 8.h “Some person plays some game.”
A. ∃x ∃y Person(x) ∧ Game(y) ∧ Plays(x, y) A. Correct.
B. ∃x ∃y [ Person(x) ∧ Game(y) ] ⇒ Plays(x, y) B. Vacuously true if there is anything(x,y) that is not a person(x) or is not a
game(y).
C. ∃x ∃y Person(x) ⇒ [ Game(y) ∧ Plays(x, y) ] C. Vacuously true if there is anything(x) that is not a person(x).
D. ∀x ∀y Person(x) ∧ Game(y) ∧ Plays(x, y) D. Everything(x,y) is a person and is a game(y) and x plays y.
**** THIS IS THE END OF THE FINAL EXAM. HAPPY SPRING BREAK!! ****
12