Chapter 11
Chapter 11
Chapter 11
Machine Learning
What is learning?
2
Machine Learning
What is learning?
3
Machine Learning
4
Machine Learning
Learning problem:
Task T
Performance measure P
Training experience E
5
Machine Learning
Chess game:
Task T: playing chess games
Performance measure P: percent of games won against
opponents
Training experience E: playing practice games againts itself
6
Machine Learning
Handwriting recognition:
Task T: recognizing and classifying handwritten words
Performance measure P: percent of words correctly
classified
Training experience E: handwritten words with given
classifications
7
Designing a Learning System
8
Designing a Learning System
9
Designing a Learning System
10
Designing a Learning System
11
Designing a Learning System
Chess game:
Task T: playing chess games
Performance measure P: percent of games won against
opponents
Training experience E: playing practice games againts itself
Target function: V: Board R
12
Designing a Learning System
Chess game:
Target function representation:
V^(b) = w0 + w1x1 + w2x2 + w3x3 + w4x4 + w5x5 + w6x6
What is learning?
15
Designing a Learning System
Learning is an (endless) generalization or induction
process.
16
Designing a Learning System
Experiment
New problem Generator Hypothesis
(initial board) (V^)
Performance
Generalizer
System
17
Issues in Machine Learning
When and how prior knowledge can guide the learning process?
What is the best way to reduce the learning task to one or more
function approximation problems?
18
Example
Experience
Example Sky AirT
emp Humidity Wind Water Forecast EnjoySport
Low Weak
Prediction
5 Rainy Cold High Strong Warm Change ?
6 Sunny Warm Normal Strong Warm Same ?
7 Sunny Warm Low Strong Cool Same ?
19
Example
Learning problem:
Task T:classifying days on which my friend enjoys water sport
Performance measure P: percent of days correctly classified
Training experience E: days with given attributes and classifications
20
Concept Learning
21
Concept Learning
Learning problem:
Target concept: a subset of the set of instances X
c: X {0, 1}
Target function:
Sky AirT
emp Humidity Wind Water Forecast {Y
es, No}
Hypothesis:
Characteristics of all instances of the concept to be learned
Constraints on instance attributes
h: X {0, 1}
22
Concept Learning
Satisfaction:
h(x) = 1 iff x satisfies all the constraints of h
h(x) = 0 otherwsie
Consistency:
h(x) = c(x) for every instance x of the training examples
Correctness:
h(x) = c(x) for every instance x of X
23
Concept Learning
24
Concept Learning
25
Concept Learning
Specific
h1 = <Sunny,?, ?, Strong, ? , ?>
h2 = <Sunny,?, ?, ? , ? , ?>
h1 h3
h3 = <Sunny,?, ?, ? , Cool, ?>
Lattice
(Partial order) h2
General
H
26
FIND-S
Example Sky AirT
emp Humidity Wind Water Forecast EnjoySport
h=< , , , , , >
h = <Sunny,Warm, Normal, Strong, Warm, Same>
h = <Sunny,Warm, ? , Strong, Warm, Same>
h = <Sunny,Warm, ? , Strong, ? , ? >
27
FIND-S
28
FIND-S
Example Sky AirT
emp Humidity Wind Water Forecast EnjoySport
30
FIND-S
The result is consistent with the positive training examples.
31
FIND-S
Is the result is consistent with the negative training examples?
32
FIND-S
Example Sky AirT
emp Humidity Wind Water Forecast EnjoySport
33
FIND-S
The result is consistent with the negative training examples if
the target concept is contained in H (and the training examples
are correct).
34
FIND-S
The result is consistent with the negative training examples if
the target concept is contained in H (and the training examples
are correct).
35
FIND-S
Questions:
Has the learner converged to the target concept, as there can be
several consistent hypotheses (with both positive and negative
training examples)?
36
List-then-Eliminate Algorithm
Algorithm:
Initial version space = set containing every hypothesis in H
For each training example <x, c(x)>, remove from the version space
any hypothesis h for which h(x) c(x)
Output the hypotheses in the version space
37
List-then-Eliminate Algorithm
38
Compact Representation of
Version Space
39
Compact Representation of
Version Space
Version space = <G, S> = {hH | gG sS: g g h g s}
40
Candidate-EliminationAlgorithm
Example Sky AirT
emp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same eYs
2 Sunny Warm High Strong Warm Same eYs
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change eYs
S0 = {<, , , , , >}
G0 = {<?, ?, ?, ?, ?, ?>}
S
S1 = {<Sunny, Warm, Normal, Strong, Warm, Same>}
G1 = {<?, ?, ?, ?, ?, ?>}
42
Candidate-EliminationAlgorithm
43
Candidate-EliminationAlgorithm
44
Candidate-EliminationAlgorithm
45
Candidate-EliminationAlgorithm
The version space will converge toward the correct target
concepts if:
H contains the correct target concept
There are no errors in the training examples
46
Candidate-EliminationAlgorithm
Partially learned concept can be used to classify new
instances using the majority rule.
S4 = {<Sunny,Warm, ?, Strong, ?, ?>}
<Sunny,?, ?, Strong, ?, ?> <Sunny,Warm, ?, ?, ?, ?> <?, Warm, ?, Strong, ?, ?>
G4 = {<Sunny,?, ?, ?, ?, ?>, <?, Warm, ?, ?, ?, ?>}
48
Inductive Bias
49
Inductive Bias
h(x) (x = x 1) (x = x 2) (x = x 3) x 1 x 2 x3
h(x) (x x 4) (x x 5) x4 x5
50
Inductive Bias
x1 x 2 x 3
x1 x 2 x 3 x 6
x4 x5
Any new instance x is classified positive by half of the version space, and
negative by the other half
not classifiable
51
Inductive Bias
Example Day Actor Price EasyTicket
2 Bad High No
3 Good High ?
4 Bad Low ?
53
Inductive Bias
54
Homework
55