ML 1
ML 1
MACHINE LEARNING
1
Vaageswari College of Engineering
Department of CSE
2
Vaageswari College of Engineering
Department of CSE
3
Vaageswari College of Engineering
Department of CSE
4
Vaageswari College of Engineering
Department of CSE
Learning ?
Definition
5
Vaageswari College of Engineering
Department of CSE
7
Vaageswari College of Engineering
Department of CSE
8
Vaageswari College of Engineering
Department of CSE
9
Vaageswari College of Engineering
Department of CSE
Reinforcement learning
Reinforcement learning works on a feedback-based
process, in which an AI agent (A software component)
automatically explore its surrounding by hitting & trail,
taking action, learning from experiences, and improving
its performance. 10
Vaageswari College of Engineering
Department of CSE
11
Vaageswari College of Engineering
Department of CSE
• Machine Learning?
Improve over task T.
With respect to performance measure P.
Based on experience E.
What are T, P, E ?
How do we formulate a machine learning problem?
12
Vaageswari College of Engineering
Department of CSE
Task T:
Playing checkers
Performance measure P:
Percent of games won against opponents
Training experience E:
Playing practice games against itself
13
Vaageswari College of Engineering
Department of CSE
14
Vaageswari College of Engineering
Department of CSE
15
Some disciplines Vaageswari College of Engineering
Department of CSE
16
Vaageswari College of Engineering
Department of CSE
19
Vaageswari College of Engineering
Department of CSE
20
Vaageswari College of Engineering
Department of CSE
22
Vaageswari College of Engineering
Department of CSE
24
Vaageswari College of Engineering
Choices in designing the Department of CSE
25
Vaageswari College of Engineering
Department of CSE
26
Vaageswari College of Engineering
Department of CSE
27
Vaageswari College of Engineering
Department of CSE
1. Image Recognition:
Image recognition is used to identify objects, persons, places,
digital images, etc. The popular use case of image recognition and
face detection. Ex: Face Recognition System
2. Speech Recognition:
Speech recognition is a process of converting voice instructions
into text, and it is also known as "Speech to text", or "Computer
speech recognition. Ex: Google Speech recognition, Ex:
SPHINX system
3. Traffic prediction
It predicts the traffic conditions such as whether traffic is cleared,
slow-moving, or heavily congested with the help of two ways.
Ex: Using Google Maps for Traffic Predictions 28
Vaageswari College of Engineering
Department of CSE
4. Product Recommendations
Machine learning is widely used by various e-commerce companies
such as Amazon, Flipkart, for product recommendation to the user.
5. Self Driving Cars
One of the most exciting applications of machine learning is self-
driving cars. Machine learning plays a significant role in self-driving
cars. Ex: Learning to drive an autonomous vehicle( ALVINN system)
6. Email Spam and Malware Filtering
Whenever we receive a new email, it is filtered automatically as
important, normal, and spam
7. Personal Assistants
Personal Assistants help us in finding the information using our voice
instruction. These assistants can help us in various ways just by our
29
voice instructions such as Play music, call someone. Ex: Siri, Alexa
Vaageswari College of Engineering
Department of CSE
Concept Learning
31
32
Vaageswari College of Engineering
Department of CSE
33
Vaageswari College of Engineering
Department of CSE
34
Vaageswari College of Engineering
Department of CSE
35
Vaageswari College of Engineering
Department of CSE
36
Vaageswari College of Engineering
Department of CSE
37
Vaageswari College of Engineering
Department of CSE
38
Vaageswari College of Engineering
Department of CSE
39
Vaageswari College of Engineering
Department of CSE
• The goal of this search is to find the hypothesis that best fits the
training examples.
Example:
The instances X and hypotheses H in the EnjoySport learning task.
40
Vaageswari College of Engineering
Department of CSE
• Given that the attribute Sky has three possible values, and that
AirTemp, Humidity, Wind, Water, and Forecast each have two possible
values, the instance space X contains exactly 3 .2 2 .2 2 .2 = 96
distinct instances.
• A similar calculation shows that there are 5.4.4 .4 .4.4 = 5 120
Syntactically distinct hypotheses within H
• Every hypothesis containing one or more “ ø" symbols
represents the empty set of instances; that is, it classifies
every instance as negative.
41
Vaageswari College of Engineering
General-to-Specific Department of CSE
Ordering of Hypotheses
• Many algorithms for concept learning organize the search through the
hypothesis space by relying on a very useful structure that exists for any
concept learning problem: a general-to-specific ordering of hypotheses.
• To illustrate the general-to-specific ordering, consider the two
hypotheses
• h1 = (Sunny, ?, ?, Strong, ?, ?)
• h2 = (Sunny, ?, ?, ?, ?, ?)
• Now consider the sets of instances that are classified positive by hl and
by h2
• Because h2 imposes fewer constraints on the instance, it classifies more
instances as positive. In fact, any instance classified positive by hl will
also be classified positive by h2. Therefore, we say that h2 is more
general than h1
42
Vaageswari College of Engineering
Department of CSE
43
Vaageswari College of Engineering
Department of CSE
44
Vaageswari College of Engineering
Department of CSE
• Consider the three hypotheses hl, h2, and h3 from our Enjoysport
example, shown in Figure.
45
Vaageswari College of Engineering
FIND-S Algorithm Department of CSE
46
Vaageswari College of Engineering
Department of CSE
• FIND-S Algorithm
1. Initialize h to most specific hypothesis
h= {‘𝚽’, ‘𝚽’,‘𝚽’, ‘𝚽’, --- ‘𝚽’, }
47
Vaageswari College of Engineering
Department of CSE
FIND-S Algorithm
48
Vaageswari College of Engineering
Department of CSE
49
Vaageswari College of Engineering
Department of CSE
• Upon observing the first training example from Table, which happens to be a
positive example, it becomes clear that our hypothesis is too specific.
• so each is replaced by the next more general constraint {hat fits the example;
namely, the attribute values for this training example.
h-> (Sunny, Warm, Normal, Strong, Warm, Same)
This h is still very specific
• the second training example (also positive in this case) forces the algorithm to
further generalize h, this time substituting a "?' in place of any attribute value in
h that is not satisfied by the new example.
51
Vaageswari College of Engineering
Department of CSE
52
Exercise
53
VERSION SPACES & Vaageswari College of Engineering
Department of CSE
CANDIDATE-ELIMINATION
ALGORITHM
54
Vaageswari College of Engineering
Department of CSE
Version Space:
S= {‘𝚽’, ‘𝚽’,‘𝚽’, ‘𝚽’, --- ‘𝚽’ } G={ ‘?’, ‘?’, ‘?’, --- ‘?’}
55
Vaageswari College of Engineering
Vaageswari College
Department of of
CSEEngineerin
Department of CSE
56
Vaageswari College of Engineering
Department of CSE
Algorithm
1. Initialize G and S as most general and specific
hypothesis.
2. For each example e:
if e is Positive:
Make specific hypothesis More general
else:
Make general hypothesis More specific
58
Vaageswari College of Engineering
Department of CSE
DATA SET:
CONCEPT: Days on which person enjoys Water sport
Step-4:
S3= {Sunny, Warm, ?, Strong, Warm, Sam}
G3= {< ‘Sunny’, ‘?’, ‘?’, ‘?’, ‘?’, ‘?’>, < ‘?’, ‘Warm’, ‘?’, ‘?’, ‘?’, ‘?’>, < ‘?’, ‘?’, ‘Normal’, ‘?’, ‘?’,‘?’>
< ‘?’, ‘?’, ‘?’, ‘?’, ‘Cool’, ‘?’>, < ‘?’, ‘?’, ‘?’, ‘?’, ‘?’, ‘Same’>}
G3= { < ‘Sunny’, ‘?’, ‘?’, ‘?’, ‘?’, ‘?’>, < ‘?’, ‘Warm’, ‘?’, ‘?’, ‘?’, ‘?’, >, < ‘?’, ‘?’, ‘?’, ‘?’, ‘?’, ‘Same’>}
Step-5:
S4= {Sunny, Warm, ?, Strong, ?, ?}
G4= { < ‘Sunny’, ‘?’, ‘?’, ‘?’, ‘?’, ‘?’>, < ‘?’, ‘Warm’, ‘?’, ‘?’, ‘?’, ‘?’>}
60
Vaageswari College of Engineering
Department of CSE
< ‘Sunny’, ‘?’, ‘?’, ‘Strong’, ‘?’, ‘?’, > < ‘Sunny’, ‘Warm’, ‘?’, ‘?’, ‘?’, ‘?’> < ‘?’, ‘Warm’, ‘?’, ‘?’, ‘Strong’, ‘?’, ‘?’>
G4= { < ‘Sunny’, ‘?’, ‘?’, ‘?’, ‘?’, ‘?’, ‘?’>, < ‘?’, ‘Warm’, ‘?’, ‘?’, ‘?’, ‘?’, ‘?’>}
61
Vaageswari College of Engineering
Department of CSE
62
Vaageswari College of Engineering
Department of CSE
Definition:
63
Vaageswari College of Engineering
Department of CSE
64
Vaageswari College of Engineering
Department of CSE
ID3 Algorithm:
65
Vaageswari College of Engineering
Department of CSE
Data Set:
66
Vaageswari College of Engineering
Department of CSE
67
Vaageswari College of Engineering
Department of CSE
68
Vaageswari College of Engineering
Department of CSE
69
Vaageswari College of Engineering
Department of CSE
70
Vaageswari College of Engineering
Department of CSE
71
Vaageswari College of Engineering
Department of CSE
72
Vaageswari College of Engineering
Department of CSE
73
Vaageswari College of Engineering
Department of CSE
74
Vaageswari College of Engineering
Department of CSE
75
Vaageswari College of Engineering
Department of CSE
76
Vaageswari College of Engineering
Department of CSE
77
Vaageswari College of Engineering
Department of CSE
78
Vaageswari College of Engineering
Department of CSE
79
Vaageswari College of Engineering
Department of CSE
80
Vaageswari College of Engineering
Department of CSE
81
Vaageswari College of Engineering
Department of CSE
82
Vaageswari College of Engineering
Department of CSE
83
Vaageswari College of Engineering
Department of CSE
84
Vaageswari College of Engineering
Department of CSE
85
Vaageswari College of Engineering
Department of CSE
86
Vaageswari College of Engineering
Department of CSE
87
Vaageswari College of Engineering
Department of CSE
88
Vaageswari College of Engineering
Department of CSE
89
Vaageswari College of Engineering
Department of CSE
90
Vaageswari College of Engineering
Department of CSE
91
Vaageswari College of Engineering
Department of CSE
92
Vaageswari College of Engineering
Department of CSE
93
Vaageswari College of Engineering
Department of CSE
94
Vaageswari College of Engineering
Department of CSE
95
Vaageswari College of Engineering
Department of CSE
96
Vaageswari College of Engineering
Department of CSE
97
Vaageswari College of Engineering
Alternative Measures for Department of CSE
Selecting Attributes
• One way to avoid this difficulty is to select decision
attributes based on some measure other than
information gain
• One alternative measure that has been used
successfully is the gain ratio
• split information, is sensitive to how broadly and
uniformly the attribute splits the data
98
Handling Training Example Vaageswari College of Engineering
Department of CSE
99
Vaageswari College of Engineering
Handling Attributes with Department of CSE
Differing Costs
• In some learning tasks the instance attributes may
have associated costs.
• ID3 can be modified to take into account attribute
costs by introducing a cost term into the attribute
selection measure.
• we might divide the Gain by the cost of the attribute,
so that lower-cost attributes would be preferred
• While such cost-sensitive measures do not guarantee
finding an optimal cost-sensitive decision tree, they
do bias the search in favor of low-cost attributes
100