ML 4
ML 4
Tuesday (10am-11am)
Wednesday (10am-11am & 3pm-4pm)
Friday (9am-10am, 11am-12am, 2pm-3pm)
Dr. Srinivasa L. Chakravarthy
&
Smt. Jyotsna Rani Thota
Department of CSE
GITAM Institute of Technology (GIT)
Visakhapatnam – 530045
Email: [email protected] & [email protected]
Department of CSE, GIT 1
2 Novt 2020
EID 403 and machine learning
Course objectives
20 August 2020 4
Chapter -8
Case-Based Reasoning
7
Introduction
Instance based learning methods different from the learning methods which classify
target function when training examples are provided.
Where as Instance based learning methods store the training data and when a query
instance is encountered, then a set of similar related instances is retrieved from memory
which can be used to classify new query instance.
nearest neighbor,
● An also When the target function is very complex, but can still described by
collection of less complex local approximations.
● Instance based methods can also use more complex and symbolic
representations for instances.
Introduction
1. Cost of classifying new instances can be high,due to the fact that all
computation takes place at classification time only.
2. They typically consider all attributes of the instances when attempting to
retrieve similar training examples from memory,
Where ar(x) denotes the value of rth attribute of instance x. Then the distance
between two instances xi and xj is defined as d(xi,xj,)
Where
K-Nearest neighbor Learning(cont.)
Training algorithm-For each training example <x,f(x)> add example to the list
training examples.
Classification algorithm-
K-Nearest neighbor Learning(cont.)
Classification algorithm(cont.)
Voronoi diagram-(cont.)
Note-
Where,
K-Nearest neighbor Learning(cont.)
A note on terminology-
Wi = K(d(xi,xq)).
Locally weighted Regression
In KNN, it is approximating the target function f(x) at single query point x=xq.
● Local because the function is approximated based on data near query point,
1. The cost of fitting more complex functions for each query instance is high.
2. These approximations model the target function quite well over a sufficiently
small subregion of instance space.
Radial Basis Functions
One approach to function approximation that is closely related to
distance-weighted regression but “eager” instead of “lazy” and also to artificial
neural networks learning with radial basis functions.
1. They are lazy learning methods in that they defer the decision of how to
generalize beyond the training data until a new query instance is observed.
2. They classify new query instances by analyzing similar instances while
ignoring instances which are different from the query.
3. They represent instances as real-valued points in an n-dimensional
euclidean space.