0% found this document useful (0 votes)
34 views3 pages

ML3 Unit 3

Case-Based Reasoning classifiers use a database of problem solutions to solve new problems by searching for similar past cases and combining their solutions, with applications in customer service, engineering, law, and medicine. Locally Weighted Regression is a non-parametric algorithm that performs linear regression using only nearby training data for each prediction, while Radial Basis Function networks apply kernel functions to transform data into higher dimensions to allow for powerful nonlinear classification and regression.

Uploaded by

ISHAN SRIVASTAVA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views3 pages

ML3 Unit 3

Case-Based Reasoning classifiers use a database of problem solutions to solve new problems by searching for similar past cases and combining their solutions, with applications in customer service, engineering, law, and medicine. Locally Weighted Regression is a non-parametric algorithm that performs linear regression using only nearby training data for each prediction, while Radial Basis Function networks apply kernel functions to transform data into higher dimensions to allow for powerful nonlinear classification and regression.

Uploaded by

ISHAN SRIVASTAVA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Case Based Reasoning (CBR) Classifier

As we know Nearest Neighbour classifiers stores training tuples as points in Euclidean


space. But Case-Based Reasoning classifiers (CBR) use a database of problem solutions to
solve new problems. It stores the tuples or cases for problem-solving as complex symbolic
descriptions.

How CBR works?


1. When a new case arrises to classify, a Case-based Reasoner(CBR) will first check if an identical
training case exists.

2. If one is found, then the accompanying solution to that case is returned.

3. If no identical case is found, then the CBR will search for training cases having components
that are similar to those of the new case.

4. Conceptually, these training cases may be considered as neighbours of the new case.

5. If cases are represented as graphs, this involves searching for subgraphs that are similar to
subgraphs within the new case.

6. The CBR tries to combine the solutions of the neighbouring training cases to propose a solution
for the new case.

7. If compatibilities arise with the individual solutions, then backtracking to search for other
solutions may be necessary.

8. The CBR may employ background knowledge and problem-solving strategies to propose a
feasible solution.

Applications of CBR includes:


1. Problem resolution for customer service help desks, where cases describe product-
related diagnostic problems.

2. It is also applied to areas such as engineering and law, where cases are either technical
designs or legal rulings, respectively.

3. Medical educations, where patient case histories and treatments are used to help
diagnose and treat new patients.

Challenges with CBR


1. Finding a good similarity metric (eg for matching subgraphs) and suitable methods
for combining solutions.

2. Selecting salient features for indexing training cases and the development of efficient
indexing techniques.
Locally Weighted Regression(LWR)

a. Linear regression is a supervised learning algorithm used for computing linear


relationships between input (X) and output (Y).

b. Locally weighted linear regression is a supervised learning algorithm.


c. It a non-parametric algorithm.

d. There exists No training phase. All the work is done during the testing
phase/while making predictions.

e. Model-based methods, such as neural networks and the mixture of Gaussians,


use the data to build a parameterized model. After training, the model is used
for predictions and the data are generally discarded.

f. In contrast, ``memory-based'' methods are non-parametric approaches that


explicitly retain the training data, and use it each time a prediction needs to be
made.

g. Locally weighted regression (LWR) is a memory-based method that performs a


regression around a point of interest using only training data that are ``local'' to
that point.

h. One recent study demonstrated that LWR was suitable for real-time control by
constructing an LWR-based system that learned a difficult juggling task.

FIG 1: In locally weighted regression, points are weighted by proximity to the


current x in question using a kernel. A regression is then computed using the weighted
points.
Radial basis function networks
Radial Basis Kernel is a kernel function that is used in machine learning to find a non-linear
classifier or regression line.

What is Kernel Function?

Kernel Function is used to transform n-dimensional input to m-dimensional input, where m is


much higher than n then find the dot product in higher dimensional efficiently.

The main idea to use kernel is: A linear classifier or regression curve in higher dimensions
becomes a Non-linear classifier or regression curve in lower dimensions.

Mathematical Definition of Radial Basis Kernel:

where x, x’ are vector point in any fixed dimensional space.


If we apply any of the algorithms like perceptron Algorithm or linear regression on this
kernel, actually we would be applying our algorithm to new infinite-dimensional datapoint
we have created.
Hence it will give a hyperplane in infinite dimensions, which will give a very strong non-
linear classifier or regression curve after returning to our original dimensions.
Radial Basis kernel is a very powerful kernel, which can give a curve fitting any complex
dataset.

Why Radial Basis Kernel Is much powerful?


The main motive of the kernel is to do calculations in any d-dimensional space where d > 1,
so that we can get a quadratic, cubic or any polynomial equation of large degree for our
classification/regression line. Since Radial basis kernel uses exponent and as we know the
expansion of e^x gives a polynomial equation of infinite power, so using this kernel, we
make our regression/classification line infinitely powerful too.

You might also like