0% found this document useful (0 votes)
52 views5 pages

K-Nearest Neighbour's Algorithm

K-Nearest Neighbors (KNN) is a simple supervised machine learning algorithm that can perform complex classification tasks. It does not have a training phase, but rather uses all data for both training and classification. To classify a new data point, KNN calculates the distance to all other data points and selects the K nearest neighbors, then assigns the new data point to the class most common among its K neighbors.

Uploaded by

Sourav Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views5 pages

K-Nearest Neighbour's Algorithm

K-Nearest Neighbors (KNN) is a simple supervised machine learning algorithm that can perform complex classification tasks. It does not have a training phase, but rather uses all data for both training and classification. To classify a new data point, KNN calculates the distance to all other data points and selects the K nearest neighbors, then assigns the new data point to the class most common among its K neighbors.

Uploaded by

Sourav Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 5

K-Nearest Neighbour's

Algorithm
KNN Brief

• The K-nearest neighbors (KNN) algorithm is a type of supervised machine learning


algorithms.
• KNN is extremely easy to implement in its most basic form, and yet performs quite
complex classification tasks.
• It is a lazy learning algorithm since it doesn't have a specialized training phase.
Rather, it uses all of the data for training while classifying a new data point or
instance.
• KNN is a non-parametric learning algorithm, which means that it doesn't assume
anything about the underlying data
KNN Theory

• The intuition behind the KNN algorithm is one of the simplest of all the
supervised machine learning algorithms.
• It simply calculates the distance of a new data point to all other training data
points. The distance can be of any type.
• It then selects the K-nearest data points, where K can be any integer. Finally
it assigns the data point to the class to which the majority of the K data
points belong.
Pro’s Of KNN

• It is extremely easy to implement


• As said earlier, it is lazy learning algorithm and therefore requires no training
prior to making real time predictions. This makes the KNN algorithm much
faster than other algorithms that require training e.g SVM, linear regression,
etc.
• Since the algorithm requires no training before making predictions, new
data can be added seamlessly.
• There are only two parameters required to implement KNN i.e. the value of
K and the distance function
Con’s Of KNN

• The KNN algorithm doesn't work well with high dimensional data because
with large number of dimensions, it becomes difficult for the algorithm to
calculate distance in each dimension.
• The KNN algorithm has a high prediction cost for large datasets. This is
because in large datasets the cost of calculating distance between new
point and each existing point becomes higher.
• Finally, the KNN algorithm doesn't work well with categorical features since
it is difficult to find the distance between dimensions with categorical
features

You might also like