0% found this document useful (0 votes)
6 views40 pages

KNN Class 2

Uploaded by

nick peak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views40 pages

KNN Class 2

Uploaded by

nick peak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

KNN

Algorithm : k-Nearest Neighbors (kNN)


Algorithm : k-Nearest Neighbors (kNN)

1. Set the Number of Neighbors (k): Decide on the number of nearest neighbors to
consider; for example, k=3 means you will consider the three closest neighbors.
Algorithm : k-Nearest Neighbors (kNN)

1. Set the Number of Neighbors (k): Decide on the number of nearest neighbors to
consider; for example, k=3 means you will consider the three closest neighbors.

2. Calculate Distances : Measure the distance from the test data point to every
labeled point in your dataset
Algorithm : k-Nearest Neighbors (kNN)

1. Set the Number of Neighbors (k): Decide on the number of nearest neighbors to
consider; for example, k=3 means you will consider the three closest neighbors.

2. Calculate Distances : Measure the distance from the test data point to every
labeled point in your dataset

3. Rank the Points: Arrange all the points from the dataset in order of their distance
from the test point, starting with the nearest.
Algorithm : k-Nearest Neighbors (kNN)

1. Set the Number of Neighbors (k): Decide on the number of nearest neighbors to
consider; for example, k=3 means you will consider the three closest neighbors.

2. Calculate Distances : Measure the distance from the test data point to every
labeled point in your dataset

3. Rank the Points: Arrange all the points from the dataset in order of their distance
from the test point, starting with the nearest.

4. Select Top k Points: From this ordered list, pick the top k points that are closest to
the test data point.
5. Prediction by kNN:
5. Prediction by kNN:

1. Classification: Look at the labels of these k nearest points. The label that
appears most frequently among the k points is assigned to the test data
point.
5. Prediction by kNN:

1. Classification: Look at the labels of these k nearest points. The label that
appears most frequently among the k points is assigned to the test data
point.
2. Regression: Calculate the average of the output feature values (y_i) of these
k nearest points. This average is used as the prediction for the test data
point.
Example
Computational Cost
Computational Cost
Decision Boundary
Decision Boundary
Example
Example
Example
Example
Example
Example
Example
Example
Example
Example
Normalization
Normalization

You might also like