0% found this document useful (0 votes)
297 views6 pages

Solved Example K Nearest Neighbors Algorithm Weighted KNN To Classify New Instance by Mahesh Huddar

The video discusses how to classify a new data point using the k-nearest neighbors algorithm on a sample dataset. It shows calculating the Euclidean distance from the new point to existing points, ranking points by distance, and assigning the new point to the class of its k nearest neighbors. It then improves on this by calculating the inverse distance as a weight, so closer neighbors have more influence, in a weighted k-nearest neighbors approach.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
297 views6 pages

Solved Example K Nearest Neighbors Algorithm Weighted KNN To Classify New Instance by Mahesh Huddar

The video discusses how to classify a new data point using the k-nearest neighbors algorithm on a sample dataset. It shows calculating the Euclidean distance from the new point to existing points, ranking points by distance, and assigning the new point to the class of its k nearest neighbors. It then improves on this by calculating the inverse distance as a weight, so closer neighbors have more influence, in a weighted k-nearest neighbors approach.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 6

0:00

video i will discuss how to apply k


0:02
nearest neighbor algorithm to classify a
0:04
new instance
0:06
based on the given data set
0:09
this is the solved example number four
0:11
the link for other examples is given in
0:13
the description below
0:16
this is the data set given to us it has
0:18
four instances i1 i2 i3 and i4
0:22
x1 and x2 are the two features in this
0:24
case
0:25
output is the target variable
0:28
the output has two possibilities either
0:30
zero or one in this case
0:32
we have to calculate the or we need to
0:35
classify this new instance that is uh t1
0:37
is equivalent to three comma seven
0:40
into one of the classes by considering
0:42
the number of neighbors as three here
0:46
to do this one first what we do is we
0:48
will calculate the distance
0:50
from this particular new instance
0:53
to the existing data points
0:56
uh we use uh euclidean distance in this
0:58
case
0:59
the distance calculation looks something
1:02
like this
1:03
first what we do is we calculate the
1:04
distance from new instance to the
1:06
existing point that is always equivalent
1:09
to
1:10
for this particular first instance
1:13
square root of
1:14
7 minus 3 bracket square
1:17
plus 7 minus 7 bracket square here so
1:20
that is what i have written which is
1:21
equivalent to 4 in this case
1:23
similarly with respect to second
1:25
instance
1:26
7 minus 3 bracket square plus 4 minus 7
1:30
bracket squared so that is what i have
1:31
written here and the answer is 5 here
1:34
with respect to the third one
1:36
square root of 3 minus 3 bracket square
1:38
plus 4 minus 7 bracket square this is
1:40
what the case and the answer is 3 here
1:43
the last one is
1:44
square root of 1 minus 3 bracket square
1:46
plus 4 minus 7 bracket square
1:49
answer is 3.6
1:51
so once you calculate the
1:52
distance with respect to the new
1:55
instance
1:56
what we need to do is we need to assign
1:58
ranks to this particular training
2:00
instances
2:02
based on the distance so the minimum
2:04
distance in this case is 3 hence we
2:06
assign the rank to this one as one so
2:09
the third instance is rank one with
2:11
respect to any instance
2:13
similarly 3.6 is the second
2:16
smallest value so we assign 2 as the
2:18
rank to this one that is i4 is the
2:21
second nearest neighbor to the new
2:23
instance
2:24
similarly the i1 will be assigned a rank
2:27
3 i2 is assigned a rank 4 in this case
2:31
now once you assign the neighbor rank to
2:33
these particular instances we have to
2:35
check what is the value given to us that
2:37
is the
2:38
k value
2:39
so k value in this case is 3 here so we
2:43
need to identify the three nearest
2:45
neighbors so the first nearest neighbor
2:47
is this one i3 second nearest neighbor
2:49
is i4
2:50
third nearest neighbor is i1 because we
2:53
have assigned the rank over here
2:55
now we have to check the output variable
2:57
with respect to this one so the output
2:59
variable is
3:01
0 here 1 here one here and the majority
3:05
in this case is one hence the answer in
3:08
this case is one or you can say that the
3:10
new instance is classified as one in
3:13
this case
3:14
so this is a simple processor we follow
3:17
so that we can classify a new instance
3:19
into one of the classes here
3:21
but what has happened here is we have
3:23
calculated the distance
3:25
and we have
3:26
assigned the rank to these particular
3:28
instances but we haven't considered the
3:30
actual value of this particular the
3:32
distance
3:34
actually what should happen here is the
3:35
value or can say that if the distance is
3:37
very small
3:39
at that should be given the maximum
3:41
weightage
3:42
so for that reason what we do here is we
3:45
calculate something called as
3:47
the inverse square distance that will be
3:50
is considered as a weight for that
3:51
particular uh instance
3:54
that is also called as a distance
3:55
weighted nearest neighbor algorithm
3:58
so till here the procedure is same
4:01
we calculate the square distance first
4:03
that is 4 square phi square 3 square 3.6
4:06
packet square
4:07
which is equivalent to 16 25 9 and 12.96
4:12
and then we calculate the inverse of
4:14
this uh distance square that is 1 by 16
4:16
which is equivalent to 0.06
4:18
1 by 25 which is equivalent to 0.04
4:21
1 by 9 is equivalent to 0.11
4:24
1 divided by 12.96 which is equivalent
4:27
to 0.08
4:28
now if you'll notice here the minimum
4:31
distance is 3
4:32
and the value or the inverse square
4:34
distance value is more for this one the
4:36
meaning is minimum the distance maximum
4:39
the weight assigned to that particular
4:40
instance here
4:42
similarly 3.6 is the second minimum we
4:44
got the maximum
4:47
what you can say vote for this
4:48
particular instance also so once you get
4:51
this particular the vote for each and
4:54
every instance we need to assign rank to
4:56
this particular instances based on this
4:58
weight
4:59
the one which is having the maximum vote
5:02
that is nothing but the more nearest
5:04
neighbor here so in this case the point
5:06
one one is having the maximum value so
5:08
this will be assigned a rank one second
5:11
highest is point zero eight it will be
5:14
assigned rank two
5:15
third one is uh point zero six which
5:18
will be assigned rank three point zero
5:19
four is the minimum which will be
5:21
assigned a rank uh 4 in this case
5:24
again we have to
5:26
check the majority vote based on this
5:28
particular rank this is the first
5:30
nearest neighbor second nearest neighbor
5:33
third nearest neighbor because the k
5:34
value is equal to 3 again
5:37
so if you look at this particular
5:38
majority one is the maximum time it
5:41
appearing in this case so we classify
5:43
the new example again as one in this
5:45
case
5:46
so i hope you understood the procedure
5:48
how can we apply the k nearest neighbor
5:50
algorithm as well as a weighted
5:52
k-nearest neighbor algorithm
5:54
to find or can say that to classify a
5:56
new instance into one of the classes
5:58
here
5:59
if you like the video do like and share
6:01
with your friends
6:02
press the subscribe button for more
6:04
videos press the bell icon for regular
6:06
updates thank you for watching

You might also like