0% found this document useful (0 votes)
6 views3 pages

Expt 5

The experiment applied various nearest neighbor classification techniques to classify a test point P(3,2) using an 18-point dataset. The results showed that the standard NN, KNN with K=5 and K=7, R-NN, and MKNN methods consistently assigned the same class to the test point. Each method utilized different approaches to determine the classification, but all converged on the same result.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

Expt 5

The experiment applied various nearest neighbor classification techniques to classify a test point P(3,2) using an 18-point dataset. The results showed that the standard NN, KNN with K=5 and K=7, R-NN, and MKNN methods consistently assigned the same class to the test point. Each method utilized different approaches to determine the classification, but all converged on the same result.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 3

Bansilal RamnathAgarwal Charitable Trust’s

VISHWAKARMA INSTITUTE OF TECHNOLOGY – PUNE


Department of SY Common

MD2201: Data Science


Name of the student: Nikhil Kumar Roll No. 39

Div: C Batch: 2

Date of performance: 19-03-2025

Experiment No.5

Title: Classifier.

Aim: To apply Nearest Neighbor algorithm.

Software used: Programming language R.

Code Statement:
Consider 18 points data set referred in theory class. Consider test sample P(3,2). Apply following
algorithms to find the class of this test point.
i. NN
ii. KNN with K=5 and K=7.
iii. R-NN Radius based algorithm with radius as 1.45 units.
iv. MKNN with K=5

Code:
data = read.csv("knn1_csv.csv")

eucld = sqrt((3-data$x)**2 + (2-data$y)**2)

f1 = cbind(data,eucld)

sortedData = f1[order(f1$eucld),]

cat("\nClass of P with NN is: ",sortedData[1,4])

df5 = sortedData[1:5,]

l1 = sum(df5$class == 1)
l2 = sum(df5$class == 2)
l3 = sum(df5$class == 3)
Bansilal RamnathAgarwal Charitable Trust’s
VISHWAKARMA INSTITUTE OF TECHNOLOGY – PUNE
Department of SY Common

if(l1>l2 & l1>l3){


cat("\nClass of P with K = 5 is 1");
}
if(l2>l1 & l2>l3){
cat("\nClass of P with K = 5 is 2");
}
if(l3>l1 & l3>l2){
cat("\nClass of P with K = 5 is 3");
}

df = sortedData[1:7,]

l1 = sum(df$class == 1)
l2 = sum(df$class == 2)
l3 = sum(df$class == 3)

if(l1>l2 & l1>l3){


cat("\nClass of P with K = 7 is 1");
}
if(l2>l1 & l2>l3){
cat("\nClass of P with K = 7 is 2");
}
if(l3>l1 & l3>l2){
cat("\nClass of P with K = 7 is 3");
}

#Rnn

df1 = sortedData[sortedData$eucld < 1.45,]


l1 = sum(df1$class == 1)
l2 = sum(df1$class == 2)
l3 = sum(df1$class == 3)

if(l1>l2 & l1>l3){


cat("\nClass of P with RNN is 1");
}
if(l2>l1 & l2>l3){
cat("\nClass of P with RNN is 2");
}
if(l3>l1 & l3>l2){
cat("\nClass of P with RNN is 3");
}

#MKNN

d5 = max(df5$eucld)
Bansilal RamnathAgarwal Charitable Trust’s
VISHWAKARMA INSTITUTE OF TECHNOLOGY – PUNE
d1 = min(df5$eucld) Department of SY Common

w = (d5 - df5$eucld)/(d5-d1)

dfw = cbind(df5,w)

w1 = sum(dfw$w[dfw$class == 1])
w2 = sum(dfw$w[dfw$class == 2])
w3 = sum(dfw$w[dfw$class == 3])

if(w1>w2 & w1>w3){


cat("\nClass of P MKNN is 1");
}
if(w2>w1 & w2>w3){
cat("\nClass of P MKNN is 2");
}
if(w3>w1 & w3>w2){
cat("\nClass of P Mknn is 3");
}

Results:

Conclusion:

In this experiment we applied various nearest neighbor–based classification techniques to assign a


class to the test point P(3,2) using an 18-point dataset. The standard nearest neighbor (NN) method
classified P based on its closest point. Extending this idea, the KNN algorithm with K = 5 and K =
7 determined the class by taking the majority vote of the five or seven closest points, respectively,
with both methods consistently indicating the same class as NN. In the radius-based NN (R-NN)
approach, only those points within a radius of 1.45 units were considered, and the majority class
among these neighbors supported the same classification. Finally, the modified KNN (MKNN)
method, which weighs each neighbor inversely to its distance from P, also reinforced the same
class decision.

You might also like