0% found this document useful (0 votes)
21 views2 pages

K-Nearest Neighbour - Jupyter Notebook

Uploaded by

vs1438kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views2 pages

K-Nearest Neighbour - Jupyter Notebook

Uploaded by

vs1438kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

11/6/24, 7:12 PM k-Nearest Neighbour - Jupyter Notebook

In [9]: import numpy as np


import pandas as pd
from sklearn.neighbors import KNeighborsClassifier
from sklearn.model_selection import train_test_split
from sklearn import metrics

In [10]: names = ['sepal-length', 'sepal-width', 'petal-length', 'petal-width', 'Class']


In [11]: # Read dataset to pandas dataframe


dataset = pd.read_csv("iris_data.csv", names=names)


In [12]: X = dataset.iloc[:, :-1]


y = dataset.iloc[:, -1]
print(X.head())

sepal-length sepal-width petal-length petal-width


0 5.1 3.5 1.4 0.2
1 4.9 3.0 1.4 0.2
2 4.7 3.2 1.3 0.2
3 4.6 3.1 1.5 0.2
4 5.0 3.6 1.4 0.2

In [13]: Xtrain, Xtest, ytrain, ytest = train_test_split(X, y, test_size=0.10)

In [14]: classifier = KNeighborsClassifier(n_neighbors=5).fit(Xtrain, ytrain)



ypred = classifier.predict(Xtest)

localhost:8888/notebooks/Downloads/AIML PROGRAMS/AIML PROGRAMS/k-Nearest Neighbour.ipynb 1/3


11/6/24, 7:12 PM k-Nearest Neighbour - Jupyter Notebook

In [15]: i = 0
print ("\n-------------------------------------------------------------------------")
print ('%-25s %-25s %-25s' % ('Original Label', 'Predicted Label', 'Correct/Wrong'))
print ("-------------------------------------------------------------------------")
for label in ytest:
print ('%-25s %-25s' % (label, ypred[i]), end="")
if (label == ypred[i]):
print (' %-25s' % ('Correct'))
else:
print (' %-25s' % ('Wrong'))
i = i + 1
print ("-------------------------------------------------------------------------")
print("\nConfusion Matrix:\n",metrics.confusion_matrix(ytest, ypred))
print ("-------------------------------------------------------------------------")
print("\nClassification Report:\n",metrics.classification_report(ytest, ypred))
print ("-------------------------------------------------------------------------")
print('Accuracy of the classifer is %0.2f' % metrics.accuracy_score(ytest,ypred))
print ("-------------------------------------------------------------------------")

-------------------------------------------------------------------------
Original Label Predicted Label Correct/Wrong
-------------------------------------------------------------------------
Iris-versicolor Iris-versicolor Correct
Iris-versicolor Iris-versicolor Correct
Iris-versicolor Iris-versicolor Correct
Iris-versicolor Iris-versicolor Correct
Iris-setosa Iris-setosa Correct
Iris-virginica Iris-virginica Correct
Iris-setosa Iris-setosa Correct
Iris-setosa Iris-setosa Correct
Iris-setosa Iris-setosa Correct
Iris-versicolor Iris-versicolor Correct
Iris-setosa Iris-setosa Correct
Iris-virginica Iris-virginica Correct
Iris-setosa Iris-setosa Correct
Iris-virginica Iris-virginica Correct
Iris-setosa Iris-setosa Correct
-------------------------------------------------------------------------

Confusion Matrix:
[[7 0 0]
[0 5 0]
[0 0 3]]
-------------------------------------------------------------------------

Classification Report:
precision recall f1-score support

Iris-setosa 1.00 1.00 1.00 7


Iris-versicolor 1.00 1.00 1.00 5
Iris-virginica 1.00 1.00 1.00 3

accuracy 1.00 15
macro avg 1.00 1.00 1.00 15
weighted avg 1.00 1.00 1.00 15

-------------------------------------------------------------------------
Accuracy of the classifer is 1.00
-------------------------------------------------------------------------

In [ ]: ​

localhost:8888/notebooks/Downloads/AIML PROGRAMS/AIML PROGRAMS/k-Nearest Neighbour.ipynb 2/3

You might also like