0% found this document useful (0 votes)
12 views5 pages

KNN Algorithm Edt

Knn algorithm

Uploaded by

sriradhan1802
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views5 pages

KNN Algorithm Edt

Knn algorithm

Uploaded by

sriradhan1802
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Ex No:

K-NN ALGORITHM

Aim:
To write a python code to implement K-Nearest Neighbour algorithms for the given data set.

Algorithm:
Step 1: Start the process
Step 2: Load the necessary libraries such a pandas, numpy, sklearn.
Step 3: Load the abalone dataset into the Google colab.
Step 4: Fetch the necessary feature from the dataset.
Step 5: Split the dataset into training set and test set
Step 6: Perform the k-nn algorithm for the abalone dataset.
Step 7: Calculate the accuracy for different number of neighbors.
Step 8: Display the result.
Step 9: Stop the process.

Coding:
#Import the necessary libraries
import numpy as np import
pandas as pd import csv
import seaborn as sns import
matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix from
sklearn.datasets import make_blobs from
sklearn.neighbors import KNeighborsClassifier
from sklearn.model_selection import train_test_split
#Load the dataset into the colab from the google
drive df =
pd.read_csv("/content/drive/MyDrive/ml_lab/abalon
e.csv") df

X, y = make_blobs(n_samples = 500, n_features = 2, centers = 4,cluster_std = 1.5,


random_state = 4)

X = df.drop(['Sex'],axis=1)
X

y = df['Sex'] y
#Split the dataset
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state = 42)
knn5.fit(X_train, y_train) knn3.fit(X_train, y_train) knn1.fit(X_train,
y_train) knn4.fit(X_train, y_train)

y_pred_5 = knn5.predict(X_test) y_pred_3


= knn3.predict(X_test) y_pred_1 =
knn1.predict(X_test) y_pred_4 =
knn4.predict(X_test)

from sklearn.metrics import accuracy_score test_accuracy_5 =


accuracy_score(y_test, y_pred_5)*100 print("Accuracy with k=5",
accuracy_score(y_test, y_pred_5)*100) print("Accuracy with k=3",
accuracy_score(y_test, y_pred_3)*100) print("Accuracy with k=1",
accuracy_score(y_test, y_pred_1)*100) print("Accuracy with k=4",
accuracy_score(y_test, y_pred_4)*100)

from sklearn.neighbors import KNeighborsClassifier from


sklearn.metrics import confusion_matrix
# Initialize and train the KNN classifier knn
= KNeighborsClassifier(n_neighbors=3)
knn.fit(X_train, y_train)

# Make predictions on the test data


y_pred = knn.predict(X_test)

# Calculate the confusion matrix cm =


confusion_matrix(y_test, y_pred)

print("Confusion Matrix:") print(cm)

#Perfrom the confusion matrix


cm=confusion_matrix(y_test, y_pred)
plt.figure(figsize=(8,6))
sns.heatmap(cm, annot=True,
fmt="d",cmap="Blues") plt.xlabel("Predicted")
plt.ylabel("Actual") plt.title("Confusion Matrix")
plt.show()
Result:
Thus the above python code to implement K-Nearest Neighbour algorithms for the given dataset has
been executed successfully.

You might also like