0% found this document useful (0 votes)
3 views

Python Decision tree

The document contains a Python script that utilizes the Iris dataset to train a decision tree classifier. It demonstrates how to load the dataset, remove specific samples, train the classifier, and make predictions on the removed samples. Additionally, it visualizes the decision tree structure generated by the classifier.

Uploaded by

revanthd2416
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Python Decision tree

The document contains a Python script that utilizes the Iris dataset to train a decision tree classifier. It demonstrates how to load the dataset, remove specific samples, train the classifier, and make predictions on the removed samples. Additionally, it visualizes the decision tree structure generated by the classifier.

Uploaded by

revanthd2416
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

5/9/25, 12:15 AM Untitled0.

ipynb - Colab

import pandas as pd
import numpy as np
from sklearn.datasets import load_iris
from sklearn import tree
iris=load_iris()
print(iris.feature_names)
print(iris.target_names)

['sepal length (cm)', 'sepal width (cm)', 'petal length (cm)', 'petal width (cm)']
['setosa' 'versicolor' 'virginica']

#Spilitting the dataset


removed =[0,50,100]
new_target = np.delete(iris.target, removed)
new_data = np.delete(iris.data, removed, axis=0)
#train classifier
clf = tree. DecisionTreeClassifier() # defining decision tree classifier
clf=clf.fit(new_data,new_target) # train data on new data and new target
prediction = clf.predict(iris.data[removed]) # assign removed data as input
print("Original Labels", iris. target[removed])
print("Labels Predicted", prediction)

Original Labels [0 1 2]
Labels Predicted [0 1 2]

tree.plot_tree(clf)

[Text(0.5, 0.9166666666666666, 'x[2] <= 2.45\ngini = 0.667\nsamples = 147\nvalue = [49, 49, 49]'),
Text(0.4230769230769231, 0.75, 'gini = 0.0\nsamples = 49\nvalue = [49, 0, 0]'),
Text(0.46153846153846156, 0.8333333333333333, 'True '),
Text(0.5769230769230769, 0.75, 'x[3] <= 1.75\ngini = 0.5\nsamples = 98\nvalue = [0, 49, 49]'),
Text(0.5384615384615384, 0.8333333333333333, ' False'),
Text(0.3076923076923077, 0.5833333333333334, 'x[2] <= 4.95\ngini = 0.171\nsamples = 53\nvalue = [0, 48, 5]'),
Text(0.15384615384615385, 0.4166666666666667, 'x[3] <= 1.65\ngini = 0.042\nsamples = 47\nvalue = [0, 46, 1]'),
Text(0.07692307692307693, 0.25, 'gini = 0.0\nsamples = 46\nvalue = [0, 46, 0]'),
Text(0.23076923076923078, 0.25, 'gini = 0.0\nsamples = 1\nvalue = [0, 0, 1]'),
Text(0.46153846153846156, 0.4166666666666667, 'x[3] <= 1.55\ngini = 0.444\nsamples = 6\nvalue = [0, 2, 4]'),
Text(0.38461538461538464, 0.25, 'gini = 0.0\nsamples = 3\nvalue = [0, 0, 3]'),
Text(0.5384615384615384, 0.25, 'x[2] <= 5.45\ngini = 0.444\nsamples = 3\nvalue = [0, 2, 1]'),
Text(0.46153846153846156, 0.08333333333333333, 'gini = 0.0\nsamples = 2\nvalue = [0, 2, 0]'),
Text(0.6153846153846154, 0.08333333333333333, 'gini = 0.0\nsamples = 1\nvalue = [0, 0, 1]'),
Text(0.8461538461538461, 0.5833333333333334, 'x[2] <= 4.85\ngini = 0.043\nsamples = 45\nvalue = [0, 1, 44]'),
Text(0.7692307692307693, 0.4166666666666667, 'x[1] <= 3.1\ngini = 0.444\nsamples = 3\nvalue = [0, 1, 2]'),
Text(0.6923076923076923, 0.25, 'gini = 0.0\nsamples = 2\nvalue = [0, 0, 2]'),
Text(0.8461538461538461, 0.25, 'gini = 0.0\nsamples = 1\nvalue = [0, 1, 0]'),
Text(0.9230769230769231, 0.4166666666666667, 'gini = 0.0\nsamples = 42\nvalue = [0, 0, 42]')]

https://colab.research.google.com/drive/1Un49-MBH_KoP2aX5MTlpUAleJFQsIgIB#printMode=true 1/2
5/9/25, 12:15 AM Untitled0.ipynb - Colab

https://colab.research.google.com/drive/1Un49-MBH_KoP2aX5MTlpUAleJFQsIgIB#printMode=true 2/2

You might also like