Lab Assignment 2 Irisdata Preprocessing
Lab Assignment 2 Irisdata Preprocessing
file_path="/content/drive/MyDrive/Colab Notebooks/FITT
Odisha/iris_dataset (Day-5).csv"
import pandas as pd
df = pd.read_csv(file_path)
df.head(20)
import pandas as pd
df = pd.read_csv(file_path)
df.tail(30)
Dataset Info:
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 150 entries, 0 to 149
Data columns (total 5 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 sepal_length 150 non-null float64
1 sepal_width 150 non-null float64
2 petal_length 150 non-null float64
3 petal_width 150 non-null float64
4 variety 150 non-null object
dtypes: float64(4), object(1)
memory usage: 6.0+ KB
None
print(df.loc[10])
sepal_length 5.4
sepal_width 3.7
petal_length 1.5
petal_width 0.2
variety Setosa
Name: 10, dtype: object
dk=df.duplicated()
print(dk.to_string())
0 False
1 False
2 False
3 False
4 False
5 False
6 False
7 False
8 False
9 False
10 False
11 False
12 False
13 False
14 False
15 False
16 False
17 False
18 False
19 False
20 False
21 False
22 False
23 False
24 False
25 False
26 False
27 False
28 False
29 False
30 False
31 False
32 False
33 False
34 False
35 False
36 False
37 False
38 False
39 False
40 False
41 False
42 False
43 False
44 False
45 False
46 False
47 False
48 False
49 False
50 False
51 False
52 False
53 False
54 False
55 False
56 False
57 False
58 False
59 False
60 False
61 False
62 False
63 False
64 False
65 False
66 False
67 False
68 False
69 False
70 False
71 False
72 False
73 False
74 False
75 False
76 False
77 False
78 False
79 False
80 False
81 False
82 False
83 False
84 False
85 False
86 False
87 False
88 False
89 False
90 False
91 False
92 False
93 False
94 False
95 False
96 False
97 False
98 False
99 False
100 False
101 False
102 False
103 False
104 False
105 False
106 False
107 False
108 False
109 False
110 False
111 False
112 False
113 False
114 False
115 False
116 False
117 False
118 False
119 False
120 False
121 False
122 False
123 False
124 False
125 False
126 False
127 False
128 False
129 False
130 False
131 False
132 False
133 False
134 False
135 False
136 False
137 False
138 False
139 False
140 False
141 False
142 True
143 False
144 False
145 False
146 False
147 False
148 False
149 False
0 False
1 False
2 False
3 False
4 False
5 False
6 False
7 False
8 False
9 False
10 False
11 False
12 False
13 False
14 False
15 False
16 False
17 False
18 False
19 False
20 False
21 False
22 False
23 False
24 False
25 False
26 False
27 False
28 False
29 False
30 False
31 False
32 False
33 False
34 False
35 False
36 False
37 False
38 False
39 False
40 False
41 False
42 False
43 False
44 False
45 False
46 False
47 False
48 False
49 False
50 False
51 False
52 False
53 False
54 False
55 False
56 False
57 False
58 False
59 False
60 False
61 False
62 False
63 False
64 False
65 False
66 False
67 False
68 False
69 False
70 False
71 False
72 False
73 False
74 False
75 False
76 False
77 False
78 False
79 False
80 False
81 False
82 False
83 False
84 False
85 False
86 False
87 False
88 False
89 False
90 False
91 False
92 False
93 False
94 False
95 False
96 False
97 False
98 False
99 False
100 False
101 False
102 False
103 False
104 False
105 False
106 False
107 False
108 False
109 False
110 False
111 False
112 False
113 False
114 False
115 False
116 False
117 False
118 False
119 False
120 False
121 False
122 False
123 False
124 False
125 False
126 False
127 False
128 False
129 False
130 False
131 False
132 False
133 False
134 False
135 False
136 False
137 False
138 False
139 False
140 False
141 False
142 True
143 False
144 False
145 False
146 False
147 False
148 False
149 False
Statistical Summary:
sepal_length sepal_width petal_length petal_width
count 150.000000 150.000000 150.000000 150.000000
mean 5.843333 3.057333 3.758000 1.199333
std 0.828066 0.435866 1.765298 0.762238
min 4.300000 2.000000 1.000000 0.100000
25% 5.100000 2.800000 1.600000 0.300000
50% 5.800000 3.000000 4.350000 1.300000
75% 6.400000 3.300000 5.100000 1.800000
max 7.900000 4.400000 6.900000 2.500000
Missing Values:
sepal_length 0
sepal_width 0
petal_length 0
petal_width 0
variety 0
dtype: int64
new_df = df.dropna()
print(new_df.to_string())
sepal_length sepal_width petal_length petal_width variety
0 5.1 3.5 1.4 0.2 Setosa
1 4.9 3.0 1.4 0.2 Setosa
2 4.7 3.2 1.3 0.2 Setosa
3 4.6 3.1 1.5 0.2 Setosa
4 5.0 3.6 1.4 0.2 Setosa
5 5.4 3.9 1.7 0.4 Setosa
6 4.6 3.4 1.4 0.3 Setosa
7 5.0 3.4 1.5 0.2 Setosa
8 4.4 2.9 1.4 0.2 Setosa
9 4.9 3.1 1.5 0.1 Setosa
10 5.4 3.7 1.5 0.2 Setosa
11 4.8 3.4 1.6 0.2 Setosa
12 4.8 3.0 1.4 0.1 Setosa
13 4.3 3.0 1.1 0.1 Setosa
14 5.8 4.0 1.2 0.2 Setosa
15 5.7 4.4 1.5 0.4 Setosa
16 5.4 3.9 1.3 0.4 Setosa
17 5.1 3.5 1.4 0.3 Setosa
18 5.7 3.8 1.7 0.3 Setosa
19 5.1 3.8 1.5 0.3 Setosa
20 5.4 3.4 1.7 0.2 Setosa
21 5.1 3.7 1.5 0.4 Setosa
22 4.6 3.6 1.0 0.2 Setosa
23 5.1 3.3 1.7 0.5 Setosa
24 4.8 3.4 1.9 0.2 Setosa
25 5.0 3.0 1.6 0.2 Setosa
26 5.0 3.4 1.6 0.4 Setosa
27 5.2 3.5 1.5 0.2 Setosa
28 5.2 3.4 1.4 0.2 Setosa
29 4.7 3.2 1.6 0.2 Setosa
30 4.8 3.1 1.6 0.2 Setosa
31 5.4 3.4 1.5 0.4 Setosa
32 5.2 4.1 1.5 0.1 Setosa
33 5.5 4.2 1.4 0.2 Setosa
34 4.9 3.1 1.5 0.2 Setosa
35 5.0 3.2 1.2 0.2 Setosa
36 5.5 3.5 1.3 0.2 Setosa
37 4.9 3.6 1.4 0.1 Setosa
38 4.4 3.0 1.3 0.2 Setosa
39 5.1 3.4 1.5 0.2 Setosa
40 5.0 3.5 1.3 0.3 Setosa
41 4.5 2.3 1.3 0.3 Setosa
42 4.4 3.2 1.3 0.2 Setosa
43 5.0 3.5 1.6 0.6 Setosa
44 5.1 3.8 1.9 0.4 Setosa
45 4.8 3.0 1.4 0.3 Setosa
46 5.1 3.8 1.6 0.2 Setosa
47 4.6 3.2 1.4 0.2 Setosa
48 5.3 3.7 1.5 0.2 Setosa
49 5.0 3.3 1.4 0.2 Setosa
50 7.0 3.2 4.7 1.4 Versicolor
51 6.4 3.2 4.5 1.5 Versicolor
52 6.9 3.1 4.9 1.5 Versicolor
53 5.5 2.3 4.0 1.3 Versicolor
54 6.5 2.8 4.6 1.5 Versicolor
55 5.7 2.8 4.5 1.3 Versicolor
56 6.3 3.3 4.7 1.6 Versicolor
57 4.9 2.4 3.3 1.0 Versicolor
58 6.6 2.9 4.6 1.3 Versicolor
59 5.2 2.7 3.9 1.4 Versicolor
60 5.0 2.0 3.5 1.0 Versicolor
61 5.9 3.0 4.2 1.5 Versicolor
62 6.0 2.2 4.0 1.0 Versicolor
63 6.1 2.9 4.7 1.4 Versicolor
64 5.6 2.9 3.6 1.3 Versicolor
65 6.7 3.1 4.4 1.4 Versicolor
66 5.6 3.0 4.5 1.5 Versicolor
67 5.8 2.7 4.1 1.0 Versicolor
68 6.2 2.2 4.5 1.5 Versicolor
69 5.6 2.5 3.9 1.1 Versicolor
70 5.9 3.2 4.8 1.8 Versicolor
71 6.1 2.8 4.0 1.3 Versicolor
72 6.3 2.5 4.9 1.5 Versicolor
73 6.1 2.8 4.7 1.2 Versicolor
74 6.4 2.9 4.3 1.3 Versicolor
75 6.6 3.0 4.4 1.4 Versicolor
76 6.8 2.8 4.8 1.4 Versicolor
77 6.7 3.0 5.0 1.7 Versicolor
78 6.0 2.9 4.5 1.5 Versicolor
79 5.7 2.6 3.5 1.0 Versicolor
80 5.5 2.4 3.8 1.1 Versicolor
81 5.5 2.4 3.7 1.0 Versicolor
82 5.8 2.7 3.9 1.2 Versicolor
83 6.0 2.7 5.1 1.6 Versicolor
84 5.4 3.0 4.5 1.5 Versicolor
85 6.0 3.4 4.5 1.6 Versicolor
86 6.7 3.1 4.7 1.5 Versicolor
87 6.3 2.3 4.4 1.3 Versicolor
88 5.6 3.0 4.1 1.3 Versicolor
89 5.5 2.5 4.0 1.3 Versicolor
90 5.5 2.6 4.4 1.2 Versicolor
91 6.1 3.0 4.6 1.4 Versicolor
92 5.8 2.6 4.0 1.2 Versicolor
93 5.0 2.3 3.3 1.0 Versicolor
94 5.6 2.7 4.2 1.3 Versicolor
95 5.7 3.0 4.2 1.2 Versicolor
96 5.7 2.9 4.2 1.3 Versicolor
97 6.2 2.9 4.3 1.3 Versicolor
98 5.1 2.5 3.0 1.1 Versicolor
99 5.7 2.8 4.1 1.3 Versicolor
100 6.3 3.3 6.0 2.5 Virginica
101 5.8 2.7 5.1 1.9 Virginica
102 7.1 3.0 5.9 2.1 Virginica
103 6.3 2.9 5.6 1.8 Virginica
104 6.5 3.0 5.8 2.2 Virginica
105 7.6 3.0 6.6 2.1 Virginica
106 4.9 2.5 4.5 1.7 Virginica
107 7.3 2.9 6.3 1.8 Virginica
108 6.7 2.5 5.8 1.8 Virginica
109 7.2 3.6 6.1 2.5 Virginica
110 6.5 3.2 5.1 2.0 Virginica
111 6.4 2.7 5.3 1.9 Virginica
112 6.8 3.0 5.5 2.1 Virginica
113 5.7 2.5 5.0 2.0 Virginica
114 5.8 2.8 5.1 2.4 Virginica
115 6.4 3.2 5.3 2.3 Virginica
116 6.5 3.0 5.5 1.8 Virginica
117 7.7 3.8 6.7 2.2 Virginica
118 7.7 2.6 6.9 2.3 Virginica
119 6.0 2.2 5.0 1.5 Virginica
120 6.9 3.2 5.7 2.3 Virginica
121 5.6 2.8 4.9 2.0 Virginica
122 7.7 2.8 6.7 2.0 Virginica
123 6.3 2.7 4.9 1.8 Virginica
124 6.7 3.3 5.7 2.1 Virginica
125 7.2 3.2 6.0 1.8 Virginica
126 6.2 2.8 4.8 1.8 Virginica
127 6.1 3.0 4.9 1.8 Virginica
128 6.4 2.8 5.6 2.1 Virginica
129 7.2 3.0 5.8 1.6 Virginica
130 7.4 2.8 6.1 1.9 Virginica
131 7.9 3.8 6.4 2.0 Virginica
132 6.4 2.8 5.6 2.2 Virginica
133 6.3 2.8 5.1 1.5 Virginica
134 6.1 2.6 5.6 1.4 Virginica
135 7.7 3.0 6.1 2.3 Virginica
136 6.3 3.4 5.6 2.4 Virginica
137 6.4 3.1 5.5 1.8 Virginica
138 6.0 3.0 4.8 1.8 Virginica
139 6.9 3.1 5.4 2.1 Virginica
140 6.7 3.1 5.6 2.4 Virginica
141 6.9 3.1 5.1 2.3 Virginica
142 5.8 2.7 5.1 1.9 Virginica
143 6.8 3.2 5.9 2.3 Virginica
144 6.7 3.3 5.7 2.5 Virginica
145 6.7 3.0 5.2 2.3 Virginica
146 6.3 2.5 5.0 1.9 Virginica
147 6.5 3.0 5.2 2.0 Virginica
148 6.2 3.4 5.4 2.3 Virginica
149 5.9 3.0 5.1 1.8 Virginica
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import classification_report, accuracy_score
# Split the data into training (80%) and testing (20%) sets
X_train, X_test, y_train, y_test = train_test_split(X, y,
test_size=0.2, random_state=42)
print("Classification Report:")
print(classification_report(y_test, y_pred))
Accuracy: 1.0
Classification Report:
precision recall f1-score support
accuracy 1.00 30
macro avg 1.00 1.00 1.00 30
weighted avg 1.00 1.00 1.00 30
Confusion Matrix:
[[10 0 0]
[ 0 9 0]
[ 0 0 11]]