0% found this document useful (0 votes)
9 views10 pages

Stress Level Detection

Machine learning coding

Uploaded by

mcs23007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views10 pages

Stress Level Detection

Machine learning coding

Uploaded by

mcs23007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

10/21/24, 11:58 AM StressLevelDetection - Jupyter Notebook

import important libraries for data cleaning,pre-processing(scaling), visualization(for co-


relation), model selection

In [1]: import pandas as pd


import numpy as np
from sklearn.preprocessing import StandardScaler
import tensorflow as tf
import tensorflow.keras as k
from sklearn.model_selection import train_test_split
from matplotlib import pyplot as plt
import time

WARNING:tensorflow:From C:\Users\SIMRAN\anaconda3\lib\site-packages\keras
\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is de
precated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy inst
ead.

Reading the training dataset


In [2]: df = pd.read_csv('train.csv')

In [3]: df.head()

Out[3]:
MEAN_RR MEDIAN_RR SDRR RMSSD SDSD SDRR_RMSSD HR

0 885.157845 853.763730 140.972741 15.554505 15.553371 9.063146 69.499952 11.

1 939.425371 948.357865 81.317742 12.964439 12.964195 6.272369 64.363150 5.

2 898.186047 907.006860 84.497236 16.305279 16.305274 5.182201 67.450066 13.

3 881.757864 893.460030 90.370537 15.720468 15.720068 5.748591 68.809562 11.

4 809.625331 811.184865 62.766242 19.213819 19.213657 3.266724 74.565728 20.

5 rows × 36 columns

condition 0: No stress; condition 1: Stress

Selecting label and features from dataset

In [4]: label = ['condition']


features = ['MEAN_RR','RMSSD','pNN25','pNN50','LF','HF','LF_HF']

Scaling the features, ensuring that all features are on the same scale, which is crucial for
optimal model performance

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 1/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

In [5]: scaler = StandardScaler()


scaler.fit(df[features])
df[features] = scaler.transform(df[features])

In [6]: X = df[features].values
y = df[label].values

In [7]: y

Out[7]: array([[0],
[1],
[1],
...,
[0],
[0],
[0]], dtype=int64)

Spliting the dataset for training and testing

In [8]: X_train,X_test,y_train,y_test = train_test_split(X,y,test_size = 0.3,random

In [9]: print(X_train.shape,y_train.shape,X_test.shape,y_test.shape)

(258502, 7) (258502, 1) (110787, 7) (110787, 1)

In [10]: y_train = k.utils.to_categorical(y_train)


y_test = k.utils.to_categorical(y_test)
#which transforms the labels into a categorical format for classification

In [11]: y_train

Out[11]: array([[0., 1., 0.],


[0., 0., 1.],
[1., 0., 0.],
...,
[1., 0., 0.],
[1., 0., 0.],
[0., 1., 0.]], dtype=float32)

Creating Model
In [12]: from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 2/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

In [13]: model = Sequential()


model.add(Dense(50, activation='relu',kernel_initializer = 'he_normal',inpu
model.add(Dense(20, activation='relu',kernel_initializer = 'he_normal'))
model.add(Dense(10, activation='relu',kernel_initializer = 'he_normal'))
model.add(Dense(3, activation='softmax'))

WARNING:tensorflow:From C:\Users\SIMRAN\anaconda3\lib\site-packages\keras
\src\backend.py:873: The name tf.get_default_graph is deprecated. Please u
se tf.compat.v1.get_default_graph instead.

In [14]: model.compile(optimizer='adam', loss='categorical_crossentropy',metrics=['a

WARNING:tensorflow:From C:\Users\SIMRAN\anaconda3\lib\site-packages\keras
\src\optimizers\__init__.py:309: The name tf.train.Optimizer is deprecate
d. Please use tf.compat.v1.train.Optimizer instead.

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 3/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

In [15]: history = model.fit(X_train,y_train, validation_data = (X_test,y_test), epo

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 4/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

Epoch 1/50
WARNING:tensorflow:From C:\Users\SIMRAN\anaconda3\lib\site-packages\keras
\src\utils\tf_utils.py:492: The name tf.ragged.RaggedTensorValue is deprec
ated. Please use tf.compat.v1.ragged.RaggedTensorValue instead.

WARNING:tensorflow:From C:\Users\SIMRAN\anaconda3\lib\site-packages\keras
\src\engine\base_layer_utils.py:384: The name tf.executing_eagerly_outside
_functions is deprecated. Please use tf.compat.v1.executing_eagerly_outsid
e_functions instead.

253/253 [==============================] - 3s 6ms/step - loss: 0.8276 - ac


curacy: 0.6327 - val_loss: 0.7018 - val_accuracy: 0.7042
Epoch 2/50
253/253 [==============================] - 1s 4ms/step - loss: 0.6325 - ac
curacy: 0.7329 - val_loss: 0.5767 - val_accuracy: 0.7545
Epoch 3/50
253/253 [==============================] - 1s 5ms/step - loss: 0.5340 - ac
curacy: 0.7800 - val_loss: 0.5041 - val_accuracy: 0.7933
Epoch 4/50
253/253 [==============================] - 1s 5ms/step - loss: 0.4708 - ac
curacy: 0.8100 - val_loss: 0.4510 - val_accuracy: 0.8178
Epoch 5/50
253/253 [==============================] - 1s 5ms/step - loss: 0.4265 - ac
curacy: 0.8321 - val_loss: 0.4126 - val_accuracy: 0.8373
Epoch 6/50
253/253 [==============================] - 1s 5ms/step - loss: 0.3898 - ac
curacy: 0.8479 - val_loss: 0.3779 - val_accuracy: 0.8519
Epoch 7/50
253/253 [==============================] - 1s 5ms/step - loss: 0.3592 - ac
curacy: 0.8623 - val_loss: 0.3491 - val_accuracy: 0.8677
Epoch 8/50
253/253 [==============================] - 1s 5ms/step - loss: 0.3353 - ac
curacy: 0.8733 - val_loss: 0.3279 - val_accuracy: 0.8777
Epoch 9/50
253/253 [==============================] - 1s 5ms/step - loss: 0.3150 - ac
curacy: 0.8817 - val_loss: 0.3109 - val_accuracy: 0.8817
Epoch 10/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2968 - ac
curacy: 0.8885 - val_loss: 0.2907 - val_accuracy: 0.8896
Epoch 11/50
253/253 [==============================] - 2s 6ms/step - loss: 0.2812 - ac
curacy: 0.8953 - val_loss: 0.2762 - val_accuracy: 0.8953
Epoch 12/50
253/253 [==============================] - 1s 6ms/step - loss: 0.2674 - ac
curacy: 0.9001 - val_loss: 0.2655 - val_accuracy: 0.9013
Epoch 13/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2565 - ac
curacy: 0.9045 - val_loss: 0.2526 - val_accuracy: 0.9068
Epoch 14/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2454 - ac
curacy: 0.9087 - val_loss: 0.2478 - val_accuracy: 0.9074
Epoch 15/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2354 - ac
curacy: 0.9119 - val_loss: 0.2334 - val_accuracy: 0.9107
Epoch 16/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2266 - ac
curacy: 0.9154 - val_loss: 0.2245 - val_accuracy: 0.9156
Epoch 17/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2193 - ac
curacy: 0.9179 - val_loss: 0.2177 - val_accuracy: 0.9180
Epoch 18/50
localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 5/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook
253/253 [==============================] - 1s 5ms/step - loss: 0.2118 - ac
curacy: 0.9209 - val_loss: 0.2109 - val_accuracy: 0.9201
Epoch 19/50
253/253 [==============================] - 1s 5ms/step - loss: 0.2059 - ac
curacy: 0.9228 - val_loss: 0.2079 - val_accuracy: 0.9238
Epoch 20/50
253/253 [==============================] - 1s 6ms/step - loss: 0.2007 - ac
curacy: 0.9252 - val_loss: 0.2007 - val_accuracy: 0.9261
Epoch 21/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1954 - ac
curacy: 0.9274 - val_loss: 0.1959 - val_accuracy: 0.9266
Epoch 22/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1902 - ac
curacy: 0.9302 - val_loss: 0.1952 - val_accuracy: 0.9259
Epoch 23/50
253/253 [==============================] - 1s 6ms/step - loss: 0.1850 - ac
curacy: 0.9322 - val_loss: 0.1853 - val_accuracy: 0.9322
Epoch 24/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1814 - ac
curacy: 0.9334 - val_loss: 0.1876 - val_accuracy: 0.9285
Epoch 25/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1770 - ac
curacy: 0.9357 - val_loss: 0.1763 - val_accuracy: 0.9363
Epoch 26/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1727 - ac
curacy: 0.9372 - val_loss: 0.1758 - val_accuracy: 0.9364
Epoch 27/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1696 - ac
curacy: 0.9384 - val_loss: 0.1698 - val_accuracy: 0.9383
Epoch 28/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1650 - ac
curacy: 0.9402 - val_loss: 0.1670 - val_accuracy: 0.9401
Epoch 29/50
253/253 [==============================] - 1s 6ms/step - loss: 0.1627 - ac
curacy: 0.9413 - val_loss: 0.1634 - val_accuracy: 0.9396
Epoch 30/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1591 - ac
curacy: 0.9422 - val_loss: 0.1606 - val_accuracy: 0.9416
Epoch 31/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1560 - ac
curacy: 0.9432 - val_loss: 0.1628 - val_accuracy: 0.9405
Epoch 32/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1534 - ac
curacy: 0.9442 - val_loss: 0.1536 - val_accuracy: 0.9457
Epoch 33/50
253/253 [==============================] - 1s 6ms/step - loss: 0.1503 - ac
curacy: 0.9454 - val_loss: 0.1546 - val_accuracy: 0.9439
Epoch 34/50
253/253 [==============================] - 1s 6ms/step - loss: 0.1487 - ac
curacy: 0.9460 - val_loss: 0.1499 - val_accuracy: 0.9455
Epoch 35/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1451 - ac
curacy: 0.9472 - val_loss: 0.1481 - val_accuracy: 0.9458
Epoch 36/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1430 - ac
curacy: 0.9480 - val_loss: 0.1449 - val_accuracy: 0.9479
Epoch 37/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1404 - ac
curacy: 0.9497 - val_loss: 0.1489 - val_accuracy: 0.9449
Epoch 38/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1387 - ac

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 6/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook
curacy: 0.9496 - val_loss: 0.1441 - val_accuracy: 0.9470
Epoch 39/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1366 - ac
curacy: 0.9502 - val_loss: 0.1367 - val_accuracy: 0.9509
Epoch 40/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1340 - ac
curacy: 0.9517 - val_loss: 0.1377 - val_accuracy: 0.9496
Epoch 41/50
253/253 [==============================] - 2s 6ms/step - loss: 0.1333 - ac
curacy: 0.9513 - val_loss: 0.1350 - val_accuracy: 0.9498
Epoch 42/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1302 - ac
curacy: 0.9529 - val_loss: 0.1290 - val_accuracy: 0.9534
Epoch 43/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1286 - ac
curacy: 0.9530 - val_loss: 0.1334 - val_accuracy: 0.9517
Epoch 44/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1276 - ac
curacy: 0.9537 - val_loss: 0.1286 - val_accuracy: 0.9536
Epoch 45/50
253/253 [==============================] - 1s 6ms/step - loss: 0.1250 - ac
curacy: 0.9550 - val_loss: 0.1301 - val_accuracy: 0.9529
Epoch 46/50
253/253 [==============================] - 1s 6ms/step - loss: 0.1237 - ac
curacy: 0.9556 - val_loss: 0.1234 - val_accuracy: 0.9549
Epoch 47/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1216 - ac
curacy: 0.9557 - val_loss: 0.1250 - val_accuracy: 0.9545
Epoch 48/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1202 - ac
curacy: 0.9566 - val_loss: 0.1262 - val_accuracy: 0.9542
Epoch 49/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1185 - ac
curacy: 0.9574 - val_loss: 0.1246 - val_accuracy: 0.9541
Epoch 50/50
253/253 [==============================] - 1s 5ms/step - loss: 0.1176 - ac
curacy: 0.9577 - val_loss: 0.1287 - val_accuracy: 0.9523

We have an accuracy of 95.77% and validation accuracy of 95.23%

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 7/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

In [16]: pd.DataFrame(history.history).plot(figsize=(10,7))

Out[16]: <Axes: >

we can see above there is a steady decrease in the valiadtion loss and training loss, with no
signs of overfitting

Testing on testset
In [17]: df_test = pd.read_csv('test.csv')

In [18]: df_test[features] = scaler.transform(df_test[features])


df_test[features].head()

Out[18]:
MEAN_RR RMSSD pNN25 pNN50 LF HF LF_HF

0 -1.001159 -0.634891 -0.598837 -0.874583 -0.575814 0.602911 -0.295775

1 -0.024971 1.048686 1.361573 -0.672601 1.080403 -0.280746 -0.157544

2 0.897836 1.544671 1.743894 0.943254 1.965161 -0.511482 0.037412

3 -0.175046 -0.777935 -0.623241 -0.335965 -0.767443 -0.477196 -0.241658

4 -0.721825 -0.393071 -0.476820 -0.201310 -0.735541 -0.089133 -0.280126

In [19]: X = df_test[features].values
y = df_test[label].values

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 8/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

In [20]: y = k.utils.to_categorical(y)

In [21]: loss,acc = model.evaluate(X,y,verbose=1)

1283/1283 [==============================] - 2s 2ms/step - loss: 0.1251 -


accuracy: 0.9534

We cans see the accuracy is 95.34%

Final Pipeline for prediction


In [22]: data = pd.read_csv('test.csv')

In [24]: t = scaler.transform(data[features].iloc[5201].values.reshape(1,-1))
print(t)

[[-1.53713909 0.87542554 0.56439365 1.21256363 0.66568458 2.7555339


-0.29899026]]

C:\Users\SIMRAN\anaconda3\lib\site-packages\sklearn\base.py:420: UserWarni
ng: X does not have valid feature names, but StandardScaler was fitted wit
h feature names
warnings.warn(

In [25]: y_pred = model.predict(t)

1/1 [==============================] - 0s 100ms/step

In [26]: print(np.argmax(y_pred[0]))
print(data[label].iloc[5201])

1
condition 1
Name: 5201, dtype: int64

Predicted Condition is 1 says : Stress

Lets check for other values

In [27]: t = scaler.transform(data[features].iloc[5545].values.reshape(1,-1))
print(t)

[[-0.7699341 0.25391942 0.11699709 0.60661783 0.54497884 0.50787441


-0.26537627]]

C:\Users\SIMRAN\anaconda3\lib\site-packages\sklearn\base.py:420: UserWarni
ng: X does not have valid feature names, but StandardScaler was fitted wit
h feature names
warnings.warn(

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 9/10
10/21/24, 11:59 AM StressLevelDetection - Jupyter Notebook

In [28]: y_pred = model.predict(t)

1/1 [==============================] - 0s 22ms/step

In [29]: print(np.argmax(y_pred[0]))
print(data[label].iloc[5545])

0
condition 0
Name: 5545, dtype: int64

Predicted Condition is 0 says : NO Stress

In [30]: t = scaler.transform(data[features].iloc[41032].values.reshape(1,-1))
print(t)
y_pred = model.predict(t)
print(np.argmax(y_pred[0]))
print(data[label].iloc[41032])

[[ 0.17166695 -0.77559868 -0.74525811 -0.87458304 -0.45972564 -0.64973772


-0.1273582 ]]
1/1 [==============================] - 0s 24ms/step
2
condition 2
Name: 41032, dtype: int64

C:\Users\SIMRAN\anaconda3\lib\site-packages\sklearn\base.py:420: UserWarni
ng: X does not have valid feature names, but StandardScaler was fitted wit
h feature names
warnings.warn(

Predicted condition is 2 says: High Stress

localhost:8888/notebooks/Python_ML_Udemy1/Health_Stress/StressLevelDetection.ipynb 10/10

You might also like