0% found this document useful (0 votes)
64 views3 pages

NNHOUSEPRICEDATA - Ipynb - Colab

Uploaded by

Shivani Ray
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
64 views3 pages

NNHOUSEPRICEDATA - Ipynb - Colab

Uploaded by

Shivani Ray
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

6/23/24, 11:37 PM NNHOUSEPRICEDATA.

ipynb - Colab

import pandas as pd
df = pd.read_csv('housepricedata.csv')
df.head()

LotArea OverallQual OverallCond TotalBsmtSF FullBath HalfBath BedroomAbvGr TotRmsAbvGrd Fireplaces GarageAre

0 8450 7 5 856 2 1 3 8 0 54

1 9600 6 8 1262 2 0 3 6 1 46

2 11250 7 5 920 2 1 3 6 1 60

3 9550 7 5 756 1 0 3 7 1 64

4 14260 8 5 1145 2 1 4 9 1 83

dataset = df.values

X = dataset[:,0:10]
Y = dataset[:,10]

from sklearn import preprocessing

min_max_scaler = preprocessing.MinMaxScaler()
X_scale = min_max_scaler.fit_transform(X)

X_scale

array([[0.0334198 , 0.66666667, 0.5 , ..., 0.5 , 0. ,


0.3864598 ],
[0.03879502, 0.55555556, 0.875 , ..., 0.33333333, 0.33333333,
0.32440056],
[0.04650728, 0.66666667, 0.5 , ..., 0.33333333, 0.33333333,
0.42877292],
...,
[0.03618687, 0.66666667, 1. , ..., 0.58333333, 0.66666667,
0.17771509],
[0.03934189, 0.44444444, 0.625 , ..., 0.25 , 0. ,
0.16925247],
[0.04037019, 0.44444444, 0.625 , ..., 0.33333333, 0. ,
0.19464034]])

from sklearn.model_selection import train_test_split


X_train, X_val_and_test, Y_train, Y_val_and_test = train_test_split(X_scale, Y, test_size=0.3)
X_val, X_test, Y_val, Y_test = train_test_split(X_val_and_test, Y_val_and_test, test_size=0.5)
print(X_train.shape, X_val.shape, X_test.shape, Y_train.shape, Y_val.shape, Y_test.shape)

https://colab.research.google.com/drive/1VNAWEcIcM-Kw4IJynlVR5dTZ54pjkv6t#printMode=true 1/3
6/23/24, 11:37 PM NNHOUSEPRICEDATA.ipynb - Colab

(1022, 10) (219, 10) (219, 10) (1022,) (219,) (219,)

from keras.models import Sequential #if not works change as tensor.keras.models


from keras.layers import Dense

model = Sequential([
Dense(32, activation='relu', input_shape=(10,)),
Dense(32, activation='relu'),
Dense(1, activation='sigmoid'),
])

model.compile(optimizer='sgd',
loss='binary_crossentropy',
metrics=['accuracy'])

hist = model.fit(X_train, Y_train,


batch_size=32, epochs=100,
validation_data=(X_val, Y_val))

model.evaluate(X_test, Y_test)[1]

Epoch 1/100
32/32 [==============================] - 1s 10ms/step - loss: 0.6924 - accuracy: 0.5157 - val_loss: 0.6898 - val_accuracy: 0.5845
Epoch 2/100
32/32 [==============================] - 0s 3ms/step - loss: 0.6836 - accuracy: 0.6614 - val_loss: 0.6856 - val_accuracy: 0.5982
Epoch 3/100
32/32 [==============================] - 0s 3ms/step - loss: 0.6765 - accuracy: 0.6595 - val_loss: 0.6818 - val_accuracy: 0.5890
Epoch 4/100
32/32 [==============================] - 0s 3ms/step - loss: 0.6701 - accuracy: 0.6027 - val_loss: 0.6776 - val_accuracy: 0.6119
Epoch 5/100
32/32 [==============================] - 0s 3ms/step - loss: 0.6643 - accuracy: 0.6742 - val_loss: 0.6742 - val_accuracy: 0.5982
Epoch 6/100
32/32 [==============================] - 0s 5ms/step - loss: 0.6588 - accuracy: 0.6517 - val_loss: 0.6704 - val_accuracy: 0.6164
Epoch 7/100
32/32 [==============================] - 0s 7ms/step - loss: 0.6534 - accuracy: 0.6751 - val_loss: 0.6662 - val_accuracy: 0.6164
Epoch 8/100
32/32 [==============================] - 0s 5ms/step - loss: 0.6477 - accuracy: 0.6986 - val_loss: 0.6620 - val_accuracy: 0.6347
Epoch 9/100
32/32 [==============================] - 0s 4ms/step - loss: 0.6420 - accuracy: 0.7133 - val_loss: 0.6571 - val_accuracy: 0.6393
Epoch 10/100
32/32 [==============================] - 0s 4ms/step - loss: 0.6360 - accuracy: 0.7299 - val_loss: 0.6519 - val_accuracy: 0.6484
Epoch 11/100
32/32 [==============================] - 0s 4ms/step - loss: 0.6297 - accuracy: 0.7436 - val_loss: 0.6466 - val_accuracy: 0.6667
Epoch 12/100
32/32 [==============================] - 0s 6ms/step - loss: 0.6232 - accuracy: 0.7456 - val_loss: 0.6405 - val_accuracy: 0.6667
Epoch 13/100
32/32 [==============================] - 0s 5ms/step - loss: 0.6166 - accuracy: 0.7524 - val_loss: 0.6345 - val_accuracy: 0.6804
Epoch 14/100
32/32 [==============================] - 0s 4ms/step - loss: 0.6099 - accuracy: 0.7769 - val_loss: 0.6286 - val_accuracy: 0.6941
Epoch 15/100
32/32 [==============================] - 0s 4ms/step - loss: 0.6028 - accuracy: 0.7818 - val_loss: 0.6222 - val_accuracy: 0.6986
Epoch 16/100
32/32 [==============================] - 0s 5ms/step - loss: 0.5955 - accuracy: 0.8033 - val_loss: 0.6160 - val_accuracy: 0.7032
Epoch 17/100
32/32 [==============================] - 0s 5ms/step - loss: 0.5881 - accuracy: 0.8023 - val_loss: 0.6092 - val_accuracy: 0.7169
Epoch 18/100
32/32 [==============================] - 0s 4ms/step - loss: 0.5802 - accuracy: 0.8151 - val_loss: 0.6021 - val_accuracy: 0.7397
Epoch 19/100
32/32 [==============================] - 0s 6ms/step - loss: 0.5722 - accuracy: 0.8180 - val_loss: 0.5943 - val_accuracy: 0.7671

https://colab.research.google.com/drive/1VNAWEcIcM-Kw4IJynlVR5dTZ54pjkv6t#printMode=true 2/3
6/23/24, 11:37 PM NNHOUSEPRICEDATA.ipynb - Colab
Epoch 20/100
32/32 [==============================] - 0s 6ms/step - loss: 0.5637 - accuracy: 0.8239 - val_loss: 0.5863 - val_accuracy: 0.7717
Epoch 21/100
32/32 [==============================] - 0s 4ms/step - loss: 0.5549 - accuracy: 0.8405 - val_loss: 0.5789 - val_accuracy: 0.7900
Epoch 22/100
32/32 [==============================] - 0s 4ms/step - loss: 0.5461 - accuracy: 0.8386 - val_loss: 0.5703 - val_accuracy: 0.7991
Epoch 23/100
32/32 [==============================] - 0s 5ms/step - loss: 0.5371 - accuracy: 0.8454 - val_loss: 0.5622 - val_accuracy: 0.8037
Epoch 24/100
32/32 [==============================] - 0s 6ms/step - loss: 0.5276 - accuracy: 0.8474 - val_loss: 0.5538 - val_accuracy: 0.8037
Epoch 25/100
32/32 [==============================] - 0s 5ms/step - loss: 0.5184 - accuracy: 0.8503 - val_loss: 0.5453 - val_accuracy: 0.8174
Epoch 26/100
32/32 [==============================] - 0s 3ms/step - loss: 0.5087 - accuracy: 0.8493 - val_loss: 0.5357 - val_accuracy: 0.8356
Epoch 27/100
32/32 [==============================] - 0s 3ms/step - loss: 0.4993 - accuracy: 0.8562 - val_loss: 0.5284 - val_accuracy: 0.8311
Epoch 28/100
32/32 [==============================] - 0s 3ms/step - loss: 0.4896 - accuracy: 0.8581 - val_loss: 0.5211 - val_accuracy: 0.8265
Epoch 29/100
32/32 [ ] 0 3 / t l 0 803 0 8 8 l l 0 l 0 8

https://colab.research.google.com/drive/1VNAWEcIcM-Kw4IJynlVR5dTZ54pjkv6t#printMode=true 3/3

You might also like