0% found this document useful (0 votes)
82 views

Linear Regression Using TensorFlow PDF

This document describes using TensorFlow to perform linear regression on randomly generated data. 50 data points are generated with random noise added. The data is plotted. Placeholders and variables are defined for X, Y, weight, and bias. A training loop runs 1500 epochs to minimize the mean squared error cost function using gradient descent. The fitted line is plotted against the original data along with calculations of R2 value, predictions for sample X values, and exercises to further explore the model.

Uploaded by

kknk
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views

Linear Regression Using TensorFlow PDF

This document describes using TensorFlow to perform linear regression on randomly generated data. 50 data points are generated with random noise added. The data is plotted. Placeholders and variables are defined for X, Y, weight, and bias. A training loop runs 1500 epochs to minimize the mean squared error cost function using gradient descent. The fitted line is plotted against the original data along with calculations of R2 value, predictions for sample X values, and exercises to further explore the model.

Uploaded by

kknk
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Lab5_LinearRegression

August 14, 2019

In [1]: import numpy as np


import tensorflow as tf
import matplotlib.pyplot as plt

In [2]: # Genrating random linear data


# There will be 50 data points ranging from 0 to 50
x = np.linspace(0, 50, 50)
y = np.linspace(0, 50, 50)

# Adding noise to the random linear data


x += np.random.uniform(-4, 4, 50)
y += np.random.uniform(-4, 4, 50)

n = len(x) # Number of data points

In [3]: # Plot of Training Data


plt.scatter(x, y, color='blue')
plt.xlabel('x')
plt.xlabel('y')
plt.title("Training Data")
plt.show()

1
In [4]: X = tf.placeholder(tf.float32)
Y = tf.placeholder(tf.float32)

In [5]: W = tf.Variable(np.random.randn(), name = "W")


b = tf.Variable(np.random.randn(), name = "b")

In [6]: learning_rate = 0.01


training_epochs = 1500

In [8]: # Hypothesis
y_pred = tf.add(tf.multiply(X, W), b)

# Mean Squared Error Cost Function


cost = tf.div(tf.reduce_sum(tf.square(y_pred-Y)),(2 * n))

# Gradient Descent Optimizer


optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

# Global Variables Initializer


init = tf.global_variables_initializer()

In [9]: # Starting the Tensorflow Session


with tf.Session() as sess:
merged = tf.summary.merge_all()

2
writer = tf.summary.FileWriter("logs", sess.graph)

# Initializing the Variables


sess.run(init)

# Iterating through all the epochs


for epoch in range(training_epochs):

# Feeding each data point into the optimizer using Feed Dictionary
for (_x, _y) in zip(x, y):
sess.run(optimizer, feed_dict = {X : _x, Y : _y})

# Displaying the result after every 50 epochs


if (epoch+1) % 50 == 0:
# Calculating the cost a every epoch
c = sess.run(cost, feed_dict = {X : x, Y : y})
print("Epoch", (epoch+1), ": cost =", c, "W =", sess.run(W), "b =", sess.ru

# Storing necessary values to be used outside the Session


training_cost = sess.run(cost, feed_dict ={X: x, Y: y})
weight = sess.run(W)
bias = sess.run(b)

Epoch 50 : cost = 7.154084 W = 0.98666245 b = 2.1450686


Epoch 100 : cost = 6.886928 W = 0.9919713 b = 1.8934373
Epoch 150 : cost = 6.6649146 W = 0.99672794 b = 1.6679817
Epoch 200 : cost = 6.479929 W = 1.0009897 b = 1.4659839
Epoch 250 : cost = 6.325364 W = 1.0048081 b = 1.2849983
Epoch 300 : cost = 6.195856 W = 1.0082291 b = 1.1228449
Epoch 350 : cost = 6.087027 W = 1.0112944 b = 0.97755986
Epoch 400 : cost = 5.9953055 W = 1.0140407 b = 0.84739035
Epoch 450 : cost = 5.9177637 W = 1.0165012 b = 0.73076344
Epoch 500 : cost = 5.852022 W = 1.0187058 b = 0.6262703
Epoch 550 : cost = 5.796106 W = 1.020681 b = 0.53264767
Epoch 600 : cost = 5.748413 W = 1.0224508 b = 0.44876537
Epoch 650 : cost = 5.7076144 W = 1.0240365 b = 0.37361065
Epoch 700 : cost = 5.6726007 W = 1.0254571 b = 0.30627447
Epoch 750 : cost = 5.6424737 W = 1.02673 b = 0.24594435
Epoch 800 : cost = 5.6164756 W = 1.0278703 b = 0.19189104
Epoch 850 : cost = 5.593986 W = 1.028892 b = 0.14346147
Epoch 900 : cost = 5.57448 W = 1.0298076 b = 0.10007
Epoch 950 : cost = 5.557519 W = 1.0306278 b = 0.061192915
Epoch 1000 : cost = 5.542733 W = 1.0313627 b = 0.026360514
Epoch 1050 : cost = 5.529823 W = 1.0320212 b = -0.0048481696
Epoch 1100 : cost = 5.5185184 W = 1.032611 b = -0.032809887
Epoch 1150 : cost = 5.508606 W = 1.0331396 b = -0.057862528
Epoch 1200 : cost = 5.4998994 W = 1.0336132 b = -0.08030887
Epoch 1250 : cost = 5.4922333 W = 1.0340375 b = -0.100419946

3
Epoch 1300 : cost = 5.4854755 W = 1.0344176 b = -0.118438624
Epoch 1350 : cost = 5.4795103 W = 1.0347582 b = -0.13458282
Epoch 1400 : cost = 5.4742365 W = 1.0350634 b = -0.14904751
Epoch 1450 : cost = 5.4695716 W = 1.0353369 b = -0.16200735
Epoch 1500 : cost = 5.4654326 W = 1.0355817 b = -0.17361827

In [10]: # Calculating the predictions


predictions = weight * x + bias
print("Training cost =", training_cost, "Weight =", weight, "bias =", bias, '\n')

Training cost = 5.4654326 Weight = 1.0355817 bias = -0.17361827

In [11]: # Plotting the Results


plt.plot(x, y,'ro', color = 'blue', label ='Original data')
plt.plot(x, predictions, color = 'red', label ='Fitted line')
plt.title('Linear Regression Result')
plt.legend()
plt.show()

In [12]: from sklearn.metrics import r2_score


R2 = r2_score(y, predictions, multioutput='variance_weighted')
print("R2 value:",R2)

4
R2 value: 0.9529184904819711

Exercises:

1. Predict Y value for X = [3.987, 19.235, 23.098, 36.5, 22.765]

2. Change the stopping criterion from number of epochs to difference in cost function. If
delta(J) <= 0.000001 stop the training.

3. Train your model with different alpha values (0.05, 0.1, 0.2 and 0.5) and conclude the be-
haviour.

4. Perform the Linear Regression on some other dataset.

You might also like