0% found this document useful (0 votes)
44 views8 pages

East West University: Computer Science and Engineering

1) The document is source code for linear regression using an iterative method to find regression parameters. It imports libraries, creates a sample dataset, and initializes weights. 2) It then runs 25 iterations to update the weights using gradient descent. The weights are recorded after 5, 10, 15, 20, and 25 iterations. 3) The results show the weights gradually updating with each additional iteration, converging towards optimal values that minimize error in the linear regression model.

Uploaded by

Shishir Zaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views8 pages

East West University: Computer Science and Engineering

1) The document is source code for linear regression using an iterative method to find regression parameters. It imports libraries, creates a sample dataset, and initializes weights. 2) It then runs 25 iterations to update the weights using gradient descent. The weights are recorded after 5, 10, 15, 20, and 25 iterations. 3) The results show the weights gradually updating with each additional iteration, converging towards optimal values that minimize error in the linear regression model.

Uploaded by

Shishir Zaman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

EAST WEST UNIVERSITY

COMPUTER SCIENCE AND ENGINEERING

Fall 2020

CSE475

Assignment 1

Submitted to:

Md. Mahmudur Rahman

Department of Computer Science and Engineering.

East West University.

Submitted by​:

Shishir Zaman
ID:2017-2-60-141

Date of submission:​ 28.10.2020


Source Code for finding regression parameters for linear regression
using iterative method :

import pandas as pd
import numpy as np

#creating dataset D2
x1 = [3,2,1,3]
x2 = [2,5,3,5]
x3 = [1,3,5,7]
x4 = [4,7,7,7]
y = [3,1,7,5]

data = pd.DataFrame({
"x1" : x1,
"x2" : x2,
"x3" : x3,
"x4" : x4,
"y" : y
})

# 1st Assume the value of w0,w1,w2,w3,w4.


w0 = 0.5
w1 = 0.5
w2 = 0.5
w3 = 0.5
w4 = 0.5
alpha = 0.001
iteration = 25

# iteration
k=0
predict_w0 = []
predict_w1 = []
predict_w2 = []
predict_w3 = []
predict_w4 = []
while(k<iteration):
pw0 = 0
pw1 = 0
pw2 = 0
pw3 = 0
pw4 = 0
for i in range(len(data)):
p_y = w0 + w1*data['x1'][i] + w2*data['x2'][i] + w3*data['x3'][i] + w4*data['x4'][i]
pw0 += (p_y - data['y'][i])
pw1 += ((p_y - data['y'][i])*data['x1'][i])
pw2 += ((p_y - data['y'][i])*data['x2'][i])
pw3 += ((p_y - data['y'][i])*data['x3'][i])
pw4 += ((p_y - data['y'][i])*data['x4'][i])

w0 = w0 - (alpha*pw0)
w1 = w1 - (alpha*pw1)
w2 = w2 - (alpha*pw2)
w3 = w3 - (alpha*pw3)
w4 = w4 - (alpha*pw4)

predict_w0.append(w0)
predict_w1.append(w1)
predict_w2.append(w2)
predict_w3.append(w3)
predict_w4.append(w4)

k +=1

# for creating pandas dataframe

weights = pd.DataFrame({
'W0' : predict_w0,
'W1' : predict_w1,
'W2' : predict_w2,
'W3' : predict_w3,
'W4' : predict_w4
})
Result:

Five iterations results:

W0 W1 W2 W3 W4

0 0.481500 0.455500 0.418000 0.420500 0.378000

1 0.469027 0.424349 0.360361 0.368323 0.295535

2 0.460644 0.402265 0.319259 0.334670 0.239899

3 0.455040 0.386344 0.289386 0.313569 0.202472

4 0.451323 0.374615 0.267140 0.300968 0.177402

Ten iterations results:

W0 W1 W2 W3 W4

0 0.481500 0.455500 0.418000 0.420500 0.378000

1 0.469027 0.424349 0.360361 0.368323 0.295535

2 0.460644 0.402265 0.319259 0.334670 0.239899

3 0.455040 0.386344 0.289386 0.313569 0.202472

4 0.451323 0.374615 0.267140 0.300968 0.177402

5 0.448889 0.365741 0.250076 0.294119 0.160721

6 0.447327 0.358813 0.236535 0.291156 0.149735

7 0.446358 0.353216 0.225393 0.290814 0.142616

8 0.445794 0.348532 0.215885 0.292235 0.138124


9 0.445507 0.344476 0.207493 0.294836 0.135418

Fifteen iterations Results:

W0 W1 W2 W3 W4

0 0.481500 0.455500 0.418000 0.420500 0.378000

1 0.469027 0.424349 0.360361 0.368323 0.295535

2 0.460644 0.402265 0.319259 0.334670 0.239899

3 0.455040 0.386344 0.289386 0.313569 0.202472

4 0.451323 0.374615 0.267140 0.300968 0.177402

5 0.448889 0.365741 0.250076 0.294119 0.160721

6 0.447327 0.358813 0.236535 0.291156 0.149735

7 0.446358 0.353216 0.225393 0.290814 0.142616

8 0.445794 0.348532 0.215885 0.292235 0.138124

9 0.445507 0.344476 0.207493 0.294836 0.135418

10 0.445410 0.340857 0.199866 0.298223 0.133927

11 0.445443 0.337544 0.192764 0.302127 0.133262

12 0.445566 0.334447 0.186026 0.306366 0.133161

13 0.445752 0.331506 0.179543 0.310818 0.133446

14 0.445983 0.328680 0.173238 0.315400 0.133996


Twenty iterations results:

W0 W1 W2 W3 W4

0 0.481500 0.455500 0.418000 0.420500 0.378000

1 0.469027 0.424349 0.360361 0.368323 0.295535

2 0.460644 0.402265 0.319259 0.334670 0.239899

3 0.455040 0.386344 0.289386 0.313569 0.202472

4 0.451323 0.374615 0.267140 0.300968 0.177402

5 0.448889 0.365741 0.250076 0.294119 0.160721

6 0.447327 0.358813 0.236535 0.291156 0.149735

7 0.446358 0.353216 0.225393 0.290814 0.142616

8 0.445794 0.348532 0.215885 0.292235 0.138124

9 0.445507 0.344476 0.207493 0.294836 0.135418

10 0.445410 0.340857 0.199866 0.298223 0.133927

11 0.445443 0.337544 0.192764 0.302127 0.133262

12 0.445566 0.334447 0.186026 0.306366 0.133161

13 0.445752 0.331506 0.179543 0.310818 0.133446

14 0.445983 0.328680 0.173238 0.315400 0.133996


15 0.446246 0.325942 0.167062 0.320054 0.134727

16 0.446533 0.323272 0.160980 0.324743 0.135584

17 0.446837 0.320657 0.154968 0.329442 0.136530

18 0.447155 0.318088 0.149011 0.334131 0.137538

19 0.447484 0.315560 0.143098 0.338801 0.138591

Twenty Five iterations results:

W0 W1 W2 W3 W4

0 0.481500 0.455500 0.418000 0.420500 0.378000

1 0.469027 0.424349 0.360361 0.368323 0.295535

2 0.460644 0.402265 0.319259 0.334670 0.239899

3 0.455040 0.386344 0.289386 0.313569 0.202472

4 0.451323 0.374615 0.267140 0.300968 0.177402

5 0.448889 0.365741 0.250076 0.294119 0.160721

6 0.447327 0.358813 0.236535 0.291156 0.149735

7 0.446358 0.353216 0.225393 0.290814 0.142616

8 0.445794 0.348532 0.215885 0.292235 0.138124

9 0.445507 0.344476 0.207493 0.294836 0.135418

10 0.445410 0.340857 0.199866 0.298223 0.133927


11 0.445443 0.337544 0.192764 0.302127 0.133262

12 0.445566 0.334447 0.186026 0.306366 0.133161

13 0.445752 0.331506 0.179543 0.310818 0.133446

14 0.445983 0.328680 0.173238 0.315400 0.133996

15 0.446246 0.325942 0.167062 0.320054 0.134727

16 0.446533 0.323272 0.160980 0.324743 0.135584

17 0.446837 0.320657 0.154968 0.329442 0.136530

18 0.447155 0.318088 0.149011 0.334131 0.137538

19 0.447484 0.315560 0.143098 0.338801 0.138591

20 0.447822 0.313067 0.137221 0.343444 0.139678

21 0.448168 0.310608 0.131376 0.348053 0.140789

22 0.448520 0.308179 0.125558 0.352627 0.141919

23 0.448879 0.305780 0.119766 0.357162 0.143065

24 0.449244 0.303408 0.113998 0.361657 0.144224

You might also like