0% found this document useful (0 votes)
21 views13 pages

21 CP 46 - (ML LAB 3)

Uploaded by

Awais Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views13 pages

21 CP 46 - (ML LAB 3)

Uploaded by

Awais Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

ML LAB REPORT

LAB NO 3
Name: Muhammad Huzaifa

Reg No.: 21-CP-46

Section: omega

Department: Computer Engineering

Submitted to: Sir shehryar Khan


Due Date: 27 Jan 2024
Task no 1:
Download the folder from Teams Lab 03 Pre_tasks and upload the notebook in google colab and upload
the required csv your task is to run and analyze the code write your understanding.

output:
EXPLANATION:

In this task we have to fetch and plot data from different files and learn how to fetch data from different
files at the last there is one problem statement in which we have to fetch data and train model to predict
value of year 2020 using data set provided in the manual.

Task no 2:

Download the 3_gradient_descent zip file from Teams and Extract it find the files and Exercise folder
run the code and get your understanding?

Explanation:
The function now plots the regression line at each iteration with plt.plot(x, y_predicted, color='green',
alpha=0.1) to show the gradual convergence.

After the loop, the final regression line is plotted in blue.

The legend is added for clarity.

The final values of m and b are printed.

Task no 3:
T3: Modify the Example 1 code to handle multivariate linear regression. Update the synthetic dataset
to include multiple features.

Code:

import NumPy as np

def gradient_descent(X, y, learning_rate, num_iterations):


# Initialize coefficients

theta = np.zeros(X.shape[1])

m = len(y)

for iteration in range(num_iterations):

# Calculate predictions

predictions = np.dot(X, theta)

# Calculate errors

errors = predictions - y

# Update coefficients

gradient = np.dot(X.T, errors) / m

theta -= learning_rate * gradient

# Calculate and print the cost

cost = np.sum(errors ** 2) / (2 * m)

print(f"Iteration {iteration + 1}/{num_iterations}, Cost: {cost}")

return theta

# Usage example for multivariate linear regression

np.random.seed(42) # for reproducibility

# Generate a synthetic dataset with two features

X1 = np.random.rand(100, 1) # Example feature 1

X2 = np.random.rand(100, 1) # Example feature 2

X = np.c_[X1, X2] # Combine features into a matrix


# Create a linear relationship with noise

y = 2 * X1 + 3 * X2 + 1 + 0.1 * np.random.randn(100, 1)

# Add a column of ones to X for the intercept term

X_b = np.c_[np.ones((100, 1)), X]

# Transpose y to make it a column vector

y = y.T[0]

# Apply gradient descent

learning_rate = 0.01

num_iterations = 100

theta = gradient_descent(X_b, y, learning_rate, num_iterations)

print("Final Coefficients:", theta)

# Calculate and print the final cost using the trained model

final_predictions = np.dot(X_b, theta)

final_cost = np.sum((final_predictions - y) ** 2) / (2 * len(y))

print("Final Cost:", final_cost)

output:
TASK NO 4:

T4: Modify the Example 1 code to implement stochastic gradient descent. Update the algorithm to
update the coefficients based on a single randomly chosen instance at each iteration.(Explore at your
End)

Code:

import numpy as np

def stochastic_gradient_descent(X, y, learning_rate, num_iterations):

# Initialize coefficients

theta = np.zeros(X.shape[1])

m = len(y)
for iteration in range(num_iterations):

# Randomly shuffle the data

shuffled_indices = np.random.permutation(m)

X_shuffled = X[shuffled_indices]

y_shuffled = y[shuffled_indices]

for i in range(m):

# Select a single instance

xi = X_shuffled[i:i+1]

yi = y_shuffled[i:i+1]

# Calculate prediction and error for the selected instance

prediction = np.dot(xi, theta)

error = prediction - yi

# Update coefficients based on the selected instance

gradient = np.dot(xi.T, error)

theta -= learning_rate * gradient

# Calculate and print the cost using the current model

predictions = np.dot(X, theta)

errors = predictions - y

cost = np.sum(errors ** 2) / (2 * m)

print(f"Iteration {iteration + 1}/{num_iterations}, Cost: {cost}")

return theta

# Usage example for stochastic gradient descent


np.random.seed(42) # for reproducibility

# Generate a synthetic dataset with two features

X1 = np.random.rand(100, 1) # Example feature 1

X2 = np.random.rand(100, 1) # Example feature 2

X = np.c_[X1, X2] # Combine features into a matrix

# Create a linear relationship with noise

y = 2 * X1 + 3 * X2 + 1 + 0.1 * np.random.randn(100, 1)

# Add a column of ones to X for the intercept term

X_b = np.c_[np.ones((100, 1)), X]

# Transpose y to make it a column vector

y = y.T[0]

# Apply stochastic gradient descent

learning_rate = 0.01

num_iterations = 100

theta_sgd = stochastic_gradient_descent(X_b, y, learning_rate, num_iterations)

print("Final Coefficients (SGD):", theta_sgd)

# Calculate and print the final cost using the trained model

final_predictions_sgd = np.dot(X_b, theta_sgd)

final_cost_sgd = np.sum((final_predictions_sgd - y) ** 2) / (2 * len(y))

print("Final Cost (SGD):", final_cost_sgd)

OUTPUT:
TASK NO 5:
Run All codes in the Lab Manuals.

Output:
EXAMPLE 1:

Iteration 1/100, Cost: 2.142795252297297

Iteration 2/100, Cost: 2.090238952060629

Iteration 3/100, Cost: 2.0389998966125247

Iteration 4/100, Cost: 1.9890450391890846

Iteration 5/100, Cost: 1.9403421621358274

Iteration 6/100, Cost: 1.892859856106142

Iteration 7/100, Cost: 1.846567499781623

Iteration 8/100, Cost: 1.801435240101203

Iteration 9/100, Cost: 1.7574339729863255

Iteration 10/100, Cost: 1.7145353245496946

Iteration 11/100, Cost: 1.6727116327754852

Iteration 12/100, Cost: 1.6319359296591762

Iteration 13/100, Cost: 1.5921819237954802

Iteration 14/100, Cost: 1.5534239834031238

Iteration 15/100, Cost: 1.5156371197755178


Iteration 16/100, Cost: 1.4787969711466389

Iteration 17/100, Cost: 1.4428797869616983

Iteration 18/100, Cost: 1.4078624125424426

Iteration 19/100, Cost: 1.3737222741371888

Iteration 20/100, Cost: 1.3404373643459424

Iteration 21/100, Cost: 1.3079862279111782

Iteration 22/100, Cost: 1.276347947865125

Iteration 23/100, Cost: 1.245502132024594

Iteration 24/100, Cost: 1.2154288998246456

Iteration 25/100, Cost: 1.1861088694825814

Iteration 26/100, Cost: 1.1575231454839825

Iteration 27/100, Cost: 1.129653306382709

Iteration 28/100, Cost: 1.102481392906988

Iteration 29/100, Cost: 1.0759898963639065

Iteration 30/100, Cost: 1.0501617473348246

Iteration 31/100, Cost: 1.0249803046544101

Iteration 32/100, Cost: 1.0004293446661807

Iteration 33/100, Cost: 0.9764930507476095

Iteration 34/100, Cost: 0.9531560030980387

Iteration 35/100, Cost: 0.9304031687828062

Iteration 36/100, Cost: 0.9082198920271513

Iteration 37/100, Cost: 0.8865918847536394

Iteration 38/100, Cost: 0.8655052173569949

Iteration 39/100, Cost: 0.8449463097103805

Iteration 40/100, Cost: 0.8249019223973241

Iteration 41/100, Cost: 0.8053591481636275

Iteration 42/100, Cost: 0.7863054035837365

Iteration 43/100, Cost: 0.767728420936196


Iteration 44/100, Cost: 0.7496162402829403

Iteration 45/100, Cost: 0.7319572017473077

Iteration 46/100, Cost: 0.7147399379857917

Iteration 47/100, Cost: 0.6979533668486692

Iteration 48/100, Cost: 0.6815866842247642

Iteration 49/100, Cost: 0.6656293570657345

Iteration 50/100, Cost: 0.6500711165853652

Iteration 51/100, Cost: 0.6349019516294924

Iteration 52/100, Cost: 0.6201121022122641

Iteration 53/100, Cost: 0.6056920532145734

Iteration 54/100, Cost: 0.5916325282405884

Iteration 55/100, Cost: 0.5779244836284204

Iteration 56/100, Cost: 0.5645591026110539

Iteration 57/100, Cost: 0.5515277896237765

Iteration 58/100, Cost: 0.5388221647544293

Iteration 59/100, Cost: 0.526434058332894

Iteration 60/100, Cost: 0.514355505656325

Iteration 61/100, Cost: 0.5025787418467219

Iteration 62/100, Cost: 0.49109619683751743

Iteration 63/100, Cost: 0.47990049048595035

Iteration 64/100, Cost: 0.46898442780806215

Iteration 65/100, Cost: 0.45834099433324355

Iteration 66/100, Cost: 0.447963351575331

Iteration 67/100, Cost: 0.4378448326173293

Iteration 68/100, Cost: 0.4279789378069087

Iteration 69/100, Cost: 0.41835933055989843

Iteration 70/100, Cost: 0.40897983326906556

Iteration 71/100, Cost: 0.3998344233155401


Iteration 72/100, Cost: 0.39091722918030847

Iteration 73/100, Cost: 0.3822225266532655

Iteration 74/100, Cost: 0.37374473513737966

Iteration 75/100, Cost: 0.36547841404558035

Iteration 76/100, Cost: 0.3574182592880469

Iteration 77/100, Cost: 0.34955909984762484

Iteration 78/100, Cost: 0.34189589444116486

Iteration 79/100, Cost: 0.3344237282646229

Iteration 80/100, Cost: 0.32713780981982477

Iteration 81/100, Cost: 0.32003346782084174

Iteration 82/100, Cost: 0.31310614817798454

Iteration 83/100, Cost: 0.3063514110574651

Iteration 84/100, Cost: 0.2997649280148275

Iteration 85/100, Cost: 0.2933424792003002

Iteration 86/100, Cost: 0.2870799506342623

Iteration 87/100, Cost: 0.28097333155106535

Iteration 88/100, Cost: 0.275018711809497

Iteration 89/100, Cost: 0.269212279368213

Iteration 90/100, Cost: 0.2635503178245087

Iteration 91/100, Cost: 0.25802920401483953

Iteration 92/100, Cost: 0.25264540567554333

Iteration 93/100, Cost: 0.2473954791622514

Iteration 94/100, Cost: 0.2422760672265186

Iteration 95/100, Cost: 0.23728389684823534

Iteration 96/100, Cost: 0.23241577712242173

Iteration 97/100, Cost: 0.2276685971990407

Iteration 98/100, Cost: 0.22303932427449857

Iteration 99/100, Cost: 0.21852500163353755


Iteration 100/100, Cost: 0.21412274674025508

Final Coefficients: [1.12381709 0.67641994]

Final Cost: 0.20982974937701704

…………………………………THE END……………………………………………

You might also like