0% found this document useful (0 votes)
25 views

Assignment 1 Utkarsh - Jupyter Notebook

This document is a Jupyter Notebook that contains code and explanations for an ML assignment. It includes: 1) Taylor series approximations of order 1, 2, and 3 for the function 1+sin(x) around the point 3π/4, with plots showing the approximations converge to the true function as the order increases. 2) Code to plot the loss surface contour for the function exp(-1/3*x^3 + x - y^2) and calculate/plot the gradient at sample points. 3) Equations for calculating the gradient of the loss function J = exp(-1/3*w1^3 + w1 - w2^2) and finding that setting

Uploaded by

Deepankar Singh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Assignment 1 Utkarsh - Jupyter Notebook

This document is a Jupyter Notebook that contains code and explanations for an ML assignment. It includes: 1) Taylor series approximations of order 1, 2, and 3 for the function 1+sin(x) around the point 3π/4, with plots showing the approximations converge to the true function as the order increases. 2) Code to plot the loss surface contour for the function exp(-1/3*x^3 + x - y^2) and calculate/plot the gradient at sample points. 3) Equations for calculating the gradient of the loss function J = exp(-1/3*w1^3 + w1 - w2^2) and finding that setting

Uploaded by

Deepankar Singh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

20/10/2020 Assignment_1 - Jupyter Notebook

ML Assignment 1 Question 1

𝐹(𝑥) = 1 + 𝑠𝑖𝑛( 3𝜋4 ) + 𝑐𝑜𝑠( 3𝜋4 ) ∗ (𝑥 − 3𝜋4 )


For Order 1

For Order 2
3𝜋 3𝜋 3𝜋 𝑠𝑖𝑛(
𝐹(𝑥) = 1 + 𝑠𝑖𝑛( 4 ) + 𝑐𝑜𝑠( 4 ) ∗ (𝑥 − 4 ) − 4 2 4
3𝜋 ) ∗ (𝑥 − 3𝜋 )2

In [107]:

1 #Question 1 a and 1 b
2 import matplotlib.pyplot as plt
3 import numpy as np
4 data = np.arange(-1*np.pi+3*np.pi/4, 1*np.pi+3*np.pi/4, 0.1)
5 taylor_1 = list(map(lambda x: 1+np.sin(np.pi*3/4)+np.cos(np.pi*3/4)*(x-np.pi*3/4
6 taylor_2 = list(map(lambda x: 1+np.sin(np.pi*3/4)+np.cos(np.pi*3/4)*(x-np.pi*3/4
7 taylor_3 = list(map(lambda x: 1+np.sin(np.pi*3/4)+np.cos(np.pi*3/4)*(x-np.pi*3/4
8 fig, ax = plt.subplots(figsize=(20,10))
9 ax.plot(data, 1+np.sin(data))
10 ax.plot(data, taylor_1)
11 ax.plot(data, taylor_2)
12 ax.plot(data, taylor_3)
13 plt.ylabel('F(x)',fontsize=20)
14 plt.xlabel('x',fontsize=20)
15 plt.title('Taylor approximation of 1+sin(x) about 3*pi/4',fontsize=20)
16 plt.legend(['1+sin(x)','taylor O(1)', 'taylor O(2)','taylor O(3)'],fontsize=20)
17 plt.grid(linestyle='dashed',linewidth='1',color='black')
18 plt.grid(b=True, which='major', color='#666666', linestyle='-')
19 plt.minorticks_on()
20 plt.grid(b=True, which='minor', color='#999999', linestyle='-', alpha=0.2)

Question 1 (c). As we include more terms in taylor series, the series will give more close value even at distant
points from where we have constructed the taylor series. Hypothetically, if we include all the infinite number of
terms of Taylor series, the series will give value at any point which is equal to the function value at that point. In

localhost:8889/notebooks/Plaksha/Coding/Ravi Kothari Sir/ML1/Assignment_1.ipynb 1/4


20/10/2020 Assignment_1 - Jupyter Notebook

the following image we can as we increase the order of Taylor Series, the series gets closer to function

ML1 Assignment 1 Question 2


Question 2 a & c

localhost:8889/notebooks/Plaksha/Coding/Ravi Kothari Sir/ML1/Assignment_1.ipynb 2/4


20/10/2020 Assignment_1 - Jupyter Notebook

In [106]:

1 import numpy as np
2 import matplotlib.pyplot as plt
3 w1 = np.linspace(-2, 2, 100)#Trial Data
4 w2 = np.linspace(-2, 2, 100)#Trial Data
5 X, Y = np.meshgrid(w1, w2)
6 Z = np.exp(-1/3*X**3+X-Y**2) #Calculating Loss
7 fig,ax=plt.subplots(figsize=(20,20))
8 ax.set_title('Loss Surface Conour Plot')
9 #Doing Part 2
10 x_cord = np.array([0,1.2,0.5])
11 y_cord = np.array([0,0.9,0.5])
12 dJ_dw1 = (-1*x_cord**2+1)*(np.exp(-1/3*x_cord**3+x_cord-y_cord**2)) #PD of J w.r
13 dJ_dw2 = (-2*y_cord)*(np.exp(-1/3*x_cord**3+x_cord-y_cord**2)) ##PD of J w.r.t.
14 cp=ax.contour(X,Y,Z,50)#Plotting Contour
15 fig.colorbar(cp)
16 ax.quiver(x_cord,y_cord, dJ_dw1, dJ_dw2) #Plotting Gradient
17 plt.xlabel('W1')
18 plt.ylabel('W2')
19 plt.show()

localhost:8889/notebooks/Plaksha/Coding/Ravi Kothari Sir/ML1/Assignment_1.ipynb 3/4


20/10/2020 Assignment_1 - Jupyter Notebook

Question 2 b

𝐽 = exp([− 13 𝑤31 + 𝑤1 − 𝑤22 ])


∂𝐽
∇𝐽 = [ ∂𝐽1 ]
∂𝑤
∂𝑤2
(−𝑤 21 + 1)exp([− 1 𝑤31 + 𝑤1 − 𝑤22 ])
∇𝐽 = [ (−2 ∗ 𝑤 )exp([− 13 𝑤3 + 𝑤 − 𝑤2 ]) ]
2 3 1 1 2
Question 2 d

On equating the above ∇𝐽 to 0 we get 𝑤1 = ±1 and 𝑤2 =0 Hence,


𝑤1 = −1 Which is the weight vector which minimises the loss function
[ 𝑤2 ] [ 0 ]

localhost:8889/notebooks/Plaksha/Coding/Ravi Kothari Sir/ML1/Assignment_1.ipynb 4/4

You might also like