AI
AI
AI
Y f b)
modele : W, b(X) =
Wx + b = Y
J(w ,
es
W, b :
Parameters piessa
scid/II
M
"
↑ = f( b (xi)
,
=
wx
i
+ b 5(w)
=
2f
↳ 5(w , b) : Bl
find W, b :
Til is close to
"
for all (x' , y")
-
Isi De & 1508 Siddd
5 cs18D (i) Y'13d) 199 21396
training piced , ,
data(X , y)
L
cost function :
squared error cost function
46 +(1 , 199
M
e
J(w b)
-
Ifm& -i <
3
number of
·
M
, =
training examples
F-fu
>
J(w b) z
,
=
m
& (f w, b(x"-yi))
in math -
minimize J (w , b)
er
, b
Week 1- Train the model with gradient descent
Week 2- multiple liner regression
Week 2- practical
Week 2- practical
Week 3- classification with logistic regression
Week 3- cost function for logistic regression
Week 3- cost function for logistic regression
Week 3- Gradient Descent Implementation
Week 3- The problem of overfitting
Course 2-
Model.compile(optimiser=tf.keras.optimizer.Adam(learning_rate=le-3)
Week 2-Multiple Classification
Week 2-Multiple Classification
Week 3- Advice for applying machine learning