0% found this document useful (0 votes)
33 views45 pages

Basics of Neural Network Programming: Binary Classification

The document covers the basics of neural network programming, focusing on binary classification and logistic regression. It explains key concepts such as cost functions, gradient descent, derivatives, and vectorization in the context of implementing logistic regression. Additionally, it emphasizes the importance of avoiding explicit for-loops and introduces broadcasting in Python for efficient computation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views45 pages

Basics of Neural Network Programming: Binary Classification

The document covers the basics of neural network programming, focusing on binary classification and logistic regression. It explains key concepts such as cost functions, gradient descent, derivatives, and vectorization in the context of implementing logistic regression. Additionally, it emphasizes the importance of avoiding explicit for-loops and introduces broadcasting in Python for efficient computation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

Basics of Neural

Network Programming

Binary Classification
deeplearning.ai
Binary Classification

1 (cat) vs 0 (non cat)

Blue
Green 255 134 93 22
Red 255 134
123 202
94 22
83 2
255 231
123 42
34 22
94 83 4 30
44 187
123 94
34 83
44 2 192
34 187
76 232 124
34 4434 187
76 92
67 232 34 142
83 194
34 7667 232
83 124
194 94
67 83 194 202
Andrew Ng
Notation

Andrew Ng
Basics of Neural
Network Programming

Logistic Regression
deeplearning.ai
Logistic Regression

Andrew Ng
Basics of Neural
Network Programming

Logistic Regression
deeplearning.ai
cost function
Logistic Regression cost function
! "
!" = % & ' + ) , where % * = "#$ +,

Given (' (.) , ! (.) ),…,(' (1) , ! (1) ) , want !" (2) ≈ ! 2 .
Loss (error) function:

Andrew Ng
Basics of Neural
Network Programming

Gradient Descent
deeplearning.ai
Gradient Descent
' ,
Recap: !" = % & ( + * , % + = ,-. /0
6 6
1 1
1 &, * = 4
5 ℒ(!" 7 , ! (7) ) =
− 5
4 ! (7) log !" 7 + (1 − ! (7) ) log(1 − !" 7 )
78, 78,

Want to find &, * that minimize 1 &, *


1 &, *

*
& Andrew Ng
Gradient Descent

Andrew Ng
Basics of Neural
Network Programming

Derivatives
deeplearning.ai
Intuition about derivatives
! " = 3"

"
Andrew Ng
Basics of Neural
Network Programming

More derivatives
deeplearning.ai
examples
Intuition about derivatives
! " = "$

"
Andrew Ng
More derivative examples

Andrew Ng
Basics of Neural
Network Programming

Computation Graph
deeplearning.ai
Computation Graph

Andrew Ng
Basics of Neural
Network Programming

Derivatives with a
deeplearning.ai Computation Graph
Computing derivatives
&=5
11 33
"=3 6 $ =&+! ) = 3$
!="#
#=2

Andrew Ng
Computing derivatives
&=5
11 33
"=3 6 $ =&+! ) = 3$
!="#
#=2

Andrew Ng
Basics of Neural
Network Programming

Logistic Regression
deeplearning.ai
Gradient descent
Logistic regression recap

! = $%& + (
)* = + = ,(!)
ℒ +, ) = −() log(+) + (1 − )) log(1 − +))

Andrew Ng
Logistic regression derivatives
&%
$%
&( ! = $% &% + $( &( + ) * = +(!) ℒ(a, 1)
$(
b

Andrew Ng
Basics of Neural
Network Programming

Gradient descent
deeplearning.ai
on m examples
Logistic regression on m examples

Andrew Ng
Logistic regression on m examples

Andrew Ng
Basics of Neural
Network Programming

Vectorization
deeplearning.ai
What is vectorization?

Andrew Ng
Basics of Neural
Network Programming

More vectorization
deeplearning.ai
examples
Neural network programming guideline
Whenever possible, avoid explicit for-loops.

Andrew Ng
Vectors and matrix valued functions
Say you need to apply the exponential operation on every element of a
matrix/vector.

!$
!= ⋮
!&

u = np.zeros((n,1))
for i in range(n):
u[i]=math.exp(v[i])

Andrew Ng
Logistic regression derivatives
J = 0, dw1 = 0, dw2 = 0, db = 0
for i = 1 to n:
! (") = " $ # (") + %
&(") = '(! (") )
* += − - (") log -1 " + (1 − - " ) log(1 − -1 " )
d! (") = &(") (1 − &(") )
(")
d"% += #% d! (")
(")
d"' += #' d! (")
db += d! (")
J = J/m, d"% = d"% /m, d"' = d"' /m, db = db/m

Andrew Ng
Basics of Neural
Network Programming

Vectorizing Logistic
deeplearning.ai
Regression
Vectorizing Logistic Regression
! (#) = & ' ( (#) + * ! (-) = & ' ( (-) + * ! (.) = & ' ( (.) + *
+(#) = ,(! (#) ) +(-) = ,(! (-) ) +(.) = ,(! (.) )

Andrew Ng
Basics of Neural
Network Programming

Vectorizing Logistic
deeplearning.ai Regression’s Gradient
Computation
Vectorizing Logistic Regression

Andrew Ng
Implementing Logistic Regression

J = 0, d!! = 0, d!" = 0, db = 0
for i = 1 to m:
" ($) = ! & # ($) + %
&($) = '(" ($) )
* += − - ($) log & $ + (1 − - $ ) log(1 − & $ )
d" ($) = &($) −- ($)
($)
d!! += #! d" ($)
($)
d!" += #" d" ($)
db += d" ($)
J = J/m, d!! = d!! /m, d!" = d!" /m
db = db/m
Andrew Ng
Basics of Neural
Network Programming

Broadcasting in
deeplearning.ai
Python
Broadcasting example
Calories from Carbs, Proteins, Fats in 100g of different foods:
Apples Beef Eggs Potatoes
Carb 56.0 0.0 4.4 68.0
Protein 1.2 104.0 52.0 8.0
Fat 1.8 135.0 99.0 0.9

cal = A.sum(axis = 0)
percentage = 100*A/(cal.reshape(1,4))
Broadcasting example
1 101
2 100 102
+ =
3 103
4 104

1 2 3 100 200 300 101 202 303


+ =
4 5 6 104 205 306

1 2 3 100 101 102 103


+ =
4 5 6 200 204 205 206
General Principle
Basics of Neural
Network Programming

Explanation of logistic
deeplearning.ai regression cost function
(Optional)
Logistic regression cost function

Andrew Ng
Logistic regression cost function
If $ = 1: ( $ ) = $*
If $ = 0: ( $ ) = 1 − $*

Andrew Ng
Cost on m examples

Andrew Ng

You might also like