Deep Learning Handwritten Notes
Deep Learning Handwritten Notes
* M
wth dotawts
canpiaiady
alentnde by mathemahel anyis
Aclig Csomalizion, tanddia
Faatuie Fealue lonnig
ML kane to manualy cary Featue Erg Bul
DL, Nenal Ndkot.
Principle Camponent Anoby iu : Dimengienal padnction
teabua 2
Trtiuo
Featue Etaction: Crealing eatuea trm the
The ectactd eatue e Aut endu, LoA):
then Aooded ML gerthe.
Slaction:
hoilsize er)dinansign
a lacsiue teakuue iinakion Cere),
Lase /Cidge Paqulaiisabion
Car
AI : ML: DL
miitaleu
Tendortta+ Fronaer
orch
Ntort ’ Tha
Nodu
Corrsponde to numbe
tAe dalaet
Dutput
>A NN
chaice the indiel
car tale y numle.
's cheica
n the u then
more than then
a
prolden
g
the the op
Val
are
tandomy t he
tc: thiu kauation u bed to creat a line for 2-D data.
data
linea. But
Non- near n nat. No, to
NN modal Acivakion
tundions. an-linaiy
: Tha igomoid reslt
tunctin This to prdue
modal.
: Noco he
’Baukud pro pgaon match
trat set af prodiced
uelws, we do BP and tusk t dharg 4 bias vaes
rul. Thiu
wightfheerr rat
ko blaun on
whele
o, tain e
want
braug c) Ten
for :lighpilomaná ramuical conputalion, ,Tar
erod : A tigh taue that
ApI onfop ol enyoro), CNT te
L Tenses: A mulki- dimarsiona wheee dka u Atore
Hoo many bimes the etire
Corree
baie. thiu ime To holaL thiu
time coMUeng bateh the datact
cleto n
ieant
all the colnne n the
max cue', None)
((- used fo eaine whathe a
ofe Male calagoiad
Poie
Troe
Some ohae
cQarifetin Agp. Csum)
hat
Rardom
Jhe poc Cue duturiny
the this proben ctalenent
more nt
Falle Positive panle kion Ccoiwsisg'
townid
metti. area
Apakial auctus
O - 256 ale
PGB 3 chanel
vidaos dalosek.
when NN,
Aantifieu mutie
Contou Conto t
Core prin ciple calld Corolutron Operstien. IE u a
pabens an the
nslurort Can Male or
t a me a action of
called kanel.
The i ze, data Re
laun Outyut
arid Gavid ducidd by Ihe
C3 x3) CNN
re eutput. CNN AA
the
Featue Ertalion
Featue acoss he
have Caruratuona
O
cperaion à
entre dekoet
The open Con bo hoe Many ine
aihirg the ho apochs
Jaains ntire lataat
modal
medlk pertotmane o - bime
n
n
ropresontd mal times,
lotne
1
make Au ha al th
Aamentas
2 rDws d
Irgortent raalons
on t
’Feokuu epreentolin pordeu.
bchia ed: fo redua he spatal inension Cundt
}aabude map uhie rekainirg the mpotontt
more obut
Mac a: Talce
matig
he marn a
3
2
tant
Neal Ntwore. Alrgsid Nawl
Netotca relall
dinenicnaldata,
meltod inside ansthet mathod
haue
the
Stdes in
bre: Np ponda, Ten ols Uaa), lanas Aataat : MNIST
’ Split aain, eut: Dirde nfo, 4 caugone:
mait. lacddata )
’ Haden te data : Conseit malk-iman ional'npcn
’ Normmaie the data: Normali He data bo tat the data tointe
neicl Jata.
ho heultn.
’ In hden
No Nauupn Nawon
oatput epoation,
* Actkiadbn tuncliond.
Aivation Funekon.
atumine ho much npotane
thal
teened ntust
the outpt. weght halp
Jaon trom data t mke more
Considael. to te
non
* Aebation Fention: AF popose
networt, whiel neans the net rprasant
the data.
patens
a vale betusen
signoid:-: Output
prediction, CFo Brey Caleulation).
Tand: Outpet
Centeu oand .
à colle Feand
* This entire pcus
eialle
hauael netwok
callad as Lass
fucti. CAcbe Prodicd)
dose to 0.
Gradirt
Gradient Dodcent: This y a fundarantel
minimige the
Naual Netuoe.
Loss functien
loss wr t
chain re
ol deatiue
Similaay
Ju dodw, lois
Rone
g Grient Prodem: Common prbn in DNN,
those 1t shen
Aay un dion bcome Amall bactunla
notot thu causs sloee or) Atalad
VGP
* ReLy Aelivabion 4 Bateh Normaliatien
’Darvatine:o to o2r.
Aolwes
Eponenia lineot Cnite
oheasse
unctin pidu
coss
optimizars: Rala preious poge 4 Grradient Dacert or more dokail
* Epod: Dne faosd 4 =|
Grrolient Jocan
Ous te entiree
frown teaiing datasetat eac
more
one
aoinig Epodh
point FP
Thrtion |1
prcaig
Oleome Normal (
* Minii Bateh lo prodm
mini atc SonD.
Batch siza (ooo We (ooo Dkapoinda
FP
Daba po inb Ibuation |
BP
Gtoeal Minina.
heobal Minina
Uled to nore
Argconuidaing pit data
reant data pointa ahile Ati t
remoe noise
danivatve patmelea
Monartun t
the rate tor
ond male ye of honahtn to acealuat
Thy
Fonal
Fomela
ANN Tndanantakion :
Panda, makyloibsitit- Aeatn
indapndert eakus.
* The next
es
Akutene
tzntovtlas. katni. ayau onpot Jenie Ew. Aition
- Laatyu, fRe, Fun ctio tor
Drpout. Ous ANN
fo
4
ope
aanca. a conurinent
The * Dane: A Donye conreethd Jayea n
Dairs ün
’Impous Gnalien
data
* Initialize An
Metiu =
Cos itie compile Copimizs ='adam', loss :
malel his toy clanif ftC bain, gaàn, adaor splt. oss both te,
: Lal Aot eoche tooo. At cre cpochs teoo)
achac certa
males the moA re cution sops atE
e
3han
Ateppad inpig retlt
’RaB: 3 harnelu,
48
O-25
Conuoluor 3*3
Cm-n)
Oparak on
(6-3)*
OxO+o|40K-2
OxIoxI4OKD
ued
* Mae Exbaa more
uridoo
windo
indos the Stide on
6b Min, Maq
2 &
4
1
3 More cip Tg
Tmplamentation:
modd Seqantil )
Thani
Neuons
Timetong
Not: The Aoene
Hidden
Dho
Veckorised
Contains Dhoni, u
Cntaie Dhai u, a
Encade : Canyet
#
pri,
Grodient proldem.
radicnt proben:
unth EGP
Tnput, AF
Wo
AF
t
the p.
AF
The cyl
n er Esçledsg Gdint proldam.
for th Low
Loss
Fundion
PU.
Paranet
qmdint containu a
to praant
Caradint probsm:
urith
J to os. e haue itzadion
ci.e) t-3
AF a val
t
AF
Garadint Problam.
omel
Can
inplode hos
maun ida bchind LSTM
ramemel.
Ahould pdt
’ Out ut hete: Delremines how e Ahould
LSTM
the Tnitil
nautdl steat.
CHddn-Stot]
The bciuey cle used tor KTM
Mamones
at eacd
or sach
The updabd short -tam mamag
LSTM
Prprtlentabian
Surm
Long-Term
Mermory
1
Long-Term
Memory
Short 1.63
Term e5.95
Memory Input f(5.95) = e5.95 1
1
= 0.997
Long-Term
Memory
2x 1.99
0.997
So this first stage of the Long
Short-Term Memory (LSTM)
unit reduced the Long-Term
Memory by a little bit.
+1.62
Sum
x2.70
Short 1.63
Term
Memory Input
1
Long-Term
Meremory
2x1.99 Sum
Long-Term
Mermory
299 Sum
Potential Mernory
Long-Term o.997
To Rernember
To Aemember
.and we get the y-axis
coordinate, 1.0.
to.62 Sum
(1 x2.00) + (1 x 1.65)
+0.62 = 4.27
K2.00
1.65 Gote
Short
Tern
Memory Input
-e427
n427) 1
1
New Long-Term
Long-Term MemoryA
Memory
Sum
2.96
% Potential Memory Potential Long
% Long-Term To Remenber o Term Memory
To Rermember 0.997 ..and we get a new
Long-Term Memory,
2.96.
Sum
5 -0.32
SUm
x1.41
200
x0.94
1.65
Short
Term
Memory Input
1
New Long-Tern
7Long-Term Mermory
Memory
2xH1.99 Sum
2.96
%Potetial Memory Potential Long
%Long-Terrn
To Rerenher 0.997 To Remember 0 Term Memory
To Remen
5um Sum
Short
Term
Input
We start with the New Long-Term
emory
Memory and use it as input to the
1 Tanh Activation Function.
New Long-Term
7Long-Term Memory
Memory
2-199 Sum 2.96
% Potential Menory
Potential Short
Term Mermory
%Potential Memory Potential Long
% Long-Term o.997 To Remernber Term Memory To Remember
To Remember (0.97
+.59
o.990.99 Outpt
Bum Sum
4.38 0.98
-0.19 New Short
Short Term
Term Memory
Memory Input This New Short-Tern Memory,
1 0.98, is also the Output from
this entire LSTM Unit.
Tranafonara i CuegKor m, Tiranfeme modla
that
al oltantion machanim" them
Hfeont pote them
Aimple u. Neton to
that t can the Contet n Aataat
b ogtimie ratullig in
bhat Con
sinilad wor
* Thiu makat the
Enbedigi Danie, contlna
op bue t semanti Aimila LSord doa
othe n thia
"Quean' hes
* word 2Vac: A achniue that ee a naeol netsort to
loords boel thin contert 2
main Metho:
on
contort wordd
Dhoni a
’
Glam : Grivan a csord pedi Contit
(Dhoni)
RNN
Genrel
ENodou
Tersforms
Hen, fhe words
I poued