0% found this document useful (0 votes)
93 views48 pages

DL Notes Handwritten

Deep Learning notes

Uploaded by

kartik.mohol.123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
93 views48 pages

DL Notes Handwritten

Deep Learning notes

Uploaded by

kartik.mohol.123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 48
pce DL Cunit-3) = “| =Convolulion __ Neual,_nelwork CONN — _ a Inlroduclion to cnn: : is [A convolulional —neuxnl nelmonn on cual is 0 Nelo int ee del inant Cano — mode_up ‘of an ——liseneform an inpu a =| lycun a cw 4o do image fla da gge.aaalysia —— loss including a Scene classificalion abjech idelec lon _¢ seqmeniation ¢ image proces 1ag-_$§ L$n avder 1a ‘undersland how “Cie work Itneve cre three Kew cancenls - local seceptive itieida, shared wig hls € biases , & achivalion SS output anpuls Sear oulp i | Bayo seul ty. —tanluinal { “NelwoyK err Nebo Pogo No. NS Bate 1s */ Aachifecture of cM = L fonvs — conva __ronv3 con on — er (A+ | ae! S Booting | Peating Pooling Ssnoul Vidden layer ee clogsificatiog = Bp cnn architecture consisis oF 4wo fundamental _companen}s — fenlurve extaattion is a _paacedure tha} uses a ronvalution tool _{o Seperate & identify the dis) acl characteristics of la piclawre for study : i = there ore several pains of © convolutional oct! 6 in the feature extaachian nelwatk —!A funy connected lauer is a makes use of [eanvalubional process's oulpul ¢ delermines l4he class of the piclure. using the — | features thal weve _prreviausly extracted =) enn has ther e layers - 1) convolutional layer |» ¢ aroy_P layer 2 ta_tonnected layer ————_—— _#i the basic _stauchuae of “a eanvelulianal | nelwork - X| Padding :- =\ i er the dimensions _| r — by inesc ening or decreas -lat ciloms ua. use a convo Mtienal 1 -___ without necessavi L widlh of the vol = Padding € stride influence operalidn 16 pevfarmed = Due to padding , information an the hander} ho of images are also preserved Simeloaty rot the “enfer df image =| Padding Terminologies - 0 vatid padding ouput = inpul - kernel + £ Sine sine Sine | |0) gs " f Bubp Inpul ' ! gine sine _ 13) fu padding Z e it (oulpus = “inpul_+ kernel = 4 Hine 512e Sine Steride *- 1 ; how convolution | d_slride influence operation is performed $$$ — over. Sleide conjaals how filler _convoive: inpu ~ 46 Glvide js sel to 4 filler moves across. 4 ipixel of ao time ¢& if mide ia eel to 2 | Filer moves 9 pixels of a dime ; = imone the value of sleride, smaller witt_he the isesuiting oulpul <¢ vice versa = T 1 | 4) 3 THz! S| n | elle = [alls|el| > fires! [ES {2/9 Input Joride =4 lglmide io tep far suding the convelutt= Na) twin Sieride Sse es equal fo 4 like pa used jo atler +he dimensi adie input | oulput yeclora ei lhe by inc easing ov decteasing | RelLU Layer : : _ "ely siond for eclified linear unit RelV 16 computed _aflea convolution —___ \_\5 mos! _cornmanly deployed achivation funclias Maal allows the neurral network 40 account for __non-tineay velationshi ps ee -||this function has _}wo_ tenjart—ashiant ges | — over _4}9moida.!— Such gs 6¢%) ox tanh Ct i Page No, Dato! =| Relu mi be —coleuleled —exleemeliy- a because Yui thal is evequins the inpul tothe Value _o = Addiliy depending on__whether ow not — its i na Sh cnegolive thas a _derivadive — of eithey _o Relu (x) = f 0, fay uo bt | | C4 for x0 logisdic ££ Rely + détw/da => | vd Re : L oo. 2 4% 6 8 10 { —hg ReLU_Vs__814moaio/ Se { *) Poona 3 r= = Pootiny layer operates on | ndependen oo i a ee eee fe ee =| thie reduces resolution of the feature map — hay seducing, —heighl @ width of fealures maps bol redoing Fealtires of the map requived fax clos Wieadion. This _is_ca\ted dawn -samp Vigg | i T = each fealure map | it = aura i di ease i Tpiclure thal the Kernel has covesred, _weduened by ma i - iq} is the mos! common better crear tS ying: 2 average of att the values faom the oren of the picluare couered by the Kernel “is what is celucradd by avetage pooling approach os it — yo} | IF oo loo] 19] a ral ig) [rs fia Ye} t 3 [oof a3] ufie{ 24 7 ie wf sii Max pooling Average | t r PONG, pany =f Poa ing can—conlarol the over OO ING, Ee tata ae same__tresolution 05 input i connected__Layeas— Sub connected layer loons ike neural nedwork connecling—all_neurans’ & forms the last few layers in _+he network 4 the oulpul ferom —fiablen nye Fed _jo__this- fully connecled layer overFitting 3 more time for Y_Joro ing Dropoul 3 Reduces the eH CitLingS. Le. improve éning—specd. 1 =/Aan the inputs — feram _+his loyev acre connected to every activation “unit of the Nex} lover -| Since. ur the ponane lees are occupied tala. fully connected lay l ar kithing -| Derdpoul is one of +the nigues thal r - L seduces oven titling This _daapout aale is ——— (Usuaiy oS I aut | 7 =a oes = OL Oni OO Onmis {| Tp One, convolution : : One oye =U oO ©. i anpul Max pool __Flathen ia conneded en : Volume : layer See fig. funy connected layer =| fuuly connected {ayer genevole final _cludsificalion these different layers in Cry cum _be mixed. On crepeated ony. number of times this is -l qh interleaving of layers_in_crval += Halewleavi is “an imporlan} feabure parovi= -ded in which makes reasearch & ;____| expevimenialian in can thal much more leesling i ~|there ore several cam designe thal m 5 = Below 15 |) Lonel the ibe used ¢ these _accchilecl lute have heen lis) of a few wrchilectures | 2) plexnel 3) vac we} | _5) Res fel I 6) 2¢Nef 4) Google tved | 4 aa ] PAS Biegers [8 toyers Wwanaiiew Tal== fe} = fat pa Alexwe} Fig LFF” own archilechure —cuith Leen layers mT =n to mf Ste 71S, h SI a AISTA_ lalate 12 F|felo|slal | -l4 Probab mage} lei 1 2 ISIS NoLL at Li /l2 lat 22 Sslelellels\slsllsie|q_ wees a6 eS | = | = fig. em VGGNe} ancchileclure. n5_have been explcuned tn | the above MFEN crn _desiq £ 9 Wess = [Pee Wo.: S Date Li 7 I i lI | Loca Respanse__Normaliaadion = _______— : local _Respans ——+mos] important normatization techniques —__— Beli wad einplnged. “aa the adlivabian an? “instead of the comman tanh ¢ 41 Luohich ed 4o the initial jnleoductian of local Response ormatiration clan) in the Blexiel | —___Larchifectune Il =| is the tolal number of channels =|€K.diB.n are hyper. parameter Hat cre 9) Inlew = channel LAN ~|the pixe) values a} Ca.¥) location hefaae Lotter normatization are ata.u) € bCx,4) gesp,| _constant -|K ig the nowmanisa si } rol ciny _2ero division (singularities )_ 9) dnioa-channel_ LRN peared allel ain tok — n Nee ore eo nid & ee Solve image a Neighborhood is defined around the ipixel under considevation in intra _-chonnel Pages No. =a 4 T aconvolutional wvedwark Generally inorder }o build & foan oa cn following Sheps acre followed —_____ { SLnpal Layer - dhs slop ~rresels the dala -| d-Sine- is equal bo the square of the aol it =| fo Example - if picluve has is6 pixels, the fig ucre 15 96 X96 contains color on na} 2 convolution layer - -\ Here, we need: to create cansislen} layers -i We a various Fillers ta learn imp Ifeatunes Of the nefwanrk —\ We define the sire of keyne! & Volume of | | the Fitbeer ! | | Pooling layer = j_ dn the Haier Slep. we adda pooling { layer - o - chia layer reduces the sine of Hye __ i input = ; 14—does__by ta The maximum value of the Sub-malrix Additional Convolulion_& Pooling ayers - an this 4tep._we can add as mony conval Jion_¢_.poaiing layers a5 we dant. [Pepe Wo, Pete ti S| dense _tayer | fur ly—connected layer —________ §n his lop. we “ean use a differen! acdiva-, ——+Henfunclion_@ add the de dropout effec! Splat layere = : -| 4} final _teye Slop ia the prediction 4) Train the model - = once the archilecture has heen finai2od th CNN can he dacined 8) Tes} 2 evaluate the madel- }_=lonce the model has been denined step is jo jes} the mode & evaluate iss | performein ce current ¢ Recursive mets aaa unil- Recurven| € Recursive els eed fee a Graph - the strruclure of a collection of computalior fuch_os_‘he-mappingaf~inpula €earomelens. [io tunable 4 loss.—may_he _formauised diag -a_compulal tonal _gaaph wWe can Unfold a recursive _on_crecurren}| Network Example of unfoiding a cecurren} eq? — classical foam of a dynamical Sysiem 16 72 £0675 9) : _where, 6{) is rated the stale of the syle. for £=% time steps we gef | 6. FE 5 @) | = CF (5s @); 0) » 7 y Fare <= Ft FAI? aA? = Ter fig Unfoiding = Dynan ical_ysiem Deriven by « external al goal : As another example , consider a-diynamhical_— Het glem deriven_byexdernal_ signal ab f Os Fg yO 5 x 30) slale now fanlains informalion ahoul Whe whole pas! _inpul sequence ._ Paga Ho. S Data: 7 unfording aecunrent Neural Ned woork To indicate thal the alate ia hidden jeweile using variable bh for state 2! ¢ =f Chcb-D wt» we cy SOSH ewO PP! KF 1 ae {| ¥ untotd - ya ie) S@_O ¢ Adv ee 5 Py | imadel hos same inpul sive : | yl eossible to use same function £ sity sam | porameters of every sleps 3) learning 4 single shared Coode| | Mou *!) Recurven} Neural elu - ~TRee yp ureren}| Neuval Nelwork (RNA) 16 q eof neural nefwork In wh which the __ | iepesutle of one slep ave fed inta the next | step's compulalians =| traditional neural nelworks have inputs— é_oulpuls that ave independen} of ane another. ___ | fidden\ : tnpul Po Neyer {oul pul fig. General RV 2 eco No: Date: iS i Run have a ‘memory’ thot cred cin, atl data elated +o talculation .a{ executes the same acfion an al) of | a the oulpuds Mons Ran ons - consider a deeper network thal has 2 hidden layey one input ¢ # biases far hidden layer 4 are ‘Cw bt) (we be) for the second hidden layer € (Wa, bs) faa the ied hidden la Tapul Sho, by) +©2b2) G1.) outpul] output] ineul fig. Recurren| tote in Rn vridden—toger snpul layer XN owout layer ‘ | Recurrent Net | ae fig. Recurrent sabe In RNA to use AAA There were a few paoblems for wand neural network which developmen} of Rv Niean't deal with cangecufive data Merely bakes ina account cutrent input BS 4) unable to cemember _eamiiow inputs i The RWN offers a mremedy far these problems pS RNN can handle sequential | | data accepting both the inpul data being Teed —al ¢ 2 Advantages of Rn) 7 | | a), ¢ L | previous inputs through their hidden s4ates a) Ans hove the obinity to relain info Rul can perform onmtine ea Raw trained on a ange _dalase: + | pisadavantages i ; )| computationany —imbensive Dt have a tine apacily Anns 15 highly sensitive 4, the gigs of initial eweighls . pifficurty, Ta_capluring Global__infarmatian | [ark of ialenpetobigigh i oe ie i =| Recurrent neural pee thal ave biclimechi ~ onal ave basicatiy 1 twa sepevale ens — | combined. — sos =| for one nelwork it ig fed in +Hhe~ inpul i sequence 19 fed in orequla time. onder, | foro dite” nelwomn it’ is fed in | ImeverSe lime onder -lpt lime Step. nelworks care typi aliy concatenated + however there arre athaa choice. uch 96 summation. -| th word info" about the sequence af each time 1of honduseriting recognition =the concep] seems easy enough bul-wiher tt omes to actually inoplementin ga neura\ Net — which ulirniees _bidixectianal structure, confusion _ arises (ae ‘Tas Sequence concat concat | ‘[eoncol | a RR / {s71 sir (is1\e}— J > {= ] Se, Est) tals} (sr x Se = hy He Xa 1 | i IL Fig. 0 confusing formulation. la i |_# lEncoder- necader Sequence fo Sequence architecture 1 eB shen_iele ile._ate_of same leagl =| Quins a very useful & can be used under iYpree cages - |) 10 i ixed sae veclay |) ¢o map a fixed sire veclor to a sequence. 4) To mop an inpul sequence ta an output sequence of the Samelength RNN when ‘Jo are ef hol same length = | Encoder- decodey imace] t =| Me-re_use—considehow_an_enn_can be Jonni | | | Input Sequence to en oul : = | Sequence —arbls_is_not necessarily the etl “ a 4 J _ =| the encodea- decoder model _1s composed ii on three pa mover building blacks + J 1) Encoder | 2) Hidden vector / Encoder. oar | ay Decoder fu ola imensior a AMeclar Ceram the inpu} sequence Chidden ‘ : Nechorr) | . =| the hidden Neclow wit he 1 | tne audpul sequc . uy Yas { | 7 7 Ei 4 Encoder & Lendl = ave 5 } & 7“ | ann alan Fen) % | ( i \ \ &\ necader EF | \ % qT 1] ka as] ane 4—___|__ __fig_Enendeer- pecoder mode} ___ x ft I En =the encoder may be caeated by 4 Jelacking many Ani cells RNIN Guccessively _ ES rN 0 mA eee noe a j_9) Encoder veclar- This is model's final hidden stale created by the encoder 2) Decoder - The final hidden Veclow ablained af |the conclusion of the encoder model serve sa xhe decoders oupul Question nawe Speech erg ynitlon Time Series apph Most recurrent neu 20 val compulatia nelworks ‘parameters £ rela 2) Moving an to the d Foran the prreviou. imay he broken down inta three 1) ferom the inpul to the concealed alate 12) from the oufput othe concealed state blocks 4 ni cd trans Cowmations : | ed following concealed state one canbe Depth inlroduced in In RANA S ways i Vinecurren} state broken down into agqupa Tak is possible ‘le thal lower. levels oF the hierarcht ve a por} fo IN convertin, iE the wow “input into a crepreseniation thal ig more B6ulable for £ chi shate y GZ PepeNo, Date: Deeper computation in hidden_}o hidden Adding the extra depth makes oh Jime “af oa voriahle from time Step - fo LO Variable in time Step t+1 become longer Oo 7 Te) y Ome o fig. Deep RNIN 2 antcaducing Skip connecfions - 4 Adding kia connections in the hiclden - 40 - hidden path o | O @-DrE Fig. neep Ran 9 see SS, ; | oe ful alouchucred prediction = | Recursive refers 4a the neural _nefwark's { Lapptieations Jo ils output = Recursive neural nelworks are capable of handling hierarchical “data ——* | computa i Ne}work =|} generanres a vecurren} nelwark from la choin to a trree. ES la Jariable Sequence ot), x¢2), ---- LCL) fan he “mapped Jo a fixed sire cepresenlalion with 1a Fixed Gel of poaamelors Recursive Neural Nel. “Tree TF Need foe. mecursive neural eknork in vile | Jexi generation & machine taanslation andin

You might also like