0% found this document useful (0 votes)
23 views24 pages

SC Unit1

soft computing

Uploaded by

Tavva Priyareddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
23 views24 pages

SC Unit1

soft computing

Uploaded by

Tavva Priyareddy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 24
' What is meant by, artificial MewrO? 9 explain working of fece bomeg method AD As An autieiciat neuron , oPlen veered 10 asa Nedle OF unit, block in artifical neural networks isa Pundamentat building Inspired! by the biological necnons The artificial neuron is 2 cornputational rtder ain bul ise ‘sirmplifle: puts, processe in the human bi ¢ them and -Iben “S$ Je receives one or more in prodeices an output: the basic structure of an artificial NeLree consists of the following coroponentst te Inputs (&.%2 ¥30-- Xn)i These axe numerical vatues represeosi -09.the ip signals to the neurons &- weights (ws pWa Wg, --- Win)! Each input is ascoclated witha bh rePlects the stength of the connection weight, whiel plo the ip and the neuron + 3 suremation Function (2): ‘The ilps are multiplied Ly their yespeln weights, and the products ave summed UP mathenatically, Sum 2 (xp xwi) 4-- Aetivation Function (PF)! the sur oblained Pioen the previous an activation Punction- step is then passed through (tanh), «common AF include sigmoid , hyperbolic tangent RelU - 5-0uT pul (y)i The result of -the activation Punctionts the cutpul of the neur9n, which may be uced as an ifp Por other newre® ey ac the Pinal olp OF the nlw- ze KING OF FEED FORWARD ME1 t ones OE (EGP FERIIASP MER: a FF method OY FEnewal nelwor’ , NEWIONS are organized into ayn youes> an input layer» one, or more hidden layers and an outpul lager a qo formation plows Aprough the alw 19 one direction, 60m tp to Ofp raye? withoul any. cycles or loops: = The process of information propagation though the alw can be given as? t+ Input layer? “The ilp layer veceives the Initial ip data a. Hidden layers! tach neuron in the hidden layers receives lp prom the newx00s in the previous layer. nd then passes: the processes this i]p using weighted sure and activation Function ay to the neuron in the Sexe layer Pinal layer produces the olp of the olw ct through. the hidden layers vesutt 3 Output layer * he pased 09 the inkorrralion processé * input layer Hidden lager pulput layer input 4 __$ cutpue input 2 input. ile4- Uist ANd elaborate difherence belween biological PEL ane Wtibicial newon Features ANN Biological NN . \ Pore > Petinition — 6 IL ig the roathernaitica! 1 is a0 compoted op model which is ain't pived by the bi0!09IA! KNOWN AC PeWION: Sevesa! PIOCCSEING Piece ing in the . newon syste in that are linked (o9¢ the, human brain: + vio synapses was sequen * Tks proce $Sing the ina, a penaliel and distributed manner» “Processing + 1s processing -tialand centralized + si ; Size. * small in size * tange in size > Co NOL Mecha © ite control Unit Keeps back * AN Processing is ma -nism oF alt computer - Yelated opera -ed cenbalty tions - —y Ratelspeed . + Jt processes the inko ak, «Lt processes the into baste: speed at slow speed —> Coroplexity + Tt cannot perform _ tf The large quantity and complex pattern recogniti —ipepplexity oF the connecter 92. extiow -the brain #0 penton ec ‘ an complicated lasts + Feedbate ¢ does'nt provide any, pees Peed Late geedback > structure « inpue «© denlrites hidden larjer- cellbody weight : synapse Output axon. ~y Foutt tolerance » There {sno fault © At hee Peuett tole7anee tolerance : > Retianility * Ie is ver vulnerable + Tt is yobust~ > warning © Ww had Very accurate ~ + They are tolerant (0 structures and formatted data ambi quity: > Response o Ue vesponce time is «Tes vesponse time 15 in mitt ume measured in milliseconds ima sureal in nanosecond Ve 3. explain about co IMPONENnts of ANNs AntiPicial Ne erat nietworte concists of several inlerconnected corny roponents fi e 4hQL 0) aogether to process informantor and rate predictions i$ tlere are the main components oF ANN: t- NEURONS (Nodes ox units)! ey Newnons are the basic moce ssing units in an ANN + Each neuron yeceives one OF MOTE UP ivation function 1and theo ¢ ,perforrns a weighted gum of these ilps ,appties an oct produces an output: > Neuron: s process injo.and pass 4k along to othe: Meuron s in the lw 2. Connections (DAEs OF wIEGHTS): en neurons and carry ino fom -¥ connections exists betwe: in 40 another one newror as associated wei =} Each cennection b ght that rnedulates the strength oF the connection 3. INPUT Layer! the ip laye is yespon: -ythe ip (ayer Serves as th at neteor into the neur sible Por receiving the initial fp data e entry poink Por external data 4: HIDDEN coyers: uo ifp and ele me plidden layers ave (Oy fayers > information fig pass ers of neurons bl unough these layers ed and processed ,aitowing the alte to tear cormpl® patterns nd representation ¢ 5 OUTPUT LAYERS * ne Pinal output oP the lwo. ON “HUhe ofp layers produces Wh ull 0 prediction of the nlw for th, the clp 1ayer provides the 165 given ip & toss runenon: : F the tose Punction measures the alipperere’ blw the predicted vy, and thé actual ofp ~Y Al quantiPies the €r107 0» deviation of the clw’s pre diction. > puring nainfng the goal 1s 40 roinlmize the loss Punction , quiding the lw lo mallee more accurate predictions 4 optimizer? -b the optimize: is an algorith™ that adjuces the eoeights ane) biases of the NN during taining to minimize the loss Punclicr -’the optimizer guides the tearning proces + by updating rouse) pararneters to innprove predictive performance - Architecture: ip signals cia a> Error signals - 2 cae explain working of biological NeuTOr and is components Ly A biological neewon ig the basic building plocle of the huraan newous system and saves as the primary punetlonal unit for jnbormation processing * ro Components oe Biolog\(a! Neuro te cell hady (soma): ucteus - che cett body contains the 9 ls prom dendrites and decides = TE integrates Ineo ming signe whether to generate an olp signa a pendri tes! Ronn pendrites are branching extensions that yeceive signals boom other NeUTONS and wansmit them fowards the cell body - a Aye! = he axon is a tong ssiender P ignal catled as ackton PO Abe cell body 4pward athe: neurons wojection that carries the neurons olp si tential, auoay bro? Leynap se! 4. SynAe ee ~ ‘the $ one new von an ynapse ic a junction blw the axon terminal oF d dentrites oF another: / The signals or the information yeleased into the synapse gransmie signals Prom one neuron to another até s- cov [ synaps ¢ HE | nyon A a ~ “S Worring of Biological Necor! SlePO: input singals : : penhites ina biological NEO” veceive signals bom other nei, - rite : ens the porn of peurovansmilt step@! Integration! : ates the signals’ yeteived from the cett bodylsorma) Integy dendrites int to reach a certain - Tb the combined ilp is subbicie ‘ an attion potentiena, threshold Ahe neuron generates SleP®! Activation Penction: VAM AHAR Panter ~The generation oP an action potenttal involves an elecho chemical pocess that travels along the avon. — Step@: output signal: wen Utan ~ The action potential Havels down the axon , where hewobansmillers ave veleated into the synapse - step@: s fe feclont ©+ synaptic Dansmission \ ~ Nenotsancmitlers released at the akon terminals bin! 4o receptors on the dendrites of the next neuron, anmitting the signal. y @. Explain the Prame work of Neusat Network ‘with sketch’ CPL th-2 10) - 1, Explain the framework of neural network with a neat diagram. » The term “Artificial Neural Network” is derived from Biological neural networks that develop the structure of a human brain. Similar to the human brain that has neurons interconnected to one another, artificial neural networks also have neurons that are interconnected to one another in various layers of the networks. > These neurons are <=> as nodes, cept) > <> “ zeo— >) coup Bilge Neural Network Artificial Neural Network Dendrites Inputs Cell nucleus Nodes Synapse Weights Axon Output > The artificial neural network is designed by programming computers to behave simply like interconnected brain cells. icial neural network: The architecture of an a > Artificial Neural Network primarily consists of three layers; Input Layer: As the name suggests, it accepts inputs in several different formats provided by the programmer. Hidden Layer: The hidden layer presents in-between input and output layers. It performs all the calculations to find hidden features and patterns. Output Layer: The input goes through a series of transformations using the hidden layer, which finally results in output that is conveyed using this layer. The artificial neural network takes input and computes the weighted sum of the inputs and includes a bias, This computation is represented in the form of a transfer function. It determines weighted total is passed as an input to an activation function to produce the output. —— Deri © 0 ‘ath : 4 Devive ast order derivatwve of sigmoid and tanh functions a Activation functions is used to provide the output ofa rode yt is used ‘lo determine the output Of Neural networks fire Yes oF 0 jy Tt maps the resulting values in blw olo4 0 ito 4 depending upon Abe function Used war oy the adlivalion Punctions tas teatly divided int 0 2 -types: L. linear Activation Punetion. a Non- lingo? Activation function. Inder Non-linear activatton ma sigmoid and tanh functions Comes ut Functlons - fa) SIGMoIP OR LomIsTIC ACIVATION Function! BG ONO i Sa cNians | omen = aan tooks lke a g-shape } the sigmoid cave =o We set os eeueceays =O} Sy 4c. 98 Pundion ts, tt exists wy the main Yeason why we us sigmoid plo oand 4- cially used for mode's where we hare -y Therehore, it is espe to predict the probabil Pe -anything exists ley as an olP- ae probability © only ble the range ob 0 and 2 -} Einst order derivate ok sigmoid function - cl by o(2) is dePined sy The sigmoid Punetion, denote: | | ab: c(t) = Se em y derivative ero’ % Ly now tels Pind the Plest orde a wd ft le) ae ( The vange of the tanh Fu > tanh ic aso sigmoidal (s-shared) o|o-- ---- ~os <1 -3 => tC! 5 Elst order derivative for tanh Function: eb The hyperbotie tangent function, denoted by tanh (3) is defined as: = tonhwd= ety = Now, let's Pind the 4SF order derivative wate “x A teanned) = ete dz Lettet = fetanh’ (x). en &_ (tanhtx)) = I- tanh>(x) aa Oe ear actvalion Functions in NN and 4. discuss about non-lin tS signibicarce in processing of informatie define AF ¢ explain is the Rectitied linear units AE ReLUs the Rely Punctlon ction - ey Th te most widely used activation fu > 4 ig dePined OS! P(x) = max (01%) < ge of using the ReLU Punclon over Other “ytne main advan ia it does not activate ait the qetivation functions te that pemons at the same time - wp REL Punction iP the typ is neg pe neuron does not get ative i will conwre tt to 7670 activated and t id) Leaky RetV! (dd Punction 1s nothing b ban improved version oe tealty REL oP the RelU Function - instead oF dePining he ReLU Function as 0 for ¥ =o nent of %- ’ we dePing it as mall tinear COMPO & Tt ean be dePined asi pla) = ax, yeo pla) = %, otherwis? - Fly) of imreReanucns NCE I PROCESSING NCE a arity to the cetevork QQ produces non- tine te Neural Nero” gmougr thé sla esenk complex fransorations, 1 AE i0 ©. AF enab mapPings rs to warn ad YEP? cleing of non. linea? 3. AF hele mitigate the vanishing exploding gradient problem duing ba 4 AP tontribute 10 ce propagation the normalization of olps. to Explain ah, —_ Cahout Auto caccociative and Heleva. acsorlaltve Concepts 0 Mtr n¢ tn Neural networles’ ar ‘ The teoo dippercol types of Assodative roermo nes (voy sles ae 4 aiso catted as. contents adcuesiable Memory (tora) thar, Hal ay designed for aigherenl purposes are * t Aulo- nssocialive maernory ° &-Helero Associative memory ~ en \U10- ASSOCIATIVE memory! OWA UA This is a single layer Ow IO which the tip Baining ver, arc the olp targel vectors ae garne « Mrehritectuye? ; [x] = Here! 0" no0h i Ine and obtained (| —>® J tt 0. Ob ob LM: tal > @ a © morring: : -» puring bairing phase, the nlve is presen tect with @ set ob ip pacers The nleo warns 1° associate each ilp pater et } During vecall, ap patlen ig not complete. tt shouted secant ot mee ‘complete the patter “Applica tion ® 1 : Ly nulo associative mercy jes op fen used bor patleo completion er pater? reconstruction « Q. HETERO- Assoc lve MEMORY! ww Asso ee pep: this ts aco @ ingle lager pewal ale bul inthis nleo* the ule painind vector ant the ofp target vector ae nol same / pcre (re y tere 0-0-0 ips “rn”. n0- of OlPe wasting: > during the Laining phase, the nlw js presented with pairs of ies and target alps patterns the Alw learns fo associate each ip pattern with the forvessponding olf patten- > puring vecait| testing» ib a Lainedd ip pattern 1s presented athe nlw should recall the associated ofp pattern “Application! > Hetero- associative memories are often used 19 appliatont where thé goat is 10 Map one sed OF patterns to another- €g: like tasks cuch as recognition, Classification , Apping etc mapping et oD tl. explain RNN Orchitecture Ond list OUt dibberence welween aingle (ayer Feed Forward NN Gnd raultif lone Feed Forward NN: E i NN: B trgpliccuuse BP SoM yy ebite: biteete Qh traditional Quin) geht enn ic a AYP of Na Where the olp Aram Previous sel? anf fed a> ip to. cuvvent = step. 2 fo traditional NIN jolt the loputs € outputs axe jadePendent of each other +o predict next und ma sentence: 2 Hain and = most important +eature ot ANN ts Hidden Slate which temembers some itor mation ghovut a sequence. 2 RNN hod A® Memory that vYemembers all “the information dbavt what hor been calculated: i 3 enn WOrKS of principle of Saving ap ot a fanttevlaa lay and 4eeding -thre back to Ip © order to predict lp. 7 ANS. Modu

You might also like