0% found this document useful (0 votes)
168 views

Machine Learning Quantum

This is very good book

Uploaded by

wouldbepm
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
168 views

Machine Learning Quantum

This is very good book

Uploaded by

wouldbepm
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 64
PUBLISHED BY: ApramSingh ‘Quantum Publications® Unit of Quantum Page Pvt. Ltd) Plot No. BOT Site, Madusteal A Sahiba, Gihariabad-201 010 Phone :0120-4160879 Ena: pagequancam Delhi Office : 14590, East Rabtas Nagar, Shah h.com Website: wwwquantumpagecaia . Dalthi-110032 © Aus Reem Risen No et ofthe pbs may empl toned way for By ay ome icons cnn smh eed mses] tetieved in be liable, Every fort hasbeen made tense ‘scummy, however neither the publisher nor the authors [Buran the acuracy or completeness of any information publshe herein and nether the pubaher noe te authors ‘ull be rexponsble for any eres, omissions, or damages sing out of use of thi information Machine Learning (CS/TTIOE-II : Sem-8) 1" Baition + 2019.20 rice: Rs 80/- only —="CONTENTS ure 6101-236) TRIRODUCHON - Weldefined ieings t ing THE CONCEPT tranNtiNe Specific ordefng of hypotheses, UNIT? : DECISION TREE LEARNING 216102336) DECISION TREE LEARNING - Decision tee learning algoritho Inuuctive bias gues im Decision free earning ARTIFICIAL NEURAL NITWORKS - Percepttons, Gradient descent and the Datta rule, Adaline, Multilayer networks, Derivation of bbckpropmgation rae Backpeopagation Algorithm Convergence, UNFIT | EVALUATING HYPOTHESES 16103266) Tealuating Hypotheses: atimating Hypotheses Accuracy, Basics ft sampling Pheory, Comparing Learning Algontnmes Bayesian Teaming: Bayes theorest, Concept learning, Bayes Optimal CChamitee, Naive Bayes cision, Bayesian belief networks, EM Steer, UNIT4: COMPUTATIONAL LEARNING THEORY (4-16 to 4-18) Computational Lesrning Theory Semple Complex for Finite Fiyathens opaces, Sample Complesiy for laine Hypothess pees The hia Bound Model f Learing: INSTANCE BASED TEARNING -IcNenrest Nejghbour Learung, Locally Weighted Regeesion Rain basis fonction networks, Casabased leaning. UNITS GENETIC ALGORITHM 1605286) Cone algonthons an strate example, Hypothesis space search, Cen Programming, Modelrof Evasion and Learning: {Sng rs onder sulesasquentalcovenngalgorhme- Genera to Speci eam sntch- FOIE: REINFORCEMENT LEARNING - The Uisrning TeskcQ Learning SHORT QUESTIONS: (60-16 050176) www.askbooks.net *AKTU Quantums *Toppers Notes *Books Practical Files sProjects :IITJEE Books www.askbooks.net All AKTU QUANTUMS are available sa eet rors * Your complete engineering solutio Oa aloof Se Yolo) Pu em 2. We don't intend to infringe any copyrighted material. Pr cn website you can kindly report us, we will remove it asap. 4. All the logos, trademarks belong to their rest Introduction 1 G(CHITIOE- Bom) TEE] rovinin bieny the trm machine learning. 1 Machine learning i application of Aric Itallignce (A) hat ‘rvidesnatoms the aly te sulomataly ats a ae ae Sxperionce without bing spy proaransae 2 Machine learning focoes onthe development of computer programs thacan accede 3 Theprinaryaimistolow the computers tolear automate without ‘tan intervention or ates end can cae 4° Machine tearing enables anlyiof mative quntines does 5K generally delivers ator sd mors acute ec inoderoenty brottable opportunites dangers aka” Combining machine learning with At and cogutve technologies can ‘nae oven mare cine ponotng age hanes crane TTA] Describe attorent types of machine learning algorithm. Different types of machine learning algorithms are : 1. Supervised machine learning algorithms: ‘Supervised learning is defined when the model is getting trained tne nbelled dataset, 1 Latwiled dataset have both input and output parameters © In this type oflearning, both training and validation datasets are ike 2 Unsupervised machine learning algorithms \Unoupervied machine learning ie used when the information | nestle cafe nor label 1h Unwpervined loaning studies how systems can infer a function te lonortheu hadden struct fom unlabelled data Machine Learning 1-86 (C9TTIORSem-8) (© Thesystem docs not figure out the iat output, but explore the ‘dna und can draw inferences from dnote a deecibe Hidden Semi-supervised machine learning algorithms: 8 Gembeupervieod machioe learning algorithm fll in between Supervised and uncuperyed earning, ce hy aed Set unlabeled data for taining tb The ayetems that woe this method are able te improve learning « Semiauperise learning is chosen when Ibe dats requires Eiled end reloventreeotrssin dart tate /earn fom ‘8 Reinoreement machine learning algorithm learning method {Satintracts with enonmenty redwing tins od acne 1b Trial error search and delayed reward are the most relevant ‘&Thismethed allows machines and software aguts to automatically Actermine the deal haviour within specie contest order ‘Baise performance. 4 Simple reward fda is raquired for the agent to lara which TEER wet aro the advantages and dlandvantage of different ‘ypen of machine lnrning slgorithm ? | Advantages of supervined machine learning algoeithm : ‘Clans present the features on Ube ground "Training datas reunableunlee features change ‘Disadvantages of supervised machine learning algorithm + ‘Classes may not match epetral laste. ‘Varying consistency nclaeses Cot and ime are involvedin selecting ening dat Advantages of unsuperviaed machine learning algorithm + ‘The opportunity for human eror mnie Ie praces unig spectral eases Relatively nny and fant to carry ou 14GCsTOR Some) Introduction Disadvantages of unsupervised machine learning algorithm : pti spectral clases donot necessarily represent the featureson the round. 2 edocs not consider opatial relationships the dat 2 Teeantake imo to interpret te pectral east. ‘Advantages of scmi-supervised machine learning algorithm : 1 eiseasytounderstand, 2 Thredaces the amount of annotated data wed In aable, fet convergent. 4 Rissimpl 5. Tehashigh eiioney. 2 Wisnol applicable a network level data [Advantage of reinforcement learning algorithm 1. Reinforcement learning a used to sve complex probleme that cannot Sesolved by conventional techniques 2 Thictachniqu i proferzed to achieve lng-term reulte which ae ery ‘his learning model is very similar to the learning of human beings “Hence iis lowe to achieving perfection Disadvantages of reinforcement learning algorithm : 1. Two much reinforcement learing can lead to an overload of sates shih an diminish the resale Reinforcement learning i ot preferable for solving simple problema ney region the rn of hing and dtesting on ‘hiss usd in many applies ike stems for factory automation, tollbeoth monitoring and security rarvellance Machine Learning 2 Spoceh recognition ‘Speech Recognition (SR) isthe transl _ 1.5 G(C8TOR Sem lon of spoken words into Tiesto known mas Auton Speech Recognition ASH compter 5 Deesk recognition or Speech To Text (STP) SS eech recognition, a software appliaton recognises spekor % Medical diagnosis: | ML provides methods techniques, and ques and tote that can hep in saving ‘iagnostic and prognostic problemain avarictyafemcl doa 1 fe being used for the analysis ofthe importance nena parameters and thelr combinations for proses = 4 Statistical arbitrage + In finance, statistical arbitrage refers to automated tradin _Rtatogies that are piel achortrm snd aval lane na Insuch stratogis, the user tie toimplementateading algorithm for a set of securities onthe bars of quantities such os hstorieal correlations and general economic varsablon Learning associations : Learning association is the process for discovering relations between variable in lane data bse Extraction Information Extraction (IE) is another application of machine Kearning: b. eis the proceas of extracting structured information from » SEI wont ar te advantages and dinndvantages of machine searning? Advantages of machine learning are: 1. Easily identifies trends and patterns: ‘4. Machine learning can review large volumes of data and discover specific tronds and patterns that would not be apparent to humans bor an e-commerce website like Flipeart, ¢ serves to understand inc browsing behaviours and purchase histories ofa urersfohelp ater tothe tight products deals, and reminders relevant ta them Ce Teuses the results 10 reveal relevant ave 2 No human intervention needed (automation): Machine leering aeeeaD lyuine physical force eno human intervention ix needed 1.66 (CS1TOR Sem) Continuous improvement? ‘ML algrihs gain experienc, Uy keep improving ML le hey keep improving in accuracy ‘ete amount of data keep growing, agvicsa o maka 4. Handling multidimensional and mult-variety data ‘8 Machine learning algorithms are good nt hendng eta that re ‘mena usury tn oa ae advantages of machine learning are: 1 Machine learning requires massive data ats torn ona the ‘ould te ncustvcranbiased, and of toed aust and resourees: A. ME need enough time to lt the algorithm arn and develop Seagh to ful their parpoce with considerable amount Of 1 Teal needs massive resource to fnetion Interpretation of revults: ‘To accurately interpret reclte generated by the algoritme. We ‘mutt carefully choo the algorithms for our purpone 4 igh error-suscoptbility + ‘2 Machine learning is autonomous bat highly susceptible to erort 1 Teeakes time to recognize the soures of the nue, and even longer tocorrectie ‘TERTT] write snort note on well defined learning problem with ample nl Well defined learning problem: ‘Acompater program inzaid to learn from experience E with respect tose ‘stn aftacks Pad performance measure Prisis performance tetas oT, ‘as moasured by P, improves with experience E. ‘Three features in learning probles 1. The clas of tanks () 2. The meature of performance tobe improved (P) 3. The source of experience (E) 1-1G(CSTTIOR Sem-s) For example 1. Acheckers learning problem : Task (7): Playing checkers Performance measure (P) : Peron aA » ‘Training experience (5): Playing practice game handwriting recognition learning problem ‘Fash (7 Recognizing an clastving handwritten words within Performance measure (P) Percent of word correctly lid, fexperience (Es A dat eto nn ace of handwritten words with 8 A robot driving learning problem HIRE) veverive welt dotined touring ‘Task (7): Driving on public four ane highways sing vison serra Performance measure (P): Average distance raeled before oo ‘rrr (au judged by human versa “Training experience (B) : A sequonce of images and steering commande recorded while olserving shuns dete machine learning. Well defined learning problems roles in machine learning: 1 Learning to recoynize spoken words » Succesfl serch recognition rate empl machin larningia or example the SPHINK ayrtem learn npeaher specifi stro ‘recognizing the primitive sounds (phonemes) ad word ro the cbserved npc signal, ‘Neural network learning methods and methods for learning hidden Markov models are effective for automatically castomising Individunl speakers, vocabularies, microphone characeriatcs ekound nae te Similar techniquee have potential applications in many signal: Interpretation prema % Learning to drive an autonomous vehicle: Machine learning methods have been | | | 1-86 (C81TI0E Sem) 4 Learning to play world class backgams ntrodution or example th ALYINN seem has uit learned orators to % Seve iomesperkourfrSombesonpathchgeays Simartechnsgure have possible apleationsin many sensor sed contre problem 8 Learning to claaify new astronomical structures: ine earning methods have been applied to wvarety of large © Shee te learn general rogularitientmplic in Ue date. 1 Forexample, declion te learning algorithms have ben wed by ASK isi how to easy elstn jecte rom the second Palomar Observatory Sky Survey. {¢Thiseystem is ured to wulomateally clay all object inthe Sky ‘Suey which cons of tee terabyte of nage data |The most sucaafil computer programs for playing ames wc wt Ihelgamnmon are based an machine learning algorithm 1. Foreampe, the world’s tap computer program for backgammon, TD.GAMMON learned ita strateay by playing aver one milion © Thaow play ata ove competitive withthe human world champion Similar techniques have applications in many practical problems ‘there large narch space must be examined efficiently learning ? Explain the important components ‘earning refers tothe changein aubjets behaviour to given situation ‘brought by repeated experiences in that situation, provided thet the ‘chaour changes cannot be explained on the basis of native reponse ‘tendencies, matriculation or tettporary sates ofthe weet. 2. Learning agent canbe thought of aa containing a performance element ‘hat deldes what actions to take anda learning clement thet modifies the performance element that it moles better deciions ‘The design of learning element is afected by three major ants 4. Components ofthe performance element Feedback of components Machi 06 (CATION Sem) ‘The important component of learning are | HBIRTIE wnat are the steps used to design # learning system? Steps used to design learning system are: 1 Spey the leaning ark ry 2. Choose a suitable set of training data to serve as the training experience, eee a. Divide the training data into groups or classes and label secordingly. {LDetermine the typeof nwa representation tbe learned fom he gerne. — 8 corer lair hat can generate general hypothe rat Appt earer lair tet data lg: Gara caine mai 4 Compare the perforant he syte with that of an xpert human 1. Acquisition of new bnowledgs Tr the csi now knows — ay for oper eves hgh ia Cee 2 ashing ‘Soren ‘he other component of aring ithe problem sling hati reqsied ‘Groth tolerate into th sytem new kwloge tat preted {olan to deduce sew tforaaton hon sequred ats ae bce es LO HBG itorentiate between artical intalligane and machine HBR ow we ep ante in machine earning ? — o — Data in eplittod in three waysin machine learning: ‘Aniicialiaiahngce AD [Machine earais EE 1 Training data: ‘AL ie human intelligence |tprovides machines the aii ‘The part fda we ue to ein our mode demenatated by machines to |te'eara, and uadera Stet he de wich oor mel atl ove th pat od le tcomplos|withowt ing expt || tasks. Progeamied. | 2 | Theidea behind Alisto program | The idea behind ML isto toach IBachince to carry out tankein | computers. £0" think and [a] itis bese on characteristics [lis based onthe sytem of] [2 | Reotnatgene oy | Z| irisosedabesltnar, Finance, |W unl fir pica charaier| wepenanae aoe [ete oa 2 Validation dat 1 Thepartofdata which suse wo do a roquont evaluation of model fit on training datacet along with improving involved Ihyperparameters initially act parameters elare ths model bis earning 1b. Thiedata playsiteparewhen the model A Testing daca: A" Onte our mode is completely trained, testing data provides the actualy taining a ___ F116 6811/08 Sem.8) When we feed in the inputs of testing data, pe ‘some values without secing actual output," ode! © After prediction, we evaluste our mode by comparing 2utput present in the testing data.” MArIBESE With acta ‘hls is how we evaluate and ace how ch our mode ftom tho experiences feed in a taining sta, ra a ea ae — I predict ‘Features :A set of variables that cary discriminating and characterising information about the objects under consideration Feature vector A collection ofd features, ordered in mesningtal way {nto ad-dimensional column vector that represent the sgneture ofthe object tobe identified Feature space : Feature space is d-dimensional space in which the feature vectors li. A d-dimensional vector in n-dimensional space feature c= [feature 2 2, |fontare & * Mig, 1.126 (CST/OE-Sem-8) Introdution 4. Clase The category to which» given objet belongs denoted byw 5 Decision boundary: A boundary in the dimensional feature pace {het soparatenpatecrn of diferent classes fromm ear ater © Clamiter: An algorithm which adjastite parameters ofind he correct ‘leaton boundartes through searing algorithnn trang efasct ‘och that acest functions minimised, 7. Reror: Incorrect labelling of the data bythe clase. Training performance : The abilty/performance ofthe elassifir in ort entices tenga, whch thaw rend © Generalization (Tost performance) : Generalization is the ability/ performance of the slasifier in identifying the classes of previoushy TEETH expan the components of machine lerning system, Components of machine learning system are 1 Benaine: 1. Teusos transducer auch as camera oF microphone for input. PR (Pattern Recognition) system depends on the bandwidth, resolution, sonst, datorton, ee, of the tranadcer, 2 Sogmentation : Patterns should be well separated and should not overlap, 3 Feature extraction : ti ured for distinguishing features, This process extract invariant features Mh reepect te tranalation, rotation and male, = 4 Classification : '& Teuse a feature vector provide bya festure extractor io aesgn the abject to mentopery 1 Teisnot always possibleto determine the values ofall the features. Post processing: | Post processor uses the output of the ch ifler to decide on the Machine Learning BRB] wat are tne cans o roe = al a ‘nin tr — Paes Common classes of problem in machine learning. 1. Classification in classification data lablled eit ncigned a caer example, Spam/non-spam or freadinon fsa 1b. The decision being modelled i o assign label to new unlabelled pleces of data © Thircan be thought ofa diserimination probes, sodlling the 2 Regression: Regression dataieInbeled with areal value rather than label 1b The decision being modeled is what value to predict for new Uinpredicted data 2 Clustering » Tn clustering data isnot labelled, but can be divided into groupe ‘ced on similrityand ether messures af ataral trcture nthe Seta For example, organising pictures by faces without names, where [er iRSaun weer as te ungn names to groupe ike Phot on he oooae 4 Rule extraction: ~ "aval extraction, data ined an the bas forth extraction of ropuitnal rs Tree rale dismal soporte eating etn lefly explain the iaeuee Tesuce rolated with machine learning are: 1 Data quality + 1 Is enental to have good quality data te produce quality ML lgerthmn and noel wa Tenciticul to make definitive statements on how wll a mole a anata ene * Taree satis ne Bt we Te one © sro saddens th ernie fr stare 4 omer! ‘Tho most common nue with ML e people sing it whore it dee > Bro tie ther in some new innovation in ML, we ae overzelout ‘eaineers ying touse it where's et really accemary ‘5 This used to happen alt with deep learning snd neural networks a ‘Traceability and reproduction of real are two main anes, — es EE oa | he Concert Larning Ta Graal Speifc Onerina of spats Pn, ts The Blnaaae Messe Cee Blinn Arc dente Ba 1-15 6 (CS1T/0E Sem-8) FEI write snore note on concept learning task. 1. The tank earning, i contrat to leering from observation, can _ be Aeteribed by Being “given a act of teenage (AGM Bas Co AyD), where the UB Ao ith ‘Aa © A represent the observable part ofthe data there denoted a vector of attributes inthe common fermalism) and the C; represent valuation ofthis data 2 Wa functional relationship between the A, and C; values is to be Aiscovered, this task is either called regression inthe statistics domain) ‘or supervised learning in the machine learning domain) 3. Themore special case where the C values are restricted to some nite ‘20C iscalled classification ot concep earning in computational learning theory. 44 ‘The classical approach to concept learning is concerned with learning concept description fr prodefined classes Cf entities rom E- 5. Acconcept is rogarded as a function mapping attribute values A, of discrete attributes toa Boolean value indivating concept membership. 6. Imthincase, the et ofentitien Fis defined by the outer product over the ‘range ofthe considered attributes in A. 7. Concepts are described as hypotheses, te conjunction of restrictions tn allowed attribute values lie allowing just one specifi, ast ofr any ‘alae foranateibute, The task oelascieal concept learning consists of Finding a hypothesis for ach clans C, that matches the raining data 8. Thistan can be performed as directed search in hypotheses space by exploting « preexinting srdering relation, called general to specie trdering ofhypetheses 1.166 (CSTTIOE Sem) Introduction 10. Aypothesc thersby is more general than anather iis se of alowed {nlances ina superant tothe set of tnetancee belonging tothe other othe, TREAT crn the term concept and concept tearing How ean srorepresent a concept? Sects vente. Concept learning + Concept learning is defined as ‘alued Rintton fom training examples of input and output ofthe function ‘Concept learning can be represented uals 1. Instance » : Instance + isa collection of attributes (Sky, AirTemp, Hlumaiy, te) 2 Target function ¢: Enjoysport X--+ (0,1) A Hypothesis A : Hypothesis hie a conjunction of constraints on the tribute A conatsaat canbe specific value (for example water = warm) ‘donot caro (for example water =?) ‘no value allowed (for example water =) ‘Training sample d: An instance paired with he target function, 5) =O ngative example els) = 1 postive example EST] ecvtain the working ot find algorithm with Now chart. Working of find algorithm 1. The proces startewithnitaising'¥ withthe most pectic hypothesis ‘generally, it is the first positive example in the data set. = We hee for sch posite earpeIthc exampe ir ngatv, we wil ‘move on tothe nom example Buti i's poate soeagie'we consider it for the next step. = 8. Wewilleheck feach attribute in the examp value is equal tothe hypothesis I the value matches, then no changes are made. 5. Hthe value does not match, the value is changed to Machine Learning + Loa] Replace the, Write find-S algorithm, with ite advantages and disadvantages. Find: algorithm : 1 Initialine Ato the moat apace ypothesin in. 2 Foreach postive taining instance or each attribute constraint, ah [Pt constraint nin eats by «THEN. do nothing ELSE feneral content sotisfied by x replace in hy next more gener 8. Output hypothesis 1G (CSITIOE Sem) Advantages of Find: algorithm + | Ihe correct target concept i contained in Hao the at ar ‘orrct the Find algorithm ean guarantee that te output gt ‘Speci hypathesi in thats conuntent withthe pea a Disadvantages of Find-S algorithm : 1, There ino way to determineifthe hypothesis i consistent throuphoet the dae 2, Inconsistent training sete can mislead the Find algorithm, snc ‘ignores the negative examples Find. algorithm doe not provid sbachracking technique to determioe te best posible change tht could b done ts improve the eee pothent 4 The convergenc ofthe learning proces poor, and convergence athe correct objective faction cannot be guntenteed 5. Theron onan wel andra number fp ano ‘Redon oa WeeTse] ws algorithe, 1. Abypothesis his consstont with wae of training examples D of target onctpt cif and only i Ax) ot) fr each training exemple Consintent (h D) =v <3, > © D) a) mele) 2 Theversion spac, Vy with repect to hypothesis qpace Hand taining examples Dis the alee of hypothees from Hf consistent wih al ‘taining examples. Vp 9% Uk © | Consistent th, D>) Listtheneliminate algorithm 1 List-Then-Eliminat algorithm initializes the version space to contain ‘ilpothee nH hen linnaeshypetess that are acon 2 Te version space of candidate hypotheses thus shrinks as ore example gc obeorved, until one hypothesis remain that isconiotet wall the obeerved example, ‘Presumably his the desired target concept | Winguficient data ie available to narrow the version apace ingle hypothesis. then the algorithm can output the enor st ot Machine Learning . ‘hypothesis consintent with the obaerved dat Ls Then-liminate lgrithncante applied whenever the hypothe space H is finite, ™ thas many advantage, inching the fact hatte guaranteed to ‘utputallypothores consistent wth the training date bt requires exhaustively enamerating all hypotheses in HL an ‘nralctic requirement forall ne most eal eds ces TERRE] eptain candidate elimination algor 1. The Cendidate-Blimination algorithm computes the version space ‘ontaining all ypothenes from fi that are cousisteot with an obwerved Stquence of training examen, It bins by initializing the version space tothe act ofall hypatheses 2H. that ie by intalising the G boundary set to contain the most general hhypatheaaia Gye (e22.22.2.21 and nalising the boundary act to contain the mowtspacifichypothisie Sy 20,0, 0,0, 0,051 3. Thece two boundary sa delimit the entire hypothesis space, because very other hypothesis in H le both more general than Sy and more peas than Oy 4. Aseach traning example is considered, the Sand G boundary sets are {Senorlised and specialized, respectively to liminate from the version ‘Shnce anyhypothacs ford inconnntent with he new training example. |S Afterall examples have been processed, the computed version pace ‘Contains all the hypotheses consistent with there examples and ypotins Algorithm : “Tnitialie O to the sot of maximally general hypotheso in nialze Sto the net of manimallyspeific hypotheses in or each training example d, do ‘© Wien positive example Remove rom G any hypithesis that does nt included For each hyputhesi sn S that oex nt include ae 1-206 (CSTDIOR-Sem-) Introduction imal oneralizations Ko much that ‘incuden and Add tol mi