0% found this document useful (0 votes)
1 views

Machine Learning qna

The document discusses machine learning, defining it as a subset of artificial intelligence that enables systems to learn from data without explicit programming. It highlights key issues in machine learning, such as data quality, overfitting, model interpretability, and ethical concerns. Additionally, it differentiates between machine learning and deep learning, and outlines evaluation metrics for various machine learning tasks.

Uploaded by

aaryaborkar67
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
1 views

Machine Learning qna

The document discusses machine learning, defining it as a subset of artificial intelligence that enables systems to learn from data without explicit programming. It highlights key issues in machine learning, such as data quality, overfitting, model interpretability, and ethical concerns. Additionally, it differentiates between machine learning and deep learning, and outlines evaluation metrics for various machine learning tasks.

Uploaded by

aaryaborkar67
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 18
{7 FINOLEX ACADEMY OF MANAGEMENT AND TECHNOLOGY, RATNAGIRL —t a Assign menk No-l_ 1. Define machine leo ening.-Describe —— i the issues_in machine learning > __ Machine Llearning- a - ——Mothine Teorning (ML) is_a_subset of = ortificiol jotelligence (Al) thot enables. = “_systen__to leart aemeeme fam . = ein i ~ Tks model takes fess | A age amount time in braining due) Lime i's token, he to jtssmall size ese, (of very big dota points. 8 Machine learning-is:| Deep learning on usecl for a wide ae e othtr hand te Sy of applicotions ssuch used for complex as reaqression, | taske such as asst 7 ation, and .l4mage and speech, a ustering recognition , natural larga @ pro cessing ond auksnomous systems. f Describe» the metrics used to evaluoke the model. For machine. learning toodels evaluokion metrics vary depending en the type of problem being: solved. Belovy are the keg metrics for different Mi tasks: : L. Classificotion Metrics - ‘ Used when predicking discrete labels Ce-g- Spar detection: sentiment analysts). *Accuvacy =‘ TPF+TN . TPLTN+FPTEN Total predictions Beot ‘for balonced datasets but misleading for imbalanced onges- E “Precision = TP/ CTP+FP) : Measures how mony predicted positives'are ackuolly correct «Important in Fraud detection. a c = correct _predicHons \ =} CD FINOLEX ACADEMY OF MANAGEMENT AND TECHNOLOGY, RATNAGIRI “Real Gensibivi by = TRYCTRHEND -, Measures how mony octaol positives — Were correctly’ predféted -Cribleal in —medicel diagnoses. + 3 dick abel on. #Recall /( frecstont Jat het-seore = 2 _#.CPre | WHarmonic_meon:of precision an (le Useful when data is. imbalanced. ression Metrics = : - ing-co nHouous valuer- is 18.8, HG Mean Absolute _Error(MAk) #12 1yi-dr | | Average absolubea difference, bet” ackua and prediched values. Pp Tt “Mean Squared Error MSE) Pe aolise langen errors more. “than MAR Root Meon Squared Enro# CAMSE)= clMsE ___* Wore inkenpretable since it’s athe some units as the clargeb veriable.— oe Po oe « p*acore CCoelficient of Delermingtion) _- Neasteres haw well The Model “explariny “the variance $n cata, Closer te | is better. — Whok do you > Overfilki Model is foo & und ly4 in the broly ne ls Fondom “noise Vanlance as low blos ond bigt Meaning .ib perfor ne f dato bub Poorly ae bine beeede ae training ne ney RATNAGIBL Mean by overfi. urs when a mn Complex and eg: Consider example of house prediction problem Vnere we use features like square footoge ond hamber of bedrooms te predict prices. 7 -- LF we use a high-dgegree polynomial reqression tnskead of a simple (inear model. e the model may Fil the Eraining data perfectly Capkiring even small Puctuations Choise) in the data @=cos! C2) | o | (3 FINOLEX ACADEMY OF MANAGEMENT AND TECHNOLOGY, RATNAGIE! 7+ Whotis Li ranorm of vector 2 Give_on _emomple > The Li Norm ‘is also known as the Manhattan norm of vector. 7 veclocA is +he sum of the absolute values ofits Components. _ : Az (Aya, Ass Sven by L dg ‘Tah = | ) exam ple MANS 3 What 1s diogenalizotion § Decide whether given matrix A js dfagonalizoble or not TF yes then perform “eigen decomposition. > Diaqonalisation Is the process of converting thatrix A into a diagonal = mathix_D Using a similarity transformatt on. This Is ed because diagonal matt ce are easier to work with oo A=pppt ' Lelgen values = Diag onal elements =6,3, ——forAs ¢- (Algebratc_ i eilale DOs 2 Am =19 =a Asa A iMel, ia Por Aso. a CA W)xX=o = o 2 | Oo 73% [e674 Xoa+%g=0 3X2 --X3=0- “SV ASs= Oi Fo= ° © FINOLEX ACADEMY OF MANAGEMENT AND TECHNOLOGY, RATNAGIRI — | fore very eb en values Mobi xi 3 diagon alizoble | A=ppp) _ pe Pl 374] me | —|8 I.) - — icueoeeee | ¢ (0:0%/3 Nia ° { ° ! on 0 5 ; ° ° | — ee ee | | EJ FINOLEX ACADEMY OF MANAGEMENT AND TECHNOLOGY, RATNAGIRI ___ gil How to décide thé hyperplane, si —_ lineorly Separable- classes usin _svm >In support veclor Machines @svm) the _ a Isto find the optimal hyperplane. that best separates hwo linearly ‘separable —_ classes jn an: N= dimentional space aan — The hyperplane sis determined using the mani mum margin peinolple which ensyre — the bok genemndlizo tlon performance, AL Steps fo decide the. hyperplane — | Equation of the Hyperplones = A-hypesplone dna Ne Imensional_space_ As. given by. Tet so Where _ ee a > Ww_js the weight vector ~x hs the inpuf Pealeire yeclon ~ bis the blas term LET 2: Support veckors- a oe These are the closest dala points ty the hypemplone and Phey define the magn 3. Margin CalculaHon- - ‘ @ margin is the perpendicular distance between the hyperplane and he nearest support vechors . The margin is given by a tt - | “Woot! . | The objective of sym is te mantmie this nus niger erect classi Fisailny ¥ Jp prim mi aaHon—p mo blem att Ero find — Lhe probe plone ame ‘golve the cons’ jroined optim) \zotl jeo_peeblem on | ind the wif? subject 7 yi (uo T2C +b)? Sy TMi ~_ where_Yi it, Ay gne the_classlobels. \ 5. Use of “Lo: Prange “Mulbip)ieirs: — The above problem Js solved using — Togronge multipliers oind the KKT— _Conditions, leading “fe _the— daar problem mor sat ak Saiajrl ys - Sabjete eo zai vir® ee & Peclsio on. , bounda« Lex) = gare Af ROdre, dassihy— os 4 alae Sloss fy —\eompedhe. Wa — Ke fr ot ad 25 3:0 3:3 45 S.0 So : po VS 2.6 25 303.5 yo "B09 t Bs as Sao 1 SS 4u0 + Tx = 6 24.0 (6.5 : 7 240 99-S 68.5 1 a Ie-S CBS 42S { S-2 Bo= -2-83g BIZ 06345 P25 0.3137 | F : : Prove B= (xtTx)7 xy | — Genera | Form _of multivariate regression is - = Xp _| where Y= g rrors CSSE) SSE or 95 2 rix “xi B)* = —_—t FINOLEX ACADEMY OF MANAGEMENT AND TECHNOLOGY, RATNAGIRI _To. ‘eskimate Brwe minimize sumol + In mokrix shod onthe cost 1” becomes, - TCP) (-XBIT CY-XBD to minimize 3 7(p) take deri votive “wink: Band “SB 3R = ‘Co- ax +2xTxp) 81). 8 CYT 1~aptxty + pte) __sekting the derivative t6 zero i oe ee +2xTXBS 6 a 2Te Beant _ “_XxTx Bs _XTY_ Ps Cat xT Hence ipod eS 2 ecnaly sia the fallo: XK) Xo ¥ 23 4 2° 3 _t 5 | Cc Tat é Plok the data points on a 2p plane. Compute the optima) Separating hyperplane using hord - -morgin SYM,

You might also like