Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
642 views
10 pages
Questions and Solutions On Bayes Theorem
Questions and solutions on bayes theorem
Uploaded by
KHAN AZHAR ATHAR
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download
Save
Save Questions and solutions on bayes theorem For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
0 ratings
0% found this document useful (0 votes)
642 views
10 pages
Questions and Solutions On Bayes Theorem
Questions and solutions on bayes theorem
Uploaded by
KHAN AZHAR ATHAR
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Carousel Previous
Carousel Next
Download
Save
Save Questions and solutions on bayes theorem For Later
Share
0%
0% found this document useful, undefined
0%
, undefined
Print
Embed
Report
Download
Save Questions and solutions on bayes theorem For Later
You are on page 1
/ 10
Search
Fullscreen
5 Naive Bayes (8 pts) a Naive Bayes problem with thre of 12 training examples, 6 positive ble with some of the counts: features, ;...rs. Imagine that we have seen a (with y = 1) and 6 negative (with y = 0). He isa y=0[y=l mt 6 6 m=1 0 0 y= 2 4 1. Supply the following estimated probabilities. Use the Laplacian correction. # Pr(z2 = 1Jy=1) = # Pr(z; = Oly = 0) 2. Which feature plays the largest role in deciding the class of a new instance? Why 13, because it has the biggest difference in the likelihood of being true for the two different classes. The other two features carry no information about the class.Problem 2. Bayes Rule and Bayes Classifiers (12 points) Suppose you are given the following, set of data with three Boolean input variables a,b, and c, and a single Boolean output variable K. For parts (a) and (b), assume we are using a naive Bayes classifier to predict the value of K’ from the values of the other variables (a) [1.5 pts] According to the naive Bayes classifier, what is P(K Answer: 1/2 1a=1Ab=1A60=0)? 1) PC (b) [1.5 pts] According to the naive Bayes classifier, what is P(K = 0la=1A6=1)? Answer: 2/3 PUK = 0a = 1.46 =1) = PK =0Aa=1Ab=1)/P(a=1Ab=1) = P(K=0)- Pla =1|K = 0): P(b= 1K =0)/ Pla=1Ab=1AK =1)4PlaNow, suppose we are using a joint Bayes classifier to predict the value of K from the values of the other variables. (©) [15 pts] According to the joint Bayes classifier, what is P(K ~ Ia 1Ab=1A6~0)? Answer: 0 Let num(X) be the number of records in our data matching X, Then we have P(K = 1Ja=1/6=1A¢=0) = num(K (@) [1.5 pts] According to the joint Bayes classifier, what is P(K = Ola = 1.6= 1)? Answer: 1/2 PUK =0ja = 1 b= 1) = num(K = 0a =1Ab=1)/numia=1Ab= Tn an unrelated example, imagine we have three variables X,¥, and Z. (©) [2 pts] Imagine I tell you the following P(ZIX) PUY) o7 of Do you have enough information to compute P(Z|X AY)? If not, write “not enough info" compute the value of P(Z|X AY) from the above information, Answer: Not enough info. If so,(8) [2 pts] Instead, imagine [tell you the following: P(ZIX) PUY) P(X) PY) Do you now have cnough information to compute P(Z|X AY)? If'mot, write “not enough info”. Ifso, compute the value of P(Z\X AY) from the above information. Answer: Not enough info (g) (2 pts| Instead, imagine I tell you the following (falsifying my easlicr statements) PUZAX)=02 P(X) 0.3 PY) Do you now have enough information to compute P(Z|X AY)? Ifnot, write “not enough info”. If so, compute the value of P(Z|X AY) from the above information, Answer: 2/3, P(Z|X AY) = P(Z|X) since P(Y) = 1. In this ease, P(Z|X AY) = P(ZAX)/P(X) = 0.2/0.3 = 2/3.6 Bayes Classifiers (10 points) Suppose we are given the following dataset, where A,B,C’ are input binary random variables, and y is a binary output whose value we want to predict. mer oreap| moron olal connoe-al (a) (5 points) How would a naive Bayes classifier predict y given this input A=0,B= “Assume that in case of a tie the classifier always prefers to predict 0 for y. Answer: The classifier will predict PA PA 1/3 P(C 1/2, PC Predicted y maximizes P(A = Oly) P(B P(A = Oly = 0)P(B = Oly = 0)P(C P(A = Oly = 1) P(B = Oly =) P(C Heenee, the predicted y is 1 ore (b) (5 points) Suppose you know for fact that A, B,C are independent random variables. In this case is it possible for any other classifier (e.g., a decision tree or a neural net) to do better than a naive Bayes Classifier? (The dataset is irrolevant for this question) Answer: Yes ‘The independency of A,B,C does not imply that they are independent within each class (in other words, they are not necessarily independent when conditioned on y). Therefore, naive Bayes classifier may uot be able to model the fmction well, while a decision tree might. ‘Thus, for example, y = AXOR B, is an example where A,B might be independent vatiables, but a naive Bayes classifier will not model the function well since for a particular class (say, y = 0), A and B are dependent, rT5 Bayes Rule (19 points) (®) (4 points) I give you the following fact: P(AIB) = 2/3 Do you have enough information to compute P(B|A)? If not, write “not enough info”. If so, compute the value of P(B|A). Na Enoual \afe (b) (5 points) Instead, I give you the following fac P(AIB) = 2/3 PAl~B) = 1/3 Do you now have enough information to compute P(B|A)? If not, write “not enough info”. If so, compute the value of P(B|A). Not eroaale lap (©) (5 points) Instead, I give you the following facts: P(AIB) = 2/3 P(A|~B) = 1/3 P(B) = 1/3 Do you now have enough information to compute P(B|A)? If not, write “not enough info If $0, compute the value of P(B|A). P(slay= _f(ateye(s) os 4x8 _ PAlee(a+ PCAL-BYPCR) 4x5 + 59% (@)_ ( points) Instead, I give you the following facts: P(AIB) = 2/3 P(Aj~B) = 1/3 P(B) = 1/3 Pld) = 4/9 Do you now have enough information to compute P(BIA)? If not, write “not enough info”. If so, compute the value of P(BIA). yO: KUL ‘of- Commer S 5 Ofc »)1 Conditional Independence, MLE/MAP, Probability (12 pts) 1. (4 pts) Show that Pr(X,¥ |Z) = Pr(X|Z) Pr(¥|Z) if Pr(X|¥,Z) = Pr( XZ) Pr(XYIZ) = Pr(XIYZIAY|Z) — Cchain rule ) = Pr (xX|2)Pr (VIZ) meron mistake: Pr(x]Y}z)=Pr(x|zZ) > X4Y given Z ° MP > Pe(KN|7)= PextO BZ) the first > does not hold if the equation is not fer al | pocsi (4 pts) If a data point y follows the Poisson distritution sath nto parameter 6, then th ey prcbebty of ile oberon 4 pula) — PE, for y—0,1,2, a You are given data points yi,-+- ,Ym independently drawn from a Poisson distribution with parameter 9. Write down the log-likelihood of the data as a function of 8. Dini (Ys lag 6-8 — log Ys! ) = ($45) igo ng — loy(IL i! ) (4 pts) Suppose that in answering a question in a multiple choice test, an examinee cither knows the answer, with probability p, or he guesses with probability 1— p. Assume that the probability of ausweriug a question conectly is 1 for an examinee who knows the answer and 1/m for the examinee who guesses, where m is the number of multiple choice alternatives. What is the probability that an examinee knew the answer W a question, given tat he has corey anemred i? PCenow answers, Conect ) a PC Know answer | Correct) = Preonect ) ~~ 2 P+ UD,3 Gaussian Bayes Classifiers (19 points) (a) (2 points) Suppose you have the following training set with one real-valued input X and a categorical output Y that has two values TY] OTA 2 [A 3 |B 4[B 5 |B 6(B 7([B You must learn the Maximum Likelihood Gaussian Bayes Classifier from this data. Write your answers in this table: m= | a= | PY=A= Be, oo ob = BeleDelw 9 PUB) 87, Ss I considered asking you to compute p(X = 2|¥ = A) using the parameters you had learned. But I decided that was too fiddly. So in the remainder of the question you can give your answers in terms of a and 3, where’ a = p(X =2/¥ =A) B = p(X =2¥=B) (b) (2 points) What is p(X =2AY = A) (answer in terms of a)? > 9X22} yzk)P(7=A) = (©) (2 points) What is p(X =2AY = B) (answer in terms of 8)? = P(x22]72B)P(1=8) = 4B (@)_ (2 points) What is p(X = 2) (answer in terms of a and 8)? 4 (2+ 58) (e). (2 points) What is P(Y = A|X = 2) (answer in terms of a and )? Se R@chahse) 2) tay cei ier ~FOrzY S(zxtS8) — 2at5h(h) (2 points) Finally, consider the following figure. If you trained a new Bayes Classifier on this data, what class would be predicted for the query location? AA A aA B A a Rall ches howe Same Conesid , aa and oe . sk larger vi OAL Ur ead wronsunr , bale Wee calcak foot ig tee As clam prem io mack Liquely So we. predid6 Naive Bayes (15 points) Consider a Naive Bayes problem with three features, 2: ...r3. Imagine that we have seen a total of 12 training examples, 6 positive (with y = 1) and 6 negative (with y = 0). Here are the actual points of if a 1| of o 1] 1 1] 0 o} of a ofo pops] o} if o rf apa o} of o o} 1] 0 r| ofa Here is a table with the summary counts: v=0]y=1] maT] 3,3 m-1{ 3] 3 n=1] 3] 3 1, What are the values of the parameters R,(1,0) and Rj(1,1) for each of the features i (using the Laplace correction)? All the parameters are (3+ 1)/(6+ 2) = 0.5 ‘ou seo the data point 1, 1,1 and use the parameters you found above, what output would Naive Bayes prediet? Explain how you got the result, ‘The prediction is arbitrary since S(0)=S(1) = 1/8 3. Naive Bayes doesn’t work very well on this data, explain why. The basic assumption of NB is that the features are independent, give the class. In this data sel, features 1 and 3 are definitely not independent; the values of these features are opposite for class 0 and equal for class 1. All the information is in these correlations, each feature independently says nothing about the class, so NB is not really applicable. Note that a decision tree would not have any problem with this data set
You might also like
Mdm4u Formula Sheet
PDF
No ratings yet
Mdm4u Formula Sheet
2 pages
Bayes Theorem PDF
PDF
No ratings yet
Bayes Theorem PDF
1 page
2 - Dual Simplex Method - Hira Gupta
PDF
No ratings yet
2 - Dual Simplex Method - Hira Gupta
9 pages
Maths Week 4
PDF
No ratings yet
Maths Week 4
49 pages
Binomial Distribution Questions (Solved)
PDF
No ratings yet
Binomial Distribution Questions (Solved)
2 pages
Conformal and Bilinear
PDF
No ratings yet
Conformal and Bilinear
8 pages
The Handshaking Lemma
PDF
No ratings yet
The Handshaking Lemma
14 pages
Chapter 24 - Probability Venn Diagram
PDF
No ratings yet
Chapter 24 - Probability Venn Diagram
14 pages
Unit 3 2DRV
PDF
No ratings yet
Unit 3 2DRV
82 pages
STAT 230 A2 Fall 2012 Solutions
PDF
No ratings yet
STAT 230 A2 Fall 2012 Solutions
7 pages
Python Week 4 GA Sols ?
PDF
No ratings yet
Python Week 4 GA Sols ?
23 pages
Joint and Conditional Probability Distributions
PDF
No ratings yet
Joint and Conditional Probability Distributions
52 pages
3.3. There Is One Error in One of Five Blocks of A...
PDF
No ratings yet
3.3. There Is One Error in One of Five Blocks of A...
2 pages
ASYMPTOTES1
PDF
No ratings yet
ASYMPTOTES1
13 pages
Normal Distribution
PDF
No ratings yet
Normal Distribution
16 pages
Week 11 Graded Solution
PDF
No ratings yet
Week 11 Graded Solution
10 pages
5b Contingency Tables and Conditional Probability Answer Key
PDF
100% (1)
5b Contingency Tables and Conditional Probability Answer Key
2 pages
M.E - QB Modi
PDF
100% (1)
M.E - QB Modi
6 pages
Markov Chains Questions
PDF
No ratings yet
Markov Chains Questions
3 pages
Cheat Sheet
PDF
No ratings yet
Cheat Sheet
2 pages
3.binomial Distribution
PDF
No ratings yet
3.binomial Distribution
14 pages
1.7 2d Random Variable
PDF
No ratings yet
1.7 2d Random Variable
44 pages
Method Characteristic
PDF
No ratings yet
Method Characteristic
7 pages
Simplex Method Problems
PDF
No ratings yet
Simplex Method Problems
5 pages
Questions and Their Solutions: Sample Space
PDF
No ratings yet
Questions and Their Solutions: Sample Space
5 pages
Week - 2
PDF
No ratings yet
Week - 2
41 pages
Specimen Standard Keyboard Notation For Ifoa Examinations
PDF
No ratings yet
Specimen Standard Keyboard Notation For Ifoa Examinations
4 pages
Comm-05-Random Variables and Processes
PDF
No ratings yet
Comm-05-Random Variables and Processes
90 pages
Mca Notes - m-1
PDF
No ratings yet
Mca Notes - m-1
25 pages
Bisection Method
PDF
No ratings yet
Bisection Method
34 pages
Hill Climbing Algorithm in AI
PDF
No ratings yet
Hill Climbing Algorithm in AI
5 pages
Further Linear Algebra. Chapter V. Bilinear and Quadratic Forms
PDF
No ratings yet
Further Linear Algebra. Chapter V. Bilinear and Quadratic Forms
14 pages
Partial Differentiation Applications
PDF
No ratings yet
Partial Differentiation Applications
37 pages
Tangent Planes
PDF
100% (1)
Tangent Planes
16 pages
BAYES Theorem
PDF
100% (5)
BAYES Theorem
4 pages
Stat 230 Notes
PDF
No ratings yet
Stat 230 Notes
248 pages
Timed Mock For Quiz 1 - Solution
PDF
No ratings yet
Timed Mock For Quiz 1 - Solution
27 pages
Unit 4. Probability Distributions
PDF
No ratings yet
Unit 4. Probability Distributions
18 pages
Chapter 14 Simple Linear Regression
PDF
No ratings yet
Chapter 14 Simple Linear Regression
45 pages
Clicker Question Bank For Numerical Analysis (Version 1.0 - May 14, 2020)
PDF
No ratings yet
Clicker Question Bank For Numerical Analysis (Version 1.0 - May 14, 2020)
91 pages
Error Function - Sfe
PDF
No ratings yet
Error Function - Sfe
12 pages
CHAPTER 6 Fundamentals of Probability
PDF
No ratings yet
CHAPTER 6 Fundamentals of Probability
78 pages
Likelihood Ratio Tests: Instructor: Songfeng Zheng
PDF
No ratings yet
Likelihood Ratio Tests: Instructor: Songfeng Zheng
9 pages
Chapter 1 - Limits and Continuity PDF
PDF
No ratings yet
Chapter 1 - Limits and Continuity PDF
55 pages
Z Test
PDF
50% (2)
Z Test
39 pages
PN Ceko
PDF
No ratings yet
PN Ceko
20 pages
Lecture 9 Moments
PDF
No ratings yet
Lecture 9 Moments
29 pages
Quiz 2 Mock 2023
PDF
No ratings yet
Quiz 2 Mock 2023
12 pages
Symbolic Math Toolbox: Quick Reference Sheet: Algebra Calculus
PDF
No ratings yet
Symbolic Math Toolbox: Quick Reference Sheet: Algebra Calculus
2 pages
2.problems On Euler's Theorem
PDF
100% (6)
2.problems On Euler's Theorem
3 pages
Fuzzy Numbers
PDF
No ratings yet
Fuzzy Numbers
16 pages
Bayes Theorem - Conditional Probability - Solved Examples
PDF
No ratings yet
Bayes Theorem - Conditional Probability - Solved Examples
2 pages
Binary Decision Diagrams: Theory, Implementation, Usage
PDF
No ratings yet
Binary Decision Diagrams: Theory, Implementation, Usage
39 pages
Probability
PDF
100% (1)
Probability
8 pages
P03 BayesianLearning SolutionNotes
PDF
No ratings yet
P03 BayesianLearning SolutionNotes
4 pages
Solution of Final Exam: 10-701/15-781 Machine Learning: Fall 2004 Dec. 12th 2004
PDF
No ratings yet
Solution of Final Exam: 10-701/15-781 Machine Learning: Fall 2004 Dec. 12th 2004
27 pages
I239-5 Naive Bayes
PDF
No ratings yet
I239-5 Naive Bayes
35 pages
DM NaiveBayes
PDF
No ratings yet
DM NaiveBayes
15 pages
Mathematics - Iii: Institute of Science&Technology
PDF
No ratings yet
Mathematics - Iii: Institute of Science&Technology
16 pages
Questions and Solutions On Linear Regression
PDF
No ratings yet
Questions and Solutions On Linear Regression
5 pages
Exercise 1: Task 1: Skiing Wax Prediction
PDF
No ratings yet
Exercise 1: Task 1: Skiing Wax Prediction
1 page
Deep Learning Handout
PDF
100% (1)
Deep Learning Handout
6 pages
Ma6452 SNM QB
PDF
No ratings yet
Ma6452 SNM QB
36 pages