0% found this document useful (0 votes)
17 views8 pages

Probabilistic Models Based Problems (Autosaved)

The document discusses probabilistic models and their applications using Bayes' Theorem, including spam classification, medical testing, sentiment analysis, fraud detection, and self-driving car systems. It explains key concepts such as posterior, likelihood, and prior probabilities, and provides examples to calculate probabilities based on given data. The document emphasizes the importance of Bayes' Theorem in updating probabilities with new evidence.

Uploaded by

Abinaya Devi C
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views8 pages

Probabilistic Models Based Problems (Autosaved)

The document discusses probabilistic models and their applications using Bayes' Theorem, including spam classification, medical testing, sentiment analysis, fraud detection, and self-driving car systems. It explains key concepts such as posterior, likelihood, and prior probabilities, and provides examples to calculate probabilities based on given data. The document emphasizes the importance of Bayes' Theorem in updating probabilities with new evidence.

Uploaded by

Abinaya Devi C
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Probabilistic Models

based Problems
Recap
 P(X|Y) is called as posterior, which we need to calculate. It is
defined as updated probability after considering the
evidence.
 P(Y|X) is called the likelihood. It is the probability of
evidence when hypothesis is true.
 P(X) is called the prior probability, probability of hypothesis
before considering the evidence
Bayes Equation
Probability Calculation Using Bayes'
Theorem
• Spam Classification:
• A spam filter uses Bayes' Theorem to classify emails.
Suppose:
• P(Spam) = 0.3 (30% of emails are spam)
• P(Not Spam) = 0.7 (70% of emails are not spam)
• P(“Buy Now” | Spam) = 0.8 (80% of spam emails contain “Buy Now”)
• P(“Buy Now” | Not Spam) = 0.1 (10% of non-spam emails contain “Buy Now”)
Find:
What is the probability that an email is spam given that it contains the words "Buy Now"? Hint:

Use Bayes' Theorem: here X = BuyNow y= spam


P(Y|X) = P(X|Y) x P(Y) / P(X)

𝑃(𝑆𝑝𝑎𝑚∣"𝐵𝑢𝑦𝑁𝑜𝑤")=𝑃("𝐵𝑢𝑦𝑁𝑜𝑤"∣𝑆𝑝𝑎𝑚)×𝑃(Spam) / P(BuyNow)

​where:𝑃("𝐵𝑢𝑦𝑁𝑜𝑤")=𝑃("𝐵𝑢𝑦𝑁𝑜𝑤"∣𝑆𝑝𝑎𝑚)×𝑃(𝑆𝑝𝑎𝑚)+𝑃("𝐵𝑢𝑦𝑁𝑜𝑤"∣𝑁𝑜𝑡𝑆𝑝𝑎𝑚)×𝑃(𝑁𝑜𝑡𝑆𝑝𝑎𝑚))
Medical Test for a Disease:
A diagnostic test for a rare disease has:
P(Disease) = 0.01 (1% of the population has the disease)
P(Test Positive | Disease) = 0.95 (95% sensitivity)
P(Test Positive | No Disease) = 0.05 (5% false positive rate)
Find: If a person tests positive, what is the probability that they actually
have the disease? Hint: Use Bayes' Theorem and calculate:
𝑃(𝐷𝑖𝑠𝑒𝑎𝑠𝑒∣𝑇𝑒𝑠𝑡𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒)=𝑃(𝑇𝑒𝑠𝑡𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒∣𝐷𝑖𝑠𝑒𝑎𝑠𝑒)×𝑃(𝐷𝑖𝑠𝑒𝑎𝑠𝑒)/𝑃(𝑇𝑒𝑠𝑡𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒)​
where:
𝑃(𝑇𝑒𝑠𝑡𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒)=𝑃(𝑇𝑒𝑠𝑡𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒|𝐷𝑖𝑠𝑒𝑎𝑠𝑒)×𝑃(𝐷𝑖𝑠𝑒𝑎𝑠𝑒)
+𝑃(𝑇𝑒𝑠𝑡𝑃𝑜𝑠𝑖𝑡𝑖𝑣𝑒∣𝑁𝑜𝐷𝑖𝑠𝑒𝑎𝑠𝑒)×𝑃(𝑁𝑜𝐷𝑖𝑠𝑒𝑎𝑠𝑒)
• Sentiment Analysis with Naïve Bayes:
A machine learning model classifies reviews as Positive or Negative
using Naïve Bayes. Given:
• P(Positive) = 0.6, P(Negative) = 0.4
• P("good" | Positive) = 0.7, P("good" | Negative) = 0.2
• P("not" | Positive) = 0.2, P("not" | Negative) = 0.6
• Find: Given a review contains the words "not good," what is the
probability that it is positive?
• A machine learning model predicts whether a transaction is
fraudulent. Given:
• P(Fraud) = 0.02 (2% of transactions are fraudulent)
• P(Alert | Fraud) = 0.9 (90% of fraudulent transactions trigger an alert)
• P(Alert | Not Fraud) = 0.1 (10% false positive rate)
• Find: If an alert is triggered, what is the probability that the
transaction is actually fraudulent?
• Hint: Use Bayes' Theorem:
• In a self-driving car system:
• P(Accident) = 0.01
• P(Rain | Accident) = 0.7, P(Rain | No Accident) = 0.3
• P(Brake Failure | Accident) = 0.6, P(Brake Failure | No Accident) = 0.1
• Find: If both rain and brake failure are observed, what is the
probability of an accident occurring?

You might also like