Statistics
Statistics
Statistics is the branch of mathematics that deals with collecting, analyzing, interpreting,
presenting, and organizing data. It helps in understanding data patterns, making predictions, and
drawing conclusions.
1. Data Collection:
o Qualitative Data: Non-numeric data (e.g., colors, categories, names).
o Quantitative Data: Numeric data (e.g., age, weight, height).
2. Measures of Central Tendency:
o Mean (Average): The sum of all values divided by the number of values.
Mean=∑Xn\text{Mean} = \frac{\sum X}{n}Mean=n∑X
o Median: The middle value when the data is ordered from least to greatest.
o Mode: The value that appears most frequently in the data.
3. Measures of Dispersion (Variation):
o Range: The difference between the maximum and minimum values in a dataset.
o Variance: The average of the squared differences from the mean.
o Standard Deviation: The square root of the variance; shows how much the data
varies from the mean.
4. Probability Distribution:
o Describes how the values of a random variable are distributed. Common
distributions include the Normal Distribution, Binomial Distribution, and
Poisson Distribution.
5. Graphs and Charts:
o Bar Graphs, Pie Charts, Histograms: Visual representations of data to see
patterns and trends easily.
o Box Plots: Used to show the distribution of data, identifying the median,
quartiles, and outliers.
Probability
Probability is the measure of the likelihood that a given event will occur. It ranges from 0
(impossible) to 1 (certain), and it's fundamental in understanding random phenomena.
1. Basic Probability:
o The probability of an event AAA is calculated as:
P(A)=Number of favorable outcomesTotal number of outcomesP(A) = \frac{\
text{Number of favorable outcomes}}{\text{Total number of
outcomes}}P(A)=Total number of outcomesNumber of favorable outcomes
o Example: The probability of rolling a 3 on a fair 6-sided die is 16\frac{1}{6}61.
2. Types of Events:
o Independent Events: Two events that do not affect each other (e.g., flipping a
coin and rolling a die).
o Dependent Events: The outcome of one event affects the outcome of the other
(e.g., drawing cards from a deck without replacement).
o Mutually Exclusive Events: Two events that cannot occur at the same time (e.g.,
tossing a coin and getting both heads and tails at once).
3. Complementary Events:
o The probability of the event not occurring is the complement of it happening:
P(not A)=1−P(A)P(\text{not A}) = 1 - P(A)P(not A)=1−P(A)
4. Conditional Probability:
o The probability of an event occurring given that another event has already
occurred.
P(A∣B)=P(A∩B)P(B)P(A | B) = \frac{P(A \cap B)}{P(B)}P(A∣B)=P(B)P(A∩B)
5. The Law of Total Probability:
o A rule that calculates the probability of an event by considering all possible ways
the event could happen.
6. Bayes' Theorem:
o A way to update the probability of a hypothesis based on new evidence:
P(A∣B)=P(B∣A)⋅P(A)P(B)P(A | B) = \frac{P(B | A) \cdot P(A)}
{P(B)}P(A∣B)=P(B)P(B∣A)⋅P(A)
7. Expected Value (Mean of Random Variable):
o A measure of the center of a probability distribution, calculated by multiplying
each possible outcome by its probability and summing the results.
Statistics:
o Used in surveys, market research, and elections to analyze trends.
o Used in medicine for data analysis (e.g., clinical trials).
Probability:
o Used in games of chance (e.g., dice, cards).
o Important in fields like finance, insurance, and risk management (predicting stock
market movements, calculating insurance premiums).