0% found this document useful (0 votes)
25 views21 pages

Bias and Variance

The document discusses the concepts of bias and variance in machine learning, highlighting the trade-off between model complexity and expected error. It explains underfitting and overfitting, noting that simple models tend to have high bias and low variance, while complex models exhibit low bias and high variance. The bias-variance tradeoff is emphasized as a crucial consideration in model selection and performance optimization.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views21 pages

Bias and Variance

The document discusses the concepts of bias and variance in machine learning, highlighting the trade-off between model complexity and expected error. It explains underfitting and overfitting, noting that simple models tend to have high bias and low variance, while complex models exhibit low bias and high variance. The bias-variance tradeoff is emphasized as a crucial consideration in model selection and performance optimization.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

CS60050: Machine Learning

Autumn 2024

Sudeshna Sarkar

Bias and Variance

21 August 2024
Bias & Variance
Overfitting vs Underfitting
Underfitting Overfitting
• Not able to capture the concept • Fitting the data too well
• Features don’t capture concept
• Model is not powerful.
HIGH BIAS HIGH VARIANCE

Testing
Error

Training
Model Complexity
Function Approximation: The Big Picture
Bias of a learner (~ mean error)
• How likely is the learner to identify the target hypothesis?
• Bias is low when the model is expressive (low empirical error)
• Bias is high when the model is too simple : The larger the
hypothesis space is, the easiest it is to be close to the true
Bias
hypothesis.

Model complexity

Variance of a learner

Variance

Model complexity
Bias-Variance Tradeoff

(C) Dhruv Batra Slide Credit: Carlos Guestrin


Impact of bias and variance

Expected
Error
Variance

Bias

Model complexity

Expected error ≈ bias + variance (why???)


Bias-Variance Decomposition of Squared Error

Bias-Variance Decomposition of Squared Error


Bias-Variance Decomposition of Squared Error

Bias-Variance Decomposition of Squared Error

Bias-Variance Decomposition of Squared Error
Model complexity

Expected
Error
Variance

Bias

Model complexity
Simple models: Complex models:
High bias and low variance High variance and low bias

15
BIAS VARIANCE
• • Error caused because the learned model reacts to
small changes (noise) in the training data
• High variance can cause an algorithm to model
the random noise in the training data, rather than
the intended outputs
• Higher Variance
• Decision tree with large no of nodes
• High degree polynomials
• Many features
Underfitting and Overfitting
Underfittin Overfitting
Expected g
Error
Variance

Bias

Model complexity

Simple models: Complex models:


High bias and low variance High variance and low bias
This can be made more accurate for some loss functions.
We will discuss a more precise and general theory that trades
expressivity of models with empirical error
17
Bias and Variance Tradeoff
There is usually a bias-variance tradeoff caused by model complexity.
Complex models often have lower bias, but higher variance.
Simple models often have higher bias, but lower variance.

High Bias High Variance

Testing
Error

Training
Model Complexity
Trade-Offs

As m increases, E decreases
Model complexity

Tr
eu
Er
ro
r
Classification Error

Trainin
g Error

Model Complexity

You might also like