0% found this document useful (0 votes)
18 views44 pages

Chapter 03

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views44 pages

Chapter 03

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Machine

Learning
S. Sridhar and M. Vijayalakshmi

© Oxford University Press 2021. All rights reserved


Chapter 3
Basics of Learning Theory

© Oxford University Press 2021. All rights reserved


Learning
LEARNING IS A PROCESS BY WHICH ONE CAN ACQUIRE KNOWLEDGE AND CONSTRUCT
NEW IDEAS OR CONCEPTS BASED ON THE EXPERIENCES.

THE STANDARD DEFINITION OF LEARNING PROPOSED BY TOM MITCHELL IS THAT A


PROGRAM CAN LEARN FROM E FOR THE TASK T, AND P IMPROVES WITH EXPERIENCE E.

THERE ARE TWO KINDS OF PROBLEMS – WELL-POSED AND ILL-POSED. COMPUTERS CAN
SOLVE ONLY WELL-POSED PROBLEMS, AS THESE HAVE WELL-DEFINED SPECIFICATIONS
AND HAVE THE FOLLOWING COMPONENTS INHERENT TO IT.

1. CLASS OF LEARNING TASKS (T)


2. A MEASURE OF PERFORMANCE (P)
3. A SOURCE OF EXPERIENCE (E)

© Oxford University Press 2021. All rights reserved


Learning Environment

© Oxford University Press 2021. All rights reserved


Learning Types

© Oxford University Press 2021. All rights reserved


Introduction to Computation Learning Theory

THESE QUESTIONS ARE THE BASIS OF A FIELD CALLED ‘COMPUTATIONAL LEARNING THEORY’ OR IN SHORT
(COLT).

© Oxford University Press 2021. All rights reserved


Design of a Learning System

© Oxford University Press 2021. All rights reserved


Introduction to Concept Learning

© Oxford University Press 2021. All rights reserved


Representation of a Hypothesis

© Oxford University Press 2021. All rights reserved


Hypothesis Space
HYPOTHESIS SPACE IS THE SET OF ALL POSSIBLE HYPOTHESES THAT APPROXIMATES THE TARGET
FUNCTION F.

THE SUBSET OF HYPOTHESIS SPACE THAT IS CONSISTENT WITH ALL-OBSERVED TRAINING


INSTANCES IS CALLED AS VERSION SPACE.

© Oxford University Press 2021. All rights reserved


Heuristic Space Search

HEURISTIC SEARCH IS A SEARCH STRATEGY THAT FINDS AN OPTIMIZED HYPOTHESIS/SOLUTION


TO A PROBLEM BY ITERATIVELY IMPROVING THE HYPOTHESIS/SOLUTION BASED ON A GIVEN
HEURISTIC FUNCTION OR A COST MEASURE.

© Oxford University Press 2021. All rights reserved


Generalization and Specialization
SEARCHING THE HYPOTHESIS SPACE

THERE ARE TWO WAYS OF LEARNING THE HYPOTHESIS, CONSISTENT WITH ALL TRAINING INSTANCES FROM
THE LARGE HYPOTHESIS SPACE.

1. SPECIALIZATION – GENERAL TO SPECIFIC LEARNING


2. GENERALIZATION – SPECIFIC TO GENERAL LEARNING
GENERALIZATION – SPECIFIC TO GENERAL LEARNING THIS LEARNING METHODOLOGY WILL SEARCH
THROUGH THE HYPOTHESIS SPACE FOR AN APPROXIMATE HYPOTHESIS BY GENERALIZING THE MOST
SPECIFIC HYPOTHESIS.

SPECIALIZATION – GENERAL TO SPECIFIC LEARNING THIS LEARNING METHODOLOGY WILL SEARCH


THROUGH THE HYPOTHESIS SPACE FOR AN APPROXIMATE HYPOTHESIS BY SPECIALIZING THE MOST
GENERAL HYPOTHESIS.

© Oxford University Press 2021. All rights reserved


Hypothesis Space Search by Find-S Algorithm

© Oxford University Press 2021. All rights reserved


Limitations of Find-S Algorithm

© Oxford University Press 2021. All rights reserved


Version Spaces

© Oxford University Press 2021. All rights reserved


List-Then-Eliminate Algorithm

© Oxford University Press 2021. All rights reserved


Candidate Elimination Algorithm

© Oxford University Press 2021. All rights reserved


Example 3.4:

© Oxford University Press 2021. All rights reserved


Example 3.4: Deriving the Version Space

© Oxford University Press 2021. All rights reserved


Induction Biases

© Oxford University Press 2021. All rights reserved


Bias and Variance

© Oxford University Press 2021. All rights reserved


Bias vs. Variance

© Oxford University Press 2021. All rights reserved


Bias vs. Variance Trade-off

© Oxford University Press 2021. All rights reserved


Modelling in Machine Learning

© Oxford University Press 2021. All rights reserved


Machine Learning Process

© Oxford University Press 2021. All rights reserved


Re-sampling Methods

© Oxford University Press 2021. All rights reserved


Cross-Validation

© Oxford University Press 2021. All rights reserved


Holdout Method

© Oxford University Press 2021. All rights reserved


K-fold Cross-Validation

© Oxford University Press 2021. All rights reserved


Stratified K-fold Cross-Validation

© Oxford University Press 2021. All rights reserved


Leave-One-Out Cross-Validation (LOOCV)

© Oxford University Press 2021. All rights reserved


Model Performance

© Oxford University Press 2021. All rights reserved


Model Performance

© Oxford University Press 2021. All rights reserved


Visual Classifier Performance

© Oxford University Press 2021. All rights reserved


Scoring Methods

© Oxford University Press 2021. All rights reserved


Learning Frameworks
PAC FRAMEWORK

© Oxford University Press 2021. All rights reserved


Learning Frameworks
MISTAKE BOUND MODEL

© Oxford University Press 2021. All rights reserved


Finite and Infinite Hypothesis

© Oxford University Press 2021. All rights reserved


Estimating Hypothesis Accuracy

© Oxford University Press 2021. All rights reserved


Hoeffding’s Inequality

© Oxford University Press 2021. All rights reserved


Vapnik–Chervonenkis Dimension

© Oxford University Press 2021. All rights reserved


Summary

© Oxford University Press 2021. All rights reserved


Summary

© Oxford University Press 2021. All rights reserved


Summary

© Oxford University Press 2021. All rights reserved

You might also like