0% found this document useful (0 votes)
88 views11 pages

Computational Learning Theory

Uploaded by

Chatla Chetan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views11 pages

Computational Learning Theory

Uploaded by

Chatla Chetan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

COMPUTATIONAL LEARNING

THEORY
Theoretical Questions of Interest
• Is it possible to identify classes of learning
problems that are inherently difficult or easy,
independent of the learning algorithm?
• Can one characterize the number of training
examples necessary or sufficient to assure
successful learning?
• How is the number of examples effected
 If observing a random sample of training data?
 If the learner is allowed to pose queries to the trainer?
• Can one characterize the number of mistakes that a
learner will make before learning the target
function?
• Can one characterize the inherent computational
complexity of classes of learning problems?
• To answer to all these questions are not yet
known,
• With computational learning theory, will
generally focus on certain types of learning
problems.
• We focus on the problem of inductively
learning an unknown target function.
Inductive Learning of Target Function
• What we are given
 Hypothesis space
 Training examples
• What we want to know
 How many training examples are sufficient to
successfully learn the target function?
 How many mistakes will the learner make before
succeeding.
Computational Learning Theory
provides a theoretical analysis of learning:
• Is it possible to identify classes of learning
problems that are inherently difficult/easy?
• Can we characterize the computational
complexity of classes of learning problems
when a learning algorithm can be expected to succeed
When learning may be impossible.
• Can we characterize the number of training
samples necessary/sufficient for successful
learning?
• How many mistakes will the learner make
before learning the target function
• Quantitative bounds can be set depending on
the following attributes:
Accuracy to which the target must be approximated
The probability that the learner will output a
successful hypothesis.
Size or complexity of the hypothesis space consider by
the learner
The manner in which training examples are presented
to the learner
Computational Learning Theory
Three general areas:
• Sample complexity. How many training
examples are need to find a good hypothesis?
• Computational complexity. How much
computational power we need to find a good
hypothesis?
• Mistake bound. How many mistakes we will
make before finding a good hypothesis?
Sample Complexity
How many training examples sufficient to learn target concept?
• Scenario 1: Active Learning
 Learner propose instances as queries to a teacher
 Query (learner): instance x
 Answer(teacher): c(x)
• Scenario 2: Passive learning from Teacher –
selected examples
 Teacher(who knows c) provides training examples
 Sequence of examples(teacher): {<x, c(x)>}
 Teacher may or many not be helpful, optimal
Sample Complexity
How many training examples sufficient to learn target concept?

• Scenario 3: Passive learning from teacher


annotated examples
 Random process (e.g., nature) proposes instances
 Instance x generated randomly, teacher provides
c(x)

You might also like