0% found this document useful (0 votes)
17 views

Generalization

The document discusses different methods for improving the generalization ability of neural networks, including using larger datasets split into training, validation, and test sets, regularization which modifies the performance function, and early stopping which monitors the validation set for minimum error. Retraining networks with different initial weights and data splits can also help find a network with good generalization.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Generalization

The document discusses different methods for improving the generalization ability of neural networks, including using larger datasets split into training, validation, and test sets, regularization which modifies the performance function, and early stopping which monitors the validation set for minimum error. Retraining networks with different initial weights and data splits can also help find a network with good generalization.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 9

Generalization

.
• One of the major advantages of neural nets is
their ability to generalize. This means that a
trained net could classify data from the same
class as the learning data that it has never
seen before.
• In other words Generalization of neural
network is ability to handle unseen data.

2
• In real world applications developers normally
have only a small part of all possible patterns for
the generation of a neural net. To reach the best
generalization, the dataset should be split into
three parts:
(1)The training set is used to train a neural net. The
error of this dataset is minimized during training.
(2)The validation set is used to determine the
performance of a neural network on patterns that
are not trained during learning.
(3)A test set for finally checking the over all
performance of a neural net.
3
Generalization error
• generalization error (also known as the out-
of-sample error]) is a measure of how
accurately an algorithm is able to predict
outcome values for previously unseen data.

4
Method for improving
generalization of Neural Networks
• There are two other methods for improving
generalization that are implemented in Neural
Network Toolbox™ software:
(1)regularization
(2)early stopping
(3)Retraining Neural Networks

5
(1) Regularization
• Another method for improving generalization
is called regularization. This involves
modifying the performance function, which is
normally chosen to be the sum of squares of
the network errors on the training set.
• The typical performance function used for
training neural networks is the mean sum of
squares of the network errors.

6
• It is possible to improve generalization if you
modify the performance function by adding a
term that consists of the mean of the sum of
squares of the network weights and biases

7
(2) Early stopping
• The default method for improving
generalization is called early stopping.
• In this technique the available data is divided
into three subsets.
(1)Training set
(2)Validation set
(3)Test set

8
(3) Retraining Neural Networks
• Typically each back propagation training
session starts with different initial weights and
biases, and different divisions of data into
training, validation, and test sets. These
different conditions can lead to very different
solutions for the same problem.
• It is a good idea to train several networks to
ensure that a network with good
generalization is found.
9

You might also like