Cross Validation
Cross Validation
There are some common methods that are used for cross-validation.
These methods are given below:
1.Validation Set Approach
2.Leave-P-out cross-validation
3.Leave one out cross-validation
4.K-fold cross-validation
5.Stratified k-fold cross-validation
1. Validation Set Approach
• We divide the input dataset into a training set and test or validation
set in the validation set approach. Both the subsets are given 50% of
the dataset.
• One of the big disadvantages is that we are just using a 50% dataset
to train our model, so the model may miss out to capture important
information of the dataset. It also tends to give the underfitted
model.
2.Leave-P-out cross-validation
• In this method, the p datasets are left out of the training data.
• If there are total n datapoints in the input dataset, then n-p data
points will be used as the training dataset and the p data points as the
validation set.
• This complete process is repeated for all the samples, and the average
error is calculated to know the effectiveness of the model.
• The Disadvantage of this technique is, it can be computationally
difficult for the large p.
3. Leave one out cross-validation