Cross Validation is technique in which we train our model on subset of data and then evaluate the model using complementary subset of data set.

The Cross Validation includes the following steps:

1. Reserve some portion of the sample dataset.

2. Train your model using the rest of the data set.

3. Use the reserve portion of the data set to test your model.

There are various methods in cross validation, they are as follows:

**1. Validation :**

In this method we perform train-test and split using 50-50 % of the data set. It may be possible that the remaining 50 % of data set which we are reserving for testing our model contains some important information and we leave all those our model will be highly bias and is bad model.

**2. Leave one Out Cross Validation (LOOCV) :**

**In this we perform training on whole dataset but leaves only one data point of the available dataset and then iterates for each data point. The model has advantages as it considers all the data points , so we can be sure that all the features will be covered by the model and hence it is less biased. At same time model has some disadvatages as it iterates over each data point till number of data points in the data set , so it takes lot of time to execute.**

**3. K-fold Cross Validation :**

This method is soulution of the problems we faced in above two methods. In this we split the data set into k number of subsets known as folds. Then we perform training on all the subsets but leave one (k-1) subset for the evaluation of the trained model.

In this process record the error you see on each of predictions, the average of your k records error is called cross validation error and will serve as your performance metric for the model.

K fold Cross Validation Visual |

Note: It is Suggested that the value of k should be 10 as the lower value of k would takes us towards validation process and larger value will take us towards LOOCV method.

**4. Stratified K fold cross Validation:**

**Stratified is the process of rearranging the data as to ensure that each fold is a good representative of the whole.**

## Comments

## Post a Comment

Leave a Message !