Encyclopedia of Machine Learning and Data Mining

2017 Edition
| Editors: Claude Sammut, Geoffrey I. Webb

Leave-One-Out Cross-Validation

Reference work entry
DOI: https://doi.org/10.1007/978-1-4899-7687-1_469


Leave-one-out cross-validation is a special case of  cross-validation where the number of folds equals the number of  instances in the  data set. Thus, the learning algorithm is applied once for each instance, using all other instances as a  training set and using the selected instance as a single-item  test set. This process is closely related to the statistical method of jack-knife estimation (Efron 1982).


Recommended Reading

  1. Efron B (1982) The Jackknife, the bootstrap and other resampling plans. In: CBMS-NSF regional conference series in applied mathematics 1982. Society for Industrial and Applied Mathematics (SIAM), PhiladelphiaGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2017