Definition
Leave-one-out cross-validation is a special case of cross-validation where the number of folds equals the number of instances in the data set. Thus, the learning algorithm is applied once for each instance, using all other instances as a training set and using the selected instance as a single-item test set. This process is closely related to the statistical method of jack-knife estimation (Efron 1982).
Cross-References
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Recommended Reading
Efron B (1982) The Jackknife, the bootstrap and other resampling plans. In: CBMS-NSF regional conference series in applied mathematics 1982. Society for Industrial and Applied Mathematics (SIAM), Philadelphia
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Science+Business Media New York
About this entry
Cite this entry
(2017). Leave-One-Out Cross-Validation. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_469
Download citation
DOI: https://doi.org/10.1007/978-1-4899-7687-1_469
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-7685-7
Online ISBN: 978-1-4899-7687-1
eBook Packages: Computer ScienceReference Module Computer Science and Engineering