Skip to main content

Leave-One-Out Cross-Validation

  • Reference work entry
Encyclopedia of Machine Learning

Definition

Leave-one-out cross-validation is a special case of cross-validation where the number of folds equals the number of instances in the data set. Thus, the learning algorithm is applied once for each instance, using all other instances as a training set and using the selected instance as a single-item test set. This process is closely related to the statistical method of jack-knife estimation (Efron, 1982).

Cross References

Algorithm Evaluation

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Recommended Reading

  • Efron, B. (1982). The Jackknife, the Bootstrap and other resampling plans. In CBMS-NSF regional conference series in applied mathematics 1982. Philadelphia, PA: Society for Industrial and Applied Mathematics (SIAM).

    Google Scholar 

Download references

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

(2011). Leave-One-Out Cross-Validation. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_469

Download citation

Publish with us

Policies and ethics