Encyclopedia of Machine Learning

2010 Edition
| Editors: Claude Sammut, Geoffrey I. Webb

Accuracy

Reference work entry
DOI: https://doi.org/10.1007/978-0-387-30164-8_3

Definition

Accuracy refers to a measure of the degree to which the predictions of a model match the reality being modeled. The term accuracy is often applied in the context of classification models. In this context, accuracy = P(λ(X) = Y ), where XY is a joint distribution and the classification model λ is a function XY. Sometimes, this quantity is expressed as a percentage rather than a value between 0.0 and 1.0.

The accuracy of a model is often assessed or estimated by applying it to test data for which the labels (Y values) are known. The accuracy of a classifier on test data may be calculated as number of correctly classified objects/total number of objects. Alternatively, a smoothing function may be applied, such as a Laplace estimate or an m-estimate.

Accuracy is directly related to error rate, such that accuracy = 1.0 − error rate (or when expressed as a percentage, accuracy = 100 − error rate).

Cross References

Copyright information

© Springer Science+Business Media, LLC 2011