Definition
Accuracy refers to a measure of the degree to which the predictions of a model match the reality being modeled. The term accuracy is often applied in the context of classification models. In this context, accuracy = P(λ(X) = Y ), where XY is a joint distribution and the classification model λ is a function X → Y. Sometimes, this quantity is expressed as a percentage rather than a value between 0.0 and 1.0.
The accuracy of a model is often assessed or estimated by applying it to test data for which the labels (Y values) are known. The accuracy of a classifier on test data may be calculated as number of correctly classified objects/total number of objects. Alternatively, a smoothing function may be applied, such as a Laplace estimate or an m-estimate.
Accuracy is directly related to error rate, such that accuracy = 1.0 − error rate (or when expressed as a percentage, accuracy = 100 − error rate).
Cross References
This is a preview of subscription content, log in via an institution.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer Science+Business Media, LLC
About this entry
Cite this entry
(2011). Accuracy. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_3
Download citation
DOI: https://doi.org/10.1007/978-0-387-30164-8_3
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-30768-8
Online ISBN: 978-0-387-30164-8
eBook Packages: Computer ScienceReference Module Computer Science and Engineering