# Accuracy

**DOI:**https://doi.org/10.1007/978-1-4899-7687-1_3

## Definition

Accuracy refers to a measure of the degree to which the predictions of a model matches the reality being modeled. The term *accuracy* is often applied in the context of classification models. In this context, *accuracy* = P(*λ*(*X*) = *Y* ), where *XY* is a joint distribution and the classification model *λ* is a function *X* → *Y*. Sometimes, this quantity is expressed as a percentage rather than a value between 0.0 and 1.0.

The accuracy of a model is often assessed or estimated by applying it to test data for which the labels (*Y* values) are known. The accuracy of a classifier on test data may be calculated as *number of correctly classified objects/total number of objects*. Alternatively, a smoothing function may be applied, such as a Laplace estimate or an *m*-estimate.

Accuracy is directly related to error rate, such that *accuracy* = 1. 0 – *error rate* (or when expressed as a percentage, *accuracy* = 100 – *error rate*).