Skip to main content

Mean Absolute Error

  • Reference work entry
Encyclopedia of Machine Learning

Synonyms

Absolute error loss; Mean absolute deviation; Mean error

Definition

Mean Absolute Error is a model evaluation metric used with regression models. The mean absolute error of a model with respect to a test set is the mean of the absolute values of the individual prediction errors on over all instances in the test set. Each prediction error is the difference between the true value and the predicted value for the instance.

$$mae = \frac{{\sum \nolimits _{i=1}^{n}}abs\left ({y}_{i} - \lambda ({x}_{i})\right )} {n}$$

where y i is the true target value for test instance x i , λ(x i ) is the predicted target value for test instance x i , and n is the number of test instances.

Cross References

Mean Squared Error

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

(2011). Mean Absolute Error. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_525

Download citation

Publish with us

Policies and ethics