Skip to main content

Unstable Learner

  • Reference work entry
Encyclopedia of Machine Learning
  • 142 Accesses

An unstable learner produces large differences in generalization patterns when small changes are made to its initial conditions. The obvious initial condition is the set of training data used – for an unstable learner, sampling a slightly different training set produces a large difference in testing behavior. Some models can be unstable in additional ways, for example neural networks are unstable with respect to the initial weights. In general this is an undesirable property – high sensitivity to training conditions is also known as high variance, which results in higher overall mean squared error. The flexibility enabled by being sensitive to data can thus be a blessing or a curse. Unstable learners can however be used to an advantage in ensemble learning methods, where large variance is “averaged out” across multiple learners.

Examples of unstable learners are: neural networks (assuming gradient descent learning), and decision trees. Examples of stable learners are support vector...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

(2011). Unstable Learner. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_866

Download citation

Publish with us

Policies and ethics