Advertisement

Gradient Descent

  • David Paper
Chapter

Abstract

Gradient descent (GD) is an algorithm that minimizes (or maximizes) functions. To apply, start at an initial set of a function’s parameter values and iteratively move toward a set of parameter values that minimize the function. Iterative minimization is achieved using calculus by taking steps in the negative direction of the function’s gradient. GD is important because optimization is a big part of machine learning. Also, GD is easy to implement, generic, and efficient (fast).

Copyright information

© David Paper 2018

Authors and Affiliations

  • David Paper
    • 1
  1. 1.Apt 3LoganUSA

Personalised recommendations