Question #687   Submitted by Answiki on 10/17/2021 at 03:23:27 PM UTC

What is the gradient descent update equation?

Answer   Submitted by Answiki on 10/17/2021 at 03:29:00 PM UTC

In the gradient descent algorithm, update equation is given by :





Where :

  •  is the next point in 
  •  is the current point in 
  •  is the step size multiplier
  •  is the gradient of the function to minimize 


 is a parameter to tune. It defines the ratio between speed of convergence and stability. High values of  will speed up the algorithm, but can also make the convergence process instable.

2 events in history
Answer by Answiki on 10/17/2021 at 03:29:00 PM

In the gradient descent algorithm, update equation is given by :





Where :

  •  is the next point in 
  •  is the current point in 
  •  is the step size multiplier
  •  is the gradient of the function to minimize 


 is a parameter to tune. It defines the ratio between speed of convergence and stability. High values of  will speed up the algorithm, but can also make the convergence process instable.

Question by Answiki 10/17/2021 at 03:23:27 PM
What is the gradient descent update equation?
# ID Query URL Count

Icons proudly provided by Friconix.