Question #687

# What is the gradient descent update equation?

Merged questions

In the gradient descent algorithm, update equation is given by :

﻿﻿

Where :

• ﻿﻿ is the next point in ﻿﻿
• ﻿﻿ is the current point in ﻿﻿
• ﻿﻿ is the step size multiplier
• ﻿﻿ is the gradient of the function to minimize ﻿﻿

﻿﻿ is a parameter to tune. It defines the ratio between speed of convergence and stability. High values of ﻿﻿ will speed up the algorithm, but can also make the convergence process instable.

2 events in history

Icons proudly provided by Friconix.