Loading...
Thumbnail Image
Item

The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions

Abbaszadehpeivasti,Hadi
de Klerk,Etienne
Zamani,Moslem
Abstract
In this paper, we study the convergence rate of the gradient (or steepest descent) method with fixed step lengths for finding a stationary point of an L-smooth function. We establish a new convergence rate, and show that the bound may be exact in some cases, in particular when all step lengths lie in the interval (0,1/L]. In addition, we derive an optimal step length with respect to the new bound.
Description
Date
2022-07
Journal Title
Journal ISSN
Volume Title
Publisher
Research Projects
Organizational Units
Journal Issue
Keywords
l-smooth optimization, gradient method, performance estomation prolem, semidefinite programming
Citation
Abbaszadehpeivasti, H, de Klerk, E & Zamani, M 2022, 'The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions', Optimization Letters, vol. 16, no. 6, pp. 1649–1661. https://doi.org/10.1007/s11590-021-01821-1
Embedded videos