LOG IN TO MyLSU
Home
lecture image Computational Mathematics Seminar Series
On the Asymptotic Convergence and Acceleration of Gradient Methods
Yakui Huang, Hebei University of Technology, China
Associate Professor
Digital Media Center 1034
November 19, 2019 - 03:30 pm
Abstract:

We consider the asymptotic behavior of a family of gradient methods, which include the steepest descent and minimal gradient methods as special instances. It is proved that each method in the family will asymptotically zigzag between two directions. Asymptotic convergence results of the objective value, gradient norm, and stepsize are presented as well. To accelerate the family of gradient methods, we further exploit spectral properties of stepsizes to break the zigzagging pattern. In particular, a new stepsize is derived by imposing finite termination on minimizing two dimensional strictly convex quadratic function. It is shown that, for the general quadratic function, the proposed stepsize asymptotically converges to the reciprocal of the largest eigenvalue of the Hessian. Furthermore, based on this spectral property, we propose a periodic gradient method by incorporating the Barzilai-Borwein method. Numerical comparisons with some recent successful gradient methods show that our new method is very promising.

 

div
Speaker's Bio:

Yakui Huang received the Ph.D. degree in applied mathematics from Xidian University, Xi'an, China, in 2015. After a two-year postdoc training in the Academy of Mathematics and Systems Science, Chinese Academy of Sciences, in 2017 he joined Hebei University of Technology (HEBUT), Tianjin, China, and is currently an associate professor with the Institute of Mathematics, HEBUT. His research interests include nonlinear optimization theory, numerical algorithms and applications.

 

div
This lecture has refreshments @ 03:00 pm