LOG IN TO MyLSU
Home
lecture image Computational Mathematics Seminar Series
A Revisit of Gradient Descent Method for Nonlinear Optimization
Hongchao Zhang, Louisiana State University
Associate Professor
Digital Media Center 1034
April 23, 2019 - 03:30 pm
Abstract:
In this talk, we will discuss some recent advances of the gradient methods developed in nonlinear optimization, including steepest descent methods, Barizilai-Borwein type methods, optimal gradient methods, quasi-Newton methods and conjugate gradient methods. Our focus will be the convergence properties of these methods as well as their practical performances.
div
Speaker's Bio:
Hongchao Zhang received his PhD in applied mathematics from University of Florida in 2006. He then had a postdoc position at the Institute for Mathematics and Its Applications (IMA) and IBM T.J. Watson Research Center. He joined LSU as an assistant professor in 2008 and is now an associate professor in the department of mathematics and Center for Computation & Technology (CCT) at LSU. His research interests are nonlinear optimization theory, algorithm and applications.
div
This lecture has refreshments @ 03:00 pm