LOG IN TO MyLSU
Home
lecture image Computational Mathematics Seminar Series
Inexact Proximal Stochastic Gradient Method for Convex Composite Optimization
Xiao Wang, Chinese Academy of Sciences
Assistant Professor
Digital Media Center 1034
November 08, 2016 - 03:30 pm
Abstract:
We study an inexact proximal stochastic gradient (IPSG) method for convex composite optimization, whose objective function is a summation
of an average of a large number of smooth convex functions and a convex, but possibly nonsmooth, function. The variance reduction technique is incorporated in the method to reduce the stochastic gradient variance. The main feature of this IPSG algorithm is to allow solving the proximal subproblems inexactly while still keeping the global convergence with desirable complexity bounds. Different accuracy criteria are
proposed for solving the subproblem, under which the global convergence and the component gradient complexity bounds are derived for the both cases when the objective function is strongly convex or generally convex. Preliminary numerical experiment shows the overall efficiency of the IPSG algorithm.
div
Speaker's Bio:
Dr. Xiao Wang received the B.S. degree in computational mathematics from Shandong University, China and the Ph. D. degree from the Academy of Mathematics and System Science, Chinese Academy of Sciences, Beijing, China. She is currently an Assistant Professor with the University of Chinese Academy of Sciences, Beijing, China. Her current research interestes include nonlinear optimization, stochastic programming and large scale optimization. Her research is mainly supported by National Science Foundation of China.
div
This lecture has a reception @ 03:00 pm