LOG IN TO MyLSU
Home
lecture image Computational Mathematics Seminar Series
Boosting Convergence Rates for Fixed-Point and Root-Finding Algorithms
Quoc Tran-Dinh, The University of North Carolina at Chapel Hill
Associate Professor, Department of Statistics and Operations Research
Digital Media Center 1034
April 16, 2024 - 03:30 pm
Abstract:

Approximating a fixed-point of a nonexpansive operator or a root of a nonlinear equation is a fundamental problem in computational mathematics, which has various applications in different fields. Most classical methods for fixed-point and root-finding problems such as  fixed-point or gradient iteration, Halpern's iteration, and extragradient methods have a convergence rate of at most O(1/square root k) on the norm of the residual, where k is the iteration counter. This convergence rate is often obtained via appropriate constant stepsizes.

In this talk, we aim at presenting some recent development to boost the theoretical convergence rates of many root-finding algorithms up to O(1/k). We first discuss a connection between the Halpern fixed-point iteration in fixed-point theory and Nesterov's accelerated schemes in convex optimization for solving monotone equations involving a co-coercive operator (or equivalently, fixed-point problems of a nonexpansive operator). We also study such a connection for different recent schemes, including extra anchored gradient method to obtain new algorithms. We show how a faster convergence rate result from one scheme can be transferred to another and vice versa. Next, we discuss various variants of the proposed methods, including randomized block-coordinate algorithms for root-finding problems,which are different from existing randomized coordinate methods in optimization. Finally, we consider the applications of these randomized coordinate schemes to monotone inclusions and finite-sum monotone inclusions. The algorithms for the latter problem can be applied to many applications in federated learning.

div
Speaker's Bio:

Quoc Tran-Dinh is currently an associate professor at the Department of Statistics and Operations Research, The University of North Carolina at Chapel Hill. He obtained his Bachelor at Vietnam National University in Hanoi, and his Ph.D. from the Department of Electrical Engineering and Optimization in Engineering Center at KU Leuven, Belgium. His research mainly focuses on numerical methods for and applications of continuous optimization and related problems, including convex and nonconvex optimization, stochastic optimization, and minimax optimization. He currently serves as an associate editor of the Computational Optimization and Applications (COAP) and Mathematical Programming
Computation (MPC) journals.