AI4OPT Seminar Series
Thursday, February 22, 2024, Noon – 1:00 pm
Location: Atrium in Coda on the 9th floor
Also live streamed at: https://gatech.zoom.us/j/99381428980
Speaker: Prof. Robert M. Freund
Title: Level-Set Geometry and the Performance of Restarted-PDHG for Conic LP
Abstract: We discuss our recent research aimed at solving truly huge-scale convex optimization problems, at the scale where matrix-factorization-free methods are attractive/necessary. The restarted primal-dual hybrid gradient method (rPDHG) -- with heuristic enhancements and GPU implementation -- has been very successful in solving huge-scale linear programming (LP) problems; however its application to more general convex conic optimization problems is not so well-studied. We analyze the theoretical and practical performance of rPDHG for general (convex) conic optimization – with LP as a special case. We show a relationship between the geometry of the primal-dual (sub-) level sets W_e and the convergence rate of rPDHG. Specifically, we prove a bound on the convergence rate of rPDHG that improves when there is a primal-dual sublevel set W_e for which (i) W_e is close (in Hausdorff distance) to the optimal solution set, and (ii) the ratio of its diameter to its ``conic radius'' is small. In the special case of LP problems, the performance of rPDHG is bounded only by the this ratio applied to the sublevel set corresponding to the second-best extreme point. We note that in practice this ratio can take on extreme values and in such cases result in poor performance of the rPDHG both in theory and in practice. To address this issue, we show how central-path-based linear transformations -- including conic rescaling -- can markedly enhance the convergence rate of rPDHG. We also present extensive computational results that demonstrate how such rescalings can accelerate convergence to high-accuracy solutions, and lead to more efficient methods for huge-scale linear and conic optimization problems. This is joint work with Zikai Xiong.
Biography: Robert Freund is the Theresa Seley Professor in Management Science at the Sloan School of Management at MIT, where he teaches courses in data science (for MBAs), as well as optimization methods/theory courses to doctoral students at MIT. His research interests are in convex optimization, computational complexity and related computational science, convex geometry, large-scale nonlinear optimization, data science and machine learning. He received his B.A. in Mathematics from Princeton University and M.S. and Ph.D. in Operations Research from Stanford University. He received the Longuet-Higgins Prize for contributions to computer vision in 2007, as well as several teaching awards over the last thirty years. He is an INFORMS Fellow, former Co-Editor and currently Associate Editor of Mathematical Programming A, and has served in numerous leadership roles in operations research and optimization, including former Co-Director of the MIT Operations Research Center, the MIT Program in Computation for Design and Optimization, and the former Chair of the INFORMS Optimization Section.
Note: Boxed lunch will be served at the seminar. So, please stop by 15 minutes before the seminar to pick up lunch.
To continue receiving all AI4OPT seminar announcements, please sign up to the mailing list at: https://lists.isye.gatech.edu/mailman/listinfo/ai4opt-seminars
Videos of the past seminars can be seen on AI4OPT webpage at: https://www.ai4opt.org/seminars/past-seminars