机器学习与优化方法1-AIRS in the AIR
Title: Deterministically Constrained Stochastic Optimization
I will present the recent work by my research group on the design, analysis, and implementation of algorithms for solving continuous nonlinear optimization problems that involve a stochastic objective function and deterministic constraints. The talk will focus on our sequential quadratic optimization (commonly known as SQP) methods for cases when the constraints are defined by nonlinear systems of equations and inequalities. Our methods are applicable for solving various types of problems, such as for training machine learning (e.g., deep learning) models with constraints. Our work focuses on the "fully stochastic" regime in which only stochastic gradient estimates are employed, for which we have derived convergence-in-expectation results and worst-case iteration complexity bounds that are on par with stochastic gradient methods for the unconstrained setting. I will also discuss the various extensions that my group is exploring.
Frank E. Curtis is a Professor in the Department of Industrial and Systems Engineering at Lehigh University. Prior to joining Lehigh, he received his bachelor's degree from the College of William and Mary, received his master's and doctoral degrees from the Department of Industrial Engineering and Management Science at Northwestern University, and worked as a Postdoctoral Researcher in the Courant Institute of Mathematical Sciences at New York University. His research focuses on the design, analysis, and implementation of numerical methods for solving large-scale nonlinear optimization problems. He received an Early Career Award from the Advanced Scientific Computing Research program of the U.S. Department of Energy, and has had other funded projects with the U.S. National Science Foundation, Office of Naval Research, and Advanced Research Projects Agency - Energy. He received, along with Leon Bottou (Facebook AI Research) and Jorge Nocedal (Northwestern), the 2021 SIAM/MOS Lagrange Prize in Continuous Optimization. He was awarded, with James V. Burke (U. of Washington), Adrian Lewis (Cornell), and Michael Overton (NYU), the 2018 INFORMS Computing Society Prize. He and team members Daniel Molzahn (Georgia Tech), Andreas Waechter (Northwestern), Ermin Wei (Northwestern), and Elizabeth Wong (UC San Diego) were awarded second place in the ARPA-E Grid Optimization Competition in 2020. He currently serves as an Associate Editor for Mathematical Programming, SIAM Journal on Optimization, Mathematics of Operations Research, IMA Journal of Numerical Analysis, and Mathematical Programming Computation. He previously served as the Vice Chair for Nonlinear Programming for the INFORMS Optimization Society and is currently very active in professional societies and groups related to mathematical optimization, including INFORMS, the Mathematics Optimization Society, and the SIAM Activity Group on Optimization.
Title: Constraint Dissolving Approaches for a Class of Riemannian Optimization Problems
We propose constraint dissolving approaches for optimization problems over a class of Riemannian manifolds. In these proposed approaches, solving a Riemannian optimization problem is transferred into the unconstrained minimization of a constraint dissolving function named CDF. Different from existing exact penalty functions, the exact gradient and Hessian of CDF are easy to compute. We study the theoretical properties of CDF and prove that the original problem and CDF have the same first-order and second-order stationary points, local minimizers, and Łojasiewicz exponents in a neighborhood of the feasible region. Remarkably, the convergence properties of our proposed constraint dissolving approaches can be directly inherited from the existing rich results in unconstrained optimization. Therefore, the proposed constraint dissolving approaches build up short cuts from unconstrained optimization to Riemannian optimization. Several illustrative examples further demonstrate the potential of the proposed approaches.
Dr. Xin Liu, “Feng Kang” Distinguished Professor of the Academy of Mathematics and Systems Science, Chinese Academy Sciences. He got his bachelor degree from the School of Mathematical Sciences, Peking University in 2004, and PhD from the University of Chinese Academy of Sciences in 2009. His research interests include the optimization problems over the Stiefel manifold, linear and nonlinear eigenvalue problems, nonlinear least squares and distributed optimization. Dr. Xin Liu was granted the Excellent Young Scientists Fund from NSFC in 2016, the Science and Technology Award for Youth from ORSC in 2016, the Fifth CSIAM Young Scholar Prize in 2020 and the National Science Fund of China for Distinguished Young Scholars in 2021. He serves as an associate editor of “Mathematical Programming Computation”, “Asia-Pacific Journal of Operational Research”, “Operations Research Transactions”, “Journal of Computational Mathematics” and “Journal of Industrial and Management Optimization”.