机器学习与数据科学博士生系列论坛(第三十五期)—— Explicit Superlinear Convergence Rates of Quasi-Newton Methods

Abstract: Quasi-Newton methods have a reputation as the most efficient numerical schemes for smooth unconstrained optimization due to the attractive feature of their superlinear convergence. However, the rates in previous work are asymptotic until Rodomanov and Nesterov (2021a, b, c) gave explicit superlinear rates for greedy and standard quasi-Newton methods. In this talk, we will introduce our two subsequent works following former significant research. The first work improves the theory of greedy quasi-newton methods by showing condition-number-free superlinear convergence rates. The second work gives explicit superlinear rates for modified standard SR1.