Poster
OPTAMI: Global Superlinear Convergence of High-order Methods
Dmitry Kamzolov · Artem Agafonov · Dmitry Pasechnyuk · Alexander Gasnikov · Martin Takáč
Hall 3 + Hall 2B #563
[
Abstract
]
Wed 23 Apr 7 p.m. PDT
— 9:30 p.m. PDT
Abstract:
Second-order methods for convex optimization outperform first-order methods in terms of theoretical iteration convergence, achieving rates up to O(k−5)O(k−5) for highly-smooth functions. However, their practical performance and applications are limited due to their multi-level structure and implementation complexity. In this paper, we present new results on high-order optimization methods, supported by their practical performance. First, we show that the basic high-order methods, such as the Cubic Regularized Newton Method, exhibit global superlinear convergence for μμ-strongly star-convex functions, a class that includes μμ-strongly convex functions and some non-convex functions. Theoretical convergence results are both inspired and supported by the practical performance of these methods. Secondly, we propose a practical version of the Nesterov Accelerated Tensor method, called NATA. It significantly outperforms the classical variant and other high-order acceleration techniques in practice. The convergence of NATA is also supported by theoretical results. Finally, we introduce an open-source computational library for high-order methods, called OPTAMI. This library includes various methods, acceleration techniques, and subproblem solvers, all implemented as PyTorch optimizers, thereby facilitating the practical application of high-order methods to a wide range of optimization problems. We hope this library will simplify research and practical comparison of methods beyond first-order.
Live content is unavailable. Log in and register to view live content