Аннотация:
We propose some accelerated methods for solving optimization
problems under the condition of relatively smooth and relatively Lipschitz
continuous functions with inexact oracle. We consider the problem
of minimizing a convex, differentiable, and relatively smooth function
relative to a reference convex function. The first proposed method is based
on a similar triangles method with inexact oracle, which uses a special
triangular scaling property of the Bregman divergence used. The other
proposed methods are non-adaptive and adaptive (tuning to the relative
smoothness parameter) accelerated Bregman proximal gradient methods with
inexact oracle. These methods are universal in the sense that they
apply not only to relatively smooth but also to relatively Lipschitz
continuous optimization problems. We also introduce an adaptive
intermediate Bregman method, which interpolates between slower but more
robust non-accelerated algorithms and faster but less robust accelerated
algorithms. We conclude the paper with the results of numerical experiments
demonstrating the advantages of the proposed algorithms for the Poisson
inverse problem.
Bibliography: 32 titles.
This work was supported by a grant from the Ministry
of Economic Development of the Russian Federation, in accordance with
the subsidy agreement (agreement identifier 000000C313925P4G0002) and
agreement no. 139-15-2025-011, dated June 20, 2025, with the Ivannikov Institute for System Programming
of the Russian Academy of Sciences.
Поступила в редакцию: 05.09.2025
Дата публикации: 02.12.2025
Тип публикации:
Статья
УДК:517.538
Язык публикации: английский
Образец цитирования:
O. S. Savchuk, M. S. Alkousa, A. I. Shushko, A. A. Vyguzov, F. S. Stonyakin, D. A. Pasechnyuk, A. V. Gasnikov, “Accelerated Bregman gradient methods for relatively smooth and relatively Lipschitz continuous minimization problems”, УМН, 80:6(486) (2025), 137–172