|
Solving convex min-min problems with smoothness and strong convexity in one group of variables and low dimension in the other
E. L. Gladinab, M. Alkousaba, A. V. Gasnikovab a Moscow Institute of Physics and Technology, Dolgoprudnyi, Moscow oblast, 141701 Russia
b Kharkevich Institute for Information Transmission Problems,
Russian Academy of Sciences, Moscow, 127051 Russia
Abstract:
The article deals with some approaches to solving convex problems of the min-min type with smoothness and strong convexity in only one of the two groups of variables. It is shown that the proposed approaches based on Vaidya’s method, the fast gradient method, and the accelerated gradient method with variance reduction have linear convergence. It is proposed to use Vaidya’s method to solve the exterior problem and the fast gradient method to solve the interior (smooth and strongly convex) one. Due to its importance for applications in machine learning, the case where the objective function is the sum of a large number of functions is considered separately. In this case, the accelerated gradient method with variance reduction is used instead of the fast gradient method. The results of numerical experiments are presented that illustrate the advantages of the proposed procedures for a logistic regression problem in which the a priori distribution for one of the two groups of variables is available.
Keywords:
convex optimization, cutting plane method, Vaidya’s method, variance reduction, fast gradient method, logistic regression.
Citation:
E. L. Gladin, M. Alkousa, A. V. Gasnikov, “Solving convex min-min problems with smoothness and strong convexity in one group of variables and low dimension in the other”, Avtomat. i Telemekh., 2021, no. 10, 60–75; Autom. Remote Control, 82:10 (2021), 1679–1691
Linking options:
https://www.mathnet.ru/eng/at15800 https://www.mathnet.ru/eng/at/y2021/i10/p60
|
Statistics & downloads: |
Abstract page: | 152 | Full-text PDF : | 1 | References: | 36 | First page: | 33 |
|