Computer Research and Modeling
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Computer Research and Modeling:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Computer Research and Modeling, 2020, Volume 12, Issue 2, Pages 301–317
DOI: https://doi.org/10.20537/2076-7633-2020-12-2-301-317
(Mi crm786)
 

This article is cited in 7 scientific papers (total in 7 papers)

NUMERICAL METHODS AND THE BASIS FOR THEIR APPLICATION

Mirror descent for constrained optimization problems with large subgradient values of functional constraints

F. S. Stonyakinab, A. N. Stepanova, A. V. Gasnikovcdb, A. A. Titovb

a V. I. Vernadsky Crimean Federal University, 4 Academician Vernadsky av., Simferopol, 295007, Russia
b Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow Region, 141701, Russia
c Institute for Information Transmission Problems, 19/1 Bolshoi Karetny per., Moscow, 127051, Russia
d Caucasus Mathematical Center, Adyghe State University, 208 Pervomayskaya st., Maykop, 385000, Russia
Full-text PDF (184 kB) Citations (7)
References:
Abstract: The paper is devoted to the problem of minimization of the non-smooth functional $f$ with a non-positive non-smooth Lipschitz-continuous functional constraint. We consider the formulation of the problem in the case of quasi-convex functionals. We propose new strategies of step-sizes and adaptive stopping rules in Mirror Descent for the considered class of problems. It is shown that the methods are applicable to the objective functionals of various levels of smoothness. Applying a special restart technique to the considered version of Mirror Descent there was proposed an optimal method for optimization problems with strongly convex objective functionals. Estimates of the rate of convergence for the considered methods are obtained depending on the level of smoothness of the objective functional. These estimates indicate the optimality of the considered methods from the point of view of the theory of lower oracle bounds. In particular, the optimality of our approach for Hölder-continuous quasi-convex (sub)differentiable objective functionals is proved. In addition, the case of a quasi-convex objective functional and functional constraint was considered. In this paper, we consider the problem of minimizing a non-smooth functional $f$ in the presence of a Lipschitz-continuous non-positive non-smooth functional constraint $g$, and the problem statement in the cases of quasi-convex and strongly (quasi-)convex functionals is considered separately. The paper presents numerical experiments demonstrating the advantages of using the considered methods.
Keywords: non-smooth constrained optimization, quasi-convex functional, adaptive mirror descent, level of smoothness, Hölder-continuous objective functional, optimal method.
Funding agency Grant number
Russian Science Foundation 18-71-00048
Russian Foundation for Basic Research 18-31-00219
Ministry of Education and Science of the Russian Federation МК-15.2020.1
Ministry of Science and Higher Education of the Russian Federation 075-00337-20-03
The research of F. Stonyakin in Algorithms 2 and 4, Remarks 2 and 3, Theorems 2 and 5 was supported by Russian Science Foundation (project 18-71-00048). The research of F. Stonyakin and A. Stepanov in numerical experiments (Examples 1 and 2) was supported by Russian Science Foundation (project 18-71-00048). The research in Theorems 3 and 4, Algorithm 3 and in numerical experiments (Examples 3 and 4) was supported by Russian Foundation for Basic Research (project 18-31-00219 mol-a). The research of F. Stonyakin in Remark 4 was supported by the grant of the President of Russian Federation for young candidates of sciences (project MK-15.2020.1). The research of A. V. Gasnikov (Algorithm 3 and Lemma 3) is supported by the Ministry of Science and Higher Education of the Russian Federation (Goszadaniye in MIPT, project 075-00337-20-03).
Received: 04.11.2019
Revised: 08.12.2019
Accepted: 24.12.2019
Document Type: Article
UDC: 519.85
Language: English
Citation: F. S. Stonyakin, A. N. Stepanov, A. V. Gasnikov, A. A. Titov, “Mirror descent for constrained optimization problems with large subgradient values of functional constraints”, Computer Research and Modeling, 12:2 (2020), 301–317
Citation in format AMSBIB
\Bibitem{StoSteGas20}
\by F.~S.~Stonyakin, A.~N.~Stepanov, A.~V.~Gasnikov, A.~A.~Titov
\paper Mirror descent for constrained optimization problems with large subgradient values of functional constraints
\jour Computer Research and Modeling
\yr 2020
\vol 12
\issue 2
\pages 301--317
\mathnet{http://mi.mathnet.ru/crm786}
\crossref{https://doi.org/10.20537/2076-7633-2020-12-2-301-317}
Linking options:
  • https://www.mathnet.ru/eng/crm786
  • https://www.mathnet.ru/eng/crm/v12/i2/p301
  • This publication is cited in the following 7 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Computer Research and Modeling
    Statistics & downloads:
    Abstract page:406
    Full-text PDF :88
    References:30
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2024