General information
Latest issue
Impact factor

Search papers
Search references

Latest issue
Current issues
Archive issues
What is RSS

Trudy Inst. Mat. i Mekh. UrO RAN:

Personal entry:
Save password
Forgotten password?

Trudy Inst. Mat. i Mekh. UrO RAN, 2017, Volume 23, Number 3, Pages 33–42 (Mi timm1435)  

Optimization methods for the sensitivity function with constraints

A. S. Antipin

Federal Research Center "Computer Science and Control" of Russian Academy of Sciences

Abstract: We consider a parametric family of convex programming problems. The parameter is the vector of the right-hand sides in the functional constraints of the problem. Each vector value of the parameter taken from the nonnegative orthant corresponds to a regular (Slater's condition) convex programming problem and the minimum value of its objective function. This value depends on the constraint parameter and generates the sensitivity function. Along with this function, a convex set is given geometrically or functionally. The problem of minimization of the implicit sensitivity function on this set is posed. It can be interpreted as a convex programming problem in which, instead of a given vector of the right-hand sides of functional constraints, only a set to which this vector belongs is specified. As a result, we obtain a two-level problem. In contrast to the classical two-level hierarchical problems with implicitly given constraints, it is objective functions that are given implicitly in out case. There is no hierarchy in this problem. As a rule, sensitivity functions are discussed in the literature in a more general context as functions of the optimal value. The author does not know optimization statements of these problems as independent studies or, even more so, solution methods for them. A new saddle approach to the solution of problems with sensitivity functions is proposed. The monotone convergence of the method is proved with respect to the variables of the space in which the problem is considered.

Keywords: sensitivity function, parametric optimization, parametric Lagrangian, saddle point, extraproximal methods, convergence.

Funding Agency Grant Number
Russian Science Foundation 17-11-01353


Full text: PDF file (191 kB)
References: PDF file   HTML file

English version:
Proceedings of the Steklov Institute of Mathematics (Supplementary issues), 2018, 303, suppl. 1, 36–44

Bibliographic databases:

UDC: 517.988.68
MSC: 90C25, 90C31, 90C46, 90C90, 49K40
Received: 04.06.2016

Citation: A. S. Antipin, “Optimization methods for the sensitivity function with constraints”, Trudy Inst. Mat. i Mekh. UrO RAN, 23, no. 3, 2017, 33–42; Proc. Steklov Inst. Math. (Suppl.), 303, suppl. 1 (2018), 36–44

Citation in format AMSBIB
\by A.~S.~Antipin
\paper Optimization methods for the sensitivity function with constraints
\serial Trudy Inst. Mat. i Mekh. UrO RAN
\yr 2017
\vol 23
\issue 3
\pages 33--42
\jour Proc. Steklov Inst. Math. (Suppl.)
\yr 2018
\vol 303
\issue , suppl. 1
\pages 36--44

Linking options:

    SHARE: FaceBook Twitter Livejournal

    Citing articles on Google Scholar: Russian citations, English citations
    Related articles on Google Scholar: Russian articles, English articles
  • Trudy Instituta Matematiki i Mekhaniki UrO RAN
    Number of views:
    This page:93
    Full text:17
    First page:11

    Contact us:
     Terms of Use  Registration  Logotypes © Steklov Mathematical Institute RAS, 2020