RUS  ENG JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB
 General information Latest issue Archive Impact factor Search papers Search references RSS Latest issue Current issues Archive issues What is RSS

 Trudy Inst. Mat. i Mekh. UrO RAN: Year: Volume: Issue: Page: Find

 Trudy Inst. Mat. i Mekh. UrO RAN, 2017, Volume 23, Number 3, Pages 33–42 (Mi timm1435)

Optimization methods for the sensitivity function with constraints

A. S. Antipin

Federal Research Center "Computer Science and Control" of Russian Academy of Sciences

Abstract: We consider a parametric family of convex programming problems. The parameter is the vector of the right-hand sides in the functional constraints of the problem. Each vector value of the parameter taken from the nonnegative orthant corresponds to a regular (Slater's condition) convex programming problem and the minimum value of its objective function. This value depends on the constraint parameter and generates the sensitivity function. Along with this function, a convex set is given geometrically or functionally. The problem of minimization of the implicit sensitivity function on this set is posed. It can be interpreted as a convex programming problem in which, instead of a given vector of the right-hand sides of functional constraints, only a set to which this vector belongs is specified. As a result, we obtain a two-level problem. In contrast to the classical two-level hierarchical problems with implicitly given constraints, it is objective functions that are given implicitly in out case. There is no hierarchy in this problem. As a rule, sensitivity functions are discussed in the literature in a more general context as functions of the optimal value. The author does not know optimization statements of these problems as independent studies or, even more so, solution methods for them. A new saddle approach to the solution of problems with sensitivity functions is proposed. The monotone convergence of the method is proved with respect to the variables of the space in which the problem is considered.

Keywords: sensitivity function, parametric optimization, parametric Lagrangian, saddle point, extraproximal methods, convergence.

 Funding Agency Grant Number Russian Science Foundation 17-11-01353

DOI: https://doi.org/10.21538/0134-4889-2017-23-3-33-42

Full text: PDF file (191 kB)
References: PDF file   HTML file

English version:
Proceedings of the Steklov Institute of Mathematics (Supplementary issues), 2018, 303, suppl. 1, 36–44

Bibliographic databases:

UDC: 517.988.68
MSC: 90C25, 90C31, 90C46, 90C90, 49K40

Citation: A. S. Antipin, “Optimization methods for the sensitivity function with constraints”, Trudy Inst. Mat. i Mekh. UrO RAN, 23, no. 3, 2017, 33–42; Proc. Steklov Inst. Math. (Suppl.), 303, suppl. 1 (2018), 36–44

Citation in format AMSBIB
\Bibitem{Ant17} \by A.~S.~Antipin \paper Optimization methods for the sensitivity function with constraints \serial Trudy Inst. Mat. i Mekh. UrO RAN \yr 2017 \vol 23 \issue 3 \pages 33--42 \mathnet{http://mi.mathnet.ru/timm1435} \crossref{https://doi.org/10.21538/0134-4889-2017-23-3-33-42} \elib{http://elibrary.ru/item.asp?id=29937997} \transl \jour Proc. Steklov Inst. Math. (Suppl.) \yr 2018 \vol 303 \issue , suppl. 1 \pages 36--44 \crossref{https://doi.org/10.1134/S0081543818090043} \isi{http://gateway.isiknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&DestLinkType=FullRecord&DestApp=ALL_WOS&KeyUT=000453521100003}