Stochastic intermediate gradient method for convex optimization problems


Citar

Texto integral

Acesso aberto Acesso aberto
Acesso é fechado Acesso está concedido
Acesso é fechado Somente assinantes

Resumo

New first-order methods are introduced for solving convex optimization problems from a fairly broad class. For composite optimization problems with an inexact stochastic oracle, a stochastic intermediate gradient method is proposed that allows using an arbitrary norm in the space of variables and a prox-function. The mean rate of convergence of this method and the probability of large deviations from this rate are estimated. For problems with a strongly convex objective function, a modification of this method is proposed and its rate of convergence is estimated. The resulting estimates coincide, up to a multiplicative constant, with lower complexity bounds for the class of composite optimization problems with an inexact stochastic oracle and for all usually considered subclasses of this class.

Sobre autores

A. Gasnikov

Institute for Information Transmission Problems

Autor responsável pela correspondência
Email: gasnikov@yandex.ru
Rússia, Bol’shoi Karetnyi per. 19/1, Moscow, 127994

P. Dvurechensky

Institute for Information Transmission Problems

Email: gasnikov@yandex.ru
Rússia, Bol’shoi Karetnyi per. 19/1, Moscow, 127994

Arquivos suplementares

Arquivos suplementares
Ação
1. JATS XML

Declaração de direitos autorais © Pleiades Publishing, Ltd., 2016