Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex


Citar

Texto integral

Acesso aberto Acesso aberto
Acesso é fechado Acesso está concedido
Acesso é fechado Somente assinantes

Resumo

In this paper we propose a modification of the mirror descent method for non-smooth stochastic convex optimization problems on the unit simplex. The optimization problems considered differ from the classical ones by availability of function values realizations. Our purpose is to derive the convergence rate of the method proposed and to determine the level of noise that does not significantly affect the convergence rate.

Sobre autores

A. Gasnikov

Moscow Institute of Physics and Technology (State University); Institute for Information Transmission Problems (Kharkevich Institute)

Autor responsável pela correspondência
Email: gasnikov@yandex.ru
Rússia, Moscow; Moscow

A. Lagunovskaya

Moscow Institute of Physics and Technology (State University); Keldysh Institute of Applied Mathematics

Email: gasnikov@yandex.ru
Rússia, Moscow; Moscow

I. Usmanova

Moscow Institute of Physics and Technology (State University); Institute for Information Transmission Problems (Kharkevich Institute)

Email: gasnikov@yandex.ru
Rússia, Moscow; Moscow

F. Fedorenko

Moscow Institute of Physics and Technology (State University)

Email: gasnikov@yandex.ru
Rússia, Moscow

Arquivos suplementares

Arquivos suplementares
Ação
1. JATS XML

Declaração de direitos autorais © Pleiades Publishing, Ltd., 2016