Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

In this paper we propose a modification of the mirror descent method for non-smooth stochastic convex optimization problems on the unit simplex. The optimization problems considered differ from the classical ones by availability of function values realizations. Our purpose is to derive the convergence rate of the method proposed and to determine the level of noise that does not significantly affect the convergence rate.

作者简介

A. Gasnikov

Moscow Institute of Physics and Technology (State University); Institute for Information Transmission Problems (Kharkevich Institute)

编辑信件的主要联系方式.
Email: gasnikov@yandex.ru
俄罗斯联邦, Moscow; Moscow

A. Lagunovskaya

Moscow Institute of Physics and Technology (State University); Keldysh Institute of Applied Mathematics

Email: gasnikov@yandex.ru
俄罗斯联邦, Moscow; Moscow

I. Usmanova

Moscow Institute of Physics and Technology (State University); Institute for Information Transmission Problems (Kharkevich Institute)

Email: gasnikov@yandex.ru
俄罗斯联邦, Moscow; Moscow

F. Fedorenko

Moscow Institute of Physics and Technology (State University)

Email: gasnikov@yandex.ru
俄罗斯联邦, Moscow

补充文件

附件文件
动作
1. JATS XML

版权所有 © Pleiades Publishing, Ltd., 2016