Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

A new version of accelerated gradient descent is proposed. The method does not require any a priori information on the objective function, uses a linesearch procedure for convergence acceleration in practice, converge according to well-known lower bounds for both convex and nonconvex objective functions, and has primal-dual properties. A universal version of this method is also described.

作者简介

S. Guminov

Moscow Institute of Physics and Technology

编辑信件的主要联系方式.
Email: sergey.guminov@phystech.edu
俄罗斯联邦, Dolgoprudnyi, Moscow oblast, 141700

Yu. Nesterov

Center for Operations Research and Econometrics (CORE), Catholic University of Louvain; National Research University Higher School of Economics

Email: sergey.guminov@phystech.edu
比利时, Louvain-la-Neuve; Moscow, 101000

P. Dvurechensky

Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
俄罗斯联邦, Moscow, 127051

A. Gasnikov

Moscow Institute of Physics and Technology; Kharkevich Institute for Information Transmission Problems, Russian Academy of Sciences

Email: sergey.guminov@phystech.edu
俄罗斯联邦, Dolgoprudnyi, Moscow oblast, 141700; Moscow, 127051

补充文件

附件文件
动作
1. JATS XML

版权所有 © Pleiades Publishing, Ltd., 2019