Adaptive Methods or Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators

封面

如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

The article is devoted to some adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Based on the recently proposed proximal version of the extragradient method for this class of problems, we study in detail the method with adaptively selected parameter values. An estimate for the rate of convergence of this method is proved. The result is generalized to a class of variational inequalities with relatively strongly monotone δ-generalized smooth variational inequality operators. For the problem of ridge regression and variational inequality associated with box-simplex games, numerical experiments were performed demonstrating the effectiveness of the proposed method of adaptive selection of parameters during the running of the algorithm.

作者简介

S. Ablaev

Moscow Institute of Physics and Technology
; Vernadsky Crimean Federal University

编辑信件的主要联系方式.
Email: seydamet.ablaev@yandex.ru
Russia, 141701, Moscow region, Dolgoprudny, Institutskiy per., 9; Russia, 295007, Simferopol, Academician Vernadsky Avenue, 4

F. Stonyakin

Moscow Institute of Physics and Technology
; Vernadsky Crimean Federal University

编辑信件的主要联系方式.
Email: fedyor@mail.ru
Russia, 141701, Moscow region, Dolgoprudny, Institutskiy per., 9; Russia, 295007, Simferopol, Academician Vernadsky Avenue, 4

M. Alkousa

Moscow Institute of Physics and Technology
; National Research University “Higher School of Economics”

编辑信件的主要联系方式.
Email: mohammad.alkousa@phystech.edu
Russia, 141701, Moscow region, Dolgoprudny, Institutskiy per., 9; Russia, 101000, Moscow, Myasnitskaya st., 20

D. Pasechnyk

Moscow Institute of Physics and Technology
; Trusted Artificial Intelligence Research Center of ISP RAS

编辑信件的主要联系方式.
Email: dmivilensky1@gmail.com
Russia, 141701, Moscow region, Dolgoprudny, Institutskiy per., 9; Russia, 109004, Moscow, Alexander Solzhenitsyn st., 25

参考

  1. Stonyakin F., Tyurin A., Gasnikov A., Dvurechensky P., Agafonov A., Dvinskikh D., Alkousa M., Pasechnyuk D., Artamonov S., Piskunova V. Inexact Relative Smoothness and Strong Convexity for Optimization and Variational Inequalities by Inexact Model // Optim. Methods and Software. 2021. V. 36. № 6. P. 1155–1201.
  2. Cohen M.B., Sidford A., Tian K. Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration. arXiv preprint https://arxiv.org/pdf/2011.06572.pdf (2020).
  3. Titov A.A., Ablaev S.S., Stonyakin F.S., Alkousa M.S., Gasnikov A. Some Adaptive First-Order Methods for Variational Inequalities with Relatively Strongly Monotone Operators and Generalized Smoothness. In: Olenev N., Evtushenko Y., Jaćimović M., Khachay M., Malkova V., Pospelov I. (eds) Optimization and Applications. OPTIMA 2022. Lecture Notes in Computer Science, vol 13781. Springer, Cham, 2022.
  4. Bauschke H.H., Bolte J., Teboulle M. A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications // Mathematics of Operations Research. 2017. V. 42 (1.2). P. 330–348.
  5. Lu H., Freund R.M., Nesterov Y. Relatively smooth convex optimization by first-order methods, and applications // SIAM Journal on Optimization. 2018. V. 28 (1.1). P. 333–354.
  6. Hendrikx H., Xiao L., Bubeck S., Bach F., Massoulie L. Statistically preconditioned accelerated gradient method for distributed optimization. In International conference on machine learning, 4203–4227. PMLR, 2020.
  7. Tian Y., Scutari G., Cao T., Gasnikov A. Acceleration in Distributed Optimization under Similarity. Proceedings of The 25th International Conference on Artificial Intelligence and Statistics // PMLR. 2022. V. 151. P. 5721–5756.
  8. Jin Y., Sidford A., Tian K. Sharper Rates for Separable Minimax and Finite Sum Optimization via Primal-Dual Extragradient Methods. In Conference on Learning Theory, 4362–4415. PMLR, 2022.

补充文件

附件文件
动作
1. JATS XML
2.

下载 (53KB)
3.

下载 (68KB)

版权所有 © С.С. Аблаев, Ф.С. Стонякин, М.С. Алкуса, Д.А. Пасечнюк, 2023

##common.cookie##