Current Issue
No 6 (2023)
АНАЛИЗ ДАННЫХ
Introduction
Adaptive Methods or Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators
Abstract
The article is devoted to some adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Based on the recently proposed proximal version of the extragradient method for this class of problems, we study in detail the method with adaptively selected parameter values. An estimate for the rate of convergence of this method is proved. The result is generalized to a class of variational inequalities with relatively strongly monotone δ-generalized smooth variational inequality operators. For the problem of ridge regression and variational inequality associated with box-simplex games, numerical experiments were performed demonstrating the effectiveness of the proposed method of adaptive selection of parameters during the running of the algorithm.
Adaptive Variant of the Frank-Wolfe Algorithm for Convex Optimization Problems
Abstract
In this paper, a variant of the Frank–Wolfe method for convex optimization problems with adaptive selection of the step parameter corresponding to information about the smoothness of the target function (the Lipschitz constant of the gradient) was investigated. Theoretical estimates of the quality of the approximate solution given out by the method using adaptively selected parameters L_k are obtained. On a class of problems on a convex feasible set with a convex objective function, the guaranteed convergence rate of the proposed method is sublinear. The special subclass of such problems is considered (the objective function with the condition of gradient dominance) and estimate of the convergence rate using adaptively selected parameters L_k is obtained. An important feature of the obtained result is the elaboration of a situation in which it is possible to guarantee, after the completion of the iteration, a reduction of the discrepancy in the function by at least 2 times. At the same time, the use of adaptively selected parameters in theoretical estimates makes it possible to apply the method for both smooth and non-smooth problems, provided that the exit criterion from the iteration is met. For smooth problems, it can be proved that the theoretical estimates of the method are guaranteed to be optimal up to multiplication by a constant factor. Computational experiments were performed, and a comparison with two other algorithms was carried out, during which the efficiency of the algorithm was demonstrated for a number of both smooth and non-smooth problems.
DECENTRALIZED CONDITIONAL GRADIENT METHOD ON TIME-VARIABLE GRAPHS
Abstract
In this paper, we consider a generalization of the decentralized Frank-Wulff algorithm for network time variables, study the convergence properties of the algorithm, and carry out the corresponding numerical experiments. The changing network is modeled as a deterministic or stochastic sequence of graphs.
On Accelerated Coordinate Descent Methods for Searching Equilibria in Two-Stage Transportation Equilibrium Traffic Flow Distribution Model
Abstract
The search for equilibrium in a two-stage traffic flow model reduces to the solution of a special nonsmooth convex optimization problem with two groups of different variables. For numerical solution of this problem, the paper proposes to use the accelerated block-coordinate Nesterov-Stich method with a special choice of block probabilities at each iteration. Theoretical estimates of the complexity of this approach can markedly improve the estimates of previously used approaches. However, in the general case they do not guarantee faster convergence. Numerical experiments with the proposed algorithms are carried out in the paper.
ROBUST ALGEBRAIC CONNECTIVITY
Abstract
The second smallest eigenvalue of a graph Laplacian is known as algebraic connectivity of the graph. This value shows how much this graph is connected. But this metric does not take into attention possible changes in graph. Note, that deletion of even one node or edge can lead the graph to be disconnected. This work is devoted to development of a metric that should describe robustness of the graph to such changes. All proposed metrics are based on algebraic connectivity. Besides, we provide generalization of some famous optimization methods for our robust modifications of algebraic connectivity. Moreover, this work contains some numerical experiments demonstrated efficiency of proposed approaches.
GRADIENT-FREE ALGORITHMS FOR SOLVING STOCHASTIC SADDLE OPTIMIZATION PROBLEMS WITH THE POLYAK–LOYASIEVICH CONDITION
Abstract
This paper focuses on solving a subclass of a stochastic nonconvex-concave black box optimization problem with a saddle point that satisfies the Polyak–Loyasievich condition. To solve such a problem, we provide the first, to our knowledge, gradient-free algorithm, the approach to which is based on applying a gradient approximation (kernel approximation) to the oracle-shifted stochastic gradient descent algorithm. We present theoretical estimates that guarantee a global linear rate of convergence to the desired accuracy. We check the theoretical results on a model example, comparing with an algorithm using Gaussian approximation.