Abstract : Manifold optimization has been developing remarkably with its rich theory and a wide variety of applications. This minisymposium aims to pioneer state-of-the-art in the field. The talks include optimization algorithms on manifolds, e.g., Riemannian adaptive and interior point methods, ADMM, local stochastic algorithms, a CG method for multiobjective optimization, and an augmented Lagrangian method based on sequential optimality conditions. Furthermore, efficient optimization on specific manifolds such as Grassmann and Stiefel manifolds and the manifold of fixed-rank positive-semidefinite matrices, and applications to practical problems, e.g., problems under differential privacy, low-rank matrix optimization problems, and design of tight minimum-sidelobe windows, are addressed.
[04372] Sequential optimality conditions for nonlinear optimization on Riemannian manifolds and a globally convergent augmented Lagrangian method
Format : Talk at Waseda University
Author(s) :
Yuya Yamakawa (Kyoto University)
Hiroyuki Sato (Kyoto University)
Abstract : Recently, the approximate Karush--Kuhn--Tucker (AKKT) conditions, also called the sequential optimality conditions, have been proposed for nonlinear optimization in Euclidean spaces, and several methods to find points satisfying such conditions have been developed by researchers. These conditions are known as genuine necessary optimality conditions because all local optima satisfy them with no constraint qualification (CQ). In this paper, we extend the AKKT conditions to nonlinear optimization on Riemannian manifolds and propose an augmented Lagrangian (AL) method that globally converges to points satisfying such conditions. In addition, we prove that the AKKT and KKT conditions are indeed equivalent under a certain CQ. Finally, we examine the effectiveness of the proposed AL method via several numerical experiments.
[04682] Nonlinear conjugate gradient method for vector optimization on Riemannian manifolds
Format : Talk at Waseda University
Author(s) :
Kangming Chen (Kyoto University)
Hiroyuki Sato (Kyoto University)
Ellen Hidemi Fukuda (Kyoto University)
Abstract : In this research, we propose a conjugate gradient descent algorithm for vector optimization on Riemannian manifolds. We extend the concepts of Wolfe conditions and Zoutendjik conditions to Riemannian manifolds. The convergence of the proposed method is proved for different choices of the parameter beta, including the Riemannian extension of Fletcher-Reeves, Conjugate Descent, and Dai-Yuan. Numerical experiments are conducted to validate the proposed method.
[04852] Gauss-Southwell type descent methods for low-rank matrix optimization
Format : Talk at Waseda University
Author(s) :
André Uschmajew (University of Augsburg)
Guillaume Olikier (Université Catholique de Louvain)
Bart Vandereycken (University of Geneva)
Abstract : We consider gradient-related methods for low-rank matrix optimization with a smooth strongly convex cost function. The methods operate on single factors and share aspects of both alternating and Riemannian optimization. We compare two possible choices for the search directions based on Gauss-Southwell type selection rules: one using the gradient of a factorized non-convex formulation, the other using the Riemannian gradient. Both methods provide convergence guarantees for the gradient that are analogous to the unconstrained case.
[03269] Min-max optimization on manifolds
Format : Online Talk on Zoom
Author(s) :
Bamdev Mishra (Microsoft )
Abstract : In this talk, we discuss some recent algorithms on min-max optimization problems over Riemannian manifolds.
[03743] Design of Tight Minimum-Sidelobe Windows via Optimization on Oblique Manifolds
Format : Online Talk on Zoom
Author(s) :
Daichi Kitahara (Osaka University)
Kohei Yatabe (Tokyo University of Agriculture and Technology)
Abstract : The short-time Fourier transform (STFT), or the discrete Gabor transform (DGT), has been widely utilized in signal analysis and processing. For noise-robust STFT/DGT domain signal processing, windows used in STFT/DGT are desired to be tight. In this talk, we propose to design tight windows minimizing the sidelobe energy. It is expressed as the maximization of Rayleigh quotients on oblique manifolds. We apply the Riemannian Newton's method to obtain the optimal tight windows by several iterations.
[04008] The Bures-Wasserstein geometry of the manifold of fixed-rank positive-semidefinite matrices
Format : Online Talk on Zoom
Author(s) :
Estelle Massart (UCLouvain)
Pierre-Antoine Absil (UCLouvain)
Abstract : We explore the well-known identification of the manifold of rank p positive-semidefinite matrices of size n with the quotient of the set of full-rank n-by-p matrices by the orthogonal group in dimension p. The induced metric corresponds to the Wasserstein metric between centered degenerate Gaussian distributions, and is a generalization of the Bures–Wasserstein metric on the manifold of positive-definite matrices.
[02861] Cayley parametrization strategy for optimization over the Stiefel manifold
Format : Talk at Waseda University
Author(s) :
Keita Kume (Tokyo Institute of Technology)
Isao Yamada (Tokyo Institute of Technology)
Abstract : We introduce the basic idea behind the adaptive localized Cayley parametrization strategy for optimization over the Stiefel manifold. The proposed adaptive strategy can mitigate slow convergence caused by singular-points of the naive Cayley parametrization. Unlike the so-called retraction-based strategy, the proposed strategy can utilize directly many powerful Euclidean optimization algorithms. For a certain class of optimization algorithms combined with the proposed parametrization strategy, we also explain briefly an idea for their unified convergence analyses.
[02851] Accelerated gradient methods on the Grassmann and Stiefel manifolds
Format : Talk at Waseda University
Author(s) :
Xiaojing Zhu (Shanghai University of Electric Power)
Abstract : In this talk we extend a nonconvex version of Nesterov's accelerated gradient method to optimization over the Grassmann and Stiefel manifolds. We propose an exponential-based AG algorithm for the Grassmann manifold and a retraction-based AG algorithm that exploits the Cayley transform for both of the Grassmann and Stiefel manifolds. Global rates of convergence of our algorithms are analyzed under reasonable assumptions. Details of computing some geometric objects as ingredients of our algorithms are also discussed.
Abstract : We consider a class of Riemannian optimization problems where the objective is the sum of a smooth function and a nonsmooth function, considered in the ambient space. This class of problems finds important applications in machine learning and statistics such as the K-means clustering, sparse spectral clustering, and orthogonal dictionary learning. We propose a Riemannian alternating direction method of multipliers to solve this class of problems. Our algorithm adopts easily solvable subproblems in each iteration. The iteration complexity of the proposed algorithm for obtaining an $\epsilon$-stationary point is analyzed under mild assumptions. Numerical experiments are conducted to demonstrate the advantage of the proposed method.
[04455] Local stochastic algorithms for Riemannian optimization
Format : Online Talk on Zoom
Author(s) :
Sam Davanloo Tajbakhsh (The Ohio State University)
Abstract : We consider optimizing a function available through its first-order stochastic oracle over a Riemannian manifold in a distributed setting with the coordination of a central server. We develop local stochastic approximation methods that perform multiple local stochastic updates in parallel on different clients and merge them in some intervals. Theoretical convergence results in different optimization and communication settings will be presented.
[03649] Riemannian Interior Point Methods for Constrained Optimization on Manifolds
Format : Talk at Waseda University
Author(s) :
Zhijian Lai (University of Tsukuba)
Akiko Yoshise (University of Tsukuba)
Abstract : We extend the classical primal-dual interior point method from the Euclidean setting to the Riemannian one. Our method, named the Riemannian interior point method (RIPM), is for solving Riemannian constrained optimization problems. We establish its locally superlinear and quadratic convergence under the standard assumptions. Moreover, we show its global convergence when it is combined with a classical line search. Numerical experiments show the stability and efficiency of our method.
[03887] Riemannian Adaptive Optimization Algorithms and Their Applications
Format : Talk at Waseda University
Author(s) :
Hiroyuki Sakai (Meiji University)
Hideaki Iiduka (Meiji University)
Abstract : We will talk about Riemannian adaptive optimization algorithms to solve the Riemannian stochastic optimization problems. In Euclidean space, adaptive optimization algorithms, such as AdaGrad, Adam, Adadelta, and AMSGrad, are widely used. However, adaptive optimization algorithms cannot be naturally extended to general Riemannian manifolds, due to the absence of a canonical coordinate system.
We introduce the ways to generalize adaptive optimization algorithms to Riemannian manifolds and consider their applications.