# Registered Data

## [00498] Approximation and modeling with manifold-valued data

**Session Time & Room**:**Type**: Proposal of Minisymposium**Abstract**: Application problems that involve data on differentiable manifolds are at the interface of numerical analysis and differential geometry. Researchers approach such tasks for various reasons: some make general efforts to transfer established methods from the Euclidean setting to nonlinear manifolds. Others are motivated by a specific application that requires one to work with manifold data. This minisymposium aims at bringing together researchers working on approximation and modeling problems on Riemannian manifolds. A particular focus is on interpolation methods and applications in model reduction. We aspire to create synergies and bridge the gap between different communities in this fascinating research field.**Organizer(s)**: Nir Sharon, Ralf Zimmermann**Classification**:__65Dxx__,__41Axx__,__53B50__,__Riemannian Computing, Model Reduction__**Minisymposium Program**:- 00498 (1/3) :
__4C__@__A615__[Chair: Ralf Zimmermann] **[05470] Implicit integration along the low-rank manifold for stiff and nonlinear equations****Format**: Online Talk on Zoom**Author(s)**:**Aaron Charous**(MIT)- Pierre F. J. Lermusiaux (MIT)

**Abstract**: We introduce a family of implicit integration methods for the dynamical low-rank approximation: the alternating-implicit dynamically orthogonal Runge-Kutta (ai-DORK) schemes. By alternating over the row and column space of the approximate solution, an efficient iterative low-rank linear solver is developed. To evaluate nonlinearities, we propose a local/piecewise polynomial approximation with adaptive clustering, and on-the-fly reclustering may be performed efficiently in the coefficient space. We demonstrate the proposed schemes on ill-conditioned, nonlinear, and realistic systems.

**[04268] Stochastic modeling of model uncertainties through Riemannian reduced-order representations****Format**: Online Talk on Zoom**Author(s)**:- Hao Zhang (Duke University)
**Johann Guilleminot**(Duke University)

**Abstract**: Molecular Dynamics (MD) simulations are widely used in computational materials science to explore the conformational space of atomistic systems, to analyze microscopic processes, and to evaluate macroscopic properties of interest. At the core of all MD simulations stands the selection of interatomic potentials, which can be calibrated by means of first-principles calculations or by solving inverse problems based on experimental observations. Such potentials are not uniquely defined in general, which raises the question of model uncertainties and their impact on MD-informed multiscale predictions. In this work, we propose a new probabilistic framework that enables the seamless integration of model-form uncertainties in atomistic computations. The approach relies on a stochastic reduced-order model involving a randomized Galerkin projection operator. An information-theoretic probabilistic model is specifically constructed on the tangent space to the manifold, taking advantage of Riemannian projection and retraction operators. We also explore statistical inference with a view toward inverse identification. We show, in particular, that the Fréchet mean can be constrained by solving a quadratic programming problem. Various applications are finally presented to demonstrate the relevance of the method, including toy examples in the Euclidean space and multiscale simulations on single layer graphene sheets. The proposed method offers key advantages, including a simple and interpretable low-dimensional parameterization, the ability to constraint the mean on the underlying manifold, and ease of implementation and propagation through multiscale operators.

**[05113] On approximation and representation of manifold-valued functions****Format**: Talk at Waseda University**Author(s)**:**Nir Sharon**(Tel Aviv University)

**Abstract**: Recent years have given rise to exciting developments in methods for approximating manifolds and manifold-valued objects. This talk will review recent work concerning manifold-valued approximation via refinement and quasi-interpolation operators and their close connection to multiscaling.

**[05093] Multivariate Hermite Intepolation On Riemannian Manifolds****Format**: Talk at Waseda University**Author(s)**:**Ralf Zimmermann**(University of Southern Denmark)- Ronny Bergmann (NTNU, Trondheim)

**Abstract**: We consider two methods for multivariate Hermite interpolation of manifold-valued functions. On the one hand, we approach the problem via computing weighted Riemannian barycenters. This approach is intrinsic in the sense that it does not depend on local coordinates. As an alternative, we consider straightforward Hermite interpolation in a tangent space. Here, the actual interpolation is conducted via classical vector space operations. Both approaches are illustrated by means of numerical examples.

- 00498 (2/3) :
__4D__@__A615__[Chair: Ralf Zimmermann] **[01628] Approximations and learning in the Wasserstein space****Format**: Talk at Waseda University**Author(s)**:**Caroline Moosmueller**(University of North Carolina at Chapel Hill)- Alexander Cloninger (University of California, San Diego)
- Varun Khurana (University of California, San Diego)
- Harish Kannan (University of California, San Diego)
- Keaton Hamm (University of Texas at Arlington)

**Abstract**: Detecting differences and building classifiers between distributions, given only finite samples, are important tasks in a number of scientific fields. Optimal transport and the Wasserstein distance have evolved as the most natural concept in dealing with such tasks, but they also have some computational drawbacks. In this talk, we describe an approximation framework through local linearizations that significantly reduces both the computational effort and the required training data in supervised learning settings.

**[05075] On multiscale quasi-interpolation of scattered scalar- and manifold-valued functions****Format**: Talk at Waseda University**Author(s)**:- Nir Sharon (Tel Aviv University)
**Holger Wendland**(University of Bayreuth)

**Abstract**: In this talk, we introduce and analyze a combination of kernel-based quasi-interpolation and multiscale approximations for both scalar- and manifold-valued functions. We are particularly interested in the improvements coming from such a combination over simply using quasi-interpolation processes alone. We provide ample numerical evidence that multiscale quasi-interpolation has superior convergence to quasi-interpolation. In addition, we will provide examples showing that the multiscale quasi-interpolation approach offers a powerful tool for data analysis tasks.

**[04608] Structure-preserving Model Order Reduction on Manifolds****Format**: Talk at Waseda University**Author(s)**:**Patrick Buchfink**(University of Stuttgart)- Silke Glas (University of Twente)
- Bernard Haasdonk (University of Stuttgart)
- Benjamin Unger (University of Stuttgart)

**Abstract**: Approximation on manifolds has become a highly researched field in Model Order Reduction (MOR) for problems with slowly decaying Kolmogorov $n$-widths. However, many MOR techniques do not respect the structure of the underlying equations during the reduction. In this talk, we present a new differential geometric formulation of MOR on pseudo-Riemannian manifolds. It allows us to geometrically understand and unify existing structure-preserving MOR techniques for Hamiltonian and Lagrangian systems.

- 00498 (3/3) :
__4E__@__A615__[Chair: Nir Sharon] **[03824] The de Casteljau algorithm on symmetric spaces****Format**: Online Talk on Zoom**Author(s)**:**Fátima Silva Leite**(Institute of Systems and Robotics, University of Coimbra)- Knut Huper (Institute of Mathematics, Julius-Maximilians-Universitat Wurzburg)

**Abstract**: An important task for interpolation problems in many areas of science and technology is the computation of smooth curves connecting two data points in a Riemannian symmetric space. For instance, the de Casteljau algorithm on manifolds, which is a geometric procedure to generate smooth polynomial splines, is based on recursive geodesic interpolation. Also in statistics, the efficient computation of (geometric) means of data in a symmetric space, as well as the computation of midpoints of smooth curves connecting two data points, is particularly important. While closed form solutions for the so called endpoint geodesic problem on general symmetric spaces are well known, often explicit exponentiation of matrices and/or SVD computations are still required. In most cases, these computations are rather expensive. We present much simpler closed form expressions for the particular case of Grassmannians, where only constant, linear and quadratic functions in the data points and scalar trigonometric functions are involved. This represents an important step in the implementation of the de Casteljau algorithm. We also comment on the general idea putting other important symmetric spaces, compact and noncompact ones, into perspective.

**[03709] The Difference of Convex Algorithm on Riemannian Manifolds****Format**: Online Talk on Zoom**Author(s)**:**Ronny Bergmann**(NTNU, Trondheim)- Orizon Pereira Ferreura (IME/UFG, Goiâna)
- Elisanderson Meneses Santos (Instituto Federal de Educação, Ciência e Tecnologia do Maranhão, Barra do Corda)
- João Carlos de Oliveira Souza (Department of Mathematics, Federal University of Piauí, Teresina)

**Abstract**: In this talk we propose a difference of convex algorithm (DCA) on Riemannian manifolds to solve optimisation problems involving a difference of two functions. We establish both its relation to recently introduced Fenchel duality on manifolds, and its wwell-posedness. On Hadamard manifolds, we prove that every cluster point of the sequence generated by the algorithm is a cluster point. Finally, we illustrate that several optimisation problems can be written as difference of convex (DC) functions on manifolds, and that some Euclidean problems that are differences of non-convex problems become DC problems when rephrased on a manifold. Numerical examples illustrate that such a rephrasing even for DC problems is beneficial numerically.

**[02950] Symplectic model order reduction via Riemannian optimization****Format**: Talk at Waseda University**Author(s)**:**Bin Gao**(Academy of Mathematics and Systems Science, Chinese Academy of Sciences)

**Abstract**: Numerous problems in optics, quantum physics, stability analysis, and control of dynamical systems can be brought to an optimization problem with matrix variable subjected to the symplecticity constraint. As this constraint nicely forms a so-called symplectic Stiefel manifold, Riemannian optimization is preferred, because one can borrow ideas from unconstrained optimization methods after preparing necessary geometric tools. Retraction is arguably the most important one which decides the way iterates are updated given a search direction. Two retractions have been constructed so far: one relies on the Cayley transform and the other is designed using quasi-geodesic curves. In this talk, we propose a new retraction that is based on an SR matrix decomposition. We prove that its domain contains the open unit ball which is essential in proving the global convergence of the associated gradient-based optimization algorithm. Moreover, we consider the symplectic model order reduction of Hamiltonian systems with various examples. The extensive numerical comparisons reveal the strengths of the proposed optimization algorithm.

**[05465] Hirotugu Akaike's Analysis of Gradient Descent: 70 years later****Format**: Talk at Waseda University**Author(s)**:**Pok Yin Thomas Yu**(Drexel University)

**Abstract**: It is very well known that when the exact line search gradient descent method is applied to a convex quadratic objective, the worst-case rate of convergence (among all seed vectors) deteriorates as the condition number of the Hessian of the objective grows. By an elegant analysis by H. Akaike in 1959, it is generally believed -- but not proved -- that in the ill-conditioned regime the ROC for almost all initial vectors, and hence also the average ROC, is close to the worst case ROC. We complete Akaike's analysis using the theorem of center and stable manifolds. Our analysis also makes apparent the effect of an intermediate eigenvalue in the Hessian by establishing the following somewhat amusing result: In the absence of an intermediate eigenvalue, the average ROC gets arbitrarily fast -- not slow -- as the Hessian gets increasingly ill-conditioned. We discuss in passing some contemporary applications of exact line search GD to polynomial optimization problems arising from imaging and data sciences and, if time allows, formulate an open problem related to accelerated GD methods.

- 00498 (1/3) :