Abstract : The minisymposium aims at presenting a few recent developments in large-scale eigenvalue computations and optimization, as well as investigating the intimate connection between them. Of particular interest are not only standard and generalized eigenvalue problems but also nonlinear eigenvalue problems, multiparameter eigenvalue problems, singular value decompositions, and their applications such as those in data science and control theory. Orthogonal transformations and projections to proper subspaces play vital roles for computing and optimizing eigenvalues numerically in the large-scale setting. The minisymposium focuses on the use of such tools in modern algorithms for large-scale eigenvalue computations, optimization, and applications.
[03074] Consistent Estimation Using SVD for a Linear Regression Model
Format : Talk at Waseda University
Author(s) :
Kensuke Aishima (Hosei University)
Abstract : In this talk, we consider parameter estimation of an errors-in-variables linear regression model. The standard approach to such parameter estimation is to formulate an optimization problem and solve it numerically using the singular value decomposition (SVD). Using the property that the SVD identifies the image and null spaces of a matrix, with orthogonal projections to the subspaces, we derive a consistent estimator for a linear regression model with the errors in a subset of variables.
[04078] Fast optimization of eigenvalues for frequency-based damping of second-order systems
Format : Online Talk on Zoom
Author(s) :
Nevena Jakovčević Stor (University of Split)
Tim Mitchell (Queens College / CUNY)
Zoran Tomljanović (University of Osijek)
Matea Ugrica (Max Planck Institute for Dynamics of Complex Technical Systems)
Abstract : We consider optimizing eigenvalues of certain parametric second-order systems that model vibrating mechanical systems, where the goal is to achieve frequency-weighted damping by moving eigenvalues away from undesirable areas on the imaginary axis. We present two new complementary approaches for this task. First, we propose determining damper viscosities via solving new nonsmooth constrained optimization problems. Second, we also propose a fast new eigensolver for the structured quadratic eigenvalue problems that appear in such vibrating systems.
Abstract : In a rectangular multiparameter eigenvalue problem we have a $k$-variate, $k\ge 2$, polynomial pencil of rectangular matrices $W(\lambda)\in{\mathbb C}^{(n+k-1)\times n}$ and $\lambda_0\in{\mathbb C}^k$ is an eigenvalue if ${\rm rank}(W(\lambda_0))
[04501] Simultaneous diagonalization and new bounds on shared invariant subspaces
Format : Talk at Waseda University
Author(s) :
Brian Sutton (Randolph-Macon College)
Abstract : Commuting Hermitian matrices may be simultaneously diagonalized by a common eigenvector matrix. However, the numerical aspects are delicate, and existing Jacobi-like algorithms have a prohibitively large operation count on large matrices. We derive new error bounds on shared invariant subspaces and use them to develop a new simultaneous diagonalization algorithm with a running time that is a small multiple of a single eigenvalue-eigenvector computation.
[03188] Estimation of the dominant poles of a large-scale descriptor system
Format : Talk at Waseda University
Author(s) :
Emre Mengi (Koc University)
Abstract : The dominant poles of the transfer function of a descriptor system are those poles that can cause large frequency response. They can be used to form reduced-order approximations to the system. We describe a subspace framework to estimate the dominant poles of a large-scale descriptor system based on Petrov-Galerkin projections. The projection subspaces are expanded gradually by means of the dominant poles of the projected systems. We argue formally that the framework converges quadratically.
[03212] Subspace Methods for Nonlinear Eigenvalue Problems
Format : Talk at Waseda University
Author(s) :
Rifqi Aziz (Koc University)
Emre Mengi (Koc University)
Matthias Voigt (UniDistance Suisse)
Abstract : We will discuss numerical methods for nonlinear eigenvalue problems that are described by matrices of large dimension. We project the large matrices within an interpolatory framework in order to obtain a reduced nonlinear eigenvalue problem that can be solved more efficiently. Based on the eigenpair residuals, new interpolation points and corresponding projection matrices can be computed in order to obtain a few eigenvalues close to a desired target point.
[05296] Optimizing orthogonality in large-scale tensor networks
Format : Talk at Waseda University
Author(s) :
Roel Van Beeumen (Lawrence Berkeley National Laboratory)
Abstract : Orthogonality plays a key role in eigenvalue computations. In 1D tensor networks such as tensor trains, the orthogonality is maintained by using QR or truncated SVD factorizations. However, this technique does not extend to 2D tensor networks such as projected entangled pair states (PEPS). Moreover, orthogonality inside a PEPS keeps the computational complexity of eigenvalue evaluations bounded. We will discuss and compare several approximate orthogonalization techniques and strategies for orthogonalizing PEPS columns and rows.
[05462] Linearizability of eigenvector nonlinearities
Format : Online Talk on Zoom
Author(s) :
Elias Jarlebring (KTH Royal Institute of Technology)
Abstract : We present a method to linearize, without approximation, a specific class of eigenvalue problems with eigenvector nonlinearities (NEPv), where the nonlinearities are expressed by scalar functions that are defined by a quotient of linear functions of the eigenvector. The exact linearization relies on an equivalent multiparameter problem (MEP) that contains the exact solutions of the NEPv. Based on the linearization we propose numerical schemes that exploit the structure of the linearization.