Registered Data

[00949] Optimal and Efficient Algorithms for Inverse Problems

  • Session Time & Room :
    • 00949 (1/2) : 5C (Aug.25, 13:20-15:00) @E505
    • 00949 (2/2) : 5D (Aug.25, 15:30-17:10) @E505
  • Type : Proposal of Minisymposium
  • Abstract : This minisymposium aims at bringing researchers to share their recent progress and to inspire new ideas in the solution of inverse problems and its applications. Talks will address modeling, and theoretical and computational aspects of numerical methods for solving inverse problems.
  • Organizer(s) : Malena Espanol, Rosemary Renaut
  • Classification : 65F22, 65R32
  • Minisymposium Program :
    • 00949 (1/2) : 5C @E505 [Chair: Malena Espanol]
      • [05576] Geometric Scattering on Measure Spaces
        • Format : Online Talk on Zoom
        • Author(s) :
          • Michael Perlmutter (Boise State University)
        • Abstract : Geometric Deep Learning is an emerging field of research that aims to extend the success of machine learning and, in particular, convolutional neural networks, to data with non-Euclidean geometric structure such as graphs and manifolds. Despite being in its relative infancy, this field has already found great success and is utilized by, e.g., Google Maps and Amazon’s recommender systems. In order to improve our understanding of the networks used in this new field, several works have proposed novel versions of the scattering transform, a wavelet-based model of neural networks for graphs, manifolds, and more general measure spaces. In a similar spirit to the original scattering transform, which was designed for Euclidean data such as images, these geometric scattering transforms provide a mathematically rigorous framework for understanding the stability and invariance of the networks used in geometric deep learning. Additionally, they also have many interesting applications such as the analysis of single-cell data
      • [02094] Variable Projection Methods for Solving Separable Nonlinear Inverse Problems
        • Format : Talk at Waseda University
        • Author(s) :
          • Malena Espanol (Arizona State University)
        • Abstract : Variable projection methods are among the classical and efficient methods to solve separable nonlinear least squares problems. In this talk, I will introduce the variable projection method and its use to solve large-scale blind deconvolution problems.
      • [03302] Doubly Noisy Kaczmarz
        • Format : Talk at Waseda University
        • Author(s) :
          • Anna Ma (UC Irvine)
          • ELHoucine Bergou ( Mohamed VI Polytechnic University (UM6P))
          • Aritra Dutta (University of Southern Denmark)
          • Soumia Boucherouite ( Mohamed VI Polytechnic University (UM6P))
          • Xin Li (University of Central Florida)
        • Abstract : Large-scale linear systems, Ax=b, frequently arise in inverse problems. Often, these systems are noisy due to operational errors or faulty data-collection processes. In the past decade, the randomized Kaczmarz algorithm (RK) was studied extensively as an efficient iterative solver for such systems. However, the convergence study of RKA in the noisy regime is limited and considers measurement noise in the right-hand side vector, b. Unfortunately, in practice, that is not always the case; the coefficient matrix A can also be noisy. In this talk, we motivate and discuss the application of RK to doubly noise linear systems, i.e., linear systems with noise in both the measurements and the measurement matrix. The presented work is a joint collaboration with El Houcine Bergou, Soumia Boucherouite, Aritra Dutta, and Xin Li.
      • [02096] Geometric Scattering on Measure Spaces
        • Format : Online Talk on Zoom
        • Author(s) :
          • Michael Perlmutter (UCLA)
        • Abstract : Geometric Deep Learning is an emerging field of research that aims to extend the success of machine learning and, in particular, convolutional neural networks, to data with non-Euclidean geometric structure such as graphs and manifolds. Despite being in its relative infancy, this field has already found great success and is utilized by, e.g., Google Maps and Amazon’s recommender systems. In order to improve our understanding of the networks used in this new field, several works have proposed novel versions of the scattering transform, a wavelet-based model of neural networks for graphs, manifolds, and more general measure spaces. In a similar spirit to the original scattering transform, which was designed for Euclidean data such as images, these geometric scattering transforms provide a mathematically rigorous framework for understanding the stability and invariance of the networks used in geometric deep learning. Additionally, they also have many interesting applications such as the analysis of single-cell data
    • 00949 (2/2) : 5D @E505 [Chair: Malena Espanol]
      • [03055] Conditional sampling via block-triangular transport maps
        • Format : Talk at Waseda University
        • Author(s) :
          • Ricardo Baptista (California Institute of Technology)
          • Nikola Kovachki (NVIDIA)
          • Bamdad Hosseini (University of Washington)
          • Youssef Marzouk (MIT)
        • Abstract : We present an optimal transport framework for conditional sampling of probability measures. Conditional sampling is a fundamental task of solving Bayesian inverse problems and generative modeling. Optimal transport provides a flexible methodology to sample target distributions appearing in these problems by constructing a deterministic coupling that maps samples from a reference distribution (e.g., a standard Gaussian) to the desired target. To extend these tools for conditional sampling, we first develop the theoretical foundations of block triangular transport in a Banach space setting by drawing connections between monotone triangular maps and optimal transport. To learn these block triangular maps, we will then present a computational approach, called monotone generative adversarial networks (MGANs). Our algorithm uses only samples from the underlying joint probability measure and is hence likelihood-free, making it applicable to inverse problems where likelihood evaluations are inaccessible or computationally prohibitive. We will demonstrate the accuracy of MGAN for sampling the posterior distribution in Bayesian inverse problems involving ordinary and partial differential equations, and probabilistic image in-painting.
      • [02631] Efficient importance sampling for Bayesian inverse problems using tensor-trains
        • Format : Talk at Waseda University
        • Author(s) :
          • Tiangang Cui (Monash University)
          • Sergey Dolgov (University of Bath)
          • Robert Scheichl (Heidelberg University)
        • Abstract : We propose an efficient importance sampling method for rare events in high-dimensional problems, by approximating the optimal importance distribution in a scalable way as the pushforward of a reference distribution under composition of order-preserving transformations based on tensor-train decompositions. By designing a ratio estimator that estimates the normalizing constant using a separate importance distribution, it applies also to Bayesian inverse problems. The efficiency and robustness are demonstrated numerically on high-dimensional problems constrained by differential equations.
      • [02317] On structured linear measurements for tensor data recovery
        • Format : Talk at Waseda University
        • Author(s) :
          • Elizaveta Rebrova (Princeton University)
        • Abstract : Data-oblivious measurements present an important branch of low-rank data compression and recovery techniques, frequently used in streaming settings and within iterative algorithms. Typically, linear data-oblivious measurements involve some version of a random sketch that preserves the geometric properties of the data. When data is tensorial, a special challenge is to create a sketch with a structure that reflects tensor structure: this way, it can work similarly to a dense random sketch matrix but require much less memory to store and can be applied more efficiently. I will talk about our and others -- including Tropp, Udell et al, De Lathauwer et al -- recently proposed streaming sketch-based approaches for computing low-rank Tucker approximations of large tensors. I will discuss our new generalized theoretical guarantees for proving their accuracy on full-rank and noisy data with high probability from a wide range of measurements.