Abstract : Mean field games (MFG) study the behavior of individual players in large populations, where each player controls their own state, while some collective behavior is considered for decision-making. Specific MFG models can be
formulated as generalized measure transportation problems, exemplifying one of the tight connections to
optimal transport (OT).
This mini-symposium highlights the close relationship between MFG and OT, advancing research directions in
modeling and numerical algorithms, and expanding fields of applications. Particular emphasis will lie on new
applications in data science, such as point-cloud analysis on networks, and in biology, such as single-cell data
integration.
[05205] Manifold Interpolating Optimal-Transport Flows for Trajectory Inference
Format : Online Talk on Zoom
Author(s) :
Smita Krishnaswamy (Yale University)
Guillaume Huguet (University of Montreal)
Alexander Tong (University of Montreal )
Oluwadamilola Fasina (Yale University)
Daniel Sumner Magruder (Yale University)
Manik Kuchroo (Yale University)
Guy Wolf (University of Montreal)
Abstract : We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow) that learns stochastic, continuous population dynamics from static snapshot samples taken at sporadic timepoints. MIOFlow combines dynamic models, manifold learning, and optimal transport by training neural ordinary differential equations (Neural ODE) to interpolate between static population snapshots as penalized by optimal transport with manifold ground distance. Further, we ensure that the flow follows the geometry by operating in the latent space of an autoencoder that we call a geodesic autoencoder (GAE). In GAE the latent space distance between points is regularized to match a novel multiscale geodesic distance on the data manifold that we define. We show that this method is superior to normalizing flows, Schrödinger bridges and other generative models that are designed to flow from noise to data in terms of interpolating between populations. Theoretically, we link these trajectories with dynamic optimal transport. We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
[01462] A Distributed Algorithm for Wasserstein Proximal Operator Splitting
Format : Online Talk on Zoom
Author(s) :
Iman Nodozi (University of California Santa Cruz)
Abhishek Halder (University of California Santa Cruz)
Abstract : Many time-stepping algorithms are available to numerically realize the Wasserstein proximal
updates, which generalize the concept of gradient steps to the manifold of probability measures. This talk will present a distributed algorithm to perform the Wasserstein proximal updates. The proposed algorithm generalizes the finite dimensional Euclidean consensus ADMM to the measure-valued Wasserstein, and to its entropy-regularized version. We will explain how the proposed algorithm differs compared to the Euclidean case, and will provide numerical case-studies.
[05216] Wasserstein gradient flows and Hamiltonian flows on the generative model
Format : Online Talk on Zoom
Author(s) :
Shu Liu (Math department, UCLA)
Wuchen Li (University of South Carolina)
Hao Wu (Georgia Institute of Technology)
Xiaojing Ye (Georgia State University)
Haomin Zhou (Georgia Institute of Technology)
Abstract : In this talk, we introduce a series of sampling-friendly, optimization-free methods for computing high-dimensional gradient flows and Hamiltonian flows on the Wasserstein probability manifold by leveraging generative models from deep learning. Such methods project the corresponding probability flows to parameter space and obtain finite-dimensional ordinary differential equations (ODEs) which can be directly solved by using classical numerical methods. Furthermore, the computed generative models can efficiently generate samples from the probability flows via pushforward maps.
[05393] Linear Optimal Transport (LOT) Framework for Graph-Based Semi-Supervised Learning using Point Cloud Data
Format : Talk at Waseda University
Author(s) :
Mary Chriselda Antony Oliver (University of Cambridge)
Michael Roberts (University of Cambridge)
Matthew Thorpe (University of Manchester)
Abstract : In this study, we introduce a novel application of the linear optimal transport (LOT) framework, leveraging the geometrical structure of its linear embeddings. We incorporate these embeddings in the form of projections (velocity fields) post dimensionality reduction using graph-based semi-supervised algorithms. Additionally, we compute the shortest path between two prominent nodes (geodesic) for the feature vector within the graphical setting. Finally, we demonstrate the performance through numerical experiments conducted on benchmark 3-D point cloud data.
[05294] Towards a mathematical theory of development
Format : Talk at Waseda University
Author(s) :
Geoffrey Schiebinger (University of British Columbia)
Abstract : This talk introduces a mathematical theory of developmental biology, based on optimal transport. While, in principle, organisms are made of molecules whose motions are described by the Schödinger equation, there are simply too many molecules for this to be useful. Optimal transport provides a set of equations that describe development at the level of cells. We propose that this optimal transport hypothesis is a fundamental mathematical principle of developmental biology.
[03712] Applications of Gromov-Wasserstein Distance to Graph and Hypergraph Analysis
Format : Talk at Waseda University
Author(s) :
Tom Needham (Florida State University)
Abstract : Gromov-Wasserstein distances are metrics, inspired by the usual Wasserstein distances of optimal transport, which are designed to handle comparisons between distributions that lie on different spaces. I will overview some recent applications of these metrics to the analysis of graph and hypergraph datasets.
[03382] Single-cell data integration using optimal transport
Format : Talk at Waseda University
Author(s) :
Ritambhara Singh (Brown University)
Pinar Demetci (Brown University)
Rebecca Santorella (Brown University)
Bjorn Sandstede (Brown University)
William Stafford Noble (University of Washington)
Ievgen Redko (Jean Monet University, Saint Etienne)
Abstract : Integration of single-cell multi-omic measurements is crucial to understand the underlying biology. However, this is particularly challenging due to the lack of sample-wise or feature-wise correspondence information across single-cell datasets generated from different samples. In this talk, I will present our optimal transport-based integration methods that perform the alignment of different single-cell measurements with minimal supervision. We demonstrate their state-of-the-art performance on simulations and real-world datasets.
[03962] A mathematical framework of transfer learning
Format : Online Talk on Zoom
Author(s) :
Haoyang Cao (Ecole Polytechnique)
Abstract : Transfer learning is an emerging and popular paradigm for utilizing existing knowledge from previous learning tasks to improve the performance of new ones. Despite its numerous empirical successes, theoretical analysis for transfer learning is limited. In this talk we introduce for the first time, to the best of our knowledge, a mathematical framework for the general procedure of transfer learning. Our unique reformulation of transfer learning as an optimization problem allows the analysis of its feasibility. Additionally, we propose a novel concept of transfer risk to evaluate transferability of transfer learning. At the end we will demonstrate how this framework can be embedded in both a generic image classification problem and a portfolio optimization problem to demonstrate the potential and benefits of incorporating transfer risk in the evaluation of transfer learning performance.