Registered Data

[00635] Mean field games and optimal transport with applications in data science and biology

  • Session Date & Time :
    • 00635 (1/2) : 1C (Aug.21, 13:20-15:00)
    • 00635 (2/2) : 1D (Aug.21, 15:30-17:10)
  • Type : Proposal of Minisymposium
  • Abstract : Mean field games (MFG) study the behavior of individual players in large populations, where each player controls their own state, while some collective behavior is considered for decision-making. Specific MFG models can be formulated as generalized measure transportation problems, exemplifying one of the tight connections to optimal transport (OT).  This mini-symposium highlights the close relationship between MFG and OT, advancing research directions in modeling and numerical algorithms, and expanding fields of applications. Particular emphasis will lie on new applications in data science, such as point-cloud analysis on networks, and in biology, such as single-cell data integration.
  • Organizer(s) : Shiying Li, Wuchen Li, Siting Liu, Caroline Moosmueller
  • Classification : 49N80, 49N90, 68T09, 92-10
  • Speakers Info :
    • Jian-Guo Liu (Duke University)
    • Renyuan Xu (University of Southern California)
    • Abhishek Halder (University of California, Santa Cruz)
    • Shu Liu (UCLA)
    • Smita Krishnaswamy (Yale University)
    • Thomas Needham (Florida State University)
    • Ritambhara Singh (Brown University)
    • Geoffrey Schiebinger (University of British Columbia)
  • Talks in Minisymposium :
    • [01462] A Distributed Algorithm for Wasserstein Proximal Operator Splitting
      • Author(s) :
        • Iman Nodozi (University of California Santa Cruz)
        • Abhishek Halder (University of California Santa Cruz)
      • Abstract : Many time-stepping algorithms are available to numerically realize the Wasserstein proximal updates, which generalize the concept of gradient steps to the manifold of probability measures. This talk will present a distributed algorithm to perform the Wasserstein proximal updates. The proposed algorithm generalizes the finite dimensional Euclidean consensus ADMM to the measure-valued Wasserstein, and to its entropy-regularized version. We will explain how the proposed algorithm differs compared to the Euclidean case, and will provide numerical case-studies.
    • [03382] Single-cell data integration using optimal transport
      • Author(s) :
        • Ritambhara Singh (Brown University)
        • Pinar Demetci (Brown University)
        • Rebecca Santorella (Brown University)
        • Bjorn Sandstede (Brown University)
        • William Stafford Noble (University of Washington)
        • Ievgen Redko (Jean Monet University, Saint Etienne)
        • Quang Huy Tran (Université Bretagne-Sud, CNRS, IRISA, Vannes)
      • Abstract : Integration of single-cell multi-omic measurements is crucial to understand the underlying biology. However, this is particularly challenging due to the lack of sample-wise or feature-wise correspondence information across single-cell datasets generated from different samples. In this talk, I will present our optimal transport-based integration methods that perform the alignment of different single-cell measurements with minimal supervision. We demonstrate their state-of-the-art performance on simulations and real-world datasets.
    • [03712] Applications of Gromov-Wasserstein Distance to Graph and Hypergraph Analysis
      • Author(s) :
        • Tom Needham (Florida State University)
      • Abstract : Gromov-Wasserstein distances are metrics, inspired by the usual Wasserstein distances of optimal transport, which are designed to handle comparisons between distributions that lie on different spaces. I will overview some recent applications of these metrics to the analysis of graph and hypergraph datasets.
    • [03962] Multi-agent reinforcement learning for collaborative games: a mean-field perspective
      • Author(s) :
        • Haotian Gu (UC Berkeley)
        • Xin Guo (UC Berkeley)
        • Renyuan Xu (University of Southern California)
        • Xiaoli Wei (Tsinghua-Berkeley Institute)
      • Abstract : Multi-agent reinforcement learning (MARL) has enjoyed substantial successes in many applications including real-time resource allocation, order matching for ride-hailing, and autonomous driving. Despite the empirical success of MARL, general theories behind MARL algorithms are less developed due to the intractability of interactions, complex information structure, and the curse of dimensionality. Instead of directly analyzing the multi-agent systems, the mean-field theory provides a powerful approach to approximate the games under various notions of equilibria. Moreover, the analytically feasible framework of mean-field theory leads to efficient and tractable learning algorithms with theoretical guarantees. In this talk, we will demonstrate how mean-field theory can contribute to analyzing a class of simultaneous-learning-and-decision-making problems under cooperation, with unknown rewards and dynamics. Moreover, we will show that the learning procedure can be further decentralized and scaled up if a network structure is specified. Our result lays the first theoretical foundation for the so-called "centralized training and decentralized execution" scheme, a widely used training scheme in the empirical works of cooperative MARL problems.
    • [05205] Manifold Interpolating Optimal-Transport Flows for Trajectory Inference
      • Author(s) :
        • Smita Krishnaswamy (Yale University)
        • Guillaume Huguet (University of Montreal)
        • Alexander Tong (University of Montreal )
        • Oluwadamilola Fasina (Yale University)
        • Daniel Sumner Magruder (Yale University)
        • Manik Kuchroo (Yale University)
        • Guy Wolf (University of Montreal)
      • Abstract : We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow) that learns stochastic, continuous population dynamics from static snapshot samples taken at sporadic timepoints. MIOFlow combines dynamic models, manifold learning, and optimal transport by training neural ordinary differential equations (Neural ODE) to interpolate between static population snapshots as penalized by optimal transport with manifold ground distance. Further, we ensure that the flow follows the geometry by operating in the latent space of an autoencoder that we call a geodesic autoencoder (GAE). In GAE the latent space distance between points is regularized to match a novel multiscale geodesic distance on the data manifold that we define. We show that this method is superior to normalizing flows, Schrödinger bridges and other generative models that are designed to flow from noise to data in terms of interpolating between populations. Theoretically, we link these trajectories with dynamic optimal transport. We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
    • [05216] Wasserstein gradient flows and Hamiltonian flows on the generative model
      • Author(s) :
        • Shu Liu (Math department, UCLA)
        • Wuchen Li (University of South Carolina)
        • Hao Wu (Georgia Institute of Technology)
        • Xiaojing Ye (Georgia State University)
        • Haomin Zhou (Georgia Institute of Technology)
      • Abstract : In this talk, we introduce a series of sampling-friendly, optimization-free methods for computing high-dimensional gradient flows and Hamiltonian flows on the Wasserstein probability manifold by leveraging generative models from deep learning. Such methods project the corresponding probability flows to parameter space and obtain finite-dimensional ordinary differential equations (ODEs) which can be directly solved by using classical numerical methods. Furthermore, the computed generative models can efficiently generate samples from the probability flows via pushforward maps.
    • [05294] Towards a mathematical theory of development
      • Author(s) :
        • Geoffrey Schiebinger (University of British Columbia)
      • Abstract : This talk introduces a mathematical theory of developmental biology, based on optimal transport. While, in principle, organisms are made of molecules whose motions are described by the Schödinger equation, there are simply too many molecules for this to be useful. Optimal transport provides a set of equations that describe development at the level of cells. We propose that this optimal transport hypothesis is a fundamental mathematical principle of developmental biology.
    • [05393] Linear Optimal Transport (LOT) Framework for Graph-Based Semi-Supervised Learning using Point Cloud Data
      • Author(s) :
        • Mary Chriselda Antony Oliver (University of Cambridge)
        • Michael Roberts (University of Cambridge)
        • Matthew Thorpe (University of Manchester)
      • Abstract : In this study, we introduce a novel application of the linear optimal transport (LOT) framework, leveraging the geometrical structure of its linear embeddings. We incorporate these embeddings in the form of projections (velocity fields) post dimensionality reduction using graph-based semi-supervised algorithms. Additionally, we compute the shortest path between two prominent nodes (geodesic) for the feature vector within the graphical setting. Finally, we demonstrate the performance through numerical experiments conducted on benchmark 3-D point cloud data.