Abstract : The theory of Dynamical Systems has helped us to analyze models in various quantitative and qualitative ways, but when considering noisy data with tools in stochastic analysis, there are still challenges in many applications to get precise models for different kinds of processes. On the other hand, lots of innovative methods in data science are now opening up new research directions and broadening the range of research fields where conventional dynamical systems can play a role. Therefore, it is important to consider interplanetary research fields between stochastic dynamics and machine learning: how to analyze stochastic dynamic systems based on observation data instead of studying models analytically? And how to analyze Machine Learning algorithms using tools from the theory of stochastic dynamical systems? In this minisymposium, we seek to find a deeper understanding of the mathematical foundations of the state-of-the-art ideas and techniques in data science as well as its applications in understanding stochastic dynamics, through algorithm development, theoretical analysis, and/or computational implementation. Fields can be covered by but are not limited to Stochastic Analysis, Inverse Problems, Stochastic Optimal Control, Numerical Analysis, Optimization, Topological Data Analysis, Nonparametric Statistics, Uncertainty Quantification, Meta Learning and Deep Reinforcement Learning, etc.
[02123] Transition Phenomena in Non-Gaussian Stochastic Dynamical Systems
Format : Talk at Waseda University
Author(s) :
Jinqiao Duan (Illinois Institute of Technology and Great Bay University )
Abstract : Dynamical systems under non-Gaussian Levy fluctuations manifest as nonlocality at a certain “macroscopic” level. Transition phenomena are special events for evolution from one metastable state to another. Examples for such events are phase transition, pattern change, gene transcription, climate change, abrupt shifts, extreme transition, and other rare events. The most probable transition pathways are the maximal likely trajectory (in the sense of optimizing a probability or an action functional) between metastable states.
[02120] Föllmer flows: contraction, sampling and generative learning
Format : Online Talk on Zoom
Author(s) :
Yuling Jiao (Wuhan University)
Abstract : We construct a unit-time flow on the Euclidean space, termed the F{\"o}llmer flow, whose flow map at time 1 pushes forward a standard Gaussian measure onto a general target measure. We study the well-posedness of the F{\"o}llmer flow and establish the Lipschitz property of the flow map at time 1. We apply the Lipschitz mapping to several rich classes of probability measures on deriving functional inequalities with dimension-free constants, sampling and generative learning.
[02122] Stochastic systems via rough path theory: theory and numerics
Format : Online Talk on Zoom
Author(s) :
Hoang Duc Luu (MPI MIS & IMH-VAST)
Abstract : This talk presents stochastic differential equations driven by Hoelder noises, which can be solved in the pathwise sense using rough path theory. The asymptotic dynamics of the system can be studied under random dynamical systems, and results on existence of random pullback attractors can be derived for dissipative systems. The numerical attractor of the discrete system is proved to converge to the one of the continuous system as the time step tends to zero.
[02124] Data-driven method to learn polymer dynamics
Format : Talk at Waseda University
Author(s) :
Xiaoli Chen (National University of Singapore)
Abstract : We propose a machine learning approach where we construct reduced thermodynamic coordinates and interpret the dynamics of these coordinates directly from microscopic stochastic trajectory data. Our approach allows the creation of custom thermodynamics that elucidates macroscopic dynamical landscapes and facilitates subsequent analysis and control. We demonstrate our method on a long polymer chain in an externally applied field by showing that only three learnt thermodynamic coordinates are sufficient to build a dynamical landscape of unfolding.
[02126] Understanding the diffusion models by conditional expectations
Format : Online Talk on Zoom
Author(s) :
Yubin Lu (Illinois Institute of Technology)
Abstract : We provide several mathematical analyses of the diffusion model in machine learning. The drift term of the backwards sampling process is represented as a conditional expectation involving the data distribution and the forward diffusion. The training process aims to find such a drift function by minimizing the mean-squared residue related to the conditional expectation. We derive a new target function and associated loss and illustrate the theoretical findings with several numerical examples.
[02127] Early-warning indicator of transition time for noise-induced critical transition of Atlantic Meridional Overturning Circulation
Format : Online Talk on Zoom
Author(s) :
Yayun Zheng (Jiangsu University)
Abstract : We develop an effective and general early-warning indicator for critical transition. The indicator of most probable transition time based on the critical tube is proposed by Onsager-Machlup method based on a critical tube probability. The approach is applied to investigate the abrupt transition from the strong to the weak mode in a thermohaline circulation model. The indicator of the most probable transition time can provide important insights for predicting future abrupt climate transitions.
[02128] Solving the Non-local Fokker-Planck Equations by Physics-informed Neural Networks
Format : Talk at Waseda University
Author(s) :
Senbao Jiang (Illinois Institute of Technology)
Xiaofan Li (Illinois Institute of Technology)
Abstract : We present trapz-PiNNs, incorporated with a modified trapezoidal rule recently developed for accurately evaluating fractional Laplacian and solve the space-fractional Fokker-Planck equations in 2D/3D. We demonstrate trapz-PiNNs have high expressive power through predicting solution with low $L^2$ relative error by a variety of numerical examples. The trapz-PiNN is able to solve PDEs with fractional Laplacian with arbitrary $0<\alpha< (0, 2)$ and on rectangular domains. It could be generalized into higher dimensions or other bounded domains.
[02119] Neural stochastic differential equations for time series forecasting
Format : Talk at Waseda University
Author(s) :
Luxuan Yang (Huazhong University of Science and Technology)
Ting Gao (Huazhong University of Science and Technology)
Abstract : We propose a model called Lévy induced stochastic differential equation network, which explores compounded stochastic differential equations with alpha-stable Lévy motion to model complex time series data and solve the prediction problem through neural network approximation. We theoretically prove that the convergence of the numerical solution and apply the algorithm to real financial time series data. We provide various evaluation metrics and find the accuracy increases through the use of non-Gaussian Lévy processes.
[03377] Modeling and learning methods applied to collective motion in biology
Format : Online Talk on Zoom
Author(s) :
James Greene (Clarkson University)
Ming Zhong (Illinois Institute of Technology)
Abstract : From groups of cells to groups of humans, collective motion is ubiquitous in biological systems . Inspired by phototaxis, we develop minimal mathematical models that exhibit the emergence of social structure in Cucker-Smale type pairwise interaction models. Numerical and analytical results are provided, which show the emergence of linear spatial structures. We also present methods by which local interaction rules may be learned from trajectory data, and apply these techniques to cancer migration models.
[03399] Neural architectures for identifying stochastic differential equations
Format : Talk at Waseda University
Author(s) :
Ali Hasan (Duke University)
Joao Pereira (Instituto Nacional de Matemática Pura e Aplicada)
Haoming Yang (Duke University)
Sina Farsiu (Duke University)
Vahid Tarokh (Duke University)
Abstract : In this work, we will describe a variational framework to recover the parameters of a latent stochastic differential equation (SDE) from high dimensional observations. We prove that, in the limit of infinite data, the true parameters can be recovered up to an isometry and numerically illustrate the efficacy of the method. We finally discuss connections to McKean-Vlasov SDEs when using neural network parameterizations of SDEs and present numerical examples in machine learning applications.
[02129] Emergent Short-range Memory in Stochastic Gradient Noise and Its Implications on Generalization
Format : Online Talk on Zoom
Author(s) :
Jiangshe Zhang (Xi‘an Jiaotong University)
Abstract : Investigating stochastic gradient descent (SGD) from the perspective of stochastic differential equations (SDEs) is quite popular in the deep learning community. In this talk, I will present an analytical result on modeling SGD with SDEs driven by fractional Brownian motion, which reveals the escaping efficiency when trapped in local minima. From the optimization point of view, I will further show how we can relate the smoothness of the optimization pathway to the generalization ability.