# Registered Data

## [00967] Stochastic Dynamical Systems and Applications in Data Science

**Session Date & Time**:- 00967 (1/3) : 1D (Aug.21, 15:30-17:10)
- 00967 (2/3) : 1E (Aug.21, 17:40-19:20)
- 00967 (3/3) : 2C (Aug.22, 13:20-15:00)

**Type**: Proposal of Minisymposium**Abstract**: The theory of Dynamical Systems has helped us to analyze models in various quantitative and qualitative ways, but when considering noisy data with tools in stochastic analysis, there are still challenges in many applications to get precise models for different kinds of processes. On the other hand, lots of innovative methods in data science are now opening up new research directions and broadening the range of research fields where conventional dynamical systems can play a role. Therefore, it is important to consider interplanetary research fields between stochastic dynamics and machine learning: how to analyze stochastic dynamic systems based on observation data instead of studying models analytically? And how to analyze Machine Learning algorithms using tools from the theory of stochastic dynamical systems? In this minisymposium, we seek to find a deeper understanding of the mathematical foundations of the state-of-the-art ideas and techniques in data science as well as its applications in understanding stochastic dynamics, through algorithm development, theoretical analysis, and/or computational implementation. Fields can be covered by but are not limited to Stochastic Analysis, Inverse Problems, Stochastic Optimal Control, Numerical Analysis, Optimization, Topological Data Analysis, Nonparametric Statistics, Uncertainty Quantification, Meta Learning and Deep Reinforcement Learning, etc.**Organizer(s)**: Ting Gao, Xiaoli Chen, Jinqiao Duan**Classification**:__60H10__,__62M20__,__93E10__,__65Z05__,__82C05__**Speakers Info**:- James Greene (Clarkson University)
- Ali Hasan (Duke University)
- Xiaoli Chen (National University of Singapore)
- Yayun Zheng (Jiangsu University)
- Jinqiao Duan (Illinois Institute of Technology)
- Yuling Jiao (Wuhan University)
- Yong Xu (Northwestern Polytechnical University)
**Ting Gao**(Huazhong University of Science and Technology)- Xiaofan Li (Illinois Institute of Technology)
- Doc Luu (IMH-VAST-Vietnam)
- Jiangshe Zhang (Xi’an Jiaotong University)
- Yubin Lu (Illinois Institute of Technology)

**Talks in Minisymposium**:**[02119] Neural stochastic differential equations for time series forecasting****Author(s)**:- Luxuan Yang (Huazhong University of Science and Technology)
**Ting Gao**(Huazhong University of Science and Technology)

**Abstract**: We propose a model called Lévy induced stochastic differential equation network, which explores compounded stochastic differential equations with alpha-stable Lévy motion to model complex time series data and solve the prediction problem through neural network approximation. We theoretically prove that the convergence of the numerical solution and apply the algorithm to real financial time series data. We provide various evaluation metrics and find the accuracy increases through the use of non-Gaussian Lévy processes.

**[02120] Föllmer flows: contraction, sampling and generative learning****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Yuling Jiao (Wuhan University)

**Abstract**: We construct a unit-time flow on the Euclidean space, termed the F{\"o}llmer flow, whose flow map at time 1 pushes forward a standard Gaussian measure onto a general target measure. We study the well-posedness of the F{\"o}llmer flow and establish the Lipschitz property of the flow map at time 1. We apply the Lipschitz mapping to several rich classes of probability measures on deriving functional inequalities with dimension-free constants, sampling and generative learning.

**[02122] Stochastic systems via rough path theory: theory and numerics****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Hoang Duc Luu (MPI MIS & IMH-VAST)

**Abstract**: This talk presents stochastic differential equations driven by Hoelder noises, which can be solved in the pathwise sense using rough path theory. The asymptotic dynamics of the system can be studied under random dynamical systems, and results on existence of random pullback attractors can be derived for dissipative systems. The numerical attractor of the discrete system is proved to converge to the one of the continuous system as the time step tends to zero.

**[02123] Transition Phenomena in Non-Gaussian Stochastic Dynamical Systems****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Jinqiao Duan (Illinois Institute of Technology and Great Bay University )

**Abstract**: Dynamical systems under non-Gaussian Levy fluctuations manifest as nonlocality at a certain “macroscopic” level. Transition phenomena are special events for evolution from one metastable state to another. Examples for such events are phase transition, pattern change, gene transcription, climate change, abrupt shifts, extreme transition, and other rare events. The most probable transition pathways are the maximal likely trajectory (in the sense of optimizing a probability or an action functional) between metastable states.

**[02124] Data-driven method to learn polymer dynamics****Author(s)**:**Xiaoli Chen**(National University of Singapore)

**Abstract**: We propose a machine learning approach where we construct reduced thermodynamic coordinates and interpret the dynamics of these coordinates directly from microscopic stochastic trajectory data. Our approach allows the creation of custom thermodynamics that elucidates macroscopic dynamical landscapes and facilitates subsequent analysis and control. We demonstrate our method on a long polymer chain in an externally applied field by showing that only three learnt thermodynamic coordinates are sufficient to build a dynamical landscape of unfolding.

**[02126] Understanding the diffusion models by conditional expectations****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Yubin Lu (Illinois Institute of Technology)

**Abstract**: We provide several mathematical analyses of the diffusion model in machine learning. The drift term of the backwards sampling process is represented as a conditional expectation involving the data distribution and the forward diffusion. The training process aims to find such a drift function by minimizing the mean-squared residue related to the conditional expectation. We derive a new target function and associated loss and illustrate the theoretical findings with several numerical examples.

**[02127] Early-warning indicator of transition time for noise-induced critical transition of Atlantic Meridional Overturning Circulation****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Yayun Zheng (Jiangsu University)

**Abstract**: We develop an effective and general early-warning indicator for critical transition. The indicator of most probable transition time based on the critical tube is proposed by Onsager-Machlup method based on a critical tube probability. The approach is applied to investigate the abrupt transition from the strong to the weak mode in a thermohaline circulation model. The indicator of the most probable transition time can provide important insights for predicting future abrupt climate transitions.

**[02128] Solving the Non-local Fokker-Planck Equations by Physics-informed Neural Networks****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Senbao Jiang (Illinois Institute of Technology)
- Xiaofan Li (Illinois Institute of Technology)

**Abstract**: We present trapz-PiNNs, incorporated with a modified trapezoidal rule recently developed for accurately evaluating fractional Laplacian and solve the space-fractional Fokker-Planck equations in 2D/3D. We demonstrate trapz-PiNNs have high expressive power through predicting solution with low $L^2$ relative error by a variety of numerical examples. The trapz-PiNN is able to solve PDEs with fractional Laplacian with arbitrary $0<\alpha< (0, 2)$ and on rectangular domains. It could be generalized into higher dimensions or other bounded domains.

**[02129] Emergent Short-range Memory in Stochastic Gradient Noise and Its Implications on Generalization****Author(s)**:**Ting Gao**(Huazhong University of Science and Technology)- Jiangshe Zhang (Xi‘an Jiaotong University)

**Abstract**: Investigating stochastic gradient descent (SGD) from the perspective of stochastic differential equations (SDEs) is quite popular in the deep learning community. In this talk, I will present an analytical result on modeling SGD with SDEs driven by fractional Brownian motion, which reveals the escaping efficiency when trapped in local minima. From the optimization point of view, I will further show how we can relate the smoothness of the optimization pathway to the generalization ability.

**[03377] Modeling and learning methods applied to collective motion in biology****Author(s)**:**James Greene**(Clarkson University)- Ming Zhong (Illinois Institute of Technology)

**Abstract**: From groups of cells to groups of humans, collective motion is ubiquitous in biological systems . Inspired by phototaxis, we develop minimal mathematical models that exhibit the emergence of social structure in Cucker-Smale type pairwise interaction models. Numerical and analytical results are provided, which show the emergence of linear spatial structures. We also present methods by which local interaction rules may be learned from trajectory data, and apply these techniques to cancer migration models.

**[03399] Neural architectures for identifying stochastic differential equations****Author(s)**:**Ali Hasan**(Duke University)- Joao Pereira (Instituto Nacional de Matemática Pura e Aplicada)
- Haoming Yang (Duke University)
- Sina Farsiu (Duke University)
- Vahid Tarokh (Duke University)

**Abstract**: In this work, we will describe a variational framework to recover the parameters of a latent stochastic differential equation (SDE) from high dimensional observations. We prove that, in the limit of infinite data, the true parameters can be recovered up to an isometry and numerically illustrate the efficacy of the method. We finally discuss connections to McKean-Vlasov SDEs when using neural network parameterizations of SDEs and present numerical examples in machine learning applications.

**[03687] Deep learning framework for solving Fokker-Planck equations with low-rank separation representation****Author(s)**:**Yong Xu**(Northwestern polytechnical university)

**Abstract**: An insightful deep learning framework is proposed in this study for solving the well-known Fokker-Planck (FP) equations that quantify the evolution of the probability density function. It eciently reduces the demand of training data in acquiring precise integrations of special normalizing conditions when using neural network (NN). While it also avoids the exponential increase in training data as dimension increases. Instead of all hypercubic discrete points, the inputs of each NN only require one-dimensional discrete data. Without loss of generality, to solve a d-dimensional FP equation, d NNs are employed and assembled into a low-rank separation representation. The FP, boundary, and integral operators are then re-expressed in separation representation. It enables the constructed loss function performs simple vector operations, because the complicated d-dimensional operators are replaced by a set of onedimensional operators. A tractable strategy is presented for the selection of separation rank inspired by the system's potential function, although selecting an appropriate separation rank is still an open issue. Typical numerical examples reveal that the proposed algorithm is eective and superior for solving the FP equations. The suggested framework is applicable and has considerable potential in various areas of engineering and applied sciences.