Abstract : Recently there has been an increased interest in applying data driven methods to learn partial differential equations (PDEs). For example, operator learning has been developed to learn maps between infinite-dimensional function spaces and has shown success in the context of smooth PDEs. However, these methods perform poorly in areas where PDEs are less well-behaved; for instance, when equations are parameterized by non-smooth functions or when the PDE involves stochasticity. This mini-symposium invites experts on novel methods for learning stochastic and ill-conditioned multiscale PDEs. Topics will include numerical methods for SPDEs, learning in multiscale settings, and advances in operator learning.
Abstract : In this talk we will discuss a new approach towards operator learning by regression or discovery of the functional form the PDE. A simple, three step approach will be discussed that can be implemented using convenient, off-the-shelf kernel regression tools. Our approach naturally includes PDEs with unknown and variable coefficients and obtains competitive accuracy when training data is very scarce.
[05235] Neural Operator for Discovering Physical Equations
Format : Online Talk on Zoom
Author(s) :
Paul Bogdan (USC)
Xiongye Xiao (USC)
Gaurav Gupta (USC)
Radu Victor Balan (UMD)
Abstract : We develop a multiwavelet-based neural operator learning architecture that compresses the associated operator’s kernel using fine-grained multiwavelets. For the initial value problems, we propose an exponential neural operator scheme for efficiently learning the map between the initial condition and the activities at later times. To solve coupled partial differential equations, we propose the coupled multiwavelets operator learning scheme by decoupling the coupled integral kernels during the decomposition and reconstruction procedures in the Wavelet space.
[05013] Neural Option Pricing for Rough Bergomi Model
Format : Talk at Waseda University
Author(s) :
Guanglian Li (HKU )
Abstract : This research investigates pricing financial options based on the rough Bergomi model by neural SDEs. We propose an efficient approximation of sample paths using the sum of exponentials and implement the Wasserstein distance as a loss function for network training. The option pricing is entirely based on the traditional martingale theory. Our experimental results indicate that the error of the option price can be bounded by the very Wasserstein distance attained during training.
[05221] One shot learning of stochastic differential equations with kernel methods
Format : Talk at Waseda University
Author(s) :
Matthieu Darcy (California Institute of Technology )
Boumediene Hamzi (Johns Hopkins University)
Giulia Livieri (Scuola Normale Superiore)
Houman Owhadi (California Institute of Technology)
Peyman Tavallali (Jet Propulsion Lab, NASa)
Abstract : We consider the problem of learning a Stochastic Differential Equations from one sample trajectory, a challenging problem as a single trajectory only provides indirect information on the unknown functions. We propose a kernel-based method that recovers the drift function $f$ and the diffusion function $\sigma$ via Maximum a Posteriori Estimation given the data. Additionally, we learn the kernels from data with randomized cross-validation. Numerical examples illustrate the efficacy and robustness of our method.
Abstract : In this talk, we present a deep learning based reduced order modeling method for stochastic flow problems in highly heterogeneous media. We aim to utilize supervised learning to build a reduced surrogate mapping from the stochastic parameter space that characterizes the possible highly heterogeneous media to the solution space of a stochastic flow problem. The research of Eric Chung is partially supported by the Hong Kong RGC General Research Fund (Projects: 14305222 and 14304021).
[03131] Multilevel Picard Approximation Algorithm for Semi-linear Integro-differential Equations
Format : Talk at Waseda University
Author(s) :
Ariel Neufeld (Nanyang Technological University)
Sizhou Wu (Nanyang Technological University)
Abstract : We introduce a multilevel Picard approximation algorithm for semi-linear
parabolic partial integro-differential equations (PIDEs). We prove that the
numerical approximation scheme converges to the unique viscosity
solution of the PIDE under consideration. To that end, we derive a
nonlinear Feynman-Kac formula. Furthermore, we show that the algorithm does not suffer from the curse of dimensionality, i.e., the computational
complexity of the algorithm is bounded polynomially in the dimension and
the reciprocal of the prescribed accuracy.
[03090] Exponentially Convergent Multiscale Finite Element Method
Format : Talk at Waseda University
Author(s) :
Yixuan Wang (California Institute of Technology)
Abstract : Exponentially convergent multiscale finite element method (ExpMsFEM) for efficient model reduction of PDEs in heterogeneous media without scale separation and in high-frequency wave propagation is proposed. ExpMsFEM is built on the non-overlapped domain decomposition in the classical MsFEM while enriching the approximation space systematically to achieve a nearly exponential convergence rate regarding the number of basis functions.
[05179] Learning Solutions to Elliptic PDEs with Discontinuous Multiscale Parameters
Format : Talk at Waseda University
Author(s) :
Margaret Katherine Trautner (California Institute of Technology)
Abstract : Elliptic partial differential equations with discontinuous coefficients arise in modeling dynamics of solid materials. When these coefficients are also multiscale, homogenization theory eliminates the rapidly-varying stiff variable. The bottleneck of this approach is solving an associated cell problem whose discontinuous parameters make solving computationally expensive. Thus, we aim to learn the cell problem solution via data-driven means. In this talk, we describe rigorous theory underpinning these learning methods and numerical experiments that validate the theory.
Abstract : Koopman operators are infinite-dimensional operators that globally linearize nonlinear dynamical systems, making their spectral information valuable for understanding dynamics. They have received considerable attention over the last decade, yet computing their spectral properties is a major challenge. I will present some recent advances in data-driven computation of Koopman spectral properties, including ResDMD and its analogue for stochastic dynamical systems. These new algorithms verifiably converge to the correct spectral properties (avoiding issues such as spectral pollution).
[05147] Solving path-dependent PDEs with signature kernels
Format : Talk at Waseda University
Author(s) :
Cristopher Salvi (Imperial College London)
Abstract : In talk I will introduce a kernel framework for solving path-dependent PDEs (PPDEs) leveraging signature kernels, a recently introduced class of kernels indexed on path space. The proposed method recast the original infinite dimensional optimsation problem to an optimal recovery problem that approximates the solution of a PPDE with an element of minimal norm in the (signature) reproducing kernel Hilbert space constrained to satisfy the PPDE at a finite collection of collocation paths. By the representer theorem, the optimisation has a unique, analytic solution expressed entirely in terms of simple linear algebra operations. I will will discuss some motivating examples from rough volatility and present numerical results on option pricing under a rough Bergomi model.
[05248] Kernel Methods for Rough PDEs
Format : Talk at Waseda University
Author(s) :
Edoardo Calvello (California Institute of Technology )
Ricardo Baptista (California Institute of Technology)
Matthieu Darcy (California Institute of Technology )
Houman Owhadi (California Institute of Technology)
Andrew Stuart (California Institute of Technology)
Xianjin Yang (California Institute of Technology)
Abstract : Following the promising success of kernel methods in solving non-linear partial differential equations (PDEs), we investigate the application of Gaussian process methods to solve PDEs with rough right-hand side. We introduce an optimal recovery scheme defined by a Reproducing Kernel Hilbert Space (RKHS) of functions of greater regularity than that of the PDE’s solution. We illustrate the resulting theoretical framework for the recovery of solutions to the PDE and related numerical experiments.