Abstract : Machine learning techniques are becoming increasingly prominent at solving complex dynamical systems and utilized in data-driven applications, such as inverse problems and model discovery. Yet, important geometric and physical structures have not traditionally been incorporated in such approaches, leading to loss of accuracy in long-term predictions. This minisymposium aims to bring together researchers from diverse groups to improve on machine learning techniques using ideas inspired by geometric integration.
Organizer(s) : Elena Celledoni, James Jackaman and Andy Wan
00908 (1/2) : 2E @E817 A715 [Chair: James Jackaman]
[03459] Geometric integration in machine learning
Format : Talk at Waseda University
Author(s) :
James Jackaman (NTNU)
Abstract : Here, as a primer, we give an overview of the role geometric integration (of Hamiltonian systems) has played in the design of neural networks in recent years, with an empathsis on the stability guarantees this provides and the structures incorporated into the networks. Time permitting, we will move on to discuss how convolutional networks can be understood through finite differences and the theoretical benefits this comparison yields.
[03490] Application of the Kernel Method to Learning Hamiltonian Equations
Format : Talk at Waseda University
Author(s) :
Taisei Ueda (Kobe University)
Takashi Matsubara (Osaka University)
Takaharu Yaguchi (Kobe University)
Abstract : Recently, methods for learning Hamiltonian systems from data have attracted much attention. While the most methods are based on neural networks, neural networks have some drawbacks, such as the possibility of falling into a local optimum. In this talk, we propose a method based on the kernel method, thereby overcoming the problems.
[03180] Structured neural networks and some applications
Format : Online Talk on Zoom
Author(s) :
Davide Murari (Norwegian University of Science and Technology)
Elena Celledoni (Norwegian University of Science and Technology)
Brynjulf Owren (Norwegian University of Science and Technology)
Ferdia Sherry (University of Cambridge)
Carola-Bibiane Schönlieb (University of Cambridge)
Abstract : Neural networks have gained much interest because of their effectiveness in many applications related to high-dimensional function approximation problems. This success is often supported by experimental evidence, while the theoretical properties of these models need to be better understood. When one knows that the target function to approximate or the data being processed has some properties, it might be desirable to reproduce them in the neural network design. This talk presents a framework that makes ODEs and numerical methods work together to model neural networks having prescribed properties. Such an approach is supported by offering particular applications for data-driven modelling and image analysis.
[02911] Auxiliary Functions as Koopman Observables
Format : Online Talk on Zoom
Author(s) :
Jason John Bramburger (Concordia University)
Abstract : Many important statements about dynamical systems can be proved by finding scalar-valued auxiliary functions whose time evolution along trajectories obeys certain pointwise inequalities that imply the desired result. The most familiar of these auxiliary functions is a Lyapunov function to prove steady-state stability, but such functions can also be used to bound averages of ergodic systems, define trapping boundaries, and so much more. In this talk I will highlight a method of identifying auxiliary functions from data using polynomial optimization. The method leverages recent advances in approximating the Koopman operator from data, so-called extended dynamic mode decomposition, to provide system-level information without system identification. The result is a model-agnostic computational method that can be used to bound quantities of interest and develop optimal state-dependent feedback controllers, while also functioning as a pre-conditioner to discovering accurate and parsimonious dynamical models from the data.
[03401] Conservative Hamiltonian Monte Carlo
Format : Online Talk on Zoom
Author(s) :
Geoffrey McGregor (University of Toronto)
Andy Wan (University of Northern British Columbia)
Abstract : Markov Chain Monte Carlo (MCMC) methods enable us to extract meaningful statistics from complex distributions which frequently appear in parameter estimation, Bayesian statistics, statistical mechanics and machine learning. However, as the dimensionality of the problem increases, the convergence rate of MCMC sequences toward the stationary distribution slows down dramatically. This has led to the development of Hamiltonian Monte Carlo (HMC) [Duane et al. ‘87, Neal ‘93], to improve performance by solving a Hamiltonian system using symplectic numerical methods. However, modern high-dimensional applications still pose a significant challenge for HMC.
In this talk, we introduce Conservative Hamiltonian Monte Carlo (CHMC), which alternatively utilizes an energy-preserving numerical method, known as the Discrete Multiplier Method. We show that CHMC converges to the correct stationary distribution under appropriate conditions, and provide numerical examples showcasing improvements in convergence rates over HMC in high-dimensional problems. Furthermore, we also will present numerical results on Bayesian parameter estimation using CHMC.
[03462] Model Reduction of Hamiltonian Systems based on Nonlinear Approximation Methods
Format : Talk at Waseda University
Author(s) :
Silke Glas (University of Twente)
Abstract : In this talk we consider structure-preserving model reduction of Hamiltonian systems, such that the reduced model is again a Hamiltonian system. We extend the classical linear-subspace model reduction methods, where the best possible error is bounded by the Kolmogorov N-width to reduced models constructed via nonlinear approximations. In this talk, we will particularly choose symplectic quadratic embeddings as our nonlinear approximation function.