Registered Data

[00793] SIAM Student Chapter Research Presentations

  • Session Time & Room :
    • 00793 (1/3) : 2E (Aug.22, 17:40-19:20) @E818
    • 00793 (2/3) : 3C (Aug.23, 13:20-15:00) @E818
    • 00793 (3/3) : 3D (Aug.23, 15:30-17:10) @E818
  • Type : Proposal of Minisymposium
  • Abstract : The SIAM Student Chapter Research Presentations minisymposium is designed to encourage student participation, to meet with both peers and professionals in their field and to promote interaction between newly established Student Chapters in the East Asia Section. The presentations given by students include recent advances in applied mathematics and computational science. Organizers also hope to encourage those in the learning community to establish new student chapters of SIAM and to strengthen collaboration opportunities between students and the SIAM leadership.
  • Organizer(s) : Tulin Kaman, Eric Chung
  • Classification : 68Txx, 70Kxx, 62Dxx, Machine Learning, Nonlinear Dynamics, Statistical Sampling Theory
  • Minisymposium Program :
    • 00793 (1/3) : 2E @E818
      • [03324] Continuum Limit of Nonlocal Diffusion on Inhomogeneous Networks
        • Author(s) :
          • Itsuki Watanabe (Waseda University)
        • Abstract : We present two limit theorems for the zero-range process with nonlocal diffusion on inhomogeneous networks. The deterministic model is governed by the reaction–diffusion equation with an integral term in space instead of a Laplacian. By constructing the reproducing kernel Hilbert space to consider the inhomogeneities of the network structure, we prove that the law of large numbers and the central limit theorem hold for our models.
      • [03390] Computing the invariant measure of the N-vortex problem on the sphere by Hamiltonian Monte Carlo
        • Author(s) :
          • Kota Takeda (Kyoto University)
          • Takashi Sakajo (Kyoto University)
        • Abstract : We consider Hamiltonian Monte Carlo(HMC) to approximates a given target distribution on manifolds embedded in Euclidean space. Its efficiency is guaranteed by the exponential convergence property called geometric ergodicity. We have proven that HMC has geometric ergodicity for smooth distributions on compact manifolds. As an application, we compute the invariant measure of the N-vortex problem on the unit sphere by HMC.
      • [03516] REGULARIZED ADAPTIVE HUBER MATRIX REGRESSION AND DISTRIBUTED LEARNING
        • Author(s) :
          • YUE WANG (City University of Hong Kong)
        • Abstract : Matrix regression provides a powerful technique for analyzing matrix-type data, as exemplified by many contemporary applications. Despite the rapid advance, distributed learning for robust matrix regression to deal with heavy-tailed noises in the big data regime still remains untouched. In this paper, we first consider adaptive Huber matrix regression with a nuclear norm penalty, which enjoys insensitivity to heavy-tailed noises without losing statistical accuracy. To further enhance the scalability in massive data applications, we employ the communication-efficient surrogate likelihood framework to develop distributed robust matrix regression, which can be efficiently implemented through the ADMM algorithms. Under only bounded $(1+\delta)$-th moment on the nose for some $\delta \in (0, 1]$, we provide upper bounds for the estimation error of the central estimator and the distributed estimator and prove they can achieve the same rate as established with sub-Gaussian tails when only the second moment of noise exists. Numerical studies verify the advantage of the proposed method over existing methods in heavy-tailed noise settings.
      • [03650] Optimal Contextual Bandits with Knapsacks under Realizability via Regression Oracles
        • Author(s) :
          • Jialin ZENG (Hong Kong University of Science and Technology)
          • Yuxuan HAN (Hong Kong University of Science and Technology)
          • Yang WANG (Hong Kong University of Science and Technology)
          • Yang XIANG (Hong Kong University of Science and Technology)
          • Jiheng ZHANG (Hong Kong University of Science and Technology)
        • Abstract : We study the stochastic contextual bandit with knapsacks (CBwK) problem, where each action, taken upon a context, not only leads to a random reward but also costs a random resource consumption in a vector form. The challenge is to maximize the total reward without violating the budget for each resource. We study this problem under a general realizability setting where the expected reward and expected cost are functions of contexts and actions in some given general function classes F and G, respectively. Existing works on CBwK are restricted to the linear function class since they use UCB-type algorithms, which heavily rely on the linear form and thus are difficult to extend to general function classes. Motivated by online regression oracles that have been successfully applied to contextual bandits, we propose the first universal and optimal algorithmic framework for CBwK by reducing it to online regression. We also establish the lower regret bound to show the optimality of our algorithm for a variety of function classes.
    • 00793 (2/3) : 3C @E818
      • [03692] Differentially Private Confidence Interval for Extrema of Parameters
        • Author(s) :
          • Xiaowen Fu (Hong Kong University of Science and Technology)
          • Yang Xiang (Hong Kong University of Science and Technology)
          • Xinzhou Guo (Hong Kong University of Science and Technology)
        • Abstract : We aim to construct a valid and efficient confidence interval for the extrema of parameters under privacy protection. The usual statistical inference on the extrema of parameters often suffers from the selection bias issue, and the problem becomes more acute, as in many application scenarios of extrema parameters, we often need to protect the privacy of the data. In this work, we focus on the exponential family of distributions and propose a privatized parametric bootstrap method to address selection bias in the extrema of parameters problem under the scheme of differential privacy. While the usual privatized parametric bootstrap does not address selection bias appropriately, we prove that with a privatized bias correction term, the proposed parametric bootstrap method can lead to a valid and efficient confidence interval for the extrema of parameters. We illustrate the proposed method with the Gaussian case and regression case and demonstrate the advantages of the proposed method via numerical experiments and real data examples.
      • [03749] Spherical signal processing via framelets and convolutional neural networks
        • Author(s) :
          • Jianfei Li (City University of Hong Kong)
        • Abstract : In this talk, we would like to introduce a general theoretical framework for constructing Haar-type tight framelets on any compact set with a hierarchical partition. In particular, we develop novel spherical framelets with directionality and combine them with CNNs for image denoising and inpainting tasks. The experiment results show that our proposed CNN model outperforms threshold methods.
      • [03813] O(N) dense direct factorization with near-perfect weak scaling.
        • Author(s) :
          • Sameer Satish Deshmukh (Tokyo Institute of Technology)
          • Rio Yokota (Tokyo Institute of Technology)
          • George Bosilca (University of Tennessee at Knoxville)
        • Abstract : Approximating the off-diagonal blocks of dense matrices arising from the Boundary Element Method can reduce the time of a dense direct factorization from $O(N^3)$ to $O(N)$ with controllable error. Distributed factorization of such algorithms is challenging due to the presence of small, irregular computations. In this talk, we show how asynchronous execution can achieve good weak scaling on FUGAKU for a variety of Green's functions on a 2D geometry.
      • [03907] Distributed Optimization with Imperfect Communication
        • Author(s) :
          • Jie Liu (City University of Hong Kong)
        • Abstract : In distributed optimization over multi-agent networks, each agent is endowed with a local private objective function. The purpose of distributed optimization is to minimize the sum of all agents' local objective functions cooperatively through information communication between different agents. However, the communication channels in the real life are not always perfect due to limited data rates, communication delays, noises, etc. In this presentation, we shall discuss distributed optimization with imperfect communication.
    • 00793 (3/3) : 3D @E818
      • [04039] SPHERICAL FRAMELETS FROM SPHERICAL DESIGNS
        • Author(s) :
          • Yuchen XIAO (City University of Hong Kong)
        • Abstract : In this paper, we investigate in detail the structures of the variational characterization $A_{N,t}$ of the spherical $t$-design, its gradient $\nabla A_{N,t}$, and its Hessian $H(A_{N,t})$ in terms of fast spherical harmonic transforms. Moreover, we propose solving the minimization problem of $A_{N,t}$ using the trust-region method to provide spherical $t$-designs with large values of $t$. Based on the obtained spherical $t$-designs, we develop (semi-discrete) spherical tight framelets as well as their truncated systems and their fast spherical framelet transforms for the practical spherical signal/image processing. Thanks to the large spherical $t$-designs and localization property of our spherical framelets, we are able to provide signal/image denoising using local thresholding techniques based on a fine-tuned spherical cap restriction. Many numerical experiments are conducted to demonstrate the efficiency and effectiveness of our spherical framelets, including Wendland function approximation, ETOPO data processing, and spherical image denoising.
      • [04179] Introducing Deep Unfolding: Incorporating Prior Knowledge into Deep Learning Models
        • Author(s) :
          • Yumeng REN (City University of Hong Kong)
        • Abstract : Deep unfolding (DU) methods accelerate iterations for inverse problems by incorporating prior knowledge into deep learning models, which can be based on backbone iterations such as the ADMM algorithm for reconstruction tasks and a PDE discretization scheme for learning PDEs from data. In this talk, I will introduce the basics of DU methods, ADMM-type backbone iterations and recent advances.
      • [05002] Koopman analysis and Dynamic mode decomposition of Elementary Cellular Automata
        • Author(s) :
          • Keisuke Taga (Waseda University)
          • Yuzuru Kato (Future University Hakodate)
          • Yoshihiro Yamazaki (Waseda University)
          • Hiroya Nakao (Tokyo Institute of Technology)
        • Abstract : We perform Koopman spectral analysis and Dynamic mode decomposition (DMD) for Elementary Cellular Automata (ECA). Koopman operator is a linear operator describing the time evolution of observables of a dynamical system, and DMD is a data-driven approach to Koopman analysis. ECA is the simplest example of finite-state systems that describe spatiotemporal patterns and, thus, a good testbed for assessing the performance of DMD. We report the reproducibility of the dynamics and spectral properties of ECA by different DMD algorithms and explain the linear algebraic background.