Abstract : Mathematical neuroscience exploits applied mathematics tools, e.g., modeling, analysis and scientific computing, to understand the structure, dynamics, and function of the brain. Many neuroscience phenomena are intriguing but extremely complicated, with features of high dimensionality, nonlinearity, multi-scale, and complex dynamics. Therefore, developing effective theoretical and computational methods becomes increasingly significant to understand the mechanism underlying neuroscience phenomena, as well as to advance experimental neuroscience. This mini-symposium focuses on novel ideas and advanced approaches in mathematical neuroscience, with an emphasis on prominent neuroscience phenomena including hierarchical structure, oscillatory and attractor dynamics, and functions of learning and memory.
[04042] Learning optimal models of statistical events in spontaneous neural activity
Format : Talk at Waseda University
Author(s) :
Toshitake Asabuki (Imperial College London)
Tomoki Fukai (Okinawa Institute of Science and Technology)
Abstract : The brain is thought to learn an internal model of the statistical environment for improved cognitive performance. Evidence suggests that spontaneous cortical activity represents such a model, or prior distribution, by replaying stimulus-evoked activity patterns with the probabilities that these stimuli were experienced. Here, we present a principle to robustly learn replay activity patterns in spiking recurrent neural networks and demonstrate how such spontaneous replay biases animals' perceptual decision making.
[03841] The hierarchical organization of the Drosophila connectome
Format : Talk at Waseda University
Author(s) :
Kresimir Josic (University of Houston)
Alexander B. Kunin (Creighton University)
Jiahao Guo (University of Houston)
Kevin E. Bassler (University of Houston)
Xaq Pitkow (Rice University)
Abstract : The Hemibrain is the largest published connectome to date. It is the result of a dense reconstruction of over twenty thousand neurons and ten million synapses spanning the fruit fly Drosophila central brain. I will describe a novel approach to uncovering the hierarchical community structure of this connectome. This approach allows us to recover previously known and reveal novel features of the organization within the fly brain. Methods such as these will be essential to interpret the forthcoming connectomics data due to its size and complexity.
[01437] The mechanism of abnormal beta-oscillation generated in striatum
Format : Talk at Waseda University
Author(s) :
Douglas Zhou (Shanghai Jiao Tong University)
Abstract : combining simulations of a neural network model and the analysis of the corresponding reduced neural mass model, we demonstrate how the cellular architecture and network dynamics of the ChAT - iMSN close loop in the striatum efficiently yield exaggerated beta oscillations. We find that beta oscillations can emerge from inhibitory interactions among iMSNs. And a slow inhibitory dynamic in iMSNs could be the underpinning of beta oscillations.
[03947] Maturation of neurons reconciles flexibility and stability of memory: dual structural plasticity in the olfactory system
Format : Talk at Waseda University
Author(s) :
Bennet Sakelaris (Northwestern University)
Hermann Riecke (Northwestern University)
Abstract : It is essential for the brain to flexibly form new memories without overwriting and jeopardizing the stability of existing ones. Using a computational model of the olfactory bulb (OB) that captures several experimental observations, we investigate how the characteristic structural plasticity of the OB addresses this flexibility-stability tradeoff. We demonstrate that the evolution of the timescales of synaptic plasticity associated with the aging of adult-born cells allows the OB to strike a harmonious balance between the competing demands of flexibility and stability.
[01409] Mathematical mechanism underlying hierarchical timescales in the primate neocortex
Format : Talk at Waseda University
Author(s) :
Songting Li (Shanghai Jiao Tong University)
Abstract : In the neocortex, while early sensory areas encode and process external inputs rapidly, higher-association areas are endowed with slow dynamics to benefit information accumulation over time. Such a hierarchy of temporal response windows along the cortical hierarchy naturally emerges in an anatomically based model of primate cortex. The emergent property raises the question of why diverse temporal modes are well segregated rather than being mixed up across the cortex, despite high connection density and an abundance of feedback loops. In this talk, we will address this question by mathematically analyzing the primate cortical model and identifying crucial conditions of synaptic excitation and inhibition that give rise to timescale segregation in a hierarchy. In addition, we will discuss the mathematical relation between timescales segregation and signal propagation in the cortex.
[04184] Computation with Adaptive Continuous Attractor Neural Networks
Format : Online Talk on Zoom
Author(s) :
SI WU (Peking University)
Abstract : Continuous attractor neural networks (CANNs) are a canonical model for neural information representation, storing, retrieving and manipulation. Adaptation is a general feature of neural systems referring to a negative feedback process when neuronal activity is high. When two of them are combined together, the neural network exhibits rich dynamical behaviors. In this talk, I will introduce the rich dynamical properties of adaptive CANNs and their potential roles in brain functions.
Abstract : First-principles-based models have been extremely successful in providing crucial insights and predictions for complex biological functions and phenomena. However, they can be hard to build and expensive to simulate for complex living systems. On the other hand, modern data-driven methods thrive at modeling many types of high-dimensional and noisy data. Still, the training and interpretation of these data-driven models remain challenging. Here, we combine the two types of methods to model stochastic neuronal network oscillations. Specifically, we develop a class of first-principles-based artificial neural networks to provide faithful surrogates to the high-dimensional, nonlinear oscillatory dynamics produced by neural circuits in the brain. Furthermore, when the training data set is enlarged within a range of parameter choices, the artificial neural networks become generalizable to these parameters, covering cases in distinctly different dynamical regimes. In all, our work opens a new avenue for modeling complex neuronal network dynamics with artificial neural networks.
[01458] Reconstruction of Evolving Percepts in Binocular Rivalry Using Novel Model Network Dynamics
Format : Online Talk on Zoom
Author(s) :
Victor Barranca (Swarthmore College)
Abstract : When the two eyes are presented with distinct stimuli, our percept irregularly switches between the monocular images, giving rise to binocular rivalry. We investigate mechanisms for rivalry through stimulus reconstructions based on the activity of a two-layer neuronal network model with competing downstream pools driven by disparate monocular images. To estimate the dynamic percept, we derive an embedded input-output mapping and iteratively apply compressive sensing techniques, generating percept reconstructions that agree with key experimental observations.