Difference between revisions of "Pathways to the 2023 IHP thematic program Random Processes in the Brain/Seminars"
(+1) |
(Add interview Olivier Faugeras and Gilles Laurent) |
||
(19 intermediate revisions by 3 users not shown) | |||
Line 9: | Line 9: | ||
| style="text-align:center; padding:30px 0 30px 0"|[[File:Logo - Neuromat - Horizontal - EN v2.svg|link=https://neuromat.numec.prp.usp.br|200px]] | | style="text-align:center; padding:30px 0 30px 0"|[[File:Logo - Neuromat - Horizontal - EN v2.svg|link=https://neuromat.numec.prp.usp.br|200px]] | ||
|- | |- | ||
− | | style="width:65%; padding:-10px" |{{Pathways to the 2023 IHP thematic program Random Processes in the Brain/ | + | | style="width:65%; padding:-10px" |{{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Priscilla Greenwood|2}} |
;Building a stochastic neural circuit of cortical-pulvinar interaction | ;Building a stochastic neural circuit of cortical-pulvinar interaction | ||
− | * Speaker: | + | [[File:Building a stochastic neural circuit of cortical-pulvinar interaction.jpg|200px|right]] |
+ | [[File:Building a stochastic neural circuit of cortical-pulvinar interaction.webm|thumb|Seminar video recording.]] | ||
+ | [[File:An interview with - Priscilla Greenwood.webm|thumb|Warmup interview]] | ||
+ | * Speaker: Priscilla Greenwood, University of British Columbia | ||
* Date: Tuesday, December 6, 2022 | * Date: Tuesday, December 6, 2022 | ||
* Abstract: Phase coherence between oscillating areas of cortex is associated with attention and information transmission. A growing literature is devoted to how this might work. We use a rate model of noise-driven quasi-cycle oscillators to build a neural circuit representing cortical-pulvinar interactions. Modelling in terms of rates or densities rather than firing allows computation of a result about optimal phase relationships. This is joint work with Lawrence Ward of the UBC Psychology Department. | * Abstract: Phase coherence between oscillating areas of cortex is associated with attention and information transmission. A growing literature is devoted to how this might work. We use a rate model of noise-driven quasi-cycle oscillators to build a neural circuit representing cortical-pulvinar interactions. Modelling in terms of rates or densities rather than firing allows computation of a result about optimal phase relationships. This is joint work with Lawrence Ward of the UBC Psychology Department. | ||
− | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/ | + | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Gilles Laurent|2}} |
− | ; | + | ;Exploring the space of neural systems dynamics |
+ | [[File:Exploring the space of neural systems dynamics.jpg|200px|right]] | ||
+ | [[File:Exploring the space of neural systems dynamics.webm|thumb|Seminar video recording.]] | ||
+ | [[File:An interview with - Gilles Laurent.webm|thumb|Warmup interview]] | ||
* Speaker: Gilles Laurent, Max Planck Institute for Brain Research | * Speaker: Gilles Laurent, Max Planck Institute for Brain Research | ||
* Date: Tuesday, November 22, 2022 | * Date: Tuesday, November 22, 2022 | ||
* Abstract: I will illustrate the diversity of dynamical properties that neural systems can express, in a search for mechanistic and functional understanding of “the brain”, using examples from a diversity of systems and animal species. | * Abstract: I will illustrate the diversity of dynamical properties that neural systems can express, in a search for mechanistic and functional understanding of “the brain”, using examples from a diversity of systems and animal species. | ||
− | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/ | + | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Wojciech Szpankowski|2}} |
;Structural and Temporal Information | ;Structural and Temporal Information | ||
[[File:Structural and Temporal Information.jpg|200px|right]] | [[File:Structural and Temporal Information.jpg|200px|right]] | ||
+ | [[File:Structural and Temporal Information.webm|thumb|Seminar video recording.]] | ||
+ | [[File:An interview with - Wojciech Szpankowski.webm|thumb|Warmup interview]] | ||
* Speaker: Wojciech Szpankowski, Purdue University | * Speaker: Wojciech Szpankowski, Purdue University | ||
* Date: Tuesday, November 8, 2022 | * Date: Tuesday, November 8, 2022 | ||
* Abstract: Shannon's information theory has served as a bedrock for advances in communication and storage systems over the past five decades. However, this theory does not handle well higher order structures (e.g., graphs, geometric structures), temporal aspects (e.g., real-time considerations), or semantics. We argue that these are essential aspects of data and information that underly a broad class of current and emerging data science applications. In this talk, we present some recent results on structural and temporal information. We first show how to extract temporal information in dynamic networks (arrival of nodes) from its structure (unlabeled graphs). We then proceed to establish fundamental limits on information content for some data structures, and present asymptotically optimal lossless compression algorithms achieving these limits for various graph models. | * Abstract: Shannon's information theory has served as a bedrock for advances in communication and storage systems over the past five decades. However, this theory does not handle well higher order structures (e.g., graphs, geometric structures), temporal aspects (e.g., real-time considerations), or semantics. We argue that these are essential aspects of data and information that underly a broad class of current and emerging data science applications. In this talk, we present some recent results on structural and temporal information. We first show how to extract temporal information in dynamic networks (arrival of nodes) from its structure (unlabeled graphs). We then proceed to establish fundamental limits on information content for some data structures, and present asymptotically optimal lossless compression algorithms achieving these limits for various graph models. | ||
− | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/ | + | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section| Olivier Faugeras|2}} |
;Mathematical Neuroscience | ;Mathematical Neuroscience | ||
+ | [[File:Mathematical neuroscience.jpg|200px|right]] | ||
+ | [[File:Mathematical Neuroscience.webm|thumb|Seminar video recording.]] | ||
+ | [[File:An interview with - Olivier Faugeras.webm|thumb|Warmup interview]] | ||
* Speaker: Olivier Faugeras, Inria Sophia Antipolis | * Speaker: Olivier Faugeras, Inria Sophia Antipolis | ||
* Date: Tuesday, October 25, 2022 | * Date: Tuesday, October 25, 2022 | ||
* Abstract: Why is it important to ground neuroscience in mathematics ? What kind of mathematics are relevant in this scientific area where biology, perception; action and cognition are closely intermingled ? What kind of relationships should be entertained with experimentalists and computationalists ? In this lecture I will try to answer these questions through examples drawn from the analysis of the activity of large populations of neurons by mathematical methods from probability, statistics, and geometry. | * Abstract: Why is it important to ground neuroscience in mathematics ? What kind of mathematics are relevant in this scientific area where biology, perception; action and cognition are closely intermingled ? What kind of relationships should be entertained with experimentalists and computationalists ? In this lecture I will try to answer these questions through examples drawn from the analysis of the activity of large populations of neurons by mathematical methods from probability, statistics, and geometry. | ||
− | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/ | + | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Tilo Schwalger|2}} |
;Current challenges for mesoscopic neural population dynamics and metastability | ;Current challenges for mesoscopic neural population dynamics and metastability | ||
[[File:Current challenges for mesoscopic neural population dynamics and metastability.jpg|200px|right]] | [[File:Current challenges for mesoscopic neural population dynamics and metastability.jpg|200px|right]] | ||
+ | [[File:Current challenges for mesoscopic neural population dynamics and metastability.webm|thumb|Seminar video recording.]] | ||
* Speaker: Tilo Schwalger, Institut für Mathematik, Technische Universität Berlin | * Speaker: Tilo Schwalger, Institut für Mathematik, Technische Universität Berlin | ||
* Date: Tuesday, October 11, 2022 | * Date: Tuesday, October 11, 2022 | ||
− | * Abstract: | + | * Abstract: Mesoscopic neuronal population dynamics deals with emergent neural activity and computations at a coarse-grained spatial scale at which fluctuations due to a finite number of neurons should not be neglected. A prime example is metastable dynamics in cortical and hippocampal circuits, in which fluctuations likely play a critical role. In this lecture, I will discuss recent advances and current challenges for mean-field descriptions of computations and metastable dynamics at the mesoscopic scale. Firstly, I will discuss fundamental differences between external noise and intrinsic "finite-size noise" in population models, and their distinct impact on metastable dynamics. Is it possible to infer the type of metastability and noise from mesoscopic population data? Secondly, I will address the question of how to treat single-neuron dynamics (e.g. refractory mechanisms, adaptation) and synaptic dynamics (e.g. short-term depression) at the level of mesoscopic populations. Is it possible to derive (low-dimensional) bottom-up mesoscopic models that link back to the microscopic properties of spiking neural networks? And thirdly, I will address the fundamental problem of heterogeneity in biological neural networks. An important source of heterogeneity is non-homogeneous network structure. The synaptic connectivity of any neural network that performs computations is structured, e.g. as a result of learning. How can mesoscopic mean-field theories, which so far assumed homogeneous (unstructured) connectivity, be generalized to heterogeneous, structured connectivity? |
+ | |||
+ | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Thibaud Taillefumier|2}} | ||
+ | ;Replica-mean field limits of metastable dynamics | ||
+ | [[File:Replica-mean field limits of metastable dynamics.jpg|200px|right]] | ||
+ | [[File:Replica-mean field limits of metastable dynamics.webm|thumb|Seminar video recording.]] | ||
+ | * Speaker: Thibaud Taillefumier, University of Texas at Austin | ||
+ | * Date: Tuesday, September 27, 2022 | ||
+ | * Abstract: In this talk, we propose to decipher the activity of neural networks via a “multiply and conquer” approach. This approach considers limit networks made of infinitely many replicas with the same basic neural structure. The key point is that these so-called replica-mean-field networks are in fact simplified, tractable versions of neural networks that retain important features of the finite network structure of interest. The finite size of neuronal populations and synaptic interactions is a core determinant of neural dynamics, being responsible for non-zero correlation in the spiking activity, but also for finite transition rates between metastable neural states. Theoretically, we develop our replica framework by expanding on ideas from the theory of communication networks rather than from statistical physics to establish Poissonian mean-field limits for spiking networks. Computationally, we leverage this replica approach to characterize the stationary spiking activity emerging in the replica mean-field limit via reduction to tractable functional equations. We conclude by discussing perspectives about how to predict transition rates in metastable networks from the characterization of their replica mean-field limit. | ||
+ | |||
+ | <u>Related publications:</u> | ||
+ | * Yu, Luyan and Taillefumier, Thibaud. (2022). Metastable spiking networks in the replica-mean-field limit. PLoS Computational Biology. 18, no. 6 (2022): e1010215. | ||
+ | * Baccelli, François, Michel Davydov, and Taillefumier Thibaud (2022). Replica-mean-field limits of fragmentation-interaction-aggregation processes. Journal of Applied Probability 59, no. 1 (2022): 38-59 | ||
+ | * Baccelli, François, and Taillefumier, Thibaud. The pair-replica-mean-field limit for intensity-based neural networks. SIAM Journal on Applied Dynamical Systems 20, no. 1 (2021): 165-207. | ||
+ | * Baccelli, François, and Taillefumier, Thibaud. Replica-mean-field limits for intensity-based neural networks. SIAM Journal on Applied Dynamical Systems 18, no. 4 (2019): 1756-1797. | ||
{{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Markus Diesmann|2}} | {{Pathways to the 2023 IHP thematic program Random Processes in the Brain/Section|Markus Diesmann|2}} |
Latest revision as of 19:42, 19 December 2022
HOME | PARTICIPANTS | WEBINARS | DISCUSSION | FURTHER CONTENT |