Analyzing the Dynamics of Biological and Artificial Neural Networks with Applications to Machine Learning

Loading...
Thumbnail Image

Publication or External Link

Date

2024

Citation

Abstract

The study of the brain has profoundly shaped the evolution of computational learning models and the history of neural networks. This journey began in the 1940s with Warren McCulloch and Walter Pitts’ groundbreaking work on the first mathematical model of a neuron, laying the foundation for artificial neural networks. The 1950s and 60s witnessed a significant milestone with Frank Rosenblatt’s development of the perceptron, showcasing the potential of neural networks for complex computational tasks. Since then, the field of neural networks has witnessed explosive growth, and terms like “Artificial Intelligence” and “Machine Learning” have become commonplace across diverse fields, including finance,medicine, and science.

This dissertation explores the symbiotic parallels between neuroscience and machine learning, focusing on the dynamics of biological and artificial neural networks. We begin by examining artificial neural networks, particularly in predicting the dynamics of large, complex networks—a paradigm where traditional machine learning algorithms often struggle. To address this, we propose a novel approach utilizing a parallel architecture that mimics the network’s structure, achieving scalable and accurate predictions.

Shifting our focus to biological neuronal networks, we delve into the theory of critical systems. This theory posits that the brain, when viewed as a complex dynamical system, operates near a critical point, a state ideal for efficient information processing. A key experimental observation of this type of criticality is neuronal avalanches—scale-free cascades of neuronal activity—which have been documented both in vitro (in neuronal cultures and acute brain slices) and in vivo (in the brains of awake animals). Recent advancements in experimental techniques, such as multi-photon imaging and genetically encoded fluorescent markers, allow for the measurement of activity in living organisms with unparalleled single-cell resolution. Despite these advances, significant challenges remain when only a fraction of neurons can be recorded with sufficient resolution, leading to inaccurate estimations of power-law relationships in size, duration, and scaling of neuronal avalanches. We demonstrate that by analyzing simulated critical neuronal networks alongside real 2-photon imaging data, temporal coarse-graining can recover the critical value of the mean size vs. duration scaling of neuronal avalanches, allowing for more accurate estimations of critical brain dynamics even from subsampled data.

Finally, we bridge the gap between machine learning and neuroscience by exploring the concept of excitatory-inhibitory balance, a crucial feature of neuronal networks in the brain, within the framework of reservoir computing. We emphasize the stabilizing role of inhibition in reservoir computers (RCs), mirroring its function in the brain. We propose a novel inhibitory adaptation mechanism that allows RCs to autonomously adjust inhibitory connections to achieve a specific firing rate target, motivated by the firing rate homeostasis observed in biological neurons.

Overall, this dissertation strives to deepen the ongoing collaboration between neuroscience and machine learning, fostering advancements that will benefit both fields.

Notes

Rights