Analyzing the Dynamics of Biological and Artificial Neural Networks with Applications to Machine Learning

dc.contributor.advisorGirvan, Michelleen_US
dc.contributor.authorSrinivasan, Keshaven_US
dc.contributor.departmentBiophysics (BIPH)en_US
dc.contributor.publisherDigital Repository at the University of Marylanden_US
dc.contributor.publisherUniversity of Maryland (College Park, Md.)en_US
dc.date.accessioned2024-09-23T06:13:12Z
dc.date.available2024-09-23T06:13:12Z
dc.date.issued2024en_US
dc.description.abstractThe study of the brain has profoundly shaped the evolution of computational learning models and the history of neural networks. This journey began in the 1940s with Warren McCulloch and Walter Pitts’ groundbreaking work on the first mathematical model of a neuron, laying the foundation for artificial neural networks. The 1950s and 60s witnessed a significant milestone with Frank Rosenblatt’s development of the perceptron, showcasing the potential of neural networks for complex computational tasks. Since then, the field of neural networks has witnessed explosive growth, and terms like “Artificial Intelligence” and “Machine Learning” have become commonplace across diverse fields, including finance,medicine, and science. This dissertation explores the symbiotic parallels between neuroscience and machine learning, focusing on the dynamics of biological and artificial neural networks. We begin by examining artificial neural networks, particularly in predicting the dynamics of large, complex networks—a paradigm where traditional machine learning algorithms often struggle. To address this, we propose a novel approach utilizing a parallel architecture that mimics the network’s structure, achieving scalable and accurate predictions. Shifting our focus to biological neuronal networks, we delve into the theory of critical systems. This theory posits that the brain, when viewed as a complex dynamical system, operates near a critical point, a state ideal for efficient information processing. A key experimental observation of this type of criticality is neuronal avalanches—scale-free cascades of neuronal activity—which have been documented both in vitro (in neuronal cultures and acute brain slices) and in vivo (in the brains of awake animals). Recent advancements in experimental techniques, such as multi-photon imaging and genetically encoded fluorescent markers, allow for the measurement of activity in living organisms with unparalleled single-cell resolution. Despite these advances, significant challenges remain when only a fraction of neurons can be recorded with sufficient resolution, leading to inaccurate estimations of power-law relationships in size, duration, and scaling of neuronal avalanches. We demonstrate that by analyzing simulated critical neuronal networks alongside real 2-photon imaging data, temporal coarse-graining can recover the critical value of the mean size vs. duration scaling of neuronal avalanches, allowing for more accurate estimations of critical brain dynamics even from subsampled data. Finally, we bridge the gap between machine learning and neuroscience by exploring the concept of excitatory-inhibitory balance, a crucial feature of neuronal networks in the brain, within the framework of reservoir computing. We emphasize the stabilizing role of inhibition in reservoir computers (RCs), mirroring its function in the brain. We propose a novel inhibitory adaptation mechanism that allows RCs to autonomously adjust inhibitory connections to achieve a specific firing rate target, motivated by the firing rate homeostasis observed in biological neurons. Overall, this dissertation strives to deepen the ongoing collaboration between neuroscience and machine learning, fostering advancements that will benefit both fields.en_US
dc.identifierhttps://doi.org/10.13016/fbaf-9coy
dc.identifier.urihttp://hdl.handle.net/1903/33409
dc.language.isoenen_US
dc.subject.pqcontrolledBiophysicsen_US
dc.subject.pqcontrolledNeurosciencesen_US
dc.subject.pqcontrolledComputational physicsen_US
dc.subject.pquncontrolledCriticalityen_US
dc.subject.pquncontrolledMachine Learningen_US
dc.subject.pquncontrolledNeuronal Avalanchesen_US
dc.subject.pquncontrolledReservoir Computingen_US
dc.subject.pquncontrolledSubsamplingen_US
dc.titleAnalyzing the Dynamics of Biological and Artificial Neural Networks with Applications to Machine Learningen_US
dc.typeDissertationen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Srinivasan_umd_0117E_24597.pdf
Size:
26.16 MB
Format:
Adobe Portable Document Format