Thumbnail Image


Publication or External Link





The dynamics of large systems of many nonlinearly evolving units is a general research area that has great importance for many areas in science and technology, including biology, computation by artificial neural networks, statistical mechanics, flocking in animal groups, the dynamics of coupled neurons in the brain, and many others. While universal principles and techniques are largely lacking in this broad

area of research, there is still one particular phenomenon that seems to be broadly applicable. In particular, this is the idea of emergence, by which is meant macroscopic behaviors that “emerge” from a large system of many “smaller or simpler entities such that ... large entities” [i.e., macroscopic behaviors] arise which “exhibit properties the smaller/simpler entities do not exhibit.” [1]. In this thesis we investigate mechanisms and manifestations of emergence in four dynamical systems consisting many nonlinearly evolving units. These four systems are as follows.

(a) We first study the motion of a large ensemble of many noninteracting particles in a slowly changing Hamiltonian system that undergoes a separatrix crossing. In such systems, we find that separatrix-crossing induces a counterintuitive effect. Specifically, numerical simulation of two sets of densely sprinkled initial conditions on two energy curves appears to suggest that the two energy curves, one originally enclosing the other, seemingly interchange their positions. This, however, is topologically forbidden. We resolve this paradox by introducing a numerical simulation method we call “robust” and study its consequences.

(b) We next study the collective dynamics of oscillatory pacemaker neurons in Suprachiasmatic Nucleus (SCN), which, through synchrony, govern the circadian rhythm of mammals. We start from a high-dimensional description of the many coupled oscillatory neuronal units within the SCN. This description is based on a forced Kuramoto model. We then reduce the system dimensionality by using the Ott Antonsen Ansatz and obtain a low-dimensional macroscopic description. Using this reduced macroscopic system, we explain the east-west asymmetry of jet-lag recovery and discus the consequences of our findings.

(c) Thirdly, we study neuron firing in integrate-and-fire neural networks. We build a discrete-state/discrete-time model with both excitatory and inhibitory neurons and find a phase transition between avalanching dynamics and ceaseless firing dynamics. Power-law firing avalanche size/duration distributions are observed at critical parameter values. Furthermore, in this critical regime we find the same power law exponents as those observed from experiments and previous, more restricted, simulation studies. We also employ a mean-field method and show that inhibitory neurons in this system promote robustness of the criticality (i.e., an enhanced range of system parameter where power-law avalanche statistics applies).

(d) Lastly, we study the dynamics of “reservoir computing networks” (RCN’s), which is a recurrent neural network (RNN) scheme for machine learning. The ad- vantage of RCN’s over traditional RNN’s is that the training is done only on the output layer, usually via a simple least-square method. We show that RCN’s are very effective for inferring unmeasured state variables of dynamical systems whose system state is only partially measured. Using the examples of the Lorenz system and the Rossler system we demonstrate the potential of an RCN to perform as an universal model-free “observer”.