PROCESSING INFORMATION ON INTERMEDIATE TIMESCALES WITHIN RECURRENT NEURAL NETWORKS
Butts, Dan A
MetadataShow full item record
The cerebral cortex has remarkable computational abilities; it is able to solve prob- lems which remain beyond the most advanced man-made systems. The complexity arises due to the structure of the neural network which controls how the neurons interact. One surprising fact about this network is the dominance of ‘recurrent’ and ‘feedback’ connections. For example, only 5-10% of connections into the earliest stage of visual processing are ‘feedforward’, in that they carry information from the eyes (via the Lateral Geniculate Nucleus). One possible reason for these connec- tions is that they allow for information to be preserved within the network; the underlying ‘causes’ of sensory stimuli usually persist for much longer than the time scales of neural processing, and so understanding them requires continued aggrega- tion of information within the sensory cortices. In this dissertation, I investigate several models of such sensory processing via recurrent connections. I introduce the transient attractor network, which depends on recurrent plastic connectivity, and demonstrate in simulations how it might be involved in the processes of short term memory, signal de-noising, and temporal coherence analysis. I then show how a certain recurrent network structure might allow for transient associative learning to occur on the timescales of seconds using presynaptic facilitation. Finally, I consider how auditory scene analysis might occur through ‘gamma partitioning’. This process uses recurrent excitatory and inhibitory connections to preserve information within the neural network about its recent state, allowing for the separation of auditory sources into different perceptual cycles.