Institute for Systems Research

Permanent URI for this communityhttp://hdl.handle.net/1903/4375

Browse

Search Results

Now showing 1 - 10 of 41
  • Thumbnail Image
    Item
    Dynamics of Neural Responses in Ferret Primary Auditory Cortex: I. Spectro-Temporal Response Field Characterization by Dynamic Ripple Spectra
    (1999) Depireux, Didier A.; Simon, J.Z.; Klein, David J.; Shamma, S.A.; ISR; CAAR
    To understand the neural representation of broadband, dynamic sounds in Primary Auditory Cortex (AI), we characterize responses using the Spectro-Temporal Response Field (STRF). The STRF describes and predicts the linear response of neurons to sounds with rich spectro-temporal envelopes. It is calculated here from the responses to elementary "ripples," a family of sounds with drifting, sinusoidal, spectral envelopes--the complex spectro-temporal envelope of any broadband, dynamic sound can expressed as the linear sum of individual ripples.

    The collection of responses to all elementary ripples is the spectro-temporal transfer function. Previous experiments using ripples with downward drifting spectra suggested that the transfer function is separable, i.e., it is reducible into a product of purely temporal and purely spectral functions.

    Here we compare the responses to upward and downward drifting ripples, assuming separability within each direction, to determine if the total bi-directional transfer function is fully separable. In general, the combined transfer function for two directions is not symmetric, and hence units in AI are not, in general, fully separable. Consequently, many AI units have complex response properties such as sensitivity to direction of motion, though most inseparable units are not strongly directionally selective.

    We show that for most neurons the lack of full separability stems from differences between the upward and downward spectral cross-sections, not from the temporal cross-sections; this places strong constraints on the neural inputs of these AI units.

  • Thumbnail Image
    Item
    Robust Spectro-Temporal Reverse Correlation for the Auditory System: Optimizing Stimulus Design
    (1999) Klein, David J.; Depireux, Didier A.; Simon, J.Z.; Shamma, S.A.; ISR; CAAR
    The spectro-temporal receptive field (STRF) is a functionaldescriptor of the linear processing of time-varying acoustic spectra by theauditory system. By cross-correlating sustained neuronal activity with the"dynamic spectrum" of a spectro-temporally rich stimulus ensemble, oneobtains an estimate of the STRF.

    In this paper, the relationship betweenthe spectro-temporal structure of any given stimulus and the quality ofthe STRF estimate is explored and exploited. Invoking the Fouriertheorem, arbitrary dynamic spectra are described as sums of basicsinusoidal components, i.e., "moving ripples." Accurate estimation isfound to be especially reliant on the prominence of components whosespectral and temporal characteristics are of relevance to the auditorylocus under study, and is sensitive to the phase relationships betweencomponents with identical temporal signatures.

    These and otherobservations have guided the development and use of stimuli withdeterministic dynamic spectra composed of the superposition of many"temporally orthogonal" moving ripples having a restricted, relevant rangeof spectral scales and temporal rates.

    The method, termedsum-of-ripples, is similar in spirit to the "white-noise approach," butenjoys the same practical advantages--which equate to faster and moreaccurate estimation--attributable to the time-domain sum-of-sinusoidsmethod previously employed in vision research. Application of the methodis exemplified with both modeled data and experimental data from ferretprimary auditory cortex (AI).

  • Thumbnail Image
    Item
    The Case of the Missing Pitch Templates: How Harmonic Templates Emerge in the Early Auditory System
    (1999) Shamma, Shihab; Klein, David J.; ISR
    Periodicity pitch is the most salient and important of all pitch percepts.Psycho-acoustical models of this percept have long postulated the existenceof internalized harmonic templates against which incoming resolved spectracan be compared, and pitch determined according to the best matchingtemplates cite{goldstein:pitch}.

    However, it has been a mystery where andhow such harmonic templates can come about. Here we present a biologicallyplausible model for how such templates can form in the early stages of theauditory system. The model demonstrates that {it any} broadband stimulussuch as noise or random click trains, suffices for generating thetemplates, and that there is no need for any delay-lines, oscillators, orother neural temporal structures.

    The model consists of two key stages:cochlear filtering followed by coincidence detection. The cochlear stageprovides responses analogous to those seen on the auditory-nerve andcochlear nucleus. Specifically, it performs moderately sharp frequencyanalysis via a filter-bank with tonotopically ordered center frequencies(CFs); the rectified and phase-locked filter responses are further enhancedtemporally to resemble the synchronized responses of cells in the cochlearnucleus.

    The second stage is a matrix of coincidence detectors thatcompute the average pair-wise instantaneous correlation (or product)between responses from all CFs across the channels. Model simulations showthat for any broadband stimulus, high coincidences occur between cochlearchannels that are exactly harmonic distances apart. Accumulatingcoincidences over time results in the formation of harmonic templates forall fundamental frequencies in the phase-locking frequency range.

    Themodel explains the critical role played by three subtle but importantfactors in cochlear function: the nonlinear transformations following thefiltering stage; the rapid phase-shifts of the traveling wave near itsresonance; and the spectral resolution of the cochlear filters. Finally, wediscuss the physiological correlates and location of such a process and itsresulting templates.

  • Thumbnail Image
    Item
    Commodity Trading Using Neural Networks: Models for the Gold Market
    (1997) Brauner, Erik; Dayhoff, Judith E.; Sun, Xiaoyun; ISR
    Essential to building a good financial forecasting model is having a realistic trading model to evaluate forecasting performance. Using gold trading as a platform for testing we present a profit based model which we use to evaluate a number of different approaches to forecasting. Using novel training techniques we show that neural network forecasting systems are capable of generating returns for above those of classical regression models.
  • Thumbnail Image
    Item
    Network Monitoring by Observing Message Codes
    (1996) De Almeida, Fernando A.; Baras, John S.; ISR; CSHCN
    The objective of this research is to develop methods to increase the reliability of a hybrid communications network. The main effort will be on early detection, insulation and diagnosis of critical trends in parameters that could lead the network to a potential failure.

    The work is initially planned to be divided in five parts: assessment of the raw data (measurements from the network), feature extraction (data preprocessing), classification (data clustering according to the network potential problems), system training (tuning methods) and self-improving (module learning capability).

    Raw data analysis: The network database will be accessed to identify the, measurement being performed. The data context and problem definition are, also part of the data analysis process. The problem, the solution to the problem and strategies for solving such problems will be defined according to the network management standards. The data organization, parameters being monitored, data context and its amount are the targets at this level of the research.

    Feature extraction: The data is preprocessed in such a way that events in the network are converted into a vector of parameters. The vector that is obtained is called "feature vector". The objective at this level is to transform data into information to be used further in the network monitoring system.

  • Thumbnail Image
    Item
    Convergence Analysis and Analog Circuit Applications for a Class of Networks of Nonlinear Coupled Oscillators
    (1996) Justh, Eric W.; Krishnaprasad, Perinkulam S.; Kub, Francis J.; ISR
    The physical motivation and rigorous proof of convergence for a particular network of nonlinear coupled oscillators are reviewed. Next, the network and convergence proof are generalized in several ways, to make the network more applicable to actual engineering problems. It is argued that such coupled oscillator circuits are more natural to implement in analog hardware than other types of dynamical equations because the signal levels tend to remain at sufficiently large values that effects of offsets and mismatch are minimized. Examples of how analog implementations of these networks are able to address actual control problems are given. The first example shows how a pair of coupled oscillators can be used to compensate for the feedback path phase shift in a complex LMS loop, and has potential application for analog adaptive antenna arrays or linear predictor circuits. The second example shows how a single oscillator circuit with feedback could be used for continuous wavelet transform applications. Finally, analog CMOS implementation of the coupled oscillator dynamics is briefly discussed.
  • Thumbnail Image
    Item
    Neural Modelling with Wavelets and Application in Adaptive/Learning Control
    (1995) Kugarajah, Tharmarajah; Krishnaprasad, P.S.; ISR
    Spatio-spectral properties of the Wavelet Transform provide a useful theoretical framework to investigate the structure of neural networks. A few researchers (Pati & Krishnaprasad, Zhang & Benveniste) have investigated the connection between neural networks and wavelet transforms. However, a number of issues remain unresolved especially when the connection is considered in the multidimensional case. In our work, we resolve these issues by extensions based on some theorems of Daubechies related to wavelet frames and provide a frame-work to analyze local learning in neural-networks.

    We also provide a constructive procedure to build networks based on wavelet theory. Moreover, cognizant of the problems usually encountered in practical implementations of these ideas, we develop a heuristic methodology, inspired by similar work in the area of Radial Basis Function (RBF) networks (Moody & Darken, Platt), to build a network sequentially on-line as well as off-line.

    We show some connections of our method to some existing methods such as Projection Pursuit Regression (Friedman), Hyper Basis Functions (Poggio & Girosi) and other methods that have been proposed in the literature on neural- networks as well as statistics. In particular, some classes of wavelets can also be derived from the regularization theoretical framework given by Poggio & Girosi.

    Finally, we choose direct nonlinear adaptive control to demonstrate the utility of the network in the context of local learning. Stability analysis is carried out within a standard Lyapunov formulation. Simulation studies show the effectiveness of these methods. We compare and contrast these methods with some recent results obtained by other researchers using Back Propagation (Feed-Forward) Networks, and Gaussian Networks.

  • Thumbnail Image
    Item
    Analysis of Dynamic Spectra in Ferret Primary Auditory Cortex: II. Prediction of, Unit Responses to Arbitrary Dynamic Spectra
    (1995) Kowalski, Nina; Depireux, Didier A.; Shamma, S.A.; ISR
    Responses of single units and unit clusters were recorded in the ferret primary auditory cortex (AI) using broadband complex dynamic spectra. Previous work (Kowalski et al 1995) demonstrated that simpler spectra consisting of single moving ripples (i.e., sinusoidally modulated spectral profiles that travel at a constant velocity along the logarithmic frequency axis) could be used effectively to characterize the response fields and transfer functions of AI cells. An arbitrary complex dynamic spectral profile can be thought of conceptually as being composed of a weighted sum of moving ripple spectra. Such a decomposition can be computed from a two-dimensional spectro- temporal Fourier transform of the dynamic spectral profile with moving ripples as the basis function. Therefore, if AI units were essentially linear satisfying the superposition principle, then their responses to arbitrary dynamic spectra could be predicted from the responses to single moving ripples, i.e., from the units response fields and transfer functions. This conjecture was tested and confirmed with data from 293 combinations of moving ripples, involving complex spectra composed of up to 15 moving ripples of different ripple frequencies and velocities. For each case, response predictions based on the unit transfer functions were compared to measured responses. The correlation between predicted and measured responses was found to be consistently high (84% with rho > 0.6). The distribution of response parameters suggest that AI cells may encode the profile of a dynamic spectrum by performing a multiscale spectro-temporal decomposition of the dynamic spectral profile in a largely linear manner.

  • Thumbnail Image
    Item
    Analysis of Dynamic Spectra in Ferret Primary Auditory Cortex: I. Characteristics of Single Unit Responses to Moving Ripple Spectra
    (1995) Kowalski, Nina; Depireux, Didier A.; Shamma, S.A.; ISR
    Auditory stimuli referred to as moving ripples are used to characterize the responses of both single and multiple units in the ferret primary auditory cortex (AI). Moving ripples are broadband complex sounds with sinusoidal spectral profiles that drift along the tonotopic axis at a constant velocity. Neuronal responses to moving ripples are locked to the phase of the ripple, i.e., they exhibit the same periodicity as that of the moving ripple profile. Neural responses are characterized as a function of ripple velocity (temporal property) and ripple frequency (spectral property). Transfer functions describing the response to these temporal and spectral modulations are constructed. Temporal transfer functions are inverse Fourier transformed to obtain impulse response functions that reflect the cell's temporal characteristics. Ripple transfer functions are inverse Fourier transformed to obtain the response field, characterizing the cell's response area along the tonotopic axis. These operations assume linearity in the cell's response to moving ripples. Separability of the temporal and ripple transfer functions is established by comparing transfer functions across different test parameters. Response fields measured with either stationary ripples or moving ripples are shown to be similar. Separability implies that the neuron can be modeled as processing spatio-temporal information in two distinct stages. The assumption of linearity implies that each of these stages is a linear operation.

    The ripples parameters that characterize cortical cells are distributed somewhat evenly, with the characteristic ripple frequencies ranging from 0.2 to over 2 cycles/octave and the characteristic angular frequency typically ranging from 2 to 20 Hz. Many responses exhibit periodicities not found in the spectral envelope of the stimulus. These periodicities are of two types. Slow rebounds with a period of about 150 ms appear with various strengths in about 30 % of the cells. Fast regular firings, with interspike intervals of the order of 10 ms are much less common and may reflect the ability of certain cells to follow the fine structure of the stimulus.

  • Thumbnail Image
    Item
    Dynamic Attractors and Basin Class Capacity in Binary Neural Networks
    (1995) Dayhoff, Judith E.; Palmadesso, Peter J.; ISR
    The wide repertoire of attractors and basins of attraction that appear in dynamic neural networks not only serve as models of brain activity patterns but create possibilities for new computational paradigms that use attractors and their basins. To develop such computational paradigms, it is first critical to assess neural network capacity for attractors and for differing basins of attraction, depending on the number of neurons and the weights. In this paper we analyze the attractors and basins of attraction for recurrent, fully-connected single layer binary networks. We utilize the network transition graph - a graph that shows all transitions from one state to another for a given neural network - to show all oscillations and fixed-point attractors, along with the basins of attraction. Conditions are shown whereby pairs of transitions are possible from the same neural network. We derive a lower bound for the number of transition graphs possible 2n2- n , for an n-neuron network. Simulation results show a wide variety of transition graphs and basins of attraction and sometimes networks have more attractors than neurons. We count thousands of basin classes - networks with differing basins of attraction - in networks with as few as five neurons. Dynamic networks show promise for overcoming the limitations of static neural networks, by use of dynamic attractors and their basins. We show that dynamic networks have high capacity for basin classes, can have more attractors than neurons, and have more stable basin boundaries than in the Hopfield associative memory.