The Adaptive Time-Delay Neural Network: Characterization and Applications to, Pattern Recognition, Prediction and Signal Processing

Loading...
Thumbnail Image

Files

PhD_94-12.pdf (7.21 MB)
No. of downloads: 596

Publication or External Link

Date

1994

Citation

DRUM DOI

Abstract

Dynamic analysis of temporally changing signals is a key issue in real-time signal processing and understanding. Such changing signals may arise from moving objects in visual images, spoken words, target trajectories, medical recording and other kinds of sensor data in a wide variety of applications. Neural network architectures with dynamic and temporal capabilities are promising for signal analysis. In this dissertation, we describe a solution to temporal problems with a flexible architecture: the Adaptive Time-Delay Neural Network (ATNN). A rich repertoire of capabilities of the ATNN are characterized in this thesis. A series of novel applications are implemented that for the first time show the capabilities and promise of the ATNN network architecture on classes of engineering problems.

Inspired by the time delays that occur in neurobiological signal transmissions, the dynamic properties of this network are formulated on-line through the adaptation of its internal parameters based on gradient descent rules according to the evolution of observed inputs and outputs. An arbitrary number of interconnections with different time delays are placed between any two processing units, and time delays are adapted, unconstrained with respect to one another. Thus the network automatically attempts to optimize its configuration, which overcomes limitations of previous works.

We have proposed a network analysis tool, the Network Unfolding algorithm (NUA), and with complexity analysis of the algorithm, demonstrated that the ATNN reduces the cost of network complexity at least by a factor of O(n) compared to a standard Backpropagation network. We apply the theorems of Funahashi, Hornik et al and Stone-Weierstrass to state the general function approximation ability of the ATNN. We have evaluated the behavior of internal hidden units over a series of applications.

The ATNN is capable of spatiotemporal trajectory learning, trajectory generation, and pattern completion with different topologies, different training arcs, various data sampling rates, and any number of dimensions. Relationships have been discovered between the complexity of the topology and the number of hidden units, between pattern generation abilities and different training arcs and data sampling rates. It is shown that the ATNN is resilient to perturbations in the data, and is capable of recovering a spatiotemporal signal from noisy data. With a special recurrent configuration, the ATNN has the ability to learn closed trajectories. these trajectories become trained attractors of the dynamic network.

The ATNN is applied to chaotic time series prediction, and we demonstrate that this complicated prediction problem can be accomplished by a simple ATNN architecture. Furthermore, the ATNN can detect the embedded dynamics. A real world problem of distinguishing multiple type components of exo- atmospheric targets is demonstrated by applying the ATNN. The ATNN performs highly complex mapping on a limited set of training data and achieves better generalization to overall trends of situations. Its performance outstrips that of related neural networks.

Notes

Rights