Analysis and Synthesis of Feedforward Neural Networks Using Discrete Affine Wavelet Transformations
Files
Publication or External Link
Date
Advisor
Citation
DRUM DOI
Abstract
In this paper we develop a theoretical description of standard feedfoward neural networks in terms of discrete affine wavelet transforms. This description aids in establishing a rigorous understanding of the behavior of feedforward neural networks based upon the properties of wavelet transforms. Time-frequency localization properties of wavelet transforms are shown to be crucial to our formulation. In addition to providing a solid mathematical foundation for feedforward neural networks, this theory may prove useful in explaining some of the empirically obtained results in the field of neural networks. Among the more practical implications of this work are the following: (1) Simple analysis of training data provides complete topological definition for a feedforward neural network. (2) Faster and more efficient learning algorithms are obtained by reducing the dimension of the parameter space in which interconnection weights are searched for. This reduction of the weight space is obtained via the same analysis used to configure the network. Global convergence of the iterative training procedure discussed here is assured. Moreover, it is possible to arrive at a non-iterative training procedure which involves solving a system of linear equations. (3) Every feedforward neural network constructed using our wavelet formulation is equivalent to a 'standard feedforward network.' Hence properties of neural networks, which have prompted the study of VLSI implementation of such networks are retained.