Convergence of a Neural Network Classifier

dc.contributor.authorBaras, John S.en_US
dc.contributor.authorLaVigna, Anthonyen_US
dc.contributor.departmentISRen_US
dc.date.accessioned2007-05-23T09:45:11Z
dc.date.available2007-05-23T09:45:11Z
dc.date.issued1990en_US
dc.description.abstractKohonen's Learning Vector Quantization (LVQ) is a neural network architecture that performs nonparametric classification. It classifies observations by comparing them to k templates called Voronoi vectors. The locations of these vectors are determined from past labeled data through a learning algorithm. When learning is complete, the class of a new observation is the same as the class of the closest Voronoi vector. Hence LVQ is similar to nearest neighbors, except that instead of all of the past obervations being searched only the k Voronoi vectors are searched. In this paper, we show that the LVQ learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. We show that the learning algorithm performance stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in the convergence of the LVQ error to the Bayesian optimal error as the appropriate parameters become large.en_US
dc.format.extent937978 bytes
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/1903/4963
dc.language.isoen_USen_US
dc.relation.ispartofseriesISR; TR 1990-12en_US
dc.subjectSystems Integrationen_US
dc.titleConvergence of a Neural Network Classifieren_US
dc.typeTechnical Reporten_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
TR_90-12.pdf
Size:
915.99 KB
Format:
Adobe Portable Document Format