Show simple item record

dc.contributor.authorBaras, John S.en_US
dc.contributor.authorLaVigna, Anthonyen_US
dc.date.accessioned2007-05-23T09:45:11Z
dc.date.available2007-05-23T09:45:11Z
dc.date.issued1990en_US
dc.identifier.urihttp://hdl.handle.net/1903/4963
dc.description.abstractKohonen's Learning Vector Quantization (LVQ) is a neural network architecture that performs nonparametric classification. It classifies observations by comparing them to k templates called Voronoi vectors. The locations of these vectors are determined from past labeled data through a learning algorithm. When learning is complete, the class of a new observation is the same as the class of the closest Voronoi vector. Hence LVQ is similar to nearest neighbors, except that instead of all of the past obervations being searched only the k Voronoi vectors are searched. In this paper, we show that the LVQ learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. We show that the learning algorithm performance stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in the convergence of the LVQ error to the Bayesian optimal error as the appropriate parameters become large.en_US
dc.format.extent937978 bytes
dc.format.mimetypeapplication/pdf
dc.language.isoen_USen_US
dc.relation.ispartofseriesISR; TR 1990-12en_US
dc.subjectSystems Integrationen_US
dc.titleConvergence of a Neural Network Classifieren_US
dc.typeTechnical Reporten_US
dc.contributor.departmentISRen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record