Convergence of a Neural Network Classifier

Loading...
Thumbnail Image

Files

TR_90-12.pdf (915.99 KB)
No. of downloads: 590

Publication or External Link

Date

1990

Advisor

Citation

DRUM DOI

Abstract

Kohonen's Learning Vector Quantization (LVQ) is a neural network architecture that performs nonparametric classification. It classifies observations by comparing them to k templates called Voronoi vectors. The locations of these vectors are determined from past labeled data through a learning algorithm. When learning is complete, the class of a new observation is the same as the class of the closest Voronoi vector. Hence LVQ is similar to nearest neighbors, except that instead of all of the past obervations being searched only the k Voronoi vectors are searched. In this paper, we show that the LVQ learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. We show that the learning algorithm performance stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in the convergence of the LVQ error to the Bayesian optimal error as the appropriate parameters become large.

Notes

Rights