A Structured Fixed-Rate Vector Quantizer Derived from Variable- Length Encoded Scalar Quantizers
Publication or External Link
The well-known error propagation problem inherent in any variable-length coding operation limits the usefulness of variable-length encoded entropy-constrained scalar quantizers when the quantizer outputs are to be transmitted over a noisy channel. In the absence of channel noise, however, these quantizers are known to perform better than error-minimizing fixed-rate Lloyd-Max quantizers for a wide class of memoryless sources. Motivated by this observation, in this paper we develop a fixed-rate vector quantization scheme which achieves performance close to that of optimum entropy-constrained scalar quantizers; due to the fixed-rate nature of the encoder, channel error propagation is not an issue any more. An algorithm for the design of this scheme is described and procedures for codebook search and codevector encoding are developed. We show that codebooks significantly larger than those in conventional vector quantizers can be designed. Numerical results demonstrating the efficacy of this scheme along with comparisons against Lloyd-Max quantizers and optimal entropy-constrained quantizers are rendered.