Extension of the Fixed-Rate Structured Vector quantizer to Vector Sources

Loading...
Thumbnail Image

Files

TR_91-106.pdf (1.23 MB)
No. of downloads: 472

Publication or External Link

Date

1991

Advisor

Citation

DRUM DOI

Abstract

The fixed-rate structured vector quantizer (SVQ) derived from a variable-length scalar quantizer was originally proposed for quantizing stationary memoryless sources. In this paper, the SVQ has been extended to a specific type of vector sources in which each component is a stationary memoryless scalar subsource in dependent of the other components. algorithms for the design and implementation of the original SVQ are modified to apply to this case. The resulting SVQ, referred to as the extended SVQ (ESVQ), is then used to quantize stationary sources with memory (with know autocorrelation function). This is done by first using a linear orthonormal block transformation, such as the Karhunen- Loeve transform, to decorrelate a block of source samples. The transform output vectors, which can be approximated as the output of an independent-component vector source, are then quantized using the ESVQ. Numerical results are presented for the quantization of first-order Gauss-Markov sources using this scheme. It is shown that ESVQ-based scheme performs very close to the entropy-coded transform quantization while maintaining a fixed-rate output and outperforms the fixed-rate scheme which uses scalar Lloyd-Marx quantization of the transform coefficients. Finally, it is shown that this scheme also performs better than implementable vector quantizers, specially at high rates.

Notes

Rights