Vector Quantization Scalar Quantization Scalar Quantization v s

  • Slides: 25
Download presentation
矢量量化 (Vector Quantization) 赵胜辉

矢量量化 (Vector Quantization) 赵胜辉

Scalar Quantization

Scalar Quantization

Scalar Quantization v. s. Vector Quantization

Scalar Quantization v. s. Vector Quantization

Why VQ? x 2 x 1

Why VQ? x 2 x 1

Why VQ?

Why VQ?

Why VQ? • Memory advantage – Dependency between input samples – Vanishes if the

Why VQ? • Memory advantage – Dependency between input samples – Vanishes if the input samples are independent • Shape advantage – Better adaptation of VQ quantization point density to the PDF of input – Vanishes in the case of entropy-constrained quantization • Space-filling advantage – Greater freedom of VQ in selecting quantization cell shapes – The advantage of an infinite-dimension VQ is 0. 255 bits per dimension for the squared-error distortion

How to do VQ? • S. P. Lloyd, “Least squares quantization in PCM, ”

How to do VQ? • S. P. Lloyd, “Least squares quantization in PCM, ” IEEE Trans. Inform. Theory, vol. IT-28, pp. 129 -137, 1982 – Lloyd algorithm (k-means algorithm) Generalized Lloyd algorithm (GLA) • Y. Linde, A. Buzo, and R. Gray, “An algorithm for vector quantizer design, ” IEEE Trans. Comm. , vol. COM-28, pp. 84 -95, 1980 • An iterative method that guarantee only local optimality

How to do VQ? • Two optimality conditions – Optimizing the encoder 最近邻准则 –

How to do VQ? • Two optimality conditions – Optimizing the encoder 最近邻准则 – Optimizing the decoder Yi = argmin E[d(X, Z)︱X∈Vi ] Z ∈Rk Yi = E[X︱X∈Vi ] Yi = 1/Ni ∑X X∈Vi k-means

Discrete GLA

Discrete GLA

Some implementation problems • Large computational complexity due to the exhaustive codebook searching •

Some implementation problems • Large computational complexity due to the exhaustive codebook searching • Codebook storage • Large computational complexity due to codebook training Dimension (codeword length) Codebook size The size of training data

Structured VQ • • • Tree-structured VQ Multi-stage VQ Split VQ Gain-Shape VQ Mean-Removed

Structured VQ • • • Tree-structured VQ Multi-stage VQ Split VQ Gain-Shape VQ Mean-Removed VQ

Tree-structured VQ

Tree-structured VQ

Multi-stage VQ

Multi-stage VQ

Split VQ X 1: (x 1, x 2, x 3, x 4) X: (x

Split VQ X 1: (x 1, x 2, x 3, x 4) X: (x 1, x 2, x 3, …, x 8) X 2: (x 5, x 6, x 7, x 8)

Gain-Shape VQ

Gain-Shape VQ

Mean-Removed VQ Mean vector codebook + Mean-removed vector codebook

Mean-Removed VQ Mean vector codebook + Mean-removed vector codebook

Rate-constrained VQ vs. Entropy-Constrained VQ • The optimal quantizer means minimizing the average distortion.

Rate-constrained VQ vs. Entropy-Constrained VQ • The optimal quantizer means minimizing the average distortion. • The resolution constraint limits the size of the codebook, i. e. , fixed-rate. • The entropy constraint limits the entropy for the quantization indices, i. e. , variable-rate.