Distributed Source Coding Using Syndromes DISCUS Design and
Distributed Source Coding Using Syndromes (DISCUS): Design and Construction S. Sandeep Pradhan, Kannan Ramchandran IEEE Transactions on Information Theory, vol. 49, no. 3, pp. 626 -643, Mar 2003 2020/10/7 1
Outline n Introduction n Preliminaries Encoding with a Fidelity Criterion n Problem Formulation ¨ Design Algorithm ¨ Constructions based on Trellis Codes ¨ n n 2020/10/7 Simulation Results Conclusion 2
Introduction n Slepian-Wolf theorem: By knowing joint distribution of X and Y, without explicitly knowing Y, encoder of X can perform as well as encoder who knows Y. (a) Both encoder and decoder have access to side information Y (b) Only decoder has access to side information Y 2020/10/7 3
Introduction n Wyner-Ziv Problem: If decoder knows Y, then the information-theoretic ratedistortion performance for coding X is identical, no matter encoder knows Y or not. (X &Y are Gaussian. ) Source: discrete-alphabet continuous-valued Compression: lossless lossy n Prior work on source quantizer design. n Contributions: ¨ Construction of a framework resting on algebraic channel coding principles ¨ Performance analysis on Gaussian signals. 2020/10/7 4
Outline n Introduction n Preliminaries n Encoding with a Fidelity Criterion Problem Formulation ¨ Design Algorithm ¨ Constructions based on Trellis Codes ¨ n n 2020/10/7 Simulation Results Conclusion 5
Preliminaries n Example: X, Y: equiprobable 3 -bit binary words Hamming distance is no more than 1. Y is available to decoder. Solution? Cosets: {000, 111}, {100, 011}, {010, 101}, {001, 110} Only transmit coset index/syndrome. 2020/10/7 6
Preliminaries n Quantization: Digitizes an analog signal. Two parameters: a partition and a codebook. -0. 5 1 +3 Codebook: [-2, 0. 4, 2. 3, 6] 2020/10/7 7
Preliminaries n Lloyd Max Quantization: partition: ai are midpoints. codebook: yi are centroids. Optimal scalar quantization. yi-2 2020/10/7 ai-1 yi-1 ai yi 8
Preliminaries n Trellis Coded Quantization (TCQ): [24] ¨ Dual of TCM ¨ Example: n Uniformly distributed source in [-A, A] ¨ Implemented by Viterbi algorithm [24] M. W. Marcellin and T. R. Fischer, “Trellis coded quantization of memoryless and Gauss-Markov sources, ” IEEE Trans. Commun. , vol. 38, pp. 82– 93, Jan. 1990. 2020/10/7 9
Outline n Introduction Preliminaries n Encoding with a Fidelity Criterion n Problem Formulation ¨ Design Algorithm ¨ Constructions based on Trellis Codes ¨ n n 2020/10/7 Simulation Results Conclusion 10
Encoding with a Fidelity Criterion n Problem Formulation ¨ X, Y: correlated, memoryless, i. i. d distributed sequences Yi = Xi + Ni Xi, Yi, Ni: continuous-valued Ni: i. i. d distributed, independent from X Xi, Ni: zero-mean Gaussian random variables with known variance Decoder alone has access to Y. Goal: Form best approximation to X given R bits per sample Encoding in blocks of length L Distortion measure: ¨ Min R, s. t. reconstruction fidelity is less than given value D. ¨ ¨ ¨ ¨ 2020/10/7 11
Encoding with a Fidelity Criterion System Model: encoder and decoder. Interplay of source coding, channel coding and estimation 2020/10/7 12
Encoding with a Fidelity Criterion n Design Algorithm Source Coding (M 1, M 2): n Partition source space: n Defining source codebook (S) n Characterizing active codeword by W (r. v. ) ¨ Estimation (M 3): Get best estimate of X (minimizing distortion) conditioned on outcome of Y and the element in. ¨ Channel Coding (M 4, M 5): n Transmit over an error-free channel with rate R (less than Rs) n Doable: I(W; Y) > 0, so H(W|Y) = H(W) – I(W; Y) n Build channel code with rate Rc on channel P(Y|W) n R = Rs – Rc. ¨ 2020/10/7 13
Encoding with a Fidelity Criterion n Summary of Design Algorithm: ¨ M 1 and M 3: n minimize Rs, s. t. reconstruction distortion within given criterion. ¨ M 2: maximize I(W; Y). ¨ M 4 : n maximize Rc, s. t. error probability meets a desired tolerance level. ¨ M 5 : 2020/10/7 minimize decoding computational complexity. 14
Encoding with a Fidelity Criterion n Scalar Quantization and Memoryless Coset Construction (C 1): ¨ Lloyd-Max (memoryless) quantizer ¨ Memoryless coset partition (M 4) ¨ Example: L=1, (sample by sample) Quantization codebook: {r 0, r 1, …, r 7}, (Rs = 3) Channel coding codebook: {r 0, r 2, r 4, r 6}, {r 1, r 3, r 5, r 7}. (Rc = 2) R = Rs – Rc = 1 bit/sample. 2020/10/7 15
Encoding with a Fidelity Criterion n Scalar Quantization and Trellis-Based Coset Construction (C 2): ¨ Scalar quantizer for {Xi}i=1 L ¨ Coset partition (M 4) by trellis code. Codebook (size of 8 L), Rs = 3 bits/sample, two cosets 2020/10/7 16
Encoding with a Fidelity Criterion n Example: Computing syndrome (Rs = 3, Rc = 2) outcome of quantization be 7, 3, 2, 1, 4. L = 5, Syndrome is given by 10110 for 5 samples. 2020/10/7 17
Encoding with a Fidelity Criterion n Trellis-Based Quantization and Memoryless Coset Construction (C 3): ¨ Trellis coded quantizer ¨ Memoryless coset partition ¨ Example: Quantization codebook: Rs = 2 D 0={r 0, r 4}, D 1={r 1, r 5}, D 2={r 2, r 6}, D 3={r 3, r 7}. Memoryless channel code: Rc = 1 1 coded bit with another 1 uncoded bit (from Y) to recover Di. 2020/10/7 18
Encoding with a Fidelity Criterion n Trellis-Based Quantization and Trellis-Based Coset Construction (C 4): ¨ Trellis coded quantizer ¨ Trellis coded coset partition Comparison between C 3 and C 4. 2020/10/7 19
Encoding with a Fidelity Criterion n Distance Property ¨ Given a uniform partition, four cases of coset constructions have same distance property. ¨ Non-uniform quantizer, analyze performance by simulations. 2020/10/7 20
Outline n n n Introduction Preliminaries Encoding with a Fidelity Criterion Problem Formulation ¨ Design Algorithm ¨ Four Constructions ¨ n Simulation Results n Conclusion 2020/10/7 21
Simulation Results Correlation -SNR: ratio of X’s variance and N’s variance. Quantization levels decrease distortion. (C 1) 2020/10/7 22
Simulation Results Correlation -SNR: ratio of X’s variance and N’s variance. Quantization levels increase prob. Of error. (C 1) 2020/10/7 23
Simulation Results Correlation -SNR: ratio of X’s variance and N’s variance. 2020/10/7 Error probability comparison of C 1 and C 2 (3 -4 d. B gain) 24
Simulation Results Correlation -SNR: ratio of X’s variance and N’s variance. Error probability of C 4 codes. 2020/10/7 25
Conclusions n n Constructive practical framework based on algebraic trellis codes. Promising performance. 2020/10/7 26
- Slides: 26