Title: Multiple Description Coding and Distributed Source Coding: Unexplored Connections in Information Theory and Coding Theory
1Multiple Description Coding and Distributed
Source CodingUnexplored Connections in
Information Theory and Coding Theory
- S. Sandeep Pradhan
- Department of EECS
- University of Michigan, Ann Arbor
- (joint work with R. Puri and K. Ramchandran of
University of California, Berkeley)
2Transmission of sources over packet networks
1
2
Packet Erasure Network
X
X
Encoder
Decoder
n
- Best Effort Networks modeled as packet erasure
channels. - User Datagram Protocol (UDP)
- Example Internet
- Multimedia over the Internet is growing fast
3Multiple Descriptions Source Coding
Distortion D1
R1
Side Decoder 1
Description 1
X
D0
MD Encoder
Central Decoder
Side Decoder 2
Description 2
Distortion D2
R2
Find the set of all achievable tuples
(R1,R2,D1,D2,D0)
4Prior Work
Information Theory (incomplete list)
- Cover-El Gamal, 80 achievable rate region
for 2-channel MD. - Ozarow 1981 Converse for Gaussian sources.
- Berger, Ahlswede, Zhang, 80-90.
- Venkataramani et al, 01 extension of cover-el
gamal region for n-cannels
Finite-block-length codes (incomplete list)
- Vaishampayan 93 MD scalar and vector
quantizers - Wang Orchard-Reibman 97 MD transform codes
- Goyal-Kovacevic 98 frames for MD
- Puri Ramchandran 01 FEC for MD
5Main idea in random codes for 2-channel
MD(Cover-El Gamal)
Fix
p(x1)
p(x2)
Find a pair of codewords that that jointly
typical with source word with respect to
p(x,x1,x2)
Possible if
6Possible ideas for n-channel MD ?
- Extend Cover-El Gamal random codes from 2 to n
- (Venkataramani et al.)
- Use maximum distance separable erasure (MDS)
codes - (Albanese et al., 95)
7 Erasure codes
- Erasure Codes (n, k, d) Add (n-k) parity
symbols - MDS Codes d n k 1
- MDS gt any k channel symbols gt k source
symbols.
ENC
DEC
C H A N N E l
Source
A subset
Source
Packets
8Fix Use many MDS codes
Example for 3-channels
(Albanese et al 95, Puri-Ramchandran 99)
Successively refinable source-encoded bit stream
Description 1
Description 2
Description 3
(3,1) (3,2) (3,3) MDS erasure codes
9What is new in our work?
- Symmetric problem, of descriptions gt 2
- Explore a fundamental connection between MD
coding and distributed source coding. - New rate region for MD random binning inspired
from distributed source coding - Constructions for MD extension of our earlier
work (DISCUS) on construction of coset codes for
distributed source coding.
10Idea 1 A new look at (n,1,n) MDS codes
- (n, 1, n) bit code
- All packets are identical (repetition)
- Reception of any one packet
- enables reconstruction
- Reception of more than one packet
- does not give better quality
-
- Parity bits wasted
11Idea 1 (contd) (n,1,n) source-channel erasure
code
- Independently quantized versions
- of X on every packet
- Reception of any one packet
- enables reconstruction
- Reception of more packets
- enables better reconstruction
- (estimation gains due to
- multiple looks!)
-
12Extensions to (n,k) source-channel codes
- Can we generalize this to (n,k) source-channel
codes? - Yes random binning (coset code) approach !
- Using Slepian-Wolf, Wyner-Ziv Theorems
A Conceptual leap using binning
(n,1) code
(n,k) code
13Idea 2 Consider a (3,2,2) MDS code
There is inherent uncertainty at the encoder
about which packets are received by the decoder.
Needs coding strategy where decoder has access to
some information while the encoder does not
distributed source coding
14Background Distributed source coding
(Slepian-Wolf 73, Wyner-Ziv 76, Berger 77)
- X and Y gt correlated sources
X
Encoder
X,Y
Decoder
Y
Encoder
- Exploiting correlation without direct
communication - Optimal rate region Slepian-Wolf 1973
15Distributed source coding (Contd)
Rate region
Random Partitions of typical sets
7 6 5 4 3
16Idea 2 (contd) Is there any telltale signs of
symmetric overcomplete partitioning in (3,2,2)
MDS codes
17Idea 2 (Contd)
Instead of a single codebook, build 3 different
codebooks (quantizers) and then partition
(overcomplete) them
18Problem Formulation
(n,k) source-channel erasure code
1
2
X
Packet Erasure Channel
X
Decoder
n
- Decoder starts reconstruction with mgt k packets
- Rate of transmission of every packet same
- Distortion gt only a function of of received
packets - Symmetric formulation, n gt2
19Problem Formulation Notation
- Source X q(x), Alphabet , BlocklengthL
- Bounded distortion measure
- Encoder
- Decoder
- Distortion with h packets
20Problem Statement (Contd.)
What is the best distortion tuple for a rate of R
bits/sample/packet?
21Example (3,2) Code
- (3,2) code (Yi) have same p.d.f.
- 3 codebooks each of rate I(XYi) are
constructed randomly. - Each is partitioned into exp2(LR) bins and
- of codewords in a bin is exponential in
--.I(Y1Y2) - Thus 2R I(XY1) I(XY2) - I(Y1Y2)
22Example of a Gaussian Source (3,2,2) code
Distortion
1 bit/sample/packet
23n-Channel Symmetric MD
Concatenation of (n,1), (n,2)(n,n)
source-channel erasure codes
Idea 3
24Key Concepts
- Multiple quantizers which can introduce
correlated quantization noise - MD Lattice VQ
(Vaishampayan, Sloane, Diggavi 01) - Computationally efficient multiple binning
schemes Symmetric distributed - source coding using coset codes
(Pradhan-Ramchandran 00, -
Schonberg, Pradhan, Ramchadran 03) - Note different from single binning schemes
-
(Zamir-Shamai 98, Pradhan-Ramchandran 99)
25A (3,2) Source Channel Lattice Code
26A (3,2) Source Channel Lattice Code
- Code of distance
- 5 overcomes
- correlation noise
- of 2.
27A (3,2) Source Channel Lattice Code
28A (3,2) Source Channel Lattice Code
- Partitioning through cosets constructive
counterpart of random bins.
29A (3,2) Source Channel Lattice Code
Suppose 2 observations Y1 and Y2.
Asymmetric case Y2 available at decoder.
A code that combats correlation noise ensures
decoding.
30A (3,2) Source Channel Lattice Code
Suppose 2 observations Y1 and Y2.
Symmetric case Split the generator vectors
of the code. 1 gets rows, 2 gets columns.
31A (3,2) Source Channel Lattice Code
Suppose 2 observations Y1 and Y2.
Symmetric case Split the generator
vectors of the code. 1 gets rows, 2 gets
columns
32A (3,2) Source Channel Lattice Code
Suppose 2 observations Y1 and Y2.
Symmetric case Split the generator vectors
of the code. 1 gets rows, 2 gets columns.
33A (3,2) Source Channel Lattice Code
34A (3,2) Source Channel Lattice Code
- Find 3 generator vectors such that any two
generate the code.
- 1 gets rows, 2 gets columns, 3 gets diagonal.
35A (3,2) Source Channel Lattice Code
- Find 3 generator vectors such that any two are
linearly independent. - 1 gets rows, 2 gets columns, 3 gets diagonal.
36Constructions for general n and k
- Choose a code (generator matrix G) that combats
correlation - noise. e.g.,
-
- Split the rows of G into k submatrices (k
generator sets S1, . Sk). e.g., - G1 5 0 and G2 0 5.
- Need a way to generate n generator sets out k
such that any k - of them are equivalent to G.
- Choose generator matrix M (dim. k x n) of an
(n,k) MDS block - code. Has the property that any k columns are
independent. e.g.,
37Constructions for general n and k
- Using weights from n columns one at a time,
linearly combine k - generator sets (S1, ., Sk) to come up with n
encoding matrices. e.g, - G1 5 0, G2 0 5, G3 5 5.
-
- Efficient algorithms for encoding and decoding
using coset code - framework (Forney 1991).
38Conclusions
- New rate region for n-channel MD problem
- A new connection between MD problem and
distributed source coding problem - A new application of multiple binning schemes
- Construction based on coset codes
- A nice synergy between quantization and MDS
erasure codes