Distributed Source Coding - PowerPoint PPT Presentation

1 / 68
About This Presentation
Title:

Distributed Source Coding

Description:

Distributed Source Coding By Raghunadh K Bhattar, EE Dept, IISc Under the Guidance of Prof.K.R.Ramakrishnan – PowerPoint PPT presentation

Number of Views:109
Avg rating:3.0/5.0
Slides: 69
Provided by: Rag106
Category:

less

Transcript and Presenter's Notes

Title: Distributed Source Coding


1
Distributed Source Coding
  • By
  • Raghunadh K Bhattar, EE Dept, IISc
  • Under the Guidance of
  • Prof.K.R.Ramakrishnan

2
Outline of the Presentation
  • Introduction
  • Why Distributed Source Coding
  • Source Coding
  • How Source Coding Works
  • How Channel Coding Works
  • Distributed Source Coding
  • Slepian-Wolf Coding
  • Wyner-Ziv Coding
  • Applications of DSC
  • Conclusion

3
Why Distributed Source Coding ?
  • Low Complexity Encoders
  • Error Resilience Robust to transmission errors
  • The above two attributes make the DSC an enabling
    technology for wireless communications

4
Low Complexity Wireless Handsets
Courtesy Nicolas Gehrig
5
Distributed Source Coding (DSC)
  • Compression of Correlated Sources Separate
    Encoding Joint Decoding

6
Source Coding (Data Compression)
  • Exploit the redundancy in the source to reduce
    the data required for storage or for transmission
  • Highly complex encoders are required for
    compression (MPEG, H.264 )
  • However, Simple decoders !
  • The Highly complex encoders require
  • Bulky handsets
  • Power consumption
  • Battery Life

7
How Source Coding Works
  • Types of redundancy
  • Spatial redundancy - Transform or predictive
    coding
  • Temporal redundancy - Predictive coding
  • In predictive coding, the next value in the
    sequence is predicted from the past values and
    the predicted value is subtracted from the actual
    value
  • The difference is only sent to the decoder
  • Let the Past values are in C, the predicted value
    is y f(C). If the actual value is x, then (x
    y) sent to the decoder.

8
  • Decoder, knowing the past values C, can also
    predict the value of y.
  • With the knowledge of (x y), the decoder finds
    the value of x, which is the desired value

9
x - y
x

-
y
Prediction
Past Values (C)
Encoder
Decoder
10
(No Transcript)
11
Compression Toy Example
Suppose, X and Y Two uniformly distributed
i.i.d Sources. X, Y X ? 3bits If they are
related (i.e., correlated) 000 Y ? 3bits Can
we reduce the data rate? 001 Let the relation
be 010 X and Y differ at most by one
bit 011 i.e., the Hamming distance
100 between X and Y is maximum
one 101 110 111
12
H(X) H(Y) 3bits Let Y 101 then X 101 (0),
100 (1), 111 (2), 001 (3) Code X?Y H(X/Y)
2bits Need 2bits to transmit X and 3bits for Y
and total 5 bits for both X and Y instead of
6bits. Here we should know what is the outcome
of Y, then we code the X with 2bits. Decoding
Y ?Code Code 0 000, 1 001, 2 010, 3 100
13
  • Now assume that, we dont know the outcome of the
    Y (but sent to decoder using 3bits), can I still
    transmit X using 2 bits?
  • The Answer is YES (Surprisingly!)
  • How?

14
Partition
  • Group all the symbols into four groups each
    consists of two members
  • (000),(111) 0 Trick Partition each
  • (001),(110) 1 set with Hamming
  • (010),(101) 2 distance 3
  • (100),(011) 3

15
  • The encoding of X is simply done by sending the
    index of the set that actually contains X
  • Let X (100) then the index for X 3
  • Let the decoder already received a correlated Y
    (101)
  • How we recover X knowing the Y (101) (from now
    onwards call this as side information) at decoder
    and index (3) from X

16
  • Since, index is 3 we know that the value of X is
    either (100) or (011)
  • Measure the Hamming distance between the two
    possible values of X with side information Y
  • (100)?(101) (001) Ham dis 1
  • (011)?(101) (110) Ham dis 2
  • ? X (100)

17
Source Coding
Y Code X
000 001 001 X
001 001 000 X
010 001 011 X
011 001 010 X
100 001 010 X
101 001 100 ?
110 001 111 X
111 001 110 X
  • Y 101
  • X 100
  • Code (100)?(101)
  • 001 1
  • Decoding
  • Y?Code
  • (101)?(001)
  • 100 X

18
Distributed Source Coding
Side Information Decoding Output
Y 000 011?000 2 100?000 1 X 100
Y 110 011?110 2 100?110 1 X 100
Y 101 011?101 2 100?101 1 X 100
Y 100 011?100 3 100?100 0 X 100
Y 111 011?111 1 100?111 2 X 011
X 100 Code 3
Correlated Side Information
No Error in Decoding
Erroneous Decoding
Uncorrelated Side Information
19
  • How to partition the input sample space? Always
    I have to find some trick ? If input sample space
    is large (even if few hundreds), can I still find
    the trick???
  • The trick is matrix multiplication and we have to
    have one such matrix, which partition the input
    space.
  • For above toy example the matrix is

Index XHT in GF(2) field H is the parity
check matrix in Error correction terminology
20
Coset Partition
  • Now, we see again the partitions.
  • (000),(111) 0 This is the repetition code
  • (001),(110) 1 (in error correction
  • (010),(101) 2 terminology.)
  • (100),(011) 3 These are the Cosets of
  • the repetition code induced by the elements of
    the sample space of X

21
Channel Coding
  • In channel coding, controlled redundancy is added
    to the information bits to protect the them from
    channel noise
  • We can classify channel coding or error control
    coding into two categories
  • Error Detection
  • Error Correction
  • In Error Detection, the introduced redundancy is
    just enough to detect errors
  • In Error Correction, we need to introduce more
    redundancy.

22
dmin 1
dmin 2
dmin 3
23
Parity Check
  • X Parity
  • 000 0
  • 001 1
  • 010 1
  • 011 0
  • 100 1
  • 101 0
  • 110 0
  • 111 1

Minimum Hamming Distance 2 How to make Hamming
distance 3 ? It is not clear (or not easy to make
minimum hamming distance 3)
24
(No Transcript)
25
(No Transcript)
26
Slepian-Wolf theorem The Slepian-Wolf theorem
states that the correlated sources that dont
communicate each other can be coded at a rate
equal to the rate at which they are coded
jointly. No performance loss occurs if they are
decoded jointly.
  • When correlated sources are coded independently,
    but decoded jointly, then the minimum data rate
    for each source is lower bounded by
  • Total data rate should be atleast equal to (or
    greater) than H(X,Y) and individual data rates
    should be atleast equal to (or greater than)
    H(X/Y) and H(Y/X) respectively
  • J. D. Slepian and J. K. Wolf, Noiseless coding
    of correlated information sources, IEEE Trans.
    Inf. Theory, vol. 19, pp. 471480, July 1973.

27
DISCUS (DIstributed Source Coding UsingSyndrome)
  • The first constructive realization of the
    Slepian-Wolf boundary using practical channel
    codes was proposed where single-parity check
    codes were used with the binning scheme.
  • Wyner first proposed to use capacity achieving
    binary linear channel code to solve the SW
    compression problem for a class of joint
    distributions
  • DISCUS extended the results of Wyner idea to the
    distributed rate-distortion (lossy compression)
    problem using channel codes
  • S. Pradhan and K.Ramchandran, Distributed source
    coding using syndrome(DISCUS), in IEEE Data
    Compression Conference, DCC-1999, Snowbird,UT,
    1999.

28
Distributed Source Coding (Compression with
Side Information)
Statistically dependent
Side Information Available at Decoder Lossless
29
Achievable Rate Region - SWC
?
Rx gt H(X/Y) Ry gt H(Y)
Ry
Separate Coding No Errors Rx gt H(X) Ry gt H(Y)
H(X,Y)
H(Y)
Achievable Rates with Slepian-Wolf Coding
A
C
Rx gt H(X/Y) Ry gt H(Y)
H(Y/X)
B
Rx Ry H(X,Y) Joint Encoding and Decoding
0
Rx
H(X/Y)
H(X)
H(X,Y)
30
(No Transcript)
31
How Compression Works ?
Compressed Data (Decorrelated Data)
Redundant Data (Correlated Data)
Remove Redundancy
How Channel Coding Works ?
Decorrelated Data

Correlated Data
Redundant Data Generator
32
Duality Between Source Coding and Channel Coding
Source Coding Channel Coding
Compress the Data Expands the Data
De-correlates the Data Correlates the Data
Complex Encoder Simple Encoder
Simple Decoder Complex Decoder
33
Channel Coding or Error Correction Coding
Information Bits
k
Channel (Additive Noise)
Channel Decoding
Code Word
n
Parity Bits
n - k
34
Channel Codes for DSC
Decompression

Channel Coding
X
x
Channel Decoding


Correlation Model ( Noise)
x
35
Turbo Coder for Slepian-Wolf Encoding
Curtsey Anne Aaron and Bernd Girod
36
Turbo Decoder for Slepian-Wolf Decoding
Pchannel
SISO Decoder
Pa posteriori
Pextrinsic
Pa priori
Pextrinsic
Pa priori
SISO Decoder
Pchannel
Pa posteriori
Curtsey Anne Aaron and Bernd Girod
37
Wyners Scheme
  • Use a linear block code, send syndrome
  • (n,k) block code, 2(n-k) syndromes, each
    corresponding to a set of 2k words of length n.
  • Each set is a coset code.
  • Compression ratio of n(n-k).

A D Wyner, "Recent Results in the Shannon Theory
in IEEE Transactions On Information Theory, VOL.
IT-20, NO. 1, JANUARY 1974 A. D. Wyner, On
source coding with side information at the
decoder, IEEE Trans. Inf. Theory, vol. 21, no.
3, pp. 294300, May 1975.
38
Linear Block Codes for DSC
Decompression
Compressed Data
Syndrome Decoding
Syndrome Former
n
n - k
x

Corrupted Codeword
x
H

n
Correlation Model for Side Information


Correlation Model ( Noise)
Compression Ratio
x
39
LDPC Encoder (Syndrome Former Generator)
X
Compressed Data
Syndrome (s)
Y
Decompressed Data
s
LDPC Decoder
Entropy Coding
Side Information (Y)
40
Correlation Model
41
The Wyner-Ziv theorem
  • Wyner and Ziv extended the work by Slepian and
    Wolf by studying the lossy case in the same
    scenario, where signals X and Y are statistically
    dependent.
  • Y is transmitted at a rate equal to its entropy
    (Y is then called Side Information) and what
    needs to be found is the minimum transmission
    rate for X that introduces no more than a certain
    distortion D.
  • The Wyner-Ziv rate-distortion function, which is
    the lowest bound for Rx.
  • For MSE distortion and Gaussian statistics,
    rate-distortion functions of the two systems are
    the same.
  • A.D.Wyner and J.Ziv, The rate distortion
    function for source coding with side information
    at the decoder, IEEE Transactions on Information
    theory, vol. 22, no. 1, pp. 110, January 1976.

42
Wyner-Ziv Codec
  • A codec that intends to separately encode signals
    X and Y while jointly decoding them, but does not
    aim at recovering them perfectly, it expects some
    distortion D in the reconstruction is called a
    Wyner-Ziv codec.

43
Wyner-Ziv Coding Lossy Compression with Side
Information
RXY (d)
Encoder
Decoder
For MSE distortion and Gaussian statistics,
rate-distortion functions of the two systems are
the same.
The rate loss R(d) RXY (d) is bounded.
R(d)
Encoder
Decoder
44
  • The structure of the Wyner-Ziv encoding and
    decoding
  • Encoding consists of quantization followed by a
    binning operation encoding U into Bin (Coset)
    index.

45
  • Structure of distributed decoders. Decoding
    consists of de-binning followed by estimation

46
Wyner-Ziv Coding (WZC) - A joint source-channel
coding problem
47
Pixel-Domain Wyner-Ziv Residual Video Codec
Wyner-Ziv Decoder
Wyner-Ziv Encoder
WZ frames
Slepian-Wolf Codec
Reconstruction
LDPC Encoder
LDPC Decoder
Scalar Quantizer
X
X
Buffer
-
Request bits
-
Q-1
Side information
Xer
Xer
Y
Frame Memory
Interpolation/ Extrapolation
Key frames
Conventional Intraframe decoding
Conventional Intraframe coding
I
I
48
Distributed Video Coding
  • Distributed coding is a new paradigm for video
    compression, based on Slepian and Wolfs
    (lossless coding) and Wyner and Zivs (lossy
    coding) information theoretic results.
  • Enables low-complexity video encoding where the
    bulk of the computation is shifted to the
    decoder.
  • A second architectural goal is to allow for far
    greater robustness to packet and frame drops.
  • Useful for wireless video applications by means
    of transcoding architecture use.

49
PRISM
  • PRISM (Power-efficient, Robust, hIgh compression
    Syndrome based Multimedia)
  • The PRISM is a practical video coding framework
    built on distributed source coding principles.
  • Flexible encoding/decoding complexity
  • High compression efficiency
  • Superior robustness to packet/frame drops
  • Light yet rich encoding syntax
  • R. Puri, A. Manjumdar, and K.Ramchandran, PRISM
    A video coding paradigm with motion estimation at
    the decoder, IEEE Transactions on Image
    Processing, vol. 16, no. 10, pp. 24362448,
    October 2007.

50
DIStributed COding for Video sERvices (DISCOVER)
  • DISCOVER is a new video coding scheme which has a
    strong potential of new applications, targeting
    new advances in coding efficiency, error
    resilience and scalability
  • At the encoder side the video is split into two
    parts.
  • The first set of frames called key frame are
    encoded with conventional H.264/AVC encoder.
  • The remaining frames known as Wyner-Ziv frames
    which are coded using distributed coding
    principle
  • X.Artigas, J.ascenso, M.Dalai, D.Kubasov, and
    M.quaret, The discover codec Architecture,
    techniques and evaluation, Picture Coding
    Symposium, 2007.
  • www.discoverdvc.org

51
A Typical Distributed Video coding
52
Side Information from Motion-Compensated
Interpolation
Decoded WZ frames
Wyner-Ziv Residual Encoder
WZ frame
Wyner-Ziv Residual Decoder
WZ parity bits
W
W
Side information
Y
Decoded frames
Interpolation
Previous key frame as encoder reference
I
I
I
wz
I
I
I
wz
53
Wyner-Ziv DCT Video Codec
WZ frames
Decoded WZ frames
W
W
Interframe Decoder
Intraframe Encoder
IDCT
Xk
Xk
qk
qk
Reconstruction

Request bits
bit-plane Mk
Side information
Yk
For each transform band k
DCT
Y
Interpolation/ Extrapolation
Interpolation/ Extrapolation
Key frames
Conventional Intraframe decoding
Conventional Intraframe coding
K
K
54
Foreman sequence
After Wyner-Ziv Coding
Side information
16-level quantization (1 bpp)
55
Sample Frame (Foreman)
After Wyner-Ziv Coding
Side information
16-level quantization (1 bpp)
56
Carphone Sequence
Wyner-Ziv Codec 384 kbps
H263 Intraframe Coding 410 kbps
57
Salesman sequence at 10 fps
DCT-based Intracoding 247 kbps PSNRY33.0 dB
Wyner-Ziv DCT codec 256 kbps PSNRY39.1 dB
GOP16
58
Salesman sequence at 10 fps
H.263 I-P-P-P 249 kbps PSNRY43.4 dB
GOP16
Wyner-Ziv DCT codec 256 kbps PSNRY39.1 dB
GOP16
59
Hall Monitor sequence at 10 fps
DCT-based Intracoding 231 kbps PSNRY33.3 dB
Wyner-Ziv DCT codec 227 kbps PSNRY39.1 dB
GOP16
60
Hall Monitor sequence at 10 fps
H.263 I-P-P-P 212 kbps PSNRY43.0 dB
GOP16
Wyner-Ziv DCT codec 227 kbps PSNRY39.1 dB
GOP16
61
Facsimile Image Compression with DSCCCITT 8 Image
62
Reconstructed Image with 30 errors
63
Fax4 Reconstructed Image with 8 errors
64
Applications
  • Very low complexity encoders
  • Compression for networks of cameras
  • Error-resilient transmission of signal waveforms
  • Digitally enhanced analog transmission
  • Unequal error protection without layered coding
  • Image authentication
  • Random access
  • Compression of encrypted signals

65
Thank You
Any Questions ?
66
Cosets
67
  • Let G 000,001,111
  • Let H 000,111 is a subgroup of G
  • Coset are
  • 001?000 001
  • 001?111 110
  • Hence, 001, 110 is one coset
  • 010?000 010
  • 010?111 101
  • 010, 101 is another coset and so on

68
Hamming Distance
  • Hamming distance is a distance measure defined as
    the number of bits two binary sequence differ
  • Let X and Y be two binary equences, the Hamming
    distance between X and Y is defined as
  • Hamming distance
  • Example Let X 0 0 1 1 1 0 1 0 1 0
  • Let Y 0 1 0 1 1 1 0 0
    1 1
  • Hamming distance Sum(0 1 1 0 0 1 1 0 0 1)
    5
Write a Comment
User Comments (0)
About PowerShow.com