SlepianWolf Coding over Broadcast Channels - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

SlepianWolf Coding over Broadcast Channels

Description:

UC San Diego, December 1, 2006. ERTEM TUNCEL, DECEMBER 1, 2006. Outline ... Why this is good news... Now the bad news... What about K 2 ? ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 35
Provided by: eeU3
Category:

less

Transcript and Presenter's Notes

Title: SlepianWolf Coding over Broadcast Channels


1
Slepian-Wolf Coding overBroadcast Channels
  • Ertem Tuncel

UC San Diego, December 1, 2006
2
Outline
  • Problem definition and motivation
  • Separate source-channel coding
  • Optimal combination of source and channel codes.
  • Joint source-channel coding
  • Complete region of achievable rates.
  • Operational separation.
  • Comparison between separate and joint schemes
  • No informational separation in general.

3
Problem setting and motivation
4
Problem setting and motivation
Local observation
Xt
Local observations
Y1t
Y2t
Y3t
Y4t
5
Formal description
Discrete memoryless source and channel
6
Questions
  • What combination of coding schemes is optimal?
  • How does the optimal joint scheme work?
  • Is joint coding superior to combination of
    separate codes?

7
PART I SEPARATESOURCE-CHANNELCODING
8
The most general separate scheme
For K 3
9
A suboptimal separate coding method
  • Worst-case Slepian-Wolf coding over compound
    channels. Transmit only W(12K) .

10
Trivial outer bound for source codes
  • Is this outer bound exactly the achievable
    region?
  • If so, what is the mechanism to achieve it?

11
Slepian-Wolf coding revisited
  • It is well-known that to achieve this trivial
    lower bound, we use binning.

12
Multiple binning
13
Error analysis
  • Define X1 X2 XL X .

14
Optimal source coding rates
15
Why this is good news
  • In other words To characterize the minimum
    achievable k in separate coding, it suffices to
    find the set of achievable total rates the
    broadcast channel can deliver to each receiver.

16
Why this is good news
17
Why this is good news
  • Therefore, assuming WLOG that
  • this implies that k is achievable in separate
    coding with K 2 if and only if

18
Comparison
19
Infinite gains possible
20
Now the bad news
  • What about K gt 2 ?
  • Would the generalization of the degraded message
    sets scenario cover all the achievable total
    rates?

21
Now the bad news
  • The framework of degraded message sets is not
    sufficient for the analysis of minimum achievable
    k.

22
PART II JOINTSOURCE-CHANNELCODING
23
Joint source-channel coding
  • Easy to prove that this is a convex region.

24
Proof sketch for the converse
Fanos inequality
25
Proof sketch for the direct result
Decoding Find i such that
26
Probability of decoding error
27
Is this separation?
  • Not in the classical sense
  • There are no stand-alone source or channel
    codes here.
  • However, no interplay between (X,Y1,Y2,,YK) and
    (U,V1,V2,,VK) unlike in other studied joint
    source-channel coding scenarios.
  • Operational separation!

28
Virtual binning
Channel codebook
Typical X n
29
Virtual binning
Channel codebook
Typical X n
For better channels, worse side information is OK
30
Suboptimality of separate coding
  • Consider a binary symmetric broadcast channel

31
Suboptimality of separate coding
R2
R
R1
32
Joint coding gains
R2
R
We observed gains of upto a ? 2.
R1
33
Summary and conclusions
  • Separate source-channel coding
  • We have a single-letter characterization for K
    2.
  • For K gt 2, it suffices to characterize the total
    channel coding rates deliverable to each
    receiver.
  • Joint source-channel coding
  • Single-letter characterization for all K.
  • Effective capacity region R .
  • Operational separation via virtual binning.
  • No informational separation in general.

34
Open questions
  • Is there a single-letter characterization for the
    total channel coding rates?
  • Any hint for how practical SW-coding over
    broadcast channels should work?
  • Any other multi-terminal problem for which our
    simple joint coding technique works?
  • Any implications for lossy coding (w or w/o side
    information)?
Write a Comment
User Comments (0)
About PowerShow.com