Adaptive Network Coding for Decoding Delay Minimization in Broadcast Erasure Channels - PowerPoint PPT Presentation

Loading...

PPT – Adaptive Network Coding for Decoding Delay Minimization in Broadcast Erasure Channels PowerPoint presentation | free to download - id: 1d93c4-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Adaptive Network Coding for Decoding Delay Minimization in Broadcast Erasure Channels

Description:

A simple routing such as time sharing between the two yellow and blue streams: s1 ... Uses causal feedback to decide which packets to mix at any time ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 59
Provided by: ram43
Learn more at: http://cecs.anu.edu.au
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Adaptive Network Coding for Decoding Delay Minimization in Broadcast Erasure Channels


1
Seminar _at_ ANU, 16 July 2009
Adaptive Network Coding for Decoding Delay
Minimization in Broadcast Erasure Channels
Parastoo Sadeghi The Australian National
University Joint work with Ramtin Shams, Danail
Traskov and Ralf Koetter, TUM
2
Network Coding- Canonical Example 1
  • Two unicast wireless sessions (s1 wants to talk
    to t1 and s2 wants to talk to t2)
  • The information has to pass through bottlenecks
    CD

s1
t2
D
C
t1
s2
3
Network Coding- Canonical Example 1
  • A simple routing such as time sharing between the
    two yellow and blue streams

s1
t2
D
C
t1
s2
4
Network Coding- Canonical Example 1
  • Assuming that each link has the capacity of B
    bits/sec
  • With routing the achievable rates satisfy the
    following

5
Network Coding- Canonical Example 1
  • Networks allows mixing of the two streams at
    middle nodes to let them pass the bottleneck
    concurrently
  • They can still be recovered at the destination

Broadcast nature of channel
s1
t2
D
C
t1
s2
6
Network Coding- Canonical Example 1
  • If streams are XORed together, the original
    flows can be recovered at the destinations

s1
t2
m1
m1? m2
m1? m2
t1
s2
m2
7
Network Coding- Canonical Example 1
  • With network coding we can achieve the maximum
    rates allowed by link capacities

8
Network Coding- Canonical Example 2
  • A multicast session (s1 wants to talk to t1 and
    t2)
  • Each edge has a unit capacity (1 bit /ch use)

s1
min-cut 2
A
B
C
F
t1
t2
9
Network Coding- Canonical Example 2
  • A multicast session (s1 wants to talk to t1 and
    t2)
  • Each edge has a unit capacity (1 bit /ch use)

s1
min-cut 2
A
B
C
F
t1
t2
10
Network Coding- Canonical Example 2
  • With routing only a maximum of 1.5 bits/ch use is
    possible

s1
A
B
C
F
t1
t2
11
Network Coding- Canonical Example 2
  • With network coding the min-cut bound can be
    achieved. This is generally true for multicast
    session

s1
m1
m2
A
B
m2
m1
C
m1
m2
m1? m2
F
m1? m2
m1? m2
t1
t2
12
Random Linear Network Coding (RLNC)
  • In the case of multicast in a general network,
    random linear network coding is sufficient for
    almost sure recovery of information
  • If a, b, g, d are chosen u.i.d from a finite
    field of sufficiently large size, t1 will almost
    surely receive two independent versions of m1 and
    m2 and can recover them
  • t1 needs to wait for both coded packets to arrive
    ? decoding delay

m2
m1
D
C
am1 bm2
gm1 dm2
t1
13
Network Coding for Combating Erasures
  • Consider the following broadcast erasure channel

. . .
14
Network Coding for Combating Erasures
  • We can mix all the packets to be sent. This
    guarantees throughput optimality.
  • They all add to the degrees of freedom at the
    destination

15
Disadvantage of block-based RLNC
  • The destinations can, in general, start decoding
    only after the whole block of coded packets has
    arrived.
  • There is often a throughput-decoding delay
    tradeoff involved.

16
Application Type 1
  • The notion of delay is highly entangled with the
    application
  • Some applications cannot use individual decoded
    packets even if they become available sooner than
    others, e.g. a simple file download for software
    updates. No real incentive for partial decoding.

17
Application Type 2
  • Some applications can benefit from partial
    decoding, as long as decoded packets are passed
    on to them in a temporal order, e.g. audio/video
    streaming.

18
Application Type 3
  • Some applications may benefit from partial
    decoding (more or less) irrespective of
    sequential order of the packets originally
    designated at the transmitter
  • Multiple agents in a sensor network, each has to
    process/execute one or more data/commands
  • For coordination, they must know other agents
    commands
  • In-order processing/execution is not a real
    issue, only that agents get them ASAP is
    important.

19
Application Type 4
  • There are other situations where partial decoding
    is critically better than batch decoding
  • Bushfires or other emergencies.
  • Emergency crew may have the map of area, but the
    dynamics usually change very rapidly
  • Updates of different parts of the map can arrive
    in any order and still be helpful

20
Application Type 5
  • Some applications may be designed in such a way
    that they are insensitive to in-order delivery
    1
  • Noisy and unreliable transport medium/protocol
  • Natural to use a multiple description codes
  • Each packet brings new information to the
    receiver regardless of its order
  • We can apply MD coding in the previous map update
    situation

Nonprogressive with retrans.
progressive with retrans.
Multiple description no retrans.
21
Classification Input Traffic
  • Rateless transmission, e.g. file download with a
    fixed number of packets K 2-6
  • Stochastic process with a certain arrival rate l
    7-8

22
Classification Channel State Information (CSI)-1
  • No CSI available at the source, except for
    completion ACK of a file download or a coded
    block 2

Got it!
23
Classification CSI-2
  • Causal CSI per slot about who received the last
    transmission 5-8

ACK
NAK
24
Classification Coding Techniques-1
  • Random linear network coding (RLNC), block-based
    transmission 2-4
  • Works on blocks of packets
  • In general, cannot start decoding until K coded
    packets have arrived
  • Suitable when CSI is limited (cases 1 and 2)
  • Suitable for file downloads, e.g. software
    updates
  • Can be applied to sub-blocks of data in streaming

25
Classification Coding Techniques-2
  • Adaptive linear network coding 5-8
  • Suitable for partial decoding applications
  • Uses causal feedback to decide which packets to
    mix at any time
  • The aim is to minimize delay (whatever the metric
    is) and maximize throughput
  • Deciding on what to transmit next is non-trivial!

26
Classification Delay-1
  • Expected completion time 2-4
  • Suitable where the application layer needs the
    whole data (e.g. full K packets) to work and we
    use RLNC with limited CSI.

t1
t2
27
Classification Delay-2
  • Decoding delay 7
  • Average number of slots between (stochastic)
    arrival of a packet at the transmitter queue and
    its decoding at the receiver.
  • Decoding delay is often calculated irrespective
    of potential packet order needed by higher layers

ta
td
pn
pn
28
Classification Delay-3
  • A different notion of delay for order-insensitive
    partial decoding 5-6
  • A receiver experiences a unit delay every time it
    successfully receives a packet that either
  • does not contain any new information or
  • is not instantaneously decodable (contains more
    than one new packet)
  • In the context of discussed applications 3,4,
    and 5, it should become clear why such notion of
    delay (especially the second condition) has some
    significance

29
Example of Delay-4
  • Receiver already has m1,m2,m4

Delay
Delay
30
Classification Throughput-1
  • Throughput optimality innovative packets
    guarantee by making sure that every successful
    reception will increase the dimension of the
    knowledge space of every receiver.
  • The optimality condition is relaxed about
    immediate decoding or in-order delivery.
  • A proper pursuit for complete file downloads, or
    for making sure that queues remain stable even
    when the load is close to system capacity

31
Classification Throughput-2
  • In the context of order-insensitive notion of
    delay 4 (and more generally), we can define
    normalized throughout in every packet
    transmission round as
  • Where N is the total number of receivers and M is
    the number of receivers for which the packet was
    not innovative

32
Classification Erasures-1
  • Most have considered memoryless erasures between
    transmission slots 2-5, 7-8

Independent erasures
33
Classification Erasures-2
  • Considering memory in erasures, Gilbert-Elliott
    channels 6

Memory in the channel
34
Basic Model Assumptions
  • Rateless transmission with a fixed number of
    packets K denoted by m1 to mK.
  • Broadcast transmission to N users denoted by R1
    to RN.
  • Slotted transmission. One packet per slot is sent
    out to the destinations.
  • Instantaneously decodable coded packet
    transmission. Binary XOR is sufficient.

35
Analysis System State at the tx
  • At time t, define the receiver-packet incidence
    matrix, A of size N by K as

Receiver 1 still needs packet 2
Receivers
Packets
36
Analysis Packet Importance
  • Weight vector W shows how important a packet
    is.
  • First and most obvious example

Sum of each column of receivers who need a
packet
Rx
Packets
37
Analysis Packets to Code
  • Packet vector X shows which packets should be
    combined

38
Analysis instantaneous Decodability
  • instantaneous decodability condition In a row,
    only one packet can be 1 for a receiver

If I send packet 2, then I cannot send packet 1
Rx
Packets
39
Analysis Optimal Problem Formulation
  • Maximizing the number of receivers for which a
    transmission is innovative, subject to
    instantaneous decodability can be posed as the
    following integer linear program (ILP)

Ensures instantaneous decodability
  • Recently, we have been able to find the optimal
    solution (efficiently in many cases) by proposing
    a recursive pseudo Boolean constraint solver

40
Analysis Idea of Optimal Solution
  • The instantly decodable condition is strong and
    hence, can be used to reduce the potentially
    combinatorial search size dramatically in many
    practical situations

This packet should always be part of our solution
because it does not conflict with any other packet
41
Flowchart of Optimal Solution
42
Analysis Heuristics-1 5
  • Choose a packet at random from the set of packets
    needed by at least one receiver
  • Find its constraints with all other packets
  • From the remaining (non-constrained) packets,
    choose another packet at random from them
  • Continue until no other packet can be added

43
Heuristics-1 Example
Each receiver need the following packets
  • If we pick m1 first
  • We cannot add any other msg
  • R3 will experience delay
  • If we pick m2/m3 first
  • We can m3/m2
  • But we still cannot add m4
  • R3 will experience delay
  • If we pick m4 first
  • Everyone will be happy!

44
Heuristic-2
Choose the column with largest number of ones a
local maximum
We cannot choose columns 1 or 5 due to
instantaneous decodability condition
We can still code either packet 3 or 4 with
packet 2
We choose packet 4 because it is needed by 2
receivers
45
Analysis Heuristic-3
  • Set a maximum number of recursions for the
    optimal search.
  • If the search reaches the max recursions and has
    not finished the search yet, switch to
    heuristic-2 for the remaining unresolved
    variables (packets) and return the largest
    objective function among feasible solutions
    found.
  • As a rule of thumb, setting the max recursions to
    K (number of packets) is a reasonable choice.

46
Results for a Very Bad Channel
Reasonable performance
  • Note that this delay is actually the overhead
    delay beyond the minimum number of required
    transmissions. The overhead delay is due to
    instantaneous decodability constraint.

47
Results for a Better Channel
Reasonable performance
  • Even for up to 40 receivers, we get about 10
    additional delay because of instantly decodable
    criterion
  • Heuristic-2 is almost as good as optimal search

48
Analysis Memory Model
  • Gilbert-Elliott Channel (GEC)
  • Good state (G) successful reception (no erasure)
  • Bad state (B) packet lost (erasure)

Channel memory content
49
Analysis Generalization to GEC
  • For receiver Ri, we have
  • Define the following matrix B

Rx
Packets
  • Define the following reward/weight for
    transmission of each packet

50
Analysis General Optimal Formulation
  • With the newly defined weights, the optimization
    problem is again given by

51
Analysis Heuristics
  • We can apply the same heuristic methods as
    before, especially heuristics 2 or 3.

52
Results Effects of Memory Content
  • There are N 3 receivers
  • In the special method, we allow violation of
    instantly decodable rule if one receiver is
    likely to be in erasure

53
Results Effects of of Packets
We have enough packets to send and guarantee zero
delay irrespective of channel conditions
  • We observe the non-ergodic effect of number of
    packets on the delay when the number is not so
    much smaller than the average state duration
    (here average state duration is 100 slots)

54
Future Work Application-Dependent Weights-1
  • In the map update example, part of the map closer
    to the physical location of an emergency team
    gets priority (weights are proportional to
    distance)

Send m2
N 4 teams
5 map areas (5 packets)
55
Future Work Application-Dependent Weights-2
  • The adaptive weight idea may be also applied to
    streaming. The weight of each packet for a
    particular rx can be a function of
  • Remaining packets in its streaming buffer
  • Its next channel state
  • Its delay (experienced so far)

current buffer size
next state
waits so far
56
Future Work L-step Decodability
  • Instantaneous decodability is a bit restrictive
    and may not result in the best throughput-delay
    tradeoff. Can we extend it to L-step decodability?

57
Future Work Other Types of Decoding Delay
  • We have done some recent work on in-order delay
    optimizations (while _at_MIT), but it still requires
    more research.

58
Selected References (not Comprehensive)
  1. V. K. Goyal, Multiple description coding
    compression meets the network, IEEE Signal
    Processing Magazine, vol. 18, pp. 7493, Sept.
    2001.
  2. A. Eryilmaz, A.Ozdaglar, M. Médard, On Delay
    Performance Gains from Network Coding,invited
    paper, Proceedings of the Conference on
    Information Sciences and Systems, Princeton,
    2006.
  3. D. Lucani, Médard, M. and Stojanovic, M. ,
    Random Linear Network Coding For Time Division
    Duplexing Energy Analysis, ICC Communication
    Theory Workshop, 2009.
  4. Lucani, D. E., Stojanovic, M., Médard, M.,
    Random Linear Network Coding for Time Division
    Duplexing When To Stop Talking And Start
    Listening , INFOCOM, 2009.
  5. L. Keller, E. Drinea, and C. Fragouli, Online
    broadcasting with network coding, in Proc.
    Fourth Workshop on Network Coding, Theory and
    Applications (NetCod 08), Hong Kong, Jan. 2008.
  6. P. Sadeghi, D. Traskov, and R. Koetter, Adaptive
    network coding for broadcast channels, NetCod,
    2009.
  7. J. K. Sundararajan, D. Shah, and M. Medard,
    Feedback-based online network coding, submitted
    to IEEE Trans. On IT.
  8. Y. E. Sagduyu and A. Ephremides, On network
    coding for stable multicast communication, in
    IEEE Military Communications Conference (MILCOM),
    2007.
  9. J. Lacan and E. Lochin, On-the-fly coding to
    enable full reliability without retransmission,
    ISAE, LAASCNRS, France, Tech. Rep., 2008.
About PowerShow.com