A Comparison between Network Coding NC and ARQ in a ConvergeCasting Scenario - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

A Comparison between Network Coding NC and ARQ in a ConvergeCasting Scenario

Description:

To compute the i-th outcoming packet: It chooses an encoding vector. vi = [v1,i ... i-th outcoming packet: linear combination of original (x) packets with vi as ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 19
Provided by: dav6193
Category:

less

Transcript and Presenter's Notes

Title: A Comparison between Network Coding NC and ARQ in a ConvergeCasting Scenario


1
A Comparison between Network Coding (NC) and ARQ
in a Converge-Casting Scenario
  • NC relatively new topic in Sensor Networks
  • no previous works on NCconverge casting

2
Converge Casting
  • Central Sink Node
  • Sink destination of all packets

3
Network Coding (NC)
  • Trick linear combinations of packets
  • Goal reduce total no. of packets flowing in
    the network

4
NC encoding vector
  • Error Prob alpha on the link
  • Node A has I (incoming) packets
  • To compute the i-th outcoming packet
  • It chooses an encoding vector vi v1,i v2,i
    vI,iT vj,i in GF(2) 0,1

5
NC encoding vector
  • i-th outcoming packet linear combination of
    original (x) packets with vi as vector of
    coefficients
  • Therefore, for the k-th bit of the i-th encoded
    packet

6
NC matrix form
  • Matrix Form XLxI VIxM YLxM where
  • a column of X is an incoming packet (L bits)
  • I incoming packets
  • a column of V is an encoding vector (I elements
    in GF(2))
  • M encoding vectors
  • a column of Y is an encoding packet (L bits)
  • M encoded packets

7
NC At the receiver
  • Assumption Node A sends encoding vector
    contestually with packet y
  • At node B (receiver) M equations in I unknowns
  • System of equations we can solve it when rank(V)
    I

8
NC - Pinv
  • rank(V) I means I linearly independent columns
    (M received packets)
  • With what probability?
  • If vi,js are in GF(2), and M I gt 10 Pinv
    0.289 MacKay

9
NC - Pinv
  • We need M gt I !
  • See the graph (I 20)

10
NC assumptions
  • vj,i chosen in GF(8)0,1,2,3,4,5,6,7
  • Higher Pinv (about 0.85) but more computational
    time

11
Old Fashion K-ARQ
  • Max of K transmissions for every packet (case of
    error)
  • Assumption noiseless error-free feedback channel
  • Probability that B doesnt receive all I packets!
  • This probability is our costraint

12
Old Fashion K-ARQ
  • e 0.05 failure prob (one or more discarded
    packets)
  • pose N number of packets to send across the
    link (in this case N I)

13
General Assumptions
  • Focus on last hop
  • One shared (e.g wireless) channel
  • No collisions Magic scheduler
  • Exclusive zones 4Common zones 28

14
Scenario 1 2 neighbors
15
Scenario 2 3 neighbors
16
Scenario 3 4 neighbors
17
Conclusions
  • Guess NC performs better in a high-loss media?
  • Even better than K-ARQ wc
  • But worse than ac
  • How to improve performance?
  • choose vj,i not randomly
  • increment cardinality of GF

18
Future work
  • Use of more realistic MAC protocols
  • More neighbors
Write a Comment
User Comments (0)
About PowerShow.com