The Transmission-Switching Duality of Communication Networks - PowerPoint PPT Presentation

About This Presentation
Title:

The Transmission-Switching Duality of Communication Networks

Description:

Title: Path Switching and A Network Sampling Theorem Author: Robert Ruan Last modified by: TTLEE Created Date: 5/22/2006 7:44:51 AM Document presentation format – PowerPoint PPT presentation

Number of Views:79
Avg rating:3.0/5.0
Slides: 58
Provided by: Robert2242
Learn more at: https://ewh.ieee.org
Category:

less

Transcript and Presenter's Notes

Title: The Transmission-Switching Duality of Communication Networks


1
The Transmission-Switching Duality of
Communication Networks

  • Tony T. Lee
  • Shanghai Jiao Tong University
  • The Chinese University of Hong Kong
  • Xidian University, June 21, 2011

2
A Mathematical Theory of Communication
BSTJ, 1948
  • C. E. Shannon

3
(No Transcript)
4
(No Transcript)
5
Contents
  • Introduction
  • Routing and Channel Coding
  • Scheduling and Source Coding

6
Reliable Communication
  • Circuit switching network
  • Reliable communication requires noise-tolerant
    transmission
  • Packet switching network
  • Reliable communication requires both
    noise-tolerant transmission and
    contention-tolerant switching

7
Quantization of Communication Systems
  • Transmissionfrom analog channel to digital
    channel
  • Sampling Theorem of Bandlimited Signal
  • (Whittakev 1915 Nyquist, 1928 Kotelnikou,
    1933 Shannon, 1948)
  • Switchingfrom circuit switching to packet
    switching
  • Doubly Stochastic Traffic Matrix Decomposition
  • (Hall 1935 Birkhoff-von Neumann, 1946)

8
Noise vs. Contention
  • Transmission channel with noise
  • Source information is a function of time, errors
    corrected by providing more signal space
  • Noise is tamed by error correcting code
  • Packet switching with contention
  • Source information f(i) is a function of space,
    errors corrected by providing more time
  • Contention is tamed by delay, buffering or
    deflection

Connection request f(i) j
0111
0001
0101
Message0101
1101
0100
Delay due to buffering or deflection
9
Transmission vs. Switching
Shannons general communication system
Received signal
Source
Transmitter
Channel capacity C
Receiver
Message
Signal
Destination
Temporal information source function f(t) of
time t
Noise source
Clos network C(m,n,k)
Source
Destination
Input module
Central module
Output module
kxk
mxn
nxm
o
o
0
0
0
n-1
n-1
Spatial information source function f(i) of
space i0,1,,N-1
N-n
k-1
m-1
k-1
N-n
N-1
N-1
Internal contention
Channel capacity m
10
Communication Channel
Clos Network
  • Noise
  • Channel Coding
  • Source Coding
  • Contention
  • Routing
  • Scheduling

11
Apple vs. Orange
  • 350mg Vitamin C
  • 1.5g/100g Sugar
  • 500mg Vitamin C
  • 2.5g/100g Sugar

12
Contents
  • Introduction
  • Routing and Channel Coding
  • Scheduling and Source Coding
  • Rate Allocation
  • Boltzmann Principle of Networking

13
Output Contention and Carried Load
  • Nonblocking switch with uniformly distributed
    destination address

0
0
  • ? offered load
  • ? carried load

1
1
N-1
N-1
  • The difference between offered load and carried
    load
  • reflects the degree of contention

14
Proposition on Signal Power of Switch
  • (V. Benes 63) The energy of connecting network is
    the number of calls in progress ( carried load )
  • The signal power Sp of an NN crossbar switch is
    the number of packets carried by outputs, and
    noise power NpN- Sp
  • Pseudo Signal-to-Noise Ratio (PSNR)

15
Boltzmann Statistics
0
0
n0 5
4
1
3
6
7
a
1
1
b
2
2
0
5
n1 2
a
d
3
3
c
Micro State
2
n2 1
b,c
4
4
5
5
d
Output Ports Particles
6
6
Packet Energy Quantum
7
7
energy level of outputs number of
packets destined for an output.
ni number of outputs with energy level
packets are distinguishable, the total number of
states is,





Number of Outputs

n
n
n
N
L
1
0
r
16
Boltzmann Statistics (contd)
  • From Boltzmann Entropy Equation
  • Maximizing the Entropy by Lagrange Multipliers
  • Using Stirlings Approximation for Factorials
  • Taking the derivatives with respect to ni, yields
  • S Entropy
  • W Number of States
  • C Boltzman Constant

17
Boltzmann Statistics (contd)
  • If offered load on each input is ?, under uniform
    loading condition
  • Probability that there are i packets destined for
    the output
  • Carried load of output

Poisson distribution
18
Clos Network C(m,n,k)
k x k
n x m
m x n
0
0
0
0
  • D nQ R
  • D is the destination address
  • Q ?D/n? --- output module in the output stage
  • R D n --- output link in the output module
  • G is the central module
  • Routing Tag (G,Q,R)

0
0
0
0
0
G
G
I
Q
n-1
n-1
m-1
k-1
k-1
m-1
D
0
0
nI
0
0
G
I
S
I
Q
G
nQ
k-1
0
0
k-1
n(I1)-1
Q
m-1
n-1
G
R
nQR
m-1
n-1
(n1)Q-1
n(k-1)
0
n(k-1)
0
0
0
0
m-1
0
k-1
k-1
I
G
G
Q
nk-1
nk-1
n-1
m-1
m-1
n-1
k-1
k-1
Input stage
Middle stage
Output stage
Slepian-Duguid condition mn
19
Clos Network as a Noisy Channel
  • Source state is a perfect matching
  • Central modules are randomly assigned to input
    packets
  • Offered load on each input link of central module
  • Carried load on each output link of central
    module
  • Pseudo signal-to-noise ratio (PSNR)

20
Noisy Channel Capacity Theorem
  • Capacity of the additive white Gaussian noise
    channel
  • The maximum date rate C that can be sent through
    a channel subject to Gaussian noise is
  • C Channel capacity in bits per second
  • W Bandwidth of the channel in hertz
  • S/N Signal-to-noise ratio

21
                                                 
  
Planck's law can be written in terms of the
spectral energy density per unit volume of
thermodynamic equilibrium cavity radiation.
22
Clos Network with Deflection Routing
  • Route the packets in C(n,n,k) and C(k,k,n)
    alternately

Routing Tag (Q1,R1, Q2,R2)
23
Loss Probability versus Network Length
  • The loss probability of deflection Clos network
    is an exponential function of network length

24
Shannons Noisy Channel Coding Theorem
  • Given a noisy channel with information capacity C
    and information transmitted at rate R
  • If RltC, there exists a coding technique which
    allows the probability of error at the receiver
    to be made arbitrarily small.
  • If RgtC, the probability of error at the receiver
    increases without bound.

25
Binary Symmetric Channel
  • The Binary Symmetric Channel(BSC) with cross
    probability q1-p½ has capacity
  • There exist encoding E and decoding D functions
  • If the rate Rk/nC-d for some dgt0. The error
    probability is bounded by
  • If Rk/nC d for some dgt0, the error probability
    is unbounded

p
26
Parallels Between Noise and Contention
Binary Symmetric Channel Deflection Clos Network
Cross Probability qlt½ Deflection Probability qlt½
Random Coding Deflection Routing
RC Rn
Exponential Error Probability Exponential Loss Probability
Complexity Increases with Code Length n Complexity Increases with Network Length L
Typical Set Decoding Equivalent Set of Outputs
27
Edge Coloring of Bipartite Graph
  • A Regular bipartite graph G with vertex-degree m
    satisfies Halls condition
  • Let A ? VI be a set of inputs, NA b (a,b) ?
    E, a?A , since edges terminate on vertices in A
    must be terminated on NA at the other end.Then
    mNA mA, so

  • NA A

28
Route Assignment in Clos Network
0
0
0
0
1
1
0
2
2
1
1
3
3
1
4
4
2
2
5
5
6
6
2
3
3
7
7
Computation of routing tag (G,Q,R)
SInput 0 1 2 3 4 5 6 7
DOutput 1 3 2 0 6 4 7 5
GCentral module 0 2 0 2 2 1 0 2
0 1 1 0 3 2 3 2
1 1 0 0 0 0 1 1
29
Rearrangeabe Clos Network and Channel Coding
Theorem
  • (Slepian-Duguid) Every Clos network with mn is
    rearrangeably nonblocking
  • The bipartite graph with degree n can be edge
    colored by m colors if mn
  • There is a route assignment for any permutation
  • Shannons noisy channel coding theorem
  • It is possible to transmit information without
    error up to a limit C.

30
LDPC Codes
  • Low Density Parity Checking (Gallager 60)
  • Bipartite Graph Representation (Tanner 81)
  • Approaching Shannon Limit (Richardson 99)

VL n variables
VR m constraints
x0
0
x1x3x4x71

Unsatisfied
x1
1
x2
0
x0x1x2x50

x3
0
Satisfied
x4
1
x2x5x6x70

x5
1
Satisfied
Closed Under ()2
x6
0
x0x3x4x61

x7
1
Unsatisfied
31
Benes Network
Bipartite graph of call requests
0
1
1
2
2
3
3
4
4
1
5
5
6
6
7
7
8
8
G(VL X VR, E)
x1

x1 x2 1
x2
x3 x4 1

Input Module Constraints
x3
x5 x6 1

x4
x7 x8 1

Not closed under
x5
x1 x3 1

x6
x6 x8 1

Output Module Constraints
x7
x4 x7 1

x8
x2 x5 1

32
Flip Algorithm
  • Assign x10, x21, x30, x41to satisfy all
    input module constraints initially
  • Unsatisfied vertices divide each cycle into
    segments. Label them a and ß alternately and flip
    values of all variables in a segments

x1
0
x2
x1x30


x1x21
1
x3
0
x3x41
x6x80


x4
1
x5
0
x5x61
x4x71


x6
1
x7
x7x81
x2x51


0
x8
1
Input module constraints
Output module constraints
variables
33
Bipartite Matching and Route Assignments
1
1
2
2
Call requests
3
3
4
4
5
5
6
6
7
7
8
8
1
1
2
2
3
3
4
4
Bipartite Matching and Edge Coloring
34
Contents
  • Introduction
  • Routing and Channel Coding
  • Scheduling and Source Coding

35
Concept of Path Switching
  • Traffic signal at cross-road
  • Use predetermined conflict-free states in cyclic
    manner
  • The duration of each state in a cycle is
    determined by traffic loading
  • Distributed control

N
Traffic loading NS 2? EW ?
W
E
NS traffic
EW traffic
S
Cycle
36
Connection Matrix
0
0
0
0
0
1
Call requests
1
2
2
3
3
1
1
1
4
4
5
5
6
6
2
2
2
7
7
8
8
0
1
2



0
1
2
37
Path Switching of Clos Network
0
0
0
0
0
1
1
2
2
3
3
1
1
1
4
4
5
5
6
6
2
2
2
7
7
8
8
0
1
2
0
1
2






0
0
1
1
2
2
Time slot 2
Time slot 1
38
Capacity of Virtual Path
  • Capacity equals average number of edges

Time slot 0
Virtual path
0
0
1
1
2
2
G1
Time slot 1
G1 U G2
0
0
1
1
2
2
G2
39
Contention-free Clos Network
Central module (nonblocking switch)
Output module (output queued Switch)
Input module (input queued switch)
o
o
n-1
n-1
o
o
n-1
n-1
Input buffer
Output buffer
Predetermined connection pattern in every time
slot
?ij Source
Buffer and scheduler
Input module i
Buffer and scheduler
Input module j
Destination
Virtual path
Scheduling to combat channel noise
Buffering to combat source noise
40
Complexity Reduction of Permutation Space
Subspace spanned by K base states Pi
Convex hull of doubly stochastic matrix
  • Reduce the complexity of permutation space from
    N! to K

K minF, N2-2N2, the base
dimension of C
41
BvN Capacity Decomposition and Sampling Theorems
Packet switching Digital transmission
Network environment Time slotted switching system Time slotted transmission system
Bandwidth limitation Capacity limited traffic matrix Bandwidth limited signal function
Samples Complete matching, (0,1) Permutation matrixes Entropy, (0,1) Binary sequences
Expansion Birkhoff decomposition (Halls matching theorem) Fourier series
42
BvN Capacity Decomposition and Sampling Theorems
Packet switching Digital transmission
Inversion by weighted sum by samples Reconstruction the capacity by running sum Reconstruction the signal by interpolation
Complexity reduction Reduce number of permutation from N! to O(N2). Reduce to O(N), if bandwidth is limited. Reduce to constant F if truncation error of order O( 1 / F ) is acceptable. Reduce infinite dimensional signal space to finite number 2tW in any duration t.
QoS Buffering and scheduling, capacity guarantee, delay bound Pulse code modulation (PCM), error-correcting code, data compression, DSP
43
Source Coding and Scheduling
  • Source coding A mapping from code book to source
    symbols to reduce redundancy
  • Scheduling A mapping from predetermined
    connection patterns to incoming packets to reduce
    delay jitter

44
Smoothness of Scheduling
  • Scheduling of a set of permutation matrices
    generated by decomposition
  • The sequence , ,, of inter-state
    distance of state Pi within a period of F
    satisfies
  • Smoothness of state Pi

with frame size F
Pi
Pi
Pi
Pi
Pi
F
45
Entropy of Decomposition and Smoothness of
Scheduling
  • Any scheduling of capacity decomposition
  • Entropy inequality

(Krafts Inequality)
The equality holds when
46
Smoothness of Scheduling
  • A Special Case
  • If KF, ?i1/F, and ni1 for all i, then
    for all i1,,F
  • Another Example

Smoothness
The Input Set
The Expected Optimal Result
47
Optimal Smoothness of Scheduling
  • Smoothness of random scheduling
  • Kullback-Leibler distance reaches
  • maximum when
  • Always possible to device a scheduling within 1/2
    of entropy

48
Source Coding Theorem
  • Necessary and Sufficient condition to prefix
    encode values x1,x2,,xN of X with respective
    length n1,n2,nN
  • Any prefix code that assigns ni bits to xi
  • Always possible to device a prefix code within 1
    of entropy

(Krafts Inequality)
49
Huffman Round Robin (HuRR) Algorithm
Initially set the root be temporary node Px, and
S PxPx be temporary sequence.
Step1
Apply the WFQ to the two successors of Px to
produce a sequecne T, and substitute T for the
subsequence PxPx of S.
Step2
If there is no intermediate node in the sequence
S, then terminate the algorithm. Otherwise select
an intermediate node Px appearing in S and go to
step 2.
Step3
1
PZ
0.5
PY
PX
0.25
0.25
P1
P2
P3
P4
P5
0.5
0.125
0.125
0.125
0.125
Huffman Code
logarithm of interstate time
length of Huffman code
50
Performance of Scheduling Algorithms
P1 P2 P3 P4 Random WFQ WF2Q HuRR Entropy
0.1 0.1 0.1 0.7 1.628 1.575 1.414 1.414 1.357
0.1 0.1 0.2 0.6 1.894 1.734 1.626 1.604 1.571
0.1 0.1 0.3 0.5 2.040 1.784 1.724 1.702 1.686
0.1 0.2 0.2 0.5 2.123 1.882 1.801 1.772 1.761
0.1 0.1 0.4 0.4 2.086 1.787 1.745 1.745 1.722
0.1 0.2 0.3 0.4 2.229 1.903 1.903 1.884 1.847
0.2 0.2 0.2 0.4 2.312 2.011 1.980 1.933 1.922
0.1 0.3 0.3 0.3 2.286 1.908 1.908 1.908 1.896
0.2 0.2 0.3 0.3 2.370 2.016 2.016 1.980 1.971
Better Performance
51
Routing vs. Coding
Clos network
Transmission Channel
  • Noisy channel capacity theorem
  • Noisy channel coding theorem
  • Error-correcting code
  • Sampling theorem
  • Noiseless channel
  • Noiseless coding theorem
  • Random routing
  • Deflection routing
  • Route assignment
  • BvN decomposition
  • Path switching
  • Scheduling

52
Transmission-Switching Duality
Permutation Matrix
Clos Network
Route Assignment
Halls Matching Theorem (BvN Decomposition)
Scheduling and Buffering
Communication System
Boltzmann Equation S k logW
Entropy
Noisy Channel
Channel Coding
Bandlimited Sampling Theorem
Source Coding
53
(No Transcript)
54
Law of Probability
  • Input signal to a transmission channel is a
    function of time
  • The main theorem on noisy channel coding is
    proved by law of large number
  • Input signal to a switch is a function of space
  • Both theorems on deflection routing and
    smoothness of scheduling are proved by randomness

55
(No Transcript)
56
(No Transcript)
57
  • Thank You!
Write a Comment
User Comments (0)
About PowerShow.com