Quantity of Information (noiseless system) - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Quantity of Information (noiseless system)

Description:

... is to making the output of the coder as suitable as possible for the channel. ... coder. decoder. 100 symbol/s. 0, 1. channel. P(A)=16/20, p(B)=3/20, p(C)=1 ... – PowerPoint PPT presentation

Number of Views:17
Avg rating:3.0/5.0
Slides: 17
Provided by: persona2
Category:

less

Transcript and Presenter's Notes

Title: Quantity of Information (noiseless system)


1
A review on important bits of Part I
  • Quantity of Information (noiseless system)

a) Depends on probability of event. b) Depends on
length of message.
probability of event
  • Average Information Entropy

Source producing many symbols of probabilities
etc.
2
Maximum entropy
For a binary source
3
  • Redundancy
  • Conditional entropy H(ji)

If there is intersymbol influence, average
information is given by
Conditional probability (probability of j given i)
Joint probability
4
  • Coding in noiseless channel Source coding

(Speed of transmission is the main
consideration )
  • Important properties of codes
  1. uniquely decodable (all combinations of code
    words distinct)
  2. instantaneous (no code words a prefix of
    another)
  3. compact (shorter code words given to more
    probable symbols)

5
Important parameters
is length (in binary digits)
where
  • Coding methods
  • Fano-Shannon method
  • Huffmans Method

6
  • Coding methods
  • Fano-Shannon method

1. Writing the symbol in a table in the order of
descending order of probabilities 2.
Dividing lines are inserted to successively
divide the probabilities into halves,
quarters, etc (or as near as possible) 3. A
0 and 1 are added to the code at each
division. 4. Final code for each symbol is
obtained by reading from towards each
symbol.
7
(No Transcript)
8
s1 0.5 0 0
s2 0.2 0 0 1 100
s3 0.1 1 0 1 101
s4 0.1 0 1 1 110
s5 0.1 1 1 1 111
L0.510.2 33 0.1 32.0 H1.96 E0.98 L0.510.2 33 0.1 32.0 H1.96 E0.98 L0.510.2 33 0.1 32.0 H1.96 E0.98 L0.510.2 33 0.1 32.0 H1.96 E0.98 L0.510.2 33 0.1 32.0 H1.96 E0.98 L0.510.2 33 0.1 32.0 H1.96 E0.98
9
  • Coding methods
  • Huffmans Method
  • 1. Writing the symbol in a table in the order of
  • descending order of probabilities
  • The probabilities are added in pairs from bottom
  • and reordered.
  • 3. A 0 or 1 is placed at each branch
  • 4. Final code for each symbol is obtained by
    reading from towards each symbol.

10
S5 S4 S3 S2 S1
0.1 0.1 0.1 0.2 0.5
S3 S5, S4 S2 S1
0.1 0.2 0.2 0.5
11
S2 S3,S5, S4 S1
0.2 0.3 0.5
S2, S3, S5, S4 S1
0.5 0.5
12
Codes S1 0 S2 11 S3 101 S4
1000 S5 1001
13
L0.510.2 2 0.1 32 0.1 4
2.0 H1.96 E0.98
14
  • Shannons first theorem

Shannon proved formally that if the source
symbols are coded in groups of n, then the
average length per symbol tends to the source
entropy H as n tends to infinite. In consequence,
a further increase in efficiency can be obtained
by grouping the source symbols in groups, (
pairs, threes), and applying the coding procedure
to the relevant probabilities of the chosen group.
  • Matching source to channel

The coding process is sometimes known as
matching source to channel , that is to making
the output of the coder as suitable as possible
for the channel.
15
Example An information source produces a long
sequence of three independent symbols A, B, C
with probabilities 16/20,3/20 and 1/20
respectively 100 such symbols are produced per
second. The information is to be transmitted via
a noiseless binary channel which can transmit up
to 100 binary digits per second. Design a
suitable compact instantaneous code and find the
probabilities of the binary digits produced.
0, 1
100 symbol/s
channel
decoder
source
coder
P(A)16/20, p(B)3/20, p(C)1/20
Coding singly, using Fano-Shannon method
A 16/20 0 0
B 3/20 0 1 10
c 1/20 1 1 11
P(0)0.73, p(1)0.27
16
AA 0.64 0 0
AB 0.12 0 1 10
BA 0.12 0 1 1 110
AC 0.04 0 0 1 1 1 11100
CA 0.04 1 0 1 1 1 11101
BB 0.0225 0 1 1 1 1 11110
BC 0.0075 0 1 1 1 1 1 111110
CB 0.0075 0 1 1 1 1 1 1 1111110
CC 0.0025 1 1 1 1 1 1 1 1111111
L1.865 per pair, R93.25bits/s p(0)0.547,
p(1)0.453. The entropy of the output stream is
(p(1)logp(0)p(1)logp(1))0.993 bits. close to
maximum value of 1bit, (p(0)p(1)).
Write a Comment
User Comments (0)
About PowerShow.com