Email:cychen07@nuk.edu.tw - PowerPoint PPT Presentation

About This Presentation
Title:

Email:cychen07@nuk.edu.tw

Description:

Def: The entropy of S, called H(S), is the average of the self-information ... 2. 2 Entropy. In the system S the probabilities p1 and p2 where p2 p1 are ... – PowerPoint PPT presentation

Number of Views:17
Avg rating:3.0/5.0
Slides: 35
Provided by: DrLawri1
Category:
Tags: cychen07 | edu | email | entropy | nuk

less

Transcript and Presenter's Notes

Title: Email:cychen07@nuk.edu.tw


1
????
Ch2 Basic Concepts
  • ???? ???
  • Emailcychen07_at_nuk.edu.tw
  • ????401
  • ?? http//www.csie.nuk.edu.tw/cychen/

2
Ch2 Basic Concepts
2. 1 Self-information
Let S be a system of events
in which
Def The self-information of the event Ek is
written I(Ek)
The base of the logarithm 2 (log) , e (ln)
??bit, nat
3
Ch2 Basic Concepts
2. 1 Self-information
then
when
then
when
then
when
then
when
??
??
4
Ch2 Basic Concepts
2. 1 Self-information
Ex1. A letter is chosen at random from the
Enlish alphabet.
Ex2. A binary number of m digits is chosen at
random.
5
Ch2 Basic Concepts
2. 1 Self-information
Ex3. 64 points are arranged in a square grid.
Ej be the event that a point picked at random in
the jth column
Ek be the event that a point picked at random in
the kth row
Why?
6
Ch2 Basic Concepts
2. 2 Entropy
f Ek? fk
E(f) be expectation or average or mean of f
Let S be the system with events
the associated probabilities being
7
Ch2 Basic Concepts
2. 2 Entropy
Def The entropy of S, called H(S), is the
average of the self-information
Self-information of an event increases as its
uncertainty grows
??
Let
certainty
????0,????????????
8
Ch2 Basic Concepts
2. 2 Entropy
Thm
with equality only when
Proof
9
Ch2 Basic Concepts
2. 2 Entropy
Thm 2.2
For xgt0
with equality only when x1.
Assume that pk ?0
10
Ch2 Basic Concepts
2. 2 Entropy
11
Ch2 Basic Concepts
2. 2 Entropy
In the system S the probabilities p1 and p2 where
p2gt p1 are replaced by p1 e and p2-erespectively
under the proviso 0lt2eltp2-p1 . Prove the H(S)
is increased.
We know that entropy H(S) can be viewed as a
measure of _____ about S. Please list 3 items for
this blank.
information
uncertainty
randomness
12
Ch2 Basic Concepts
2. 3 Mutual information
Let S1 be the system with events
the associated probabilities being
Let S2 be the system with events
the associated probabilities being
13
Ch2 Basic Concepts
2. 3 Mutual information
Two systems S1 and S2
satisfying
relation
14
Ch2 Basic Concepts
2. 3 Mutual information
relation
15
Ch2 Basic Concepts
2. 3 Mutual information
conditional probability
conditional self-information
mutual information
NOTE
16
Ch2 Basic Concepts
2. 3 Mutual information
conditional entropy
mutual information
17
Ch2 Basic Concepts
2. 3 Mutual information
conditional self-information
mutual information and
If Ej and Fk are statistically independent
18
Ch2 Basic Concepts
2. 3 Mutual information
joint entropy
joint entropy and conditional entropy
19
Ch2 Basic Concepts
2. 3 Mutual information
mutual information and conditional entropy
20
Ch2 Basic Concepts
2. 3 Mutual information
Thm
mutual information of two systems cannot exceed
the sum of their separate entropies
21
Ch2 Basic Concepts
2. 3 Mutual information
Systems independent
If S1 and S2 are statistically independent
Joint entropy of two statistically independent
systems is the sum of their separate entropies
22
Ch2 Basic Concepts
2. 3 Mutual information
Thm
with equality only if S1 and S2 are
statistically independent
ProofAssume that pjk ?0
23
Ch2 Basic Concepts
2. 3 Mutual information
Thm
with equality only if S1 and S2 are
statistically independent
Proof
24
Ch2 Basic Concepts
2. 3 Mutual information
Ex A binary symmetric channel with crossover
probability e
Let S1 be the input E00, E11 and S2 be the
output F00, F11
25
Ch2 Basic Concepts
2. 3 Mutual information
Assume that
Then
26
Ch2 Basic Concepts
2. 3 Mutual information
Compute the output
Then
then
If
27
Ch2 Basic Concepts
2. 3 Mutual information
Compute the mutual information
28
Ch2 Basic Concepts
2. 3 Mutual information
Compute the mutual information
29
Ch2 Basic Concepts
2. 3 Mutual information
Ex The following message may be sent over a
binary symmetric channel with crossover
probability e
and they are equally probable at the input. What
is the mutual information between M1 and the
first output digit being 0? What additional
mutual information is conveyed by the knowledge
that the second output digit is also 0?
30
Ch2 Basic Concepts
2. 3 Mutual information
For the output 00
The extra mutual infoemation
31
Ch2 Basic Concepts
2. 4 Data processing theorem
Data processing theorem
If S1 and S3 are statistically independent when
conditioned on S2, then
convexity theorem
If S1 and S3 are statistically independent when
conditioned on S2, then
32
Ch2 Basic Concepts
2. 4 Data processing theorem
Data processing theorem
If S1 and S3 are statistically independent when
conditioned on S2, then
proof
33
Ch2 Basic Concepts
2. 5 Uniqueness theorem
Def
be a continuous function of its arguments in which
??
(a) f takes its largest value of pk1/n
(b) f is unaltered if an impossible event is
added to the system
(c)
34
Ch2 Basic Concepts
2. 5 Uniqueness theorem
Uniqueness theorem
for a positive constant C
proof
Write a Comment
User Comments (0)
About PowerShow.com