Basic Concepts of Information Theory - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Basic Concepts of Information Theory

Description:

Basic Concepts of Information Theory Entropy for Two-dimensional Discrete Finite Probability Schemes. Conditional Entropy. Communication Network. – PowerPoint PPT presentation

Number of Views:303
Avg rating:3.0/5.0
Slides: 28
Provided by: IgorAiz6
Category:

less

Transcript and Presenter's Notes

Title: Basic Concepts of Information Theory


1
Basic Concepts of Information Theory
  • Entropy for Two-dimensional Discrete Finite
    Probability Schemes.
  • Conditional Entropy.
  • Communication Network.
  • Noise Characteristics of a Communication Channel.

2
Entropy. Basic Properties
  • Continuity if the probabilities of the
    occurrence of events are slightly changed, the
    entropy is slightly changed accordingly.
  • Symmetry
  • Extremal Property when all the events are
    equally likely, the average uncertainty has the
    largest value

3
Entropy. Basic Properties
  • Additivity. Let is the
    entropy associated with a complete set of events
    E1, E2, , En. Let the event En is divided into
    k disjoint subsets
  • Thus
  • and where

4
Entropy. Basic Properties
  • In general,
  • is continuous in pi for
    all


5
Entropy for Two-dimensional Discrete Finite
Probability Schemes
6
Entropy for Two-dimensional Discrete Finite
Probability Schemes
  • The two-dimensional probability scheme provides
    the simplest mathematical model for a
    communication system with a transmitter and a
    receiver.
  • Consider two finite discrete sample spaces O1
    (transmitter space) O2 (receiver space) and their
    product space O.

7
Entropy for Two-dimensional Discrete Finite
Probability Schemes
  • In O1 and O2 we select complete sets of events
  • Each event may occur in
    conjunction with any event . Thus
    for the product space O O1 O2 we obtain the
    following complete set of events

8
Entropy for Two-dimensional Discrete Finite
Probability Schemes
  • We may consider the following three complete sets
    of probability schemes
  • Each one of them is, by assumption, a finite
    complete probability scheme like

9
Entropy for Two-dimensional Discrete Finite
Probability Schemes
  • The joint probability matrix for the random
    variables X and Y associated with spaces O1 and
    O2
  • Respectively,

10
Entropy for Two-dimensional Discrete Finite
Probability Schemes
  • Complete Probability Scheme
  • Entropy

11
Entropy for Two-dimensional Discrete Finite
Probability Schemes
  • If all marginal probabilities and
    are known then the marginal entropies can be
    expressed according to the entropy definition

12
Conditional Entropies
  • Let now an event Fi may occur not independently,
    but in conjunction with

13
Conditional Entropies
  • Consider the following complete probability
    scheme
  • Hence

14
Conditional Entropies
  • Taking this conditional entropy for all
    admissible yj, we obtain a measure of average
    conditional entropy of the system
  • Respectively,

15
Conditional Entropies
  • Since
  • Then finally conditional entropies can be written
    as

16
Five Entropies Pertaining to Joint Distribution
  • Thus we have considered
  • Two conditional entropies H(XY), H(YX)
  • Two marginal entropies H(X), H(Y)
  • The joint entropy H(X,Y)

17
Communication Network. Noise characteristics of
a channel
18
Communication Network
  • Consider a source of communication with a given
    alphabet. The source is linked to the receiver
    via a channel.
  • The system may be described by a joint
    probability matrix by giving the probability of
    the joint occurrence of two symbols, one at the
    input and another at the output.

19
Communication Network
  • xi a symbol, which was sent yj - a symbol,
    which was received
  • The joint probability matrix

20
Communication Network Probability Schemes
  • There are following five probability schemes of
    interest in a product space of the random
    variables X and Y
  • PX,Y joint probability matrix
  • PX marginal probability matrix of X
  • PY marginal probability matrix of Y
  • PXY conditional probability matrix of XY
  • PYX conditional probability matrix of YX

21
Communication Network Entropies
  • There is the following interpretation of the five
    entropies corresponding to the mentioned five
    probability schemes
  • H(X,Y) average information per pairs of
    transmitted and received characters (the entropy
    of the system as a whole)
  • H(X) average information per character of the
    source (the entropy of the source)
  • H(Y) average information per character at the
    destination (the entropy at the receiver)
  • H(YX) a specific character xk being
    transmitted and one of the permissible yj may be
    received (a measure of information about the
    receiver, where it is known what was transmitted)
  • H(XY) a specific character yj being received
    this may be a result of transmission of one of
    the xk with a given probability (a measure of
    information about the source, where it is known
    what was received)

22
Communication Network Entropies Meaning
  • H(X) and H(Y) give indications of the
    probabilistic nature of the transmitter and
    receiver, respectively.
  • H(X,Y) gives the probabilistic nature of the
    communication channel as a whole
  • H(YX) gives an indication of the noise (errors)
    in the channel
  • H(XY) gives a measure of equivocation (how well
    one can recover the input content from the output)

23
Communication NetworkDerivation of the Noise
Characteristics
  • In general, the joint probability matrix is not
    given for the communication system.
  • It is customary to specify the noise
    characteristics of a channel and the source
    alphabet probabilities.
  • From these data the joint and the output
    probability matrices can be derived.

24
Communication NetworkDerivation of the Noise
Characteristics
  • Let us suppose that we have derived the joint
    probability matrix

25
Communication NetworkDerivation of the Noise
Characteristics
  • In other words
  • where

26
Communication NetworkDerivation of the Noise
Characteristics
  • If PX is not diagonal, but a row matrix
    (n-dimensional vector) then
  • where PY is also a row matrix
    (m-dimensional vector) designating the
    probabilities of the output alphabet.

27
Communication NetworkDerivation of the Noise
Characteristics
  • Two discrete channels of our particular interest
  • Discrete noise-free channel (an ideal channel)
  • Discrete channel with independent input-output
    (errors in the channel occur, thus noise is
    presented)
Write a Comment
User Comments (0)
About PowerShow.com