Communication Theory (EC 2252) - PowerPoint PPT Presentation

1 / 120
About This Presentation
Title:

Communication Theory (EC 2252)

Description:

Communication Theory (EC 2252) Prof.J.B.Bhattacharjee K.Senthil Kumar ECE Department Rajalakshmi Engineering College * Review of Spectral characteristics Periodic and ... – PowerPoint PPT presentation

Number of Views:283
Avg rating:3.0/5.0
Slides: 121
Provided by: Noo82
Category:

less

Transcript and Presenter's Notes

Title: Communication Theory (EC 2252)


1
Communication Theory (EC 2252)
  • Prof.J.B.Bhattacharjee
  • K.Senthil Kumar
  • ECE Department
  • Rajalakshmi Engineering College

2
Review of Spectral characteristics
  • Periodic and Non-periodic Signals A signal is
    said to be periodic, if it exhibits periodicity.
    i.e.,
  • x(t T)x(t) , for all values of t.
  • Periodic signal has the property that it is
    unchanged by a time shift of T. A signal that
    does not satisfy the above periodicity property
    is called a non-periodic signal.
  • Periodic signals can be represented using the
    Fourier Series. Non-periodic signals can be
    represented using the Fourier Transform.
  • Both Fourier series and Fourier Transform deal
    with the representation of the signals as a
    combination of sine and cosine waves.

3
Fourier Series
  • Fourier series a complicated waveform analyzed
    into a number of harmonically related sine and
    cosine functions
  • A continuous periodic signal x(t) with a period T
    may be represented by
  • x(t)S8k1 (Ak cos k? t Bk sin k? t) A0
  • Dirichlet conditions must be placed on x(t) for
    the series to be valid the integral of the
    magnitude of x(t) over a complete period must be
    finite, and the signal can only have a finite
    number of discontinuities in any finite interval

4
Fourier Series Equations
  • The Fourier series represents a periodic signal
    Tp in terms of frequency components
  • We get the Fourier series coefficients as
    follows
  • The complex exponential Fourier coefficients are
    a sequence of complex numbers representing the
    frequency component ?0k.

5
  • Periodic signals represented by Fourier Series
    have Discrete spectra.

6
The Fourier Transform
  • Fourier transform is used for the non-periodic
    signals. A Fourier transform converts the signal
    from the time domain to the spectral domain.
  • Continuous Fourier Transform

7
  • Non-periodic signals represented by Fourier
    transform have Continuous spectra.

8
Fourier Transform PairsNote ? stands for
rectangular function. ? stands for triangular
function.
9
Introduction to Communication Systems
  • Communication Basic process of exchanging
    information from one location (source) to
    destination (receiving end).
  • Refers process of sending, receiving and
    processing of information/signal/input from one
    point to another point.

Source
Destination
Flow of information
Figure 1 A simple communication system
10
  • Electronic Communication System defined as the
    whole mechanism of sending and receiving as well
    as processing of information electronically from
    source to destination.
  • Example Radiotelephony, broadcasting,
    point-to-point, mobile communications, computer
    communications, radar and satellite systems.

11
Objectives
  • Communication System to produce an accurate
    replica of the transmitted information that is to
    transfer information between two or more points
    (destinations) through a communication channel,
    with minimum error.

12
NEED FOR COMMUNICATION
  • Interaction purposes enables people to interact
    in a timely fashion on a global level in social,
    political, economic and scientific areas, through
    telephones, electronic-mail and video conference.
  • Transfer Information Tx in the form of audio,
    video, texts, computer data and picture through
    facsimile, telegraph or telex and internet.
  • Broadcasting Broadcast information to masses,
    through radio, television or teletext.

13
Terms Related To Communications
  • Message physical manifestation produced by the
    information source and then converted to
    electrical signal before transmission by the
    transducer in the transmitter.
  • Transducer Device that converts one form of
    energy into another form.
  • Input Transducer placed at the transmitter
    which convert an input message into an electrical
    signal.
  • Example Microphone which converts sound energy
    to electrical energy.

Input Transducer
Electrical Signal
Message
14
  • Output Transducer placed at the receiver which
    converts the electrical signal into the original
    message.
  • Example Loudspeaker which converts electrical
    energy into sound energy.
  • Signal electrical voltage or current which
    varies with time and is used to carry message or
    information from one point to another.

Electrical Signal
Output Transducer
Message
15
Elements of a Communication System
  • The basic elements are Source, Transmitter,
    Channel, Receiver and Destination.

Information Source
Transmitter
Channel Transmission Medium
Receiver
Destination
Noise
Figure Basic Block Diagram of a Communication
System
16
Function of each Element.
  • Information Source the communication system
    exists to send messages. Messages come from
    voice, data, video and other types of
    information.
  • Transmitter Transmit the input message into
    electrical signals such as voltage or current
    into electromagnetic waves such as radio waves,
    microwaves that is suitable for transmission and
    compatible with the channel. Besides, the
    transmitter also do the modulation and encoding
    (for digital signal).

17
Block Diagram of a Transmitter
Transmitting Antenna
  • 5 minutes exercise
  • Describe the sequence of events that happen at
    the radio waves station during news broadcast?

Audio Amplifier
Modulator
RF Amplifier
Modulating Signal
Carrier Signal
18
  • Channel/Medium is the link or path over which
    information flows from the source to destination.
    Many links combined will establish a
    communication networks.
  • There are 5 criteria of a transmission system
    Capacity, Performance, Distance, Security and
    Cost which includes the installation, operation
    and maintenance.
  • 2 main categories of channel that commonly used
    are line (guided media) and free space (unguided
    media)

19
  • Receiver Receives the electrical signals or
    electromagnetic waves that are sent by the
    transmitter through the channel. It is also
    separate the information from the received signal
    and sent the information to the destination.
  • Basically, a receiver consists of several stages
    of amplification, frequency conversion and
    filtering.

20
Block Diagram of a Receiver
Receiving Antenna
  • Destination is where the user receives the
    information, such as loud speaker, visual
    display, computer monitor, plotter and printer.

RF Amplifier
Intermediate Frequency Amplifier
Demodulator
Destination
Audio Amplifier
Mixer
Local Oscillator
21
Analog Modulation
  • Baseband Transmission
  • Baseband signal is the information either in a
    digital or analogue form.
  • Transmission of original information whether
    analogue or digital, directly into transmission
    medium is called baseband transmission.
  • Example intercom (figure below)

Voice
Speaker
Audio Amplifier
Microphone
Voice
Audio Amplifier
Wire
22
Baseband signal is not suitable for long distance
communication.
  • Hardware limitations
  • Requires very long antenna
  • Baseband signal is an audio signal of low
    frequency. For example voice, range of frequency
    is 0.3 kHz to 3.4 kHz. The length of the antenna
    required to transmit any signal at least 1/10 of
    its wavelength (?). Therefore, L 100km
    (impossible!)
  • Interference with other waves
  • Simultaneous transmission of audio signals will
    cause interference with each other. This is due
    to audio signals having the same frequency range
    and receiver stations cannot distinguish the
    signals.

23
Modulation
  • Modulation defined as the process of modifying
    a carrier wave (radio wave) systematically by the
    modulating signal.
  • This process makes the signal suitable for
    transmission and compatible with the channel.
  • Resultant signal modulated signal
  • 2 types of modulation Analog Modulation and
    Digital Modulation.
  • Analogue Modulation to transfer an analogue low
    pass signal over an analogue bandpass channel.
  • Digital Modulation to transfer a digital bit
    stream the carrier is a periodic train and one of
    the pulse parameter (amplitude, width or
    position) changes according to the audio signal.

24
Purpose of Modulation Process in Communication
Systems
  • To generate modulated signal that is suitable for
    transmission and compatible with the channel.
  • To allow efficient transmission increase
    transmission speed and distance, eg
  • By using high frequency carrier signal, the
    information (voice) can travel and propagate
    through the air at greater distances and shorter
    transmission time
  • Also, high frequency signal is less prone to
    noise and interference. Certain types of
    modulation have the useful property of
    suppressing both noise and interference
  • For example, FM use limiter to reduce noise and
    keep the signals amplitude constant. PCM systems
    use repeaters to generate the signal along the
    transmission path.

25
Amplitude Modulation (AM)
  • Objectives-
  • Recognize AM signal in the time domain, frequency
    domain and trigonometric equation form
  • Calculate the percentage of modulation index
  • Calculate the upper sidebands, lower sidebands
    and bandwidth of an AM signal by given the
    carrier and modulating signal frequencies
  • Calculate the power related in AM signal
  • Define the terms of DSBSC, SSB and VSB
  • Understand the modulator and demodulator
    operations

26
Introduction
  • Modulation
  • The alteration of the amplitude, phase or
    frequency of an oscillator in accordance with
    another signal.
  • Input signal is encoded in a format suitable for
    transmission
  • A low frequency information signal is encoded
    over a higher frequency signal
  • Carrier Signal
  • Sinusoidal wave,
  • Modulating Signal/Base band
  • Information signal,
  • Modulated Wave
  • Higher frequency signal which is being modulated
  • Modulation Schemes
  • To counter the effects of multi path fading and
    time-delay spread

27
Modulation Schemes
Carrier Signal, Vc
Modulating Signal, Vm
Modulated Signal VAM VPM VFM
28
Amplitude Modulation
  • Time Domain
  • Frequency Domain

29
AM Modulator
Modulator
Information Signal
Output
Carrier Signal
30
Amplitude Modulation
Vc
- Vc
Vm
- Vm
Vam
- Vam
31
Modulation Index
  • Modulation Index, m
  • Indicates the amount that the carrier signal is
    modulated.
  • It is an expression of the amount of power in the
    sidebands.
  • Modulation level ranges 0-1 where
  • 0 no modulation
  • 1 full modulation
  • gt1 distortion

32
Modulation Index
33
Modulation Index
Vmax
Vmin
Vmax (p-p)
Vmin (p-p)
34
Modulation Index
m 0
m 0.5
m 1
35
Bandwidth
VC
fc
  • Bandwidth for AM signal,

fc-fm
fcfm
36
Power Distributions
fc-fm
fcfm
fc
  • Total transmitted power, PT
  • If R 1,

37
Double Side Band Suppressed Carrier (DSBSC)
  • It is a technique where it is transmitting both
    the sidebands without the carrier (carrier is
    being suppressed/cut)
  • Characteristics
  • Power content less
  • Same bandwidth
  • Disadvantages - receiver is complex and expensive.

38
Single Side Band (SSB)
  • Improved DSBSC and standard AM, which waste power
    and occupy large bandwidth
  • SSB is a process of transmitting one of the
    sidebands of the standard AM by suppressing the
    carrier and one of the sidebands
  • Advantages
  • Saving power
  • Reduce BW by 50
  • Increase efficiency, increase SNR
  • Disadvantages
  • Complex circuits for frequency stability

39
Vestigial Side Band (VSB)
  • VSB is mainly used in TV broadcasting for their
    video transmissions.
  • TV signal consists of
  • Audio signal transmitted by FM
  • Video signal transmitted by VSB
  • A video signal consists a range of frequency and
    fmax 4.5 MHz.
  • If it transmitted using conventional AM, the
    required BW is 9 MHz (BW2fm). But according to
    the standard, TV signal is limited to 7 MHz only
  • So, to reduce the BW, a part of the LSB of
    picture signal is not fully transmitted.

40
Vestigial Side Band (VSB)
  • The frequency spectrum for the TV signal / VSB

Video Carrier
Audio Carrier
Total TV signal bandwidth 7 MHz
4.5 MHz
Lower Video Bands
Upper Video Bands
Upper Audio Bands
Lower Audio Bands
f (MHz)
1.25
5.75
0
6.75
7.0
6.25
41
Modulator Circuits
42
Modulator Circuits
A. Modulating Signal
B. Carrier
C. Sum of carrier and modulating signal
D. Diode current
E. AM output across tuned circuit
43
Demodulator
44
Demodulator
A. AM signal
B. Current pulses through diode
C. Demodulating signal
D. Modulating signal
45
Frequency Modulation (FM)
  • Objectives-
  • Recognize FM signal in the time domain, frequency
    domain and trigonometric equation form
  • Calculate the percentage of modulation index
  • Calculate the upper sidebands, lower sidebands
    and bandwidth of an FM signal by Carsonss Rule
    and Bessel Function Table
  • Calculate the power related in FM signal
  • Understand the modulator and demodulator of FM

46
Introduction
  • FM is the process of varying the frequency of a
    carrier wave in proportion to a modulating
    signal.
  • The amplitude of the carrier is kept constant
    while its frequency is varied by the amplitude of
    the modulating signal.
  • In all types of modulation, the carrier wave is
    varied by the AMPLITUDE of the modulating signal.
  • FM signal does not have an envelope, therefore
    the FM receiver does not have to respond to
    amplitude variations ? it can ignore noise to
    some extent.

47
Frequency Modulation
48
Frequency Modulation
  • The importance features about FM waveforms are
  • The frequency varies
  • The rate of change of carrier frequency changes
    is the same as the frequency of the information
    signal
  • The amount of carrier frequency changes is
    proportional to the amplitude of the information
    signal
  • The amplitude is constant

49
Frequency Modulation
  • Carrier Signal
  • Sinusoidal wave
  • Modulating Signal/Base band
  • Information signal
  • Modulated Wave
  • Higher frequency signal which is being modulated
  • Where

50
Frequency Modulation
  • Time Domain
  • Frequency Domain

51
FM Modulator
52
FM Modulator
Modulator
Information Signal
Output
Carrier Signal
53
Frequency
  • Carrier Frequency
  • As in FM system, carrier frequency in FM systems
    must be higher than the information signal
    frequency.
  • Maximum Frequency
  • Minimum Frequency
  • Carrier Swing

54
Modulation Index
  • Modulation Index, m _at_ ß
  • Indicates the amount that the carrier signal is
    modulated.
  • It is an expression of the amount of power in the
    sidebands.
  • Modulation level ranges 0
  • Where
  • ?f fd frequency deviation
  • fm modulating frequency
  • Vm amplitude of modulating signal

55
Modulation Index
ß 1
ß 5
56
Modulation Index
ß 25
57
Modulation Index


58
Bandwidth
  • Using Bessel Function, the bandwidth for FM
    signal,
  • n number of pairs of the significant
    sidebands
  • fm the frequency the modulating signal

59
Bandwidth
  • Using Carsons Rule, to estimate the bandwidth
    for an FM signal transmission.
  • ?f peak frequency deviation
  • fm(max) highest modulating signal frequency

60
Power Distributions
  • FM transmitted power, PFM
  • where

61
Narrowband FM and Wideband FM
  • Narrowband FM has only a single pair of
    significant sidebands.  The value of modulation
    index ß lt1.
  • Wideband FM has a large number  (theoretically
    infinite) number of sidebands. The value of
    modulation index ß gt1.

62
Generation of Narrowband FM (NBFM)
_
INTEGRATOR
PRODUCT MODULATOR
S
NBFM WAVE
  • The modulator splits the carrier into two paths.
    One path is direct. The other path contains a -90
    degree phase shift unit and a product modulator.
    The difference between the signals in the two
    paths produces the NBFM signal.

-90 PHASE SHIFTER
CARRIER WAVE
MODULATING WAVE
63
Frequency Modulators
  • A frequency modulator is a circuit that varies
    carrier frequency in accordance with the
    modulating signal.
  • There are two types of frequency modulator
    circuits.
  • (1) Direct FM Carrier frequency is directly
    varied by the message through voltage-controlled
    oscillator.
  • Eg Varactor diode modulator.
  • (2) Indirect FM Generate NBFM first, then NBFM
    is frequency multiplied for targeted ?f.
  • Eg Armstrong modulator

64
FM Varactor Modulator
65
The Operation of the Varactor Modulator
  • The info signal is applied to the base of the
    input transistor and appears amplified and
    inverted at the collector.
  • This low freq signal passes through the RF choke
    (L1) and is applied across the varactor diode.
  • Varactor diode behaves as voltage controlled
    capacitor.
  • When low reverse biased voltage is applied, more
    capacitance is generated and thus decrease the
    frequency.

66
  • When high reverse biased voltage is applied, less
    capacitance is generated and thus increase the
    frequency.
  • The varactor diode changes its capacitance in
    sympathy with the info signal and therefore
    changes the total value of the capacitance in the
    tuned circuit.
  • The changing value of capacitance causes the
    oscillator freq to increase and decrease under
    the control of the information signal.
  • The output is therefore an FM signal.

67
Armstrong of indrect FM generation
  • In this method the message signal is first
    subjected to NBFM modulator using a
    crystal-controlled oscillator for generating
    carrier.
  • Crystal control provides frequency stability.
  • The NBFM wave is next multiplied in frequency by
    using a frequency multiplier so as to produce the
    desired wideband FM.

68
Frequency Demodulator
  • The FM demodulating circuits used to recover the
    original modulating signal.
  • Any circuit that will convert a frequency
    variation in the carrier back into a proportional
    voltage variation can be used to demodulate or
    detect FM signals.
  • A popular method used for FM demodulation is the
    Frequency discriminator.

69
Frequency discriminator
Output of the Frequency discriminator
70
  • The Frequency discriminator circuit consists of
    the slope ciruit followed by the envelope
    detector.
  • The slope circuit converts the instantaneous
    frequency variations of the FM input signal to
    instantaneous amplitude variations.
  • These amplitude variations are rectified by the
    envelope detector to provide a DC output voltage
    which varies in amplitude and polarity with the
    input signal frequency.

71
FM vs AM
Advantages Disadvantages
Better noise immunity Rejection of interfering signals because of capture effect Better transmitter efficiency Excessive use of spectrum More complex and costly circuits
72
  • Review of Probability
  • Sample Spacethe space of all possible outcomes
    (d)
  • Eventa collection of outcomessubset of d
  • Probabilitya measure assigned to the events of
    a sample space with the following properties
  • for all event A in S
  • If A and B are mutually exclusive,
  • Theorem
  • The Conditional probability of an event A given
    the occurrence of event B is

73
  • Two events A and B are independent if
  • Random Variables
  • A rule which assigns a numerical value to each
    possible outcomes of a chance experiment.
  • If the experiment is flipping a coin. Then a
    random variable X can be defined as

S1 H X(S1)1
S2 T X(S2)-1
74
  • Cumulative Distribution Function (CDF)
  • ?
  • Properties of CDF
  • 1.
  • 2.
  • 3.
  • Probability Density Function (PDF)
  • ?
  • Properties of PDF ,
    ,

75
  • Random Processes A random process is a mapping
    from the sample space to an ensemble of time
    functions.

76
Gaussian process
  • A random process X(t) is a Gaussian process if
    for all n and for all (t1 t2 ... tn), the
    sequence of random variables X(t1), X(t2)...
    X(tn) has a jointly Gaussian density function.
  • Central limit theorem
  • The sum of a large number of independent and
    identically distributed(i.i.d) random variables
    getting closer to Gaussian distribution.
  • Thermal noise can be closely modeled by Gaussian
    process.

77
  • Property 1
  • For Gaussian process, knowledge of the mean(m)
    and covariance(C) provides a complete statistical
    description of process.
  • Property 2
  • If a Gaussian process X(t) is passed through a
    LTI system, the output of the system is also a
    Gaussian process. The effect of the system on
    X(t) is simply reflected by the change in mean(m)
    and covariance(C) of X(t).

78
Noise Theory
  • Shot noise It results from the shot effect in
    the amplifying devices and active device. It is
    caused by random variation in the arrival of
    electrons (or holes) at the output of the
    devices.
  • For diode, the rms shot noise current is given by

79
  • Thermal noise is the electrical noise arising
    from the random motion of electrons in a
    conductor. The noise power generated by a
    resistor is given by

80
  • White noise It is the idealized form of noise,
    whose spectrum is independent of the operating
    frequency. The power spectral density of white
    noise w(t) is Sw(f)N0 /2. The autocorrelation
    Rw(t) of white noise is an impulse as shown
    below.

81
  • Narrow band noise (Ideal case)
  • w(t) n(t)
  • filtered noise is narrow-band noise
  • n(t) nI(t)cos(2?fCt) - nQ(t)sin(2?fCt)
  • where nI(t) is inphase, nQ(t) is
    quadrature component
  • ? filtered signal x(t)
  • x(t) s(t) n(t)
  • - Average Noise Power N0BT

82
  • Noise Figure
  • Consider a signal source. The signal to noise
    ratio (SNR) available from the source is given
    by
  • Consider that the source is connected to an
    amplifier with gain G. Since all amplifiers
    contribute noise, the available output SNR will
    be less than the SNR of the source.

83
  • The noise power at the output of the amplifier
    will be
  • The noise factor F is defined as
  • When noise factor is expressed in decibels, it is
    called noise figure.
  • Noise figure (F) dB 10logF

84
  • The noise power expressed in terms of a
    temperature is callled Noise Temperature.
  • If the amplifier noise is Pna , then the
    equivalent noise temperature Te of the amplifier
    is given by the equation

85
AM SUPERHETERODYNE RECEIVER
86
  • RF section It generally consists of a
    pre-selector and an amplifier stage. The
    pre-selector is a broad tuned band-pass filter
    with adjustable center frequency that is tuned to
    the desired carrier frequency. The other
    functions of the RF section are detecting, band
    limiting and amplifying the received RF signals.
  • Mixer/converter section It is the stage of
    down-converts the received RF frequencies to
    intermediate frequencies (IF) which are simply
    frequencies that fall somewhere between the RF
    and information frequencies, hence the name
    intermediate. This section also includes a local
    oscillator (LO).

87
  • IF Section IF or intermediate frequency section
    is the stage where its primary functions are
    amplification and selectivity.
  • AM detector Section AM detector section is the
    stage that demodulates the AM wave and converts
    it to the original
  • information signal.
  • Audio section Audio section is the stage that
    amplifies the recovered information.

88
Performance of CW Modulation Systems
  • Introduction
  • - Receiver Noise (Channel Noise)
  • additive, White, and Gaussian
  • Receiver Model
  • 1. RX Model
  • N0 KTe where K Boltzmanns constant
  • Te equivalent noise
    Temp.
  • Average noise power per unit
    bandwidth

89
SNR
  • The signal x(t) available for demodulation is
    defined by
  • The output signal-to-noise ratio (SNR)O is
    defined as the ratio of the average power of the
    demodulated message signal to the average power
    of the noise, both measured at the receiver
    output.
  • The channel signal-to-noise ratio, (SNR)C is
    defined as the ratio of the average power of the
    modulated signal to the average power of the
    channel noise in the message bandwidth, both
    measure at the receiver input.
  • For the purpose of comparing different CW
    modulation systems, we normalize the receiver
    performance by dividing (SNR)O by (SNR)C. This
    ratio is called figure of merit for the receiver
    and is defined as

90
Noise in DSB-SC Receivers
Lets consider the case of DSB-SC. The expression
for the modulated signal is given as The carrier
wave is statistically independent of the message
signal. The average power of DSB-SC modulated
component of s(t) is
91
  • With a noise PSD of N0/2 the average noise power
    in the message bandwidth W equals WN0 (baseband
    scenario).
  • Pm is the power of the message. Hence we have
  • Finding an expression for (SNR)O, we have

92
  • Output of the LPF is
  • The power of the signal component at the
    receiver output is . The average
    power of the filtered noise is 2WN0.
  • The average noise power at the receiver output is
  • Hence we have,

93
Noise in AM receiver using envelope detection
  • The expression for AM signal is given as
  • where it is assumed that
  • The average power of the carrier in the AM signal
    s(t) is
  • The average power of the information bearing
    component
  • is
  • Average power of the full AM signal s(t) is

94
  • Hence, the channel signal to noise ratio for AM
    is
  • Finding an expression for (SNR)O, we have

95
Threshold Effect
  • When carrier-to-noise ratio is small as compared
    to unity the noise term dominates the performance
    of the envelope detector and is completely
    different. Representing the narrowband noise n(t)
    in terms of its envelope and phase, we have
  • The phasor diagram for x(t) s(t) n(t) becomes

96
  • The noise envelope is used as a reference here
    due to its dominance. Here it is assumed that Ac
    is small as compared to r(t). If we neglect the
    quadrature component of the signal with respect
    to the noise we have
  • Hence, when carrier-to-noise ratio is small the
    detector has no component that is strictly
    proportional to the message signal m(t).
    Recalling that is uniformly distributed
    over radians. Hence, it follows that we have a
    complete loss of information at the detector
    output (as expected value will be zero). This
    loss of information m(t) at the output of the
    envelope detector is called the threshold effect.

97
Pre-emphasis and De-emphasis
  • FM results is an unacceptably low SNR at the high
    frequency end of the message spectrum. To offset
    this undesirable occurrence, pre-emphasis and
    de-emphasis technique is used.
  • Pre-emphasis consists in artificially boosting
    the spectral components in the higher part of the
    message spectrum. This is accomplished by passing
    message signal m(t) , through the pre-emphasis
    filter, denoted Hpe(f) . The pre-emphasized
    signal is used to frequency modulate the carrier
    at the transmitting end.
  • In the receiver, the inverse operation,
    de-emphasis, is performed. This is accomplished
    by passing the discriminator output through a
    filter, called the de-emphasis filter, denoted
    Hde(f ) .

98
  • Pre-emphasis and de-emphasis in FM
  • P.S.D. of noise at FM Rx output
  • P.S.D. of typical message signal

99
Information theory
  • What is information theory ?
  • Information theory is needed to enable the
    communication system to carry information
    (signals) from sender to receiver over a
    communication channel
  • it deals with mathematical modelling and analysis
    of a communication system
  • its major task is to answer to the questions of
    signal compression and data transfer rate.
  • Those answers can be found and solved by entropy
    and channel capacity

100
  • Information is a measure of uncertainty. The less
    is the probability of occurrence of a certain
    message, the higher is the information.
  • Since the information is closely associated with
    the uncertainty of the occurrence of a particular
    symbol, When the symbol occurs the information
    associated with its occurrence is defined as

101
Entropy
  • Entropy is defined in terms of probabilistic
    behaviour of a source of information
  • In information theory the source output are
    discrete random variables that have a certain
    fixed finite alphabet with certain probabilities
  • Entropy is an average information content for the
    given source symbol. (bits/message)

102
  • Rate of information
  • If a source generates at a rate of r messages
    per second, the rate of information R is
    defined as the average number of bits of
    information per second.
  • H is the average number of bits of information
    per message. Hence
  • R rH bits/sec

103
Source Coding
  • Source coding (a.k.a lossless data compression)
    means that we will remove redundant information
    from the signal prior the transmission.
  • Basically this is achieved by assigning short
    descriptions to the most frequent outcomes of the
    source output and vice versa.
  • The common source-coding schemes are prefix
    coding, huffman coding, lempel-ziv coding.

104
Source Coding Theorem
  • Source coding theorem states that the output of
    any information source having entropy H units per
    symbol can be encoded into an alphabet having N
    symbols in such a way that the source symbols are
    represented by code words having a weighted
    average length not less than H/logN.
  • Hence source coding theorem says that encoding of
    messages from a source with entropy H can be
    done, bounded by the fundamental information
    theoretic limitation that the Minimum average
    number of symbols/message is H/logN.

105
Source coding example
  • Prefix coding has an important feature that it is
    always uniquely decodable and it also satisfies
    Kraft-McMillan (see formula 10.22 p. 624)
    inequality term
  • Prefix codes can also be referred to as
    instantaneous codes, meaning that the decoding
    process is achieved immediately

106
  • Shannon-Fano Coding In ShannonFano coding, the
    symbols are arranged in order from most probable
    to least probable, and then divided into two sets
    whose total probabilities are as close as
    possible to being equal. All symbols then have
    the first digits of their codes assigned symbols
    in the first set receive "0" and symbols in the
    second set receive "1".
  • As long as any sets with more than one member
    remain, the same process is repeated on those
    sets, to determine successive digits of their
    codes. When a set has been reduced to one symbol,
    of course, this means the symbol's code is
    complete and will not form the prefix of any
    other symbol's code.

107
  • Huffman Coding Create a list for the symbols, in
    decreasing order of probability. The symbols with
    the lowest probability are assigned a 0 and a
    1.
  • These two symbols are combined into a new symbol
    with the probability equal to the sum of their
    individual probabilities. The new symbol is
    placed in the list as per its probability value.
  • The procedure is repeated until we are left with
    2 symbols only for which 0 and 1 are assigned.
  • Huffman code is the bit sequence obtained by
    working backwards and tracking sequence of 0s
    and 1s assigned to that symbol and its
    successors.

108
  • Lempel-Ziv Coding A drawback of Huffman code is
    that knowledge of probability model of source is
    needed. Lempel-Ziv coding is used to overcome
    this drawback.
  • while Huffmans algorithm encodes blocks of fixed
    size into binary sequences of variable length,
    Lempel-Ziv encodes blocks of varying length into
    blocks of fixed size.
  • Lempel-Ziv coding is performed by parsing the
    source data into segments that are the shortest
    subsequences not encountered before.

109
Mutual Information
Source X
Channel
Receiver Y
  • Consider a communication system with a source of
    entropy H(X). The entropy on the receiver side be
    H(Y).
  • H(XY) and H(YX) are the conditional entropies,
    and H(X,Y) is the joint entropy of X and Y.
  • Then the Mutual information between the source X
    and the receiver Y can be expressed as
  • I(X,Y) H(X) - H(XY)
  • H(X) is the uncertainty of source X and H(X/Y) is
    the uncertainty of X given Y. Hence the quantity
    H(X) - H(XY) represents the reduction in
    uncertainty of X given the knowledge of Y. Hence
    I(X,Y) is termed mutual information.

110
Channel Capacity
  • Capacity in the channel is defined as a
    intrinsic ability of a channel to convey
    information.
  • Using mutual information the channel capacity of
    a discrete memoryless channel is the maximum
    average mutual information in any single use of
    channel over all possible probability
    distributions.
  • Thus Channel capacity Cmax( I(X,Y) ).

111
  • Shannons Channel Coding theorem
  • The Shannon theorem states that given a noisy
    channel with channel capacity C and information
    transmitted at a rate R, then if R lt C there
    exist codes that allow the probability of error
    at the receiver to be made arbitrarily small.
    This means that theoretically, it is possible to
    transmit information nearly without error at any
    rate below a limiting rate, C.
  • The converse is also important. If R gt C, an
    arbitrarily small probability of error is not
    achievable. All codes will have a probability of
    error greater than a certain positive minimal
    level, and this level increases as the rate
    increases. So, information cannot be guaranteed
    to be transmitted reliably across a channel at
    rates beyond the channel capacity.

112
  • Shannon-Hartley theorem or Information Capacity
    Theorem
  • An application of the channel capacity concept to
    an additive white Gaussian noise channel with B
    Hz bandwidth and signal-to-noise ratio S/N is the
    Information Capacity Theorem.
  • It states that for a band-limited Gaussian
    channel operating in the presence of additive
    Gaussian noise, the channel capacity is given by
  • C B log2(1 S/N)
  • where C is the capacity in bits per second, B
    is the bandwidth of the channel in Hertz, and S/N
    is the signal-to-noise ratio.

113
  • Band width and SNR tradeoff
  • As the bandwidth of the channel increases, it is
    possible to make faster changes in the
    information signal, thereby increasing the
    information rate.
  • However, as B ? ?, the channel capacity does not
    become infinite since, with an increase in
    bandwidth, the noise power also increases.
  • As S/N increases, one can increase the
    information rate while still preventing errors
    due to noise.
  • For no noise, S/N ? ? and an infinite information
    rate is possible irrespective of bandwidth.

114
Implications of the Information Capacity Theorem
115
  • Rate distortion theory
  • Rate distortion theory is the branch of
    information theory addressing the problem of
    determining the minimal amount of entropy or
    information that should be communicated over a
    channel such that the source can be reconstructed
    at the receiver with a given distortion.
  • Rate distortion theory can be used for the given
    below situations
  • 1. Source coding in which the coding alphabet
    cannot exactly represent the source information.
  • 2. when the information is to be transmitted at a
    rate greater than channel capacity.

116
Lower the bit rate R by allowing some acceptable
distortion D of the signal
117
  • Rate Distortion Function
  • The functions that relate the rate and distortion
    are found as the solution of the following
    minimization problem.
  • In the above equation, I(X,Y) is the Mutual
    information.

118
Rate distortion function for Gaussian memory-less
source
  • If Px(X) is Gaussian, variance is ?2 and if we
    assume that successive samples of the signal x
    are stochastically independent, we find the
    following analytical expression for the rate
    distortion function.

119
A Plot of the Rate distortion function for
Gaussian source
120
Lossy Source Coding
  • Lossy source coding is the representation of the
    source in digital form with as few bits as
    possible while maintaining an acceptable loss of
    information.
  • In lossy source coding, the source output is
    encoded at a rate less than the source entropy.
  • Hence there is reduction in the information
    content of the source.
  • Eg It is not possible to digitally encode an
    analog signal with a finite number of bits
    without producing some distortion.
Write a Comment
User Comments (0)
About PowerShow.com