Chapter 7 Generating and Processing Random Signals - PowerPoint PPT Presentation

About This Presentation
Title:

Chapter 7 Generating and Processing Random Signals

Description:

Chapter 7 Generating and Processing Random Signals B93902016 B93902076 Outline Stationary and Ergodic Process ... – PowerPoint PPT presentation

Number of Views:143
Avg rating:3.0/5.0
Slides: 57
Provided by: Supe2
Category:

less

Transcript and Presenter's Notes

Title: Chapter 7 Generating and Processing Random Signals


1
Chapter 7Generating and Processing Random Signals
  • ???
  • ??? B93902016 ???
  • ??? B93902076 ???

2
Outline
Outline
  • Stationary and Ergodic Process
  • Uniform Random Number Generator
  • Mapping Uniform RVs to an Arbitrary pdf
  • Generating Uncorrelated Gaussian RV
  • Generating correlated Gaussian RV
  • PN Sequence Generators
  • Signal processing

3
Random Number Generator
  • Noise, interference
  • Random Number Generator- computational or
    physical device designed to generate a sequence
    of numbers or symbols that lack any pattern, i.e.
    appear random, pseudo-random sequence
  • MATLAB - rand(m,n) , randn(m,n)

4
Stationary and Ergodic Process
  • strict-sense stationary (SSS)
  • wide-sense stationary (WSS)

  • Gaussian
  • SSS gtWSS WSSgtSSS
  • Time average v.s ensemble average
  • The ergodicity requirement is that the ensemble
    average coincide with the time average
  • Sample function generated to represent signals,
    noise, interference should be ergodic

5
Time average v.s ensemble average
  • Time average
  • ensemble average

6
Example 7.1 (N100)
7
Uniform Random Number Genrator
  • Generate a random variable that is uniformly
    distributed on the interval (0,1)
  • Generate a sequence of numbers (integer) between
    0 and M and the divide each element of the
    sequence by M
  • The most common technique is linear congruence
    genrator (LCG)

8
Linear Congruence
  • LCG is defined by the operation
  • xi1axicmod(m)
  • x0 is seed number of the generator
  • a, c, m, x0 are integer
  • Desirable property- full period

9
Technique A The Mixed Congruence Algorithm
  • The mixed linear algorithm takes the form
  • xi1axicmod(m)
  • - c?0 and relative prime to m
  • - a-1 is a multiple of p, where p is
    the
  • prime factors of m
  • - a-1 is a multiple of 4 if m is a
  • multiple of 4

10
Example 7.4
  • m5000(23)(54)
  • c(33)(72)1323
  • a-1k1?2 or k2?5 or 4?k3
  • so, a-14?2?5?k 40k
  • With k6, we have a241
  • xi1241xi 1323mod(5000)
  • We can verify the period is 5000, so its full
    period

11
Technique B The Multiplication Algorithm With
Prime Modulus
  • The multiplicative generator defined as
  • xi1aximod(m)
  • - m is prime (usaually large)
  • - a is a primitive element mod(m)
  • am-1/m k interger
  • ai-1/m ? k, i1, 2, 3,, m-2

12
Technique C The Multiplication Algorithm With
Nonprime Modulus
  • The most important case of this generator having
    m equal to a power of two
  • xi1aximod(2n)
  • The maximum period is 2n/4 2n-2
  • the period is achieved if
  • - The multiplier a is 3 or 5
  • - The seed x0 is odd

13
Example of Multiplication Algorithm With Nonprime
Modulus
a3 c0 m16 x01
14
Testing Random Number Generator
  • Chi-square test, spectral test
  • Testing the randomness of a given sequence
  • Scatterplots- a plot of xi1 as a function of
    xi
  • Durbin-Watson Test
  • -

15
ScatterplotsExample 7.5
(i) rand(1,2048) (ii)xi165xi1mod(2048) (iii
)xi11229xi1mod(2048)
16
Durbin-Watson Test (1)
Let X Xn
Y Xn-1
Assume Xn and Xn-1 are correlated and Xn
is an ergodic process
Let
17
Durbin-Watson Test (2)
X and Z are uncorrelated and zero mean
Dgt2 negative correlation D2 - uncorrelation
(most desired) Dlt2 positive correlation
18
Example 7.6
  • rand(1,2048) - The value of D is 2.0081 and ? is
    0.0041.
  • xi165xi1mod(2048) - The value of D is 1.9925
    and ? is 0.0037273.
  • xi11229xi1mod(2048) - The value of D is
    1.6037 and ? is 0.19814.

19
Minimum Standards
  • Full period
  • Passes all applicable statistical tests for
    randomness.
  • Easily transportable from one computer to another
  • Lewis, Goodman, and Miller Minimum Standard
    (prior to MATLAB 5)
  • xi116807ximod(231-1)

20
Mapping Uniform RVs to an Arbitrary pdf
  • The cumulative distribution for the target random
    variable is known in closed form Inverse
    Transform Method
  • The pdf of target random variable is known in
    closed form but the CDF is not known in closed
    form Rejection Method
  • Neither the pdf nor CDF are known in closed form
    Histogram Method

21
Inverse Transform Method
  • CDF FX(X) are known in closed form
  • U FX (X) Pr X? x
  • X FX-1 (U)
  • FX (X) Pr FX-1 (U) ? x Pr U ? FX (x)
    FX (x)

22
Example 7.8 (1)
  • Rayleigh random variable with pdf
  • ?
  • Setting FR(R) U

23
Example 7.8 (2)
  • ? RV 1-U is equivalent to U (have same pdf)
  • ?
  • Solving for R gives
  • n,xout hist(Y,nbins) -
  • bar(xout,n) - plot the histogram

24
Example 7.8 (3)
25
The Histogram Method
  • CDF and pdf are unknown
  • Pi Prxi-1 lt x lt xi ci(xi-xi-1)
  • FX(x) Fi-1 ci(xi-xi-1)
  • FX(X) U Fi-1 ci(X-xi) more samples

  • more accuracy!

26
Rejection Methods (1)
  • Having a target pdf
  • MgX(x) ? fX(x), all x

27
Rejection Methods (2)
  • Generate U1 and U2 uniform in (0,1)
  • Generate V1 uniform in (0,a), where a is the
    maximum value of X
  • Generate V2 uniform in (0,b), where b is at least
    the maximum value of fX(x)
  • If V2? fX(V1), set X V1. If the inequality is
    not satisfied, V1 and V2 are discarded and the
    process is repeated from step 1

28
Example 7.9 (1)
29
Example 7.9 (2)
30
Generating Uncorrelated Gaussian RV
  • Its CDF cant be written in closed form,so
    Inverse method cant be used and rejection method
    are not efficient
  • Other techniques
  • 1.The sum of uniform method
  • 2.Mapping a Rayleigh to Gaussian RV
  • 3.The polar method

31
The Sum of Uniforms Method(1)
  • 1.Central limit theorem
  • 2.See next
  • .
  • 3.

represent independent uniform R.V
is a constant that decides the var of
Y converges to a Gaussian R.V.
32
The Sum of Uniforms Method(2)
  • Expectation and Variance
  • We can set to any desired value
  • Nonzero at

33
The Sum of Uniforms Method(3)
  • Approximate Gaussian
  • Maybe not a realistic situation.

34
Mapping a Rayleigh to Gaussian RV(1)
  • Rayleigh can be generated by
  • U is the uniform RV in
    0,1
  • Assume X and Y are indep. Gaussian RV
  • and their joint pdf
  • ?

35
Mapping a Rayleigh to Gaussian RV(2)
  • Transform
  • ? let and
  • ? and
  • ?
  • ?

36
Mapping a Rayleigh to Gaussian RV(3)
  • Examine the marginal pdf
  • ?R is Rayleigh RV and is uniform RV

37
The Polar Method
  • From previous
  • We may transform

38
The Polar Method Alothgrithm
  • 1.Generate two uniform RV, and
  • and they are all on the interval (0,1)
  • 2.Let and ,so they are
  • independent and uniform on (-1,1)
  • 3.Let if continue,
  • else back to step2
  • 4.Form
  • 5.Set and

39
Establishing a Given Correlation Coefficient(1)
  • Assume two Gaussian RV X and Y ,they are zero
    mean and uncorrelated
  • Define a new RV
  • We also can see Z is Gaussian RV
  • Show is correlation coefficient relating
  • X and Z

40
Establishing a Given Correlation Coefficient(2)
  • Mean,Variance,Correlation coefficient

41
Establishing a Given Correlation Coefficient(3)
  • Covariance between X and Z
  • ? as desired

42
Pseudonoise(PN) Sequence Genarators
  • PN generator produces periodic sequence that
    appears to be random
  • Generated by algorithm using initial seed
  • Although not random,but can pass many tests of
    randomness
  • Unless algorithm and seed are known,the sequence
    is impractical to predict

43
PN Generator implementation
44
Property of Linear Feedback Shift Register(LFSR)
  • Nearly random with long period
  • May have max period
  • If output satisfy period ,is called
    max-length sequence or m-sequence
  • We define generator polynomial as
  • The coefficient to generate m-sequence can always
    be found

45
Example of PN generator
46
Different seed for the PN generator
47
Family of M-sequences
48
Property of m-sequence
  • Has ones, zeros
  • The periodic autocorrelation of a
    m-sequence is
  • If PN has a large period,autocorrelation function
    approaches an impulse,and PSD is approximately
    white as desired

49
PN Autocorrelation Function
50
Signal Processing
  • Relationship
  • 1.mean of input and output
  • 2.variance of input and output
  • 3.input-output cross-correlation
  • 4.autocorrelation and PSD

51
Input/Output Means
  • Assume system is linear?convolution
  • Assume stationarity assumption
  • ?
  • We can get
  • and ?

52
Input/Output Cross-Correlation
  • The Cross-Correlation is defined by
  • This use is used in the development of a number
    of performance estimators,which will be developed
    in chapter 8

53
Output Autocorrelation Function(1)
  • Autocorrelation of the output
  • Cant be simplified without knowledge of
  • the Statistics of

54
Output Autocorrelation Function(2)
  • If input is delta-correlated(i.e. white noise)
  • substitute previous equation

55
Input/Output Variances
  • By definition ?
  • Let m0 substitute into
  • But if is white noise sequence

56
  • The End
  • Thanks for listening
Write a Comment
User Comments (0)
About PowerShow.com