PART 3 Random Processes - PowerPoint PPT Presentation

About This Presentation
Title:

PART 3 Random Processes

Description:

PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern Mediterranean ... – PowerPoint PPT presentation

Number of Views:287
Avg rating:3.0/5.0
Slides: 94
Provided by: McGrawHil97
Category:

less

Transcript and Presenter's Notes

Title: PART 3 Random Processes


1
PART 3Random Processes
Huseyin Bilgekul Eeng571 Probability and
astochastic Processes Department of Electrical
and Electronic Engineering Eastern Mediterranean
University
2
Random Processes
3
Kinds of Random Processes
4
Random Processes
  • A RANDOM VARIABLE X, is a rule for assigning to
    every outcome, w, of an experiment a number
    X(w).
  • Note X denotes a random variable and X(w)
    denotes a particular value.
  • A RANDOM PROCESS X(t) is a rule for assigning to
    every w, a function X(t,w).
  • Note for notational simplicity we often omit the
    dependence on w.

5
Conceptual Representation of RP
6
Ensemble of Sample Functions
The set of all possible functions is called the
ENSEMBLE.
7
Random Processes
  • A general Random or Stochastic Process can be
    described as
  • Collection of time functions (signals)
    corresponding to various outcomes of random
    experiments.
  • Collection of random variables observed at
    different times.
  • Examples of random processes in communications
  • Channel noise,
  • Information generated by a source,
  • Interference.

t1
t2
8
Random Processes
Let denote the random outcome of an
experiment. To every such outcome suppose a
waveform is assigned. The
collection of such waveforms form a stochastic
process. The set of and the time index
t can be continuous or discrete (countably
infinite or finite) as well. For fixed
(the set of all experimental outcomes),
is a specific time function. For fixed
t, is a random variable. The ensemble of all
such realizations over time
represents the stochastic
9
Random Process for a Continuous Sample Space
10
Random Processes
11
Wiener Process Sample Function
12
(No Transcript)
13
Sample Sequence for Random Walk
14
Sample Function of the Poisson Process
15
Random Binary Waveform
16
Autocorrelation Function of the Random Binary
Signal
17
Example
18
(No Transcript)
19
Random Processes Introduction (1)
20
Introduction
  • A random process is a process (i.e., variation in
    time or one dimensional space) whose behavior is
    not completely predictable and can be
    characterized by statistical laws.
  • Examples of random processes
  • Daily stream flow
  • Hourly rainfall of storm events
  • Stock index

21
Random Variable
  • A random variable is a mapping function which
    assigns outcomes of a random experiment to real
    numbers. Occurrence of the outcome follows
    certain probability distribution. Therefore, a
    random variable is completely characterized by
    its probability density function (PDF).

22
STOCHASTIC PROCESS
23
STOCHASTIC PROCESS
24
STOCHASTIC PROCESS
25
STOCHASTIC PROCESS
  • The term stochastic processes appears mostly in
    statistical textbooks however, the term random
    processes are frequently used in books of many
    engineering applications.

26
STOCHASTIC PROC ESS
27
DENSITY OF STOCHASTIC PROCESSES
  • First-order densities of a random process
  • A stochastic process is defined to be completely
    or totally characterized if the joint densities
    for the random variables
    are known for all times and
    all n.
  • In general, a complete characterization is
    practically impossible, except in rare cases. As
    a result, it is desirable to define and work with
    various partial characterizations. Depending on
    the objectives of applications, a partial
    characterization often suffices to ensure the
    desired outputs.

28
DENSITY OF STOCHASTIC PROCESSES
  • For a specific t, X(t) is a random variable with
    distribution .
  • The function is defined as the
    first-order distribution of the random variable
    X(t). Its derivative with respect to x
  • is the first-order density of X(t).

29
DENSITY OF STOCHASTIC PROCESSES
  • If the first-order densities defined for all time
    t, i.e. f(x,t), are all the same, then f(x,t)
    does not depend on t and we call the resulting
    density the first-order density of the random
    process otherwise, we have a family
    of first-order densities.
  • The first-order densities (or distributions) are
    only a partial characterization of the random
    process as they do not contain information that
    specifies the joint densities of the random
    variables defined at two or more different times.

30
MEAN AND VARIANCE OF RP
  • Mean and variance of a random process
  • The first-order density of a random process,
    f(x,t), gives the probability density of the
    random variables X(t) defined for all time t. The
    mean of a random process, mX(t), is thus a
    function of time specified by
  • For the case where the mean of X(t) does not
    depend on t, we have
  • The variance of a random process, also a function
    of time, is defined by

31
HIGHER ORDER DENSITY OF RP
  • Second-order densities of a random process
  • For any pair of two random variables X(t1) and
    X(t2), we define the second-order densities of a
    random process as or
    .
  • Nth-order densities of a random process
  • The nth order density functions for
    at times
  • are given by
  • or
    .

32
Autocorrelation function of RP
  • Given two random variables X(t1) and X(t2), a
    measure of linear relationship between them is
    specified by EX(t1)X(t2). For a random process,
    t1 and t2 go through all possible values, and
    therefore, EX(t1)X(t2) can change and is a
    function of t1 and t2. The autocorrelation
    function of a random process is thus defined by

33
Autocovariance Functions of RP
34
Stationarity of Random Processes
  • Strict-sense stationarity seldom holds for random
    processes, except for some Gaussian processes.
    Therefore, weaker forms of stationarity are
    needed.

35
Stationarity of Random Processes
36
Wide Sense Stationarity (WSS) of Random Processes
37
Equality and Continuity of RP
  • Equality
  • Note that x(t, ?i) y(t, ?i) for every ?i is
    not the same as x(t, ?i) y(t, ?i) with
    probability 1.

38
Equality and Continuity of RP
39
Mean Square Equality of RP
  • Mean square equality

40
Equality and Continuity of RP
41
(No Transcript)
42
Random Processes Introduction (2)
43
Stochastic Continuity
44
Stochastic Continuity
45
Stochastic Continuity
46
Stochastic Continuity
47
Stochastic Continuity
48
Stochastic Continuity
49
Stochastic Convergence
  • A random sequence or a discrete-time random
    process is a sequence of random variables X1(?),
    X2(?), , Xn(?), Xn(?), ? ? ?.
  • For a specific ?, Xn(?) is a sequence of
    numbers that might or might not converge. The
    notion of convergence of a random sequence can be
    given several interpretations.

50
Sure Convergence (Convergence Everywhere)
  • The sequence of random variables Xn(?)
    converges surely to the random variable X(?) if
    the sequence of functions Xn(?) converges to X(?)
    as n ? ? for all ? ? ?, i.e.,
  • Xn(?) ? X(?) as n ? ? for all ? ? ?.

51
Stochastic Convergence
52
Stochastic Convergence
53
Almost-sure convergence (Convergence with
probability 1)
54
Almost-sure Convergence (Convergence with
probability 1)
55
Mean-square Convergence
56
Convergence in Probability
57
Convergence in Distribution
58
Remarks
  • Convergence with probability one applies to the
    individual realizations of the random process.
    Convergence in probability does not.
  • The weak law of large numbers is an example of
    convergence in probability.
  • The strong law of large numbers is an example of
    convergence with probability 1.
  • The central limit theorem is an example of
    convergence in distribution.

59
Weak Law of Large Numbers (WLLN)
60
Strong Law of Large Numbers (SLLN)
61
The Central Limit Theorem
62
Venn Diagram of Relation of Types of Convergence
Note that even sure convergence may not imply
mean square convergence.
63
Example
64
Example
65
Example
66
Example
67
Ergodic Theorem
68
Ergodic Theorem
69
The Mean-Square Ergodic Theorem
70
The Mean-Square Ergodic Theorem
  • The above theorem shows that one can expect a
    sample average to converge to a constant in mean
    square sense if and only if the average of the
    means converges and if the memory dies out
    asymptotically, that is , if the covariance
    decreases as the lag increases.

71
Mean-Ergodic Process
72
Strong or Individual Ergodic Theorem
73
Strong or Individual Ergodic Theorem
74
Strong or Individual Ergodic Theorem
75
Examples of Stochastic Processes
  • iid random process
  • A discrete time random process X(t), t 1, 2,
    is said to be independent and identically
    distributed (iid) if any finite number, say k, of
    random variables X(t1), X(t2), , X(tk) are
    mutually independent and have a common cumulative
    distribution function FX(?) .

76
iid Random Stochastic Processes
  • The joint cdf for X(t1), X(t2), , X(tk) is given
    by
  • It also yields
  • where p(x) represents the common probability
    mass function.

77
Bernoulli Random Process
78
Random walk process
79
Random walk process
  • Let ?0 denote the probability mass function of
    X0. The joint probability of X0, X1, ? Xn is

80
Random walk process
81
Random walk process
  • The property
  • is known as the Markov property.
  • A special case of random walk the Brownian
    motion.

82
Gaussian process
  • A random process X(t) is said to be a Gaussian
    random process if all finite collections of the
    random process, X1X(t1), X2X(t2), , XkX(tk),
    are jointly Gaussian random variables for all k,
    and all choices of t1, t2, , tk.
  • Joint pdf of jointly Gaussian random variables
    X1, X2, , Xk

83
Gaussian process
84
Time series AR random process
85
The Brownian motion (one-dimensional, also known
as random walk)
  • Consider a particle randomly moves on a real
    line.
  • Suppose at small time intervals ? the particle
    jumps a small distance ? randomly and equally
    likely to the left or to the right.
  • Let be the position of the particle on
    the real line at time t.

86
The Brownian motion
  • Assume the initial position of the particle is at
    the origin, i.e.
  • Position of the particle at time t can be
    expressed as

  • where
    are independent random variables, each having
    probability 1/2 of equating 1 and ?1.
  • ( represents the largest integer not
    exceeding
  • .)

87
Distribution of X?(t)
  • Let the step length equal , then
  • For fixed t, if ? is small then the distribution
    of is approximately normal with mean
    0 and variance t, i.e.,
    .

88
Graphical illustration of Distribution of X?(t)
89
  • If t and h are fixed and ? is sufficiently small
    then

90
Graphical Distribution of the displacement of
  • The random variable
    is normally distributed with mean 0 and variance
    h, i.e.

91
  • Variance of is dependent on t, while
    variance of is not.
  • If , then
    ,
  • are independent random variables.

92
(No Transcript)
93
Covariance and Correlation functions of
Write a Comment
User Comments (0)
About PowerShow.com