STAT131 W12L1a Markov Chains - PowerPoint PPT Presentation

1 / 53
About This Presentation
Title:

STAT131 W12L1a Markov Chains

Description:

What is the probability that it will rain on Thursday? What is the probability that ... P(rain) =0.24 0.12 =0.36. 26. Markov Chains: Context ... – PowerPoint PPT presentation

Number of Views:52
Avg rating:3.0/5.0
Slides: 54
Provided by: AP39
Category:

less

Transcript and Presenter's Notes

Title: STAT131 W12L1a Markov Chains


1
STAT131W12L1a Markov Chains
  • by
  • Anne Porter
  • alp_at_uow.edu.au

2
Lecture Outline
  • Naming conventions
  • Matrices
  • Definition
  • Multiplication
  • Probability
  • Markov Chains
  • Definition
  • Examples

3

Naming Conventions
Squarem1n2
or maybe m1n2
4

Naming Conventions
Square12
or s12 or c12 or.
5
Definition 1 Matrix
  • An m x n matrix is a rectangular array of
    elements, with m rows and n columns, written

6
Example Matrix elements
  • An m x n matrix is a rectangular array of
    elements, with m rows and n columns.
  • Given
  • (a) What is b32 ?

1
7
Example Matrix elements
  • An m x n matrix is a rectangular array of
    elements, with m rows and n columns.
  • Given
  • (a) What is b13 ?

1
8
Definition 2 Order of a matrix
  • An m x n matrix is said to be of order (or size)
    m x n.

Example If and
(a) What is the size of A ? (b) What is the size
of B?
2x3
3x3
9
Matrix multiplication
  • Two matrices A and B can multiplied together only
    if the number of columns of A is equal to the
    number of rows of B.
  • An example and

A is order 2x3
B is of order 3x 3 2 rows
x 3 columns 3 rows x 3 columns
Hence these matrices can be multiplied
10
Definition 3
  • If the (i,j)th elements of A and B are aij and
    bij respectively then the (i,j)th element of CAB
    is

11
Evaluating CA x B
  • where and

C11
a11b11
a12b21
a13b31
1x2 2x1 3x3 13
12
Evaluating CA x B
  • where and

C11
a11b11 a12b21 a13b31
13
C21
a21b11 a22b21 a23b31
2x2 3x1 1x3 10
13
Evaluating CA x B
  • where and

C21
a21b11 a22b21 a23b31 10
C12
a11b12 a12b22 a13b32
1x0 2x1 3 x1 5
14
Evaluating CA x B
  • where and

C22
2x0 3x1 1x1 4
C21
a11b11 a12b21 a13b31 10
C12
a11b12 a12b22 a13b32 5
15
Evaluating CA x B
  • where and

C22
2x0 3x1 1x1
4
C21
a11b11 a12b21 a13b31 10
C13
1x1 2x1 3x2
9
C12
a11b12 a12b22 a13b32 5
16
Evaluating CA x B
  • where and

C23
2x1 3x1 1x2
7
17
Evaluating CA x B
18
Multiply
  • Size

2x2 2x2
New Matrix
19
Multiply
1x2
3x2
Size
Can not be multiplied columns A not equal to rows
of B
20
Probability Example WeatherSource Griffiths
(Additional notes)
  • Starting on a Wednesday the weather has an
    initial probability of 0.8 of being 'fine' and
    0.2 of 'rain'. If the weather is fine on any day
    then the conditional probability that it will be
    fine the next day is 0.7, whereas if it rains on
    one day the conditional probability of it being
    fine the next is 0.4.

(1) Represent this information in a tree diagram.
(2) Determine the P(fine on Thursday). (3)
Determine the P(rain on Thursday).
21
Probability Using Tree Diagrams
  • Wednesday

Thursday
P(fineW and fineT)
P(fineT fineW)
0.7
P(fineW)0.8
P(fineW and rainT)
0.3
0.4
P(rainW and fineT)
P(rainW)0.2
0.6
P(rainW and rainT)
22
Using the definition of conditional probability
  • Given P(fineT) and P(fineTwetW),
  • how do we find P(fineW and fineT)
  • Hence
  • So P(fineW)xP(fineTfineW) P(fineW and fineT)

23
Probability Using Tree Diagrams
Wednesday
Thursday
P(fineW and fineT)
P(fineT fineW)
0.7
0.8x0.70.56
P(fineW)0.8
P(fineW and rainT)
P(rainT fineW)
0.3
0.8x0.30.24
0.4
P(rainW and fineT)
P(fineT rainW)
P(rainW)0.2
0.2x0.40.08
P(rainW and rainT)
P(rainT rainW)
0.6
0.2x0.60.12
24
Probability Using Tree Diagrams
  • What is the probability that it will rain on
    Thursday?

0.24 0.12 0.36
  • What is the probability that
  • It will be fine on Thursday?

0.56 0.08 0.64
  • What is the probability it will
  • rain or be fine on Thursday?

1.00
25
Probability Law of Total Probability
Thursday
  • What is the probability that it will rain on
    Thursday?

P(fineW)x P( rainT fineW)
P(rain) 0.24 0.12 0.36
P(rain) P(fineW)x P( rainT fineW)
P(rainW)x P( rainT rainW)
P(rainW)x P( rainT rainW)
26
Markov Chains Context
  • In contrast to coin tossing, which is a sequence
    of independent events, there are processes that
    change over time. Stochastic processes (or
    random or chance processes) that can often be
    modelled by a sequence of dependent experiments.
    Here we will consider one special case of
    experimental dependence.

27
Definition Markov Chain
  • A Markov Chain or Markov Process exists if the
    following conditions are satisfied
  • There is a finite number of states of the
    experimental
  • system, and the system is in exactly one of
    these states after
  • each repetition of the experiment. The different
    states are
  • denoted by E1,E2,,En , where each repetition of
    the
  • experiment has to result in one of these states.
  • The state of the process after a repetition of
    the experiment
  • depends (probabilistically) on only the state of
    the process
  • immediately after the previous experiment but
    not on the
  • states after earlier experiments. That is, the
    process has no
  • memory of the past, beyond the previous
    experiment.

28
Example 1 Markov Chain Source Griffiths
Weather example (Additional notes)
  • Starting on a Wednesday the weather has an
    initial probability of 0.8 of being 'fine' and
    0.2 of 'rain'. If the weather is fine on any day
    then the conditional probability that it will be
    fine the next day is 0.7, whereas if it rains on
    one day the conditional probability of it being
    fine the next is 0.4.
  • (1) What are the states of this system?

S fine, rain
29
Example 2 Markov Chain
  • Rules for Snakes and Ladders
  • If you land on the bottom of the ladder,
    automatically go to the top
  • If you land on the snake head automatically slide
    down to its tail
  • You must land exactly on square 7 to finish if
    your move would take you beyond square 7, then
    you cannot take the move, so you remain on the
    same square.
  • (1) What is the state space?

S0,1,3,5,7
30
To describe a Markov Chain
  • Two sets of probabilities must be known.
  • the initial probability vector and
  • the transition probability matrix

31
Initial probability vector
  • The initial probability vector p0 describes the
    initial state (S) of the process

p0 P( initial S is p1), P(initial S is p2),,
P(initial S is pn)
  • If the initial state is known the initial vector
    will have
  • one of the probabilities equal to 1 and the rest
    equal to 0.

32
Example 1 Markov Chain Source Griffiths
Weather example (Additional notes)
  • Starting on a Wednesday the weather has an
    initial probability of 0.8 of being 'fine' and
    0.2 of 'rain'. If the weather is fine on any day
    then the conditional probability that it will be
    fine the next day is 0.7, whereas if it rains on
    one day the conditional probability of it being
    fine the next is 0.4.
  • (1) What is the initial probability vector to
    start?

Fine rain
0.8 0.2
33
Example 2 Markov Chain
  • Rules
  • If you land on the bottom of the ladder,
    automatically go to the top
  • If you land on the snake head automatically slide
    down to its tail
  • You must land exactly on square 7 to finish if
    your move would take you beyond square 7, then
    you cannot take the move, so you remain on the
    same square.
  • (2) If we start on 0 in snakes and ladders what
    is the initial vector?

States 0 1 3 5 7
1 0 0 0 0
34
Transition probability matrix
  • The (conditional) probability that the process
    moves from state i to state j is called a
    (one-step) transition probability, and is
    denoted by ,pij that is
  • pijP(Ej next Ei before)
  • It is usual to display the values in an m (rows)
    x m (columns) matrix. That is a square matrix.

35
Transition probability matrix
After state 1 2 m

Before state 1 2
m
pijP(Ej next Ei before)
36
Example 1 Markov Chain Source Griffiths
Weather example (Additional notes)
  • Starting on a Wednesday the weather has an
    initial probability of 0.8 of being 'fine' and
    0.2 of 'rain'. If the weather is fine on any day
    then the conditional probability that it will be
    fine the next day is 0.7, whereas if it rains on
    one day the conditional probability of it being
    fine the next is 0.4.
  • (1) What is the
  • transition matrix

End Fine Rain
Start Fine rain
37
Example 2 Markov Chains
  • Rules for Snakes and Ladders
  • If you land on the bottom of the ladder,
    automatically go to the top
  • If you land on the snake head automatically slide
    down to its tail
  • You must land exactly on square 7 to finish if
    your move would take you beyond square 7, then
    you cannot take the move, so you remain on the
    same square.
  • (3) Represent the conditional probabilities of
    end states given the starting states.

38
Example 2 Markov Chains
  • Transition Matrix

39
Example 1 Markov Chain Source Griffiths
Weather example (Additional notes)
  • Starting on a Wednesday the weather has an
    initial probability of 0.8 of being 'fine' and
    0.2 of 'rain'. If the weather is fine on any day
    then the conditional probability that it will be
    fine the next day is 0.7, whereas if it rains on
    one day the conditional probability of it being
    fine the next is 0.4.
  • This can be represented in Matrix notation (we
    previously did it as a tree diagram). To do this
    we use the Law of Total Probability.

40
Probability Using Tree Diagrams
Wednesday
Thursday
P(FineT)0.64 P(RainT)0.36
41
Probability Law of Total Probability
  • What is the probability that it will be fine on
    Thursday?
  • Wet on Thursday?

Represented in matrix form PBPA.PBA Where PBP
P(B) P(Not B) PAP(A1) P(A2) P(Am) and
42
Probability Law of Total Probability
  • What is the probability that it will be fine on
    Thursday?
  • Wet on Thursday?
  • Initial probability matrix
  • Transition Matrix

43
Probability Law of Total Probability
Represented in matrix form PBPA.PBA
  • What is the probability that it will be fine on
    Thursday?
  • Wet on Thursday?
  • Initial probability matrix
  • Transition Matrix

PB 0.8 0.2
x
0.8 0.2
0.64 0.36
44
Now predict P(fine) and P(Rain on Friday)
  • What was the probability of fine and rain on
    Thursday?

0.64 0.36
  • What is the initial probability vector starting
  • on Thursday?

0.64 0.36
  • What else do we need?

Transition matrix
So P(fineF) P(rainF)
0.64 0.36x
45
Now predict P(fine) and P(Rain on Friday)
  • P(fineF) P(rainF) 0.64 0.36x

The size of the matrix will be
1x2
That is
P(fineF) P(rainF)
P(fineF) P(rainF)

0.64x0.70.36x0.4
46
Now predict P(fine) and P(Rain on Friday)
  • P(fineF) P(rainF) 0.64 0.36x

The size of the matrix will be
1x2
That is
P(fineF) P(rainF)
P(fineF) P(rainF)

0.64x0.70.36x0.4
0.64x0.30.36x0.6
0.592 0.408
The sum of these two values P(fineF) and P(rainF)
should equal
1
47
Probability n-step transition
PB
0.8 0.2 x
0.64 0.36
P(fineT) P(rainT)
P(fineF) P(rainF)
48
Probability Using Tree Diagrams
Friday
Wednesday
Thursday
0.7
P(finefineW)
0.7
0.3
0.4
0.3
P(rainfineW)
0.6
0.7
0.4
P(finewetW)
0.3
0.4
0.6
P(rainrainW)
0.6
And we would multiply through each branch then
add all probabilities for fineF and then rainF
49
Example 2 Markov Chain
  • Rules for Snakes and Ladders
  • If you land on the bottom of the ladder,
    automatically go to the top
  • If you land on the snake head automatically slide
    down to its tail
  • You must land exactly on square 7 to finish if
    your move would take you beyond square 7, then
    you cannot take the move, so you remain on the
    same square.
  • (1) What is the state space?

S0,1,3,5,7
50
Example 2 Markov Chain
  • Rules
  • If you land on the bottom of the ladder,
    automatically go to the top
  • If you land on the snake head automatically slide
    down to its tail
  • You must land exactly on square 7 to finish if
    your move would take you beyond square 7, then
    you cannot take the move, so you remain on the
    same square.
  • (2) If we start on 0 in snakes and ladders what
    is the initial vector?

States 0 1 3 5 7
1 0 0 0 0
51
Example 2 Markov Chains
  • Rules for Snakes and Ladders
  • If you land on the bottom of the ladder,
    automatically go to the top
  • If you land on the snake head automatically slide
    down to its tail
  • You must land exactly on square 7 to finish if
    your move would take you beyond square 7, then
    you cannot take the move, so you remain on the
    same square.
  • (3) Represent the conditional probabilities of
    end states given the starting states.

52
Example 2 Markov Chains
  • Transition Matrix - homework

53
We will continue...
  • With a musical interlude!
Write a Comment
User Comments (0)
About PowerShow.com