Title: Incorporating New Information to Decision Trees (posterior probabilities)
1Incorporating New Information to Decision Trees
(posterior probabilities)
- MGS3100 - Chapter 6
- Part 3
2How We Will Use Bayes' Theorem Prior information
can be based on the results of previous
experiments, or expert opinion, and can be
expressed as probabilities. If it is desirable to
improve on this state of knowledge, an experiment
can be conducted. Bayes' Theorem is the
mechanism used to update the state of knowledge
with the results of the experiment to provide a
posterior distribution.
3Bayes Theorem
- Used to revise probabilities based upon new data
Posterior probabilities
4How Bayes' Theorem Works Let the experiment be A
and the prediction be B. Lets assume that both
have occurred. The probability of both A and B
together is P(AnB), or simply P(AB). The law of
conditional probability says that this
probability can be found as the product of the
conditional probability of one, given the other,
times the probability of the other. That
is P(AB) P(B) P(AB) P(BA) P(A) Simple
algebra shows that P(BA) P(AB) P(B)
P(A) This is Bayes' Theorem.
5Sequential Decisions
- Would you hire a market research group or a
consultant (or a psychic) to get more info about
states of nature? - How would additional info cause you to revise
your probabilities of states of nature occuring? - Draw a new tree depicting the complete problem.
6Problem Marketing Cellular Phones
The design and product-testing phase has just
been completed for Sonorolas new line of
cellular phones.
Three alternatives are being considered for a
marketing/production strategy for this product
1. Aggressive (A)
2. Basic (B)
3. Cautious (C)
Management decides to categorize the level of
demand as either strong (S) or weak (W).
7Here, we reproduce the last slide of the Sonorola
problem from lecture slides part 2.
Of the three expected values, choose 12.85, the
branch associated with the Basic strategy.
This decision is indicated in the TreePlan by
the number 2 in the decision node.
8Marketing Department
- Reports on the state of the market
- Encouraging
- Discouraging
9First, find out the reliability of the source of
information (in this case, the marketing research
group). Find the Conditional Probability based
on the prior track record For two events A
and B, the conditional probability P(AB), is
the probability of event A given that event B
will occur. For example, P(ES) is the
conditional probability that marketing gives an
encouraging report given that the market is in
fact going to be strong.
10If marketing were perfectly reliable, P(ES) 1.
However, marketing has the following track
record in predicting the market
P(DS) 1 - P(ES) 0.4
P(EW) 1 - P(DW) 0.3
Here is the same information displayed in tabular
form
11Calculating the Posterior Probabilities
Suppose that marketing has come back with an
encouraging report.
Knowing this, what is the probability that the
market is in fact strong P(SE)?
Note that probabilities such as P(S) and P(W) are
initial estimates called a prior probabilities.
Conditional probabilities such as P(SE) are
called posterior probabilities.
The domestic tractor division has already
estimated the prior probabilities as P(S) 0.45
and P(W) 0.55.
Now, use Bayes Theorem (see appendix for a
formal description) to determine the posterior
probabilities.
12(No Transcript)
13AppendixBayes Theorem
- Bayes' theorem is a result in probability theory,
which gives the conditional probability
distribution of a random variable A given B in
terms of the conditional probability distribution
of variable B given A and the marginal
probability distribution of A alone. - In the context of Bayesian probability theory and
statistical inference, the marginal probability
distribution of A alone is usually called the
prior probability distribution or simply the
prior. The conditional distribution of A given
the "data" B is called the posterior probability
distribution or just the posterior.