Chapter 4, Section 6 Continuous Random Variables - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Chapter 4, Section 6 Continuous Random Variables

Description:

So, if we multiply inside the integral by , and outside by its reciprocal, , (a ... This integral must be approximated using tables or numerical methods unless ... – PowerPoint PPT presentation

Number of Views:53
Avg rating:3.0/5.0
Slides: 24
Provided by: johnjc
Category:

less

Transcript and Presenter's Notes

Title: Chapter 4, Section 6 Continuous Random Variables


1
Chapter 4, Section 6Continuous Random Variables
  • The Gamma Distributions
  • The Exponential Distributions

? John J Currano, 03/29/2009
2
Properties. 1. If ? gt 1, then ??(? ) (? ? 1)
??(? ? 1). The proof uses integration by
parts. In the integral let u x ??1 dv
e ?x dx du (? ? 1) x ??2 dx v ? e ?x
3
Definition. is called the gamma function.
Properties. 1. If ? gt 1, then ??(? ) (? ? 1)
??(? ? 1). If ? gt 0, then ??(? 1 ) ???(?
). 2. If ? gt 0 and k is a positive integer, then
by repeated application of (1), ?(? k ) (?
k ? 1) (? k ? 2) ? ? ? (? 1 ) ???(? ). 3.
4. ?(n) (n ? 1) (n ? 2) ? ? ? (2) (1)??(1)
(n ? 1) ! for n ? 2, n ?? Z. Proof. Let ? 1
and k n 1 in (2), and notice that ? k
n. 5. 6.
4
It is the variable part of the
standard normal density function, , which
integrates to 1 over (???, ?) and, because of
symmetry, to ½ over (0, ?). So, if we multiply
inside the integral by , and outside by its
reciprocal, , (a standard trick) we
obtain
Does the last integrand look familiar?
5
Gamma(?, ?) Distribution.
6
Gamma(?, ?) Distribution.
If Y Gamma(?, ? ), then for 0 lt c lt d, This
integral must be approximated using tables or
numerical methods unless ? is an integer.
7
To compute the mean and the variance, we first
calculate E(Y k ) for all positive integers k in
one fell swoop
The integrand is the variable part of the
Gamma(?k, ?) pdf
8
(No Transcript)
9
Two special cases of the Gamma Distributions
are 1. The Exponential Distribution with
parameter ? gt 0 Exponential(? ) Gamma(1, ?)
? 1
2. The Chi-Square Distribution with?? degrees of
freedom ? ? Z, ? gt 0, ? ?/2 and
? 2
10
Exponential(? ) Distribution
The graph at the left is a typical picture of an
exponential pdf.
Note that some authors use the parameter (? gt
0). With this notation,
11
Exponential(? ) Distribution
Theorem. If Y Exponential(? ), then
1. its cdf is
2. its survival function is
Note. Exponential distributions are among the few
common continuous distributions for which
probabilities can be calculated by actually
integrating the pdf, so expect to see one on the
SOA-CAS exam.
12
Exponential(? ) Distribution
Theorem. If Y Exponential(? ), then E(Y ) ?
and V(Y ) ? 2. This follows since Y
Gamma(1, ? ). Using the parameter,
13
Exponential(? ) Distribution
An exponential random variable is
memoryless Theorem. If Y Exponential(? ) and
a gt 0 and b gt 0, then P(Y gt a b Y gt a)
P(Y gt b). Proof. In the text, Example 4.10,
page 188. Remember that
14
  • Two applications of exponential distributions are
    modeling
  • lifetimes of items that do not remember how
    old they are, such as electronic components,
    and
  • the waiting time until the first occurrence,
    or between occurrences, in a Poisson process.
  • The second application leads to a relation
    between the gamma family of distributions and the
    Poisson distributions.
  • Consider events that occur over a period of time,
    such as
  • people entering a particular store
  • alpha particles from a radioactive source
    hitting a Geiger counter
  • telephone calls entering a telephone exchange
  • data packets arriving at a network switch, etc.

15
A counting process is a collection of random
variables N(t) t ? 0 , where N(t)
represents the number of events that occur by
time t (and thus in the time interval 0, t
). A counting process, N(t) t ? 0 , is said
to be a Poisson process with rate ?, where ? gt 0,
if a. N(0) 0 b. The numbers of
events occurring in disjoint time intervals are
independent and c. The number
of events in any time interval of length t has
a Poisson distribution with mean ? t
. Therefore, in a Poisson process with rate ?,
the mean number of events in any time interval is
proportional to the length of the interval and
the constant of proportionality is ? .
16
Exponential(? ) Distribution
Let Y time until the first occurrence in a
Poisson process with rate ?. Then if y?? 0, the
number of occurrences in the interval 0, y
has a Poisson distribution with mean ? y, so
F (y) P (Y ? y ) 1 - P (Y gt y )
1 ? P ( no occurrence in 0, y )
1 ? P ( Poisson(? y ) 0 )
while F(y) 0 if y lt 0, since no events occur
before time 0. Therefore,
17
Thus the relation between the exponential
distribution and Poisson processes is analogous
to the relation between the geometric
distribution and binomial trials. In fact, more
is true. If k is a positive integer and if we let
W be the waiting time until there have been
exactly k occurrences in a Poisson process with
rate ?, then W has a gamma distribution with
parameters a k and b 1/l. This is similar to
the relation between the negative binomial
distribution and binomial trials. Thus there are
similarities between the geometric and
exponential distributions (both are memoryless
and model waiting times), and between the
negative binomial and gamma distributions. There
are more details about the Gamma-Poisson Relation
on the course web site.
18
Chi-Square( ? ) Distribution
Then E(Y) ? and V(Y) 2?, since Y
Gamma(? /2, 2 ). One place where the ??2
distributions arise are in Goodness-of-Fit Tests
(next semester), and one reason is the
following Theorem. If Z N(0, 1), then Z 2
?2(1). Proof. (A foretaste of Chapter 6).
Suppose that Z N(0, 1), so that
Let X Z 2, and G(x) and g(x) denote the cdf
and pdf of X, respectively. Then if x ? 0,
G(x) P(X ? x) 0 since X Z 2 can only assume
nonnegative values.
19
Theorem. If Z N(0, 1), then Z 2 ?2(1). F(z)
and f(z) denote the cdf and pdf of Z,
respectively. G(x) and g(x) denote the cdf and
pdf of X Z 2, respectively.
If x ? 0, G(x) P(X ? x) 0, so g(x)
G?(x) 0. If x gt 0, G(x) P(X ? x) P(Z
2 ? x)
by the symmetry of the normal pdf
2 F (x½ ) ? ½ 2 F (x ½ ) ? 1
so g(x) G ?(x) 2F ?(x ½ ) ? (1/2) x ? ½ f
(x ½ ) ? x ? ½
20
Theorem. If Z N(0, 1), then Z 2 ?2(1). G(x)
and g(x) denote the cdf and pdf of X Z 2,
respectively.
Then, if x 0, g(x) 0, while if x gt 0, g(x)

21
Example. (p. 190, 4.91) The demand, Y, for
water has an approx. exponential distribution
with mean ? 100 cfs. Thus, the pdf of Y is
a. Find the probability that the demand will
exceed 200 cfs.
Substitute
Alternatively, P(Y gt 200) S(200) 1
F(200) e?200/100 e ?2.
22
Example. (p. 190, 4.91) The demand, Y, for
water has an approx. exponential distribution
with mean ? 100 cfs. Thus, the pdf of Y is
b. Find the capacity, c, such that the
probability that demand exceeds capacity is only
0.01. We need to solve P(Y gt c) 0.01 for c
so we require
23
Example. (p. 193, 4.110) Given that the pdf of
Y is
find E(Y) and V(Y) by inspection. From the
form of the pdf, we know that Y has a gamma
distribution (the form of the variable part of
its pdf, , tells us this). Since the
variable part of the pdf of the Gamma(?, ? )
distribution is and the variable part of the
given pdf is , we know that ? 3
and ? . Thus
Write a Comment
User Comments (0)
About PowerShow.com