Chapter 7 Functions of Random Variables - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Chapter 7 Functions of Random Variables

Description:

Two independent Poisson random variables with moment-generating functions given by (Exercise 19) ... X2 is. So, Y1 have a Poisson distribution with parameter ... – PowerPoint PPT presentation

Number of Views:453
Avg rating:3.0/5.0
Slides: 26
Provided by: iis72
Category:

less

Transcript and Presenter's Notes

Title: Chapter 7 Functions of Random Variables


1
Chapter 7 Functions of Random Variables
  • Wen-Hsiang Lu (???)
  • Department of Computer Science and Information
    Engineering,
  • National Cheng Kung University
  • 2005/05/2

2
Introduction
  • Moment-generating function helpful in learning
    about distributions of linear functions of random
    variables
  • Statistical hypothesis testing, estimation, or
    even statistical graphics involve functions of
    one or more random variables
  • The use of averages of random variables
  • The distribution of sums of squares of random
    variables

3
Transformations of Variables
  • Theorem 7.1 Suppose that X is a discrete random
    variable with probability distribution f(x). Let
    Y u(X) define a one-to-one transformation
    between the values of X and Y so that the
    equation y u(x) can be uniquely solved for x in
    terms of y, say x w(y). Then the probability
    distribution of Y is g(y) fw(y).
  • Example 7.1 Let X is a geometric random variable
    with probability distribution
    Find the probability
    distribution of the random variable Y X2.
  • solution

4
Transformations of Variables
  • Theorem 7.2 Suppose that X1 and X2 are discrete
    random variables with joint probability
    distribution f(x1, x2). Let Y1 u1 (X1, X2) and
    Y2 u2(X1, X2) define a one-to-one
    transformation between the points (x1, x2) and
    (y1, y2) so that the equations y1 u1(x1, x2) ,
    y2 u2(x1, x2 ) may be uniquely solved for x1
    and x2 in terms of y1 and y2, say x1 w1(y1, y2)
    and x2 w2(y1, y2) Then the joint probability
    distribution of Y1 and Y2 is g(y1, y2)
    fw1(y1, y2), w2(y1, y2).

5
Transformations of Variables
6
Transformations of Variables
  • Theorem 7.3 Suppose that X is a continuous
    random variable with probability distribution
    f(x). Let Y u(X) define a one-to-one
    transformation between the values of X and Y so
    that the equation y u(x) can be uniquely solved
    for x in terms of y, say x w(y). Then the
    probability distribution of Y is g(y)
    fw(y)J, where J w(y) and is called the
    Jacobian of the transformation.
  • solution

7
Transformations of Variables
  • Proof

8
Transformations of Variables
  • Example 7.3 Let X be a continuous random
    variable with probability distribution
  • Find the probability distribution of Y 2X
    3.
  • solution

9
Transformations of Variables
  • Theorem 7.4 Suppose that X1 and X2 are
    continuous random variables with joint
    probability distribution f(x1, x2). Let Y1 u1
    (X1, X2) and Y2 u2(X1, X2) define a one-to-one
    transformation between the points (x1, x2) and
    (y1, y2) so that the equations y1 u1(x1, x2) ,
    y2 u2(x1, x2 ) may be uniquely solved for x1
    and x2 in terms of y1 and y2, say x1 w1(y1, y2)
    and x2 w2(y1, y2) Then the joint probability
    distribution of Y1 and Y2 is
    g(y1, y2) fw1(y1, y2), w2(y1, y2)J,where
    the Jacobian is the 2?2 determinant and
    ?x1/?y1 is simply the derivative of x1 w1(y1,
    y2) with respect to y1 with y2 held constant,
    referred to in calculus as the partial derivative
    of x1 with respect to y1. The other partial
    derivatives are defined in a similar manner.

10
Transformations of Variables
  • Example 7.4 Let X1 and X2 be two
    continuousrandom variables with joint
    probability distribution Find the joint
    probability distribution of Y1 X12 and Y2
    X1X2.
  • Solution

11
Transformations of Variables
  • Theorem 7.5 Suppose that X is a continuous
    random variable with probability distribution
    f(x). Let Y u(X) define a transformation
    between the values of X and Y that is not
    one-to-one. If the interval over which X is
    defined can be partitioned into k mutually
    disjoint sets such that each of the inverse
    functions x1 w1(y), x2 w2(y),
    ... , xk wk(y) of y u(x) define a
    one-to-one correspondence, then the probability
    distribution of Y is

12
Transformations of Variables
  • Example 7.5 Show that Y (X - ?)2/? 2 has a
    chi-squared distribution with 1 degree of freedom
    when X has a normal distribution with mean ? and
    variance ? 2.
  • Solution

13
Transformations of Variables
  • Solution

14
Moments and Moment-Generating Functions
  • Definition 7.1 The rth moment about the origin
    of the random variable X is given by
  • The first and second moments about the origin are
    given by ?1 E(X) and ?2 E(X2), so mean ?
    ?1 and variance ? 2 ?2 - ?2.
  • Definition 7.2 The moment-generating function of
    the random variable X is given by E(etX) and is
    denoted by Mx(t).

15
Moments and Moment-Generating Functions
  • Theorem 7.6 Let X is a random variable with
    moment-generating function Mx(t). Then
  • Proof

16
Moments and Moment-Generating Functions
  • Example 7.6 Find the moment-generating function
    of the binomial random variable X and then use it
    to verify that ? np and ?2 npq.
  • Proof

17
Moments and Moment-Generating Functions
  • Example 7.7 Show that the moment-generating
    function of the random variable X having a normal
    probability distribution with mean ? and variance
    ?2 is given by
  • Proof

1
18
Moments and Moment-Generating Functions
  • Example 7.8 Show that the moment-generating
    function of the random variable X having a
    chi-squared distribution with v degrees of
    freedom is
  • Proof

19
Moments and Moment-Generating Functions
  • Theorem 7.7 (Uniqueness Theorem) Let X and Y be
    two random variables with moment-generating
    functions Mx(t) and MY(t), respectively. If Mx(t)
    MY(t) for all values of t, then X and Y have
    the same probability distribution.
  • Theorem 7.8
  • Proof
  • Theorem 7.9
  • Proof

20
Moments and Moment-Generating Functions
  • Theorem 7.10 If X1, X2 ,, Xn are independent
    random variables with moment-generating functions
    Mx1(t), Mx2(t),, Mxn(t), respectively, and Y
    X1 X2 Xn, then
  • Proof

21
Moments and Moment-Generating Functions
  • Example The sum of two independent random
    variables having Poisson distributions with
    parameters ?1 and ?2, has a Poisson distribution
    with parameter ?1 ?2.
  • SolutionTwo independent Poisson random
    variables with moment-generating functions given
    by (Exercise 19)respectively. According to
    Theorem 7.10, Y1 X1 X2 is
  • So, Y1 have a Poisson distribution with
    parameter ?1 ?2.

22
Moments and Moment-Generating Functions
  • Theorem 7.11 If X1, X2 ,, Xn are independent
    random variables having normal distributions with
    means ?1, ?2 ,, ?n and variances ?12, ?22 ,,
    ?n2, respectively, then the random variable
    Y a1X1 a2X2 anXnhas
    a normal distribution with mean
    ?Y a1 ?1 a2 ?2 an ?nand
    variance ?Y2
    a12?12 a22?22 an2?n2.

?Y
?Y2
23
Moments and Moment-Generating Functions
  • Theorem 7.12 If X1, X2 ,, Xn are mutually
    independent random variables that have,
    respectively, chi-squared distributions with v1,
    v2 ,, vn degrees of freedom, then the random
    variable Y X1
    X2 Xnhas a chi-squared distribution with
    v v1 v2 vn degrees of freedom.
  • Proof

24
Moments and Moment-Generating Functions
  • Corollary If X1, X2 ,, Xn are independent
    random variables having identical normal
    distributions with mean ? and variances ?2
    has a chi-squared
    distribution with v n degrees of freedom.
  • Example 7.5 states that each of the n independent
    random variables Y (Xi - ?) /? 2 has a
    chi-squared distribution with 1 degree of
    freedom.
  • It establishes a relationship between chi-squared
    distribution and the normal distribution.
  • If Z1, Z2 ,, Zn are independent standard normal
    random variables, then has a
    chi-square distribution and single parameter, v,
    the degrees of freedom, is n , the number of
    standard normal variates.

25
Exercise
  • 17, 19, 21
Write a Comment
User Comments (0)
About PowerShow.com