Generating functions - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Generating functions

Description:

... distribution function is thus. Pr[Xj] = 1- qj. MA 4030-probability ... the probability mass function becomes qyp, which is the Geometric distribution. ... – PowerPoint PPT presentation

Number of Views:946
Avg rating:3.0/5.0
Slides: 31
Provided by: mrt4
Category:

less

Transcript and Presenter's Notes

Title: Generating functions


1
Generating functions
2
Generating functions
  • Definition Let a0,a1,an,.be a sequence of
    real numbers and let
  • If the series converges in some real interval
    (-x0, x0), x x0 the function A(x) is called a
    generating function for aj.

3
  • The generating function may be regarded as a
    transformation which carries the sequence aj
    into A(x).In general x will be a real number.
    However, it is possible to work with the complex
    numbers as well.
  • If, the sequence aj. is bounded , then a
    comparison with the geometric series shows that
    A(x) converges at least for x 1.

4
Probability Generating function
  • If we have the additional property that
  • aj 0 and
  • then A(x) is called a probability generating
    function.
  •  

5
Proposition
  • A generating function uniquely determines its
    sequence.
  • This single function A(x) can be used to
    represent the whole collection of individual
    items aj.
  • Uses of probability generating functions
  • To find the density/mass function
  • To find the Moments in stochastic models
  • To Calculate limit distributions
  • In difference equations or recursions

6
Example
  • Let us consider a random variable X,
  • where the probability, PXjpj
  • Suppose X is an integral valued random variable
    with values 0,1,2,.
  • Then we can define the tail probabilities as
  • PrX gt j
    qj
  • The distribution function is thus
  • PrXj 1- qj

7
  • The probability generating function is
  • The generating function,
  • is not a p.g.f. since .
  •  

8
Some useful results
  • 1) 1-P(x) (1-x) (Q(x))
  • 2.) P(1) Q(1)
  • 3.) P(1) 2Q(1)
  • 4.) V(x)P(1)P(1)-P(1)2
  • 2Q(1) Q(1)-Q(1)2
  • 5) the rth factorial moment or rth moment about
    the origin,
  • µ(r) E X(X-1) (X-2)(X-r1)
  • P(r) (1) rQ (r-1) (1)

9
Convolutions
  • Consider two non negative independent integral
    valued random variables X and Y, having the
    probability distribution,
  • PXj aj and PYk bk
  • The probability of the event (Xj,Yk) is
    therefore
  • Pr (Xj,Yk) aj bk
  •  

10
  • Suppose we form a new random variable SXY
  • Then the event Sr comprises the mutually
    exclusive events,
  • (X0,Yr),(X1,Yr-1),.(Xm,Yr-m),
    ..(Xr-1,Y1),(Xr,Y0)
  • If the distribution of S is given by PSrcr
  • Then it follows that
  • Cra0bra1br-1..arb0
  • This method of compounding two sequences
  • of numbers is called a convolution cjajbj

11
Generating functions and Convolutions
  • Proposition The generating function of a
    convolution (cj ) is the product of the
    generating functions (aj,bj) .

12
  • Define the associated probability generating
    functions of the sequences defined earlier.

13
  • ProofSuppose we form a new random variable
    SXY. Let Sn and PSncn
  • Then the corresponding generating function is
  • Ctd

14
(No Transcript)
15
  • Extensions to convolution
  • More generally the generating function of
    aj,bj,cj,is the product of the generating
    functions of aj,bj,cj,.
  • F(x) A(x).B(x).C(x).
  • Also let X1,X2,..,Xn be i.i.d r.v.s
  • and SnX1X2..Xn
  • Then the g.f.of Sn is P(x)n

16
Some properties of convolution
  • 1.The convolution of two probability mass
    functions on the non negative integers is a pr.
    Mass function.
  • 2.Convolution is a commutative operation
  • XY and YX , have the same distribution.
  • 3.It is an associative operation. (order is
    immaterial) X(YZ) (XY) Z , have the same
    distribution

17
Examples
  • Find the p.g.f , the mean and the variance of
  • 1.Bernoulli distribution where,
    pPsuccessPX1, and qPX0. Where X
    number of successes in a trial.
  • 2.Poisson distribution PXr e-µ µr/r!
  • eg X -. number of phone calls in a unit time
    interval. µ is the average number of phone
    calls/unit
  • 3.Geometric distribution PXjpqj X?
    define
  • 4.Binomial distribution (X_ number of successes
    in n number of trials.) PXr nCr pr q n-r
    ,
  • where p Psuccess and q Pfailure

18
  • 1. For the Bernoulli r.v.

  • P(s)qps
  • the mean
    P1p
  • the variance
    P1P1-P12

  • 0p-p2p(1-p)
  • qp
  • 4. For the Binomial r.v. ,which is the sum of n
    independent r.v.s, P(s)qpsn
  • Meannp
  • Variance npq

19
  • 2. For the Poisson r.v.
  • Px
  • The mean P1?
  • And the varianceP1P1-P12
  • ?

20
  • 3. For the Geometric r.v.
  • Px
  • The mean Px x1

21
  • To find the variance,
  • The variance P1P1-P12

22
Some more examples
  • The
  • Where the Psuccessp and Pfailureq
  • Number of independent trials for the kth success
    Yk, Or the number of failures before the kth
    success r.v. Y
  • This distribution is called the negative binomial
    because he probabilities correspond to the
    successive terms of the binomial expansion of

Negative Binomial random variable Y,
23
  • Let Ykn and yn-k
  • The m.g.f is M(? ,t)qpe-?)-k
  • And the p.g.f is P(x,t)qpx-1)-k

24
  • Note that when k1, the probability mass
    function becomes qyp, which is the Geometric
    distribution.
  • It can be shown that P(x)
  • Which is the nth power of the p.g.f. of the
    geometric distribution.
  • Then it is clear that negative binomial r.v. is
    the convolution of n geometric random variables.
  • Its mean

25
Compound Distributions
  • Consider the sum of n independent random
    variables, where the number of r.v. contributing
    to the sum is also a r.v.
  • Suppose SNX1X2XN
  • PrXijfj, PrNngn, PrSNkhk,
  • with the corresponding p.g.fs.

26
This property is mainly used in discrete
branching processes
27
Moment generating function
  • Moment generating function (m.g.f) of a r.v. Y
    is defined as M(?)Ee ?Y
  • If Y is a discrete integer valued r.v. with
    probability PYjpj
  • Then M(?)

28
  • Taylor expansion of of M(?) generates the
  • moments given by
  • M(?)
  • is the rth moment about the origin,
  • M(?) ? . E(X), M (?) ? 0 (E(X(X-1))

29
  • If X is a continuous r.v.with a frequency
    distribution f(u), then we have
  • And all the other properties as in the discrete
    case.

30
M.g.f of the Binomial distribution
  • M(?)(qpe?)n
  • That is replace x in p.g.f. P(x) by e?..
  • M(?)n(qpe?)n-1p
  • EX M(?) ?0 n(qpe?)n-1p ?0
  • n(qp)n-1p , since e? 1 when ?0
  • np
  • Similarly it can be shown that V(X)npq
Write a Comment
User Comments (0)
About PowerShow.com