Multidimensional%20Integration%20Part%20I - PowerPoint PPT Presentation

About This Presentation
Title:

Multidimensional%20Integration%20Part%20I

Description:

Multidimensional Integration Part I Harrison B. Prosper Florida State University Outline Do we need it? Markov Chain Monte Carlo Adaptive Methods Summary Do we need it? – PowerPoint PPT presentation

Number of Views:105
Avg rating:3.0/5.0
Slides: 14
Provided by: Harri195
Learn more at: http://www.hep.fsu.edu
Category:

less

Transcript and Presenter's Notes

Title: Multidimensional%20Integration%20Part%20I


1
Multidimensional IntegrationPart I
  • Harrison B. Prosper
  • Florida State University

2
Outline
  • Do we need it?
  • Markov Chain Monte Carlo
  • Adaptive Methods
  • Summary

3
Do we need it?
  • Most analyses in high energy physics are done
    using frequentist methods.
  • The more sophisticated ones typically involve the
    minimization of log likelihoods using programs
    such as the celebrated MINUIT.
  • These methods in general do not need
    multidimensional integration.

4
But we may need it if
  • We wish to do analyses using Bayesian methods.
    Here are a few examples
  • Limit-setting
  • DØ Single Top Analysis
  • Luminosity estimation
  • SUSY/Higgs Workshop
  • Jet energy scale corrections

5
Jet Energy Scale Corrections
1. Assume we have a pure sample of
2. Assume
6
A single event
Likelihood
Prior
Posterior
7
But need lots of events, in practice!
Posterior
Number of dimensions 2Nm, where N is the
number of events used and m is the number of a
parameters. If N 1000 and m 3, we have Ndim
2000!!
8
Multidimensional Integration
  • Low dimensions, that is, lt than about 20
  • Adaptive Numerical Integration
  • Recursively partition space while working to
    reduce the integration error on the partition,
    which at a given step, has the largest error.
  • High dimensions, that is, gt than about 20
  • Markov Chain Monte Carlo

9
The Basic Idea
  • Generate a sequence of parameter values ai from
    the posterior distribution Post(aD) and compute
    averages
  • In general, it its very difficult to sample
    directly from a complicated distribution.
    Gaussians, of course are easy!
  • Must use indirect method to generate sequence.

10
The Basic Idea, cont.
  • If the sequence of ai are statistically
    independent the uncertainty in the estimate of
    the integral is just the error on the mean
  • Important The error reduces slowly, but it does
    so in a manner that is independent of the
    dimensionality of the space.

11
Markov Chain Monte Carlo
  • State, x a vector of real-valued quantities
  • Transition probability, T probability to get
    state x(t1) given state x(t)
  • Proposal probability, q probability to propose a
    new state y(t1) given state x(t)
  • Acceptance probability, A probability to accept
    the proposed state.
  • Markov chain random sequence of states x(t) with
    the property that the probability to get state
    x(t1) depends only on the previous state x(t).

12
MCMC 1
  • Let pt1(x) be the probability of state x at time
    step t1 and pt(x) be the probability of state x
    at time step t. Then
  • The goal is to produce the following condition
  • as the time step t goes to infinity. That is, to
    arrive at a stationary (or invariant, or
    equilibrium) distribution p(x)

13
MCMC 2
  • Next time we shall see how that condition can be
    achieved!
Write a Comment
User Comments (0)
About PowerShow.com