Sample%20variance%20and%20sample%20error - PowerPoint PPT Presentation

About This Presentation
Title:

Sample%20variance%20and%20sample%20error

Description:

We can often determine P(model) using prior knowledge about the models. ... Choice of prior. Suppose our model of a set of data X is controlled by a parameter ... – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 13
Provided by: Phys157
Category:

less

Transcript and Presenter's Notes

Title: Sample%20variance%20and%20sample%20error


1
Sample variance and sample error
  • We learned recently how to determine the sample
    variance using the sample mean.
  • How do we translate this to an unbiased estimate
    of the error on a single point?
  • We cant just take the square root! This would
    introduce a bias

2
Mean and variance of S2
  • Like any other statistic, S2 has its own mean and
    variance.
  • Need to know these to compute bias in S

3
Bias in sqrt(S2)
  • Define square root function g
  • g(X) and its derivatives
  • Hence compute bias

4
Unbiased estimator for ?
  • Re-define bias-corrected estimator for ?

5
Conditional probabilities
  • Consider 2 random variables X and Y with a joint
    p.d.f. P(X,Y) that looks like
  • To get P(X) or P(Y), project P(X,Y) on to X or Y
    axis and normalise.
  • Can also determine P(XY) (probability of X
    given Y) which is a normalised slice through
    P(X,Y) at a fixed value of Y or vice versa.
  • At any point along each slice, can get P(X,Y)
    from

6
Bayes Theorem and Bayesian inference
  • Bayes Theorem
  • This leads to the method of Bayesian inference
  • We can determine the evidence P(datamodel)
    using goodness-of-fit statistics.
  • We can often determine P(model) using prior
    knowledge about the models.
  • This allows us to make inferences about the
    relative probabilities of different models, given
    the data.

7
Choice of prior
  • Suppose our model of a set of data X is
    controlled by a parameter ?.
  • Our knowledge about ? before X is measured is
    quantified by the prior p.d.f. P(?).
  • Choice of P(?) is arbitrary subject to common
    sense!
  • After measuring X get posterior p.d.f.
    P(?X) P(X?).P(?)
  • Different priors P(?) lead to different
    inferences P(?X)!

8
Examples
  • Suppose ? is the Doppler shift of a star.
  • Adopting a search range 200 lt ? lt 200
    km/sec in uniform velocity increments implicitly
    assumes a uniform prior.
  • Alternatively scaling an emission-line profile
    of known shape.
  • If you know ? 0, can force ? gt 0 by
    constructing the pdf in uniform increments of
    Log ???so P(?) 1/Log(?).
  • Posterior distributions are skewed differently
    according to choice of prior.

9
Relative probabilities of models
  • Two models m1, m2
  • Relative probabilities depend on
  • Ratio of prior probabilities
  • Relative ability to fit data
  • Note that P(data) cancels.

10
Maximum likelihood fits
  • Suppose we try to fit a spectral line continuum
    using a set of data points Xi, i1...N
  • Suppose our model is
  • Parameters are C, A, ?0, ??
  • ?i , ?i assumed known.

11
Likelihood of a model
  • Likelihood of a particular set ? of model
    parameters (i.e. probability of getting this set
    of data given model ? ), is
  • If errors are gaussian, then

12
Estimating ?
Xi
  • Data points Xi with no errors
  • To find A, minimise ?2 ?2.
  • Cant use ?2 minimisation to estimate ? because
  • Instead, minimise

i
Write a Comment
User Comments (0)
About PowerShow.com