Time domain targeted pulsar search: Algorithm and Results - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Time domain targeted pulsar search: Algorithm and Results

Description:

Once fully heterodyned, the complex times series simply evolves with the IFO ... Finally we marginalize over the uninteresting parameters to leave the posterior ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 16
Provided by: grah163
Category:

less

Transcript and Presenter's Notes

Title: Time domain targeted pulsar search: Algorithm and Results


1
Time domain targeted pulsar searchAlgorithm
and Results
  • Réjean Dupuis Graham Woan
  • IGR Glasgow

2
Reasoning behind a time-domain analysis
  • Searches targeted at known radio pulsars are not
    computationally expensive, so we can trade some
    efficiency for clarity and flexibility.
  • We know the IFO sensitivity is non-stationary and
    that there are gaps and dropouts. This evolution
    is handled naturally in a time domain analysis.
  • The data handling/transport problem can be
    efficiently reduced by heterodyning (mixing) the
    raw h(t) channel at near the expected pulsar
    signal frequency, and band-limiting the result to
    just a few Hz (gaining a factor of 1000 in
    compression).
  • Pulsars with complex phase evolutions (especially
    the Crab) can be processed relatively simply.

3
The signal a reminder
  • We use the standard model for the detected strain
    signal from a non-precessing neutron star

4
Data heterodyning
  • Data are heterodyned in two stages
  • A coarse complex heterodyne at a fixed frequency,
    to reduce the effective sample rate to 4 Hz and
    allow for easy data transportation.
  • A fine complex heterodyne to take account of
    pulsar slowdown and Doppler shift, reducing its
    apparent frequency to 0 Hz, and reducing the data
    rate to (e.g.) 1 sample per 60 s.
  • Accomplished with LAL routines LALCoarseHeterodyne
    ToPulsar and LALFineHeterodyneToPulsar, which
    include robust low-pass filtering routines to
    protect from strong out-of-band signals.
  • Between these stages, the noise level is
    estimated from the variance of the data over each
    60 s period (assumed constant over this period).

5
Heterodyne output
  • Once fully heterodyned, the complex times series
    simply evolves with the IFO antenna pattern as
    the pulsar moves across the sky, and we fit a
    model to this signal of the form
  • where a is the vector of the 4 unknown
    parameters.

1 day
Rey(t)
GPS seconds
6
Model fitting 1
  • The data from the heterodyne code, Bk, are
    modelled as Gaussian, with variances estimated
    from each nominal 60 s stretch. This is fair
    provided the central limit theorem holds and the
    data are stationary over the period.
  • We take a Bayesian approach, and determine the
    joint posterior distribution of the probability
    of our unknown parameters, using uniform priors
    on over their
    accessible values, i.e.

posterior prior likelihood
7
Model fitting 2
  • The likelihood (that the data are consistent with
    a given set of model parameters) is proportional
    to exp(-?2 /2), whereThe sum is only over
    valid data, so dropouts and gaps are dealt with
    simply.
  • Finally we marginalize over the uninteresting
    parameters to leave the posterior distribution
    for the probability of h0

8
Upper limit definition
  • The 90 confidence upper limit is set by the
    value h90 satisfying

h90
9
Detection
  • A detection would appear as a maximum
    significantly offset from zero. Note that an
    upper limit can still be defined(!)

Most probable h0
h90
10
Validation 1
  • Marginalised posterior pdf for h0, resulting from
    an end-to-end test using 24 h of stationary,
    fake, Gaussian noise.
  • signal h0 0, f01234 Hz, RA dec 0, ? 0,
    ? ?/4 noise Sh 9 x 10-17 Hz-1/2 (?
    10-18 at 16384 samples/s)
  • Note a naïve calculation would give a 1? upper
    limit of The apparent loss on sensitivity of
    12 is due to the precise definition of h0 and
    the attenuation from the mean beam for this
    (low dec) test source.

1? upper limit 3.1 x 10-22
11
Validation 2
  • Marginalized posterior pdf for h0, using 24 h of
    fake data and Gaussian noise.
  • signal h0 6 x 10-21 , f01234 Hz, RA dec
    0, ? 0, ? ?/4 noise Sh 9 x 10-17
    Hz-1/2 (? 10-18 at 16384 samples/s)
  • (The multiple peaks are an temporary artefact of
    the fake data generation method.)

12
GEO E7 results 1
  • Within a 4 Hz band around 1283 Hz (PSR
    J19392134), the noise is highly non-stationary,
  • but the standard deviation weighted data appears
    Gaussian over 60 s (above). In fact we need to
    drop to 10 s to resolve stationarity in these
    data.

13
GEO E7 results 2
  • Assuming just 10 s stationarity, the noise is
    time-resolved and a consistent upper limit for
    PSR J19392134 can be determined

90 confidence upper limit h0 lt 4.5 x 10-20
Marginalised posterior probability
14
Prospects for GEO S1
  • S1 is more sensitive than, but possibly less
    stationary than, E7

15
Morals and intentions
  • Method works and can handle some truly horrific
    conditions.
  • A good understanding of the noise is vital to the
    definition of a reliable upper limit.
  • Monte Carlo runs will give a final check.
  • Method still needs to be applied to LIGO data.
Write a Comment
User Comments (0)
About PowerShow.com