An Overview of ASKAP Computing - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

An Overview of ASKAP Computing

Description:

Cutout 0th moment maps for all sources. Higher-moment maps for resolved sources ... All catalogues and cutout images from analysis software ... – PowerPoint PPT presentation

Number of Views:42
Avg rating:3.0/5.0
Slides: 14
Provided by: benhum
Category:

less

Transcript and Presenter's Notes

Title: An Overview of ASKAP Computing


1
An Overview of ASKAP Computing Pipelines
  • Matthew Whiting
  • Australia Telescope National Facility

2
ASKAP Data Flow
3
Processing model for ASKAP Computing
  • Complex system with large data rates
  • Limit number of observing modes
  • Wrap up as Software / Science Instruments,
    matched to reduction capabilities
  • Data flow is large. Images created are large
  • Need to process the data straight away
  • Need a powerful computer to do so
  • Imaging algorithms, gridding in particular, have
    largest call on resources
  • Do all imaging and analysis within the Central
    Processor
  • This includes cataloguing!
  • We do not assume that this can be done later
  • Have staged deployment of capabilities to manage
    risk

4
Pipelines overview
  • Continuum mode
  • Average visibility stream to 256 channels. Keep
    full Stokes
  • Able to do full imaging with deconvolution over
    full 30 sq.deg.
  • Always have a good continuum model of the sky
  • Automatic cataloguing
  • Keep continuum model up to date for calibration
  • Create science catalogues
  • Have a well developed prototype that is
    undergoing testing
  • Transient mode
  • Correlator outputs every 5 seconds
  • Average visibilities to 16 channels
  • Remove sky model and look for new / varied
    sources
  • Raise alerts and maintain light-curves
  • Search on longer timescales as well

5
Spectral-line Pipeline overview
  • Keep full 16384 channels
  • Two polarisations only
  • Continuum model subtracted prior to imaging
  • This is a previously-made continuum model
  • Always have a good sky model available
  • No deconvolution in imaging
  • Maybe for a few nearby galaxies, or Galactic
    objects, but single major cycle only
  • Array configuration means PSF is good enough
  • Cataloguing done immediately
  • Measure as much as we can while we can
  • Create image cutouts, moment maps and spectral
    plots
  • Will have data quality evaluation built in to
    pipeline

6
Data Products and Volumes (Spectral-line)
  • Spectral-line cube 4096x4096x16384x2 pixels ( 2
    TB)
  • Catalogues of all sources
  • Integrated spectra for all sources
  • Cutout 0th moment maps for all sources
  • Higher-moment maps for resolved sources
  • Visibilities too large to keep long term

7
Accessing the Data
  • Astronomers access ASKAP data via the Science
    Data Archive
  • All data necessary for ASKAP operation will be
    kept at the Processing Centre
  • Science data staged for periodic transfer to
    Science Archive
  • Located separate from APC, management delegated
    to external party
  • All survey data will be public (after
    verification period)
  • Data sent to Archive
  • Images, Cubes, visibilities (continuum, not
    spectral-line),
  • Associated metadata
  • Transient time series ( images?) all-sky
    catalogues few TB
  • All catalogues and cutout images from analysis
    software
  • All-sky continuum survey has 70M entries, 17 GB
    catalogues, 17 TB with cutouts
  • All-sky HI survey has 500K entries, 140 MB
    catalogues, 32 TB with cutouts

8
Spectral-line Source Finding
  • Key aspect for HI surveys
  • Intention is to do source finding in Central
    Processor
  • Do not assume that it is possible to do
    cataloguing later
  • Want to process as much as possible while data is
    in memory
  • Processing requirements not great, but IO
    overheads are
  • Do source finding as soon as image made
  • Have spare processing capacity scope for several
    approaches
  • Places strong requirements on quality of source
    extraction
  • Need thorough tests of completeness and accuracy
    of extraction
  • Lots of testing through Design Phase and BETA
    commissioning
  • Good simulations essential
  • Alternative scenarios?
  • Leave source-finding until after controlled by
    SST?
  • Produce baseline list that is refined by
    relevant SST?

9
Spectral-line Visibility Data
  • Full spectral-line visibility data too large to
    keep long-term
  • Will be able to store for days-weeks, but will
    not permanently archive.
  • Averaging?
  • Possible to average visibilities to reduce
    volume.
  • Would do after continuum-subtraction, to reduce
    smearing
  • Would this be useful?
  • Post-processing considerations
  • What would one do with stored spectral-line
    visibilities?
  • How to process?
  • If processed using non-ASKAP resources, wont
    have ASKAP software (to do parallel processing)
  • Central processors specs are for it to be used
    100 of time

10
High-angular resolution
  • Full spectral-line cubes will be made at
    resolution from 2km array
  • Do not have resources (processors) to do full
    spectral-line data for full 16K channels at high
    resolution
  • Postage stamps?
  • High-resolution cutout images of sources of
    interest
  • May be possible to extract postage stamps if 10
    sources/field
  • Easier if source list is known beforehand
  • Currently not part of the prototype pipeline

11
Current and Future Work
  • Software development
  • Have imaging and analysis pipelines working on a
    parallel system
  • Making use of existing libraries e.g.
    source-finding code, casacore
  • Adding basic functionality e.g. profile-fitting
  • Simulation and testing
  • Have end-to-end simulation suite running to test
    software
  • Using local cluster to test parallel processing
  • Currently using continuum simulations only
  • Move to spectral-line later this year
  • Making use of different simulations, including
    SKADS Simulated Skies

12
Current and Future Work
  • Will work closely with SSTs post-August
  • Define specifics of algorithms and catalogue
    structure
  • Develop appropriate simulations to test
    reliability / completeness of analysis pipelines
  • Make use of end-to-end simulation suite
  • There will be postdocs who will work closely with
    us here
  • ASKAP postdocs for highest-ranked Survey Projects
  • Baerbel has a postdoc advertised currently to
    work on HI survey preparation
  • BETA commissioning (from late 2010) will be
    crucial
  • BETA Boolardy Engineering Test Array
  • 6 antennas with phased-array feeds
  • Early observations essential in testing pipelines
  • Make use of data to test algorithms and
    completeness of cataloguing.

13
Thank you
Australia Telescope National Facility Matthew
Whiting ASKAP Computing, Science Applications
Lead Phone 02 9372 4683 Email
matthew.whiting_at_csiro.au Web http//www.atnf.csir
o.au/projects/askap/
Write a Comment
User Comments (0)
About PowerShow.com