HDF and HDF-EOS Workshop - PowerPoint PPT Presentation

About This Presentation
Title:

HDF and HDF-EOS Workshop

Description:

The Homogenization and Reporting of Groundbased Atmospheric ... LIDAR, MWR, FTIRs, UV-Vis instruments, etc. Different platforms ... LIDAR, MWR, FTIR completed ... – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 21
Provided by: boja9
Learn more at: http://hdfeos.org
Category:
Tags: eos | hdf | mwr | workshop

less

Transcript and Presenter's Notes

Title: HDF and HDF-EOS Workshop


1
The Homogenization and Reporting of Groundbased
Atmospheric Datasets for the Validation of Earth
Observing Satellite Instruments - the Aura and
Envisat validation approach using HDF
  • B. R. Bojkov, NASA-GSFC (USA)
  • R. M. Koopman, ESA-ESRIN (I)
  • M. De Maziere, BIRA-IASB (B)
  • I. Boyd, NIWA (NZ)

2
Outline
  • Context (and some history)
  • Implementation for Envisat and Aura
  • Future activities

3
  • Why did we do this for atmospheric sciences?
  • (from the perspective of validation)

4
Near-real-time ozonesondes
  • Ensuing from European Arctic campaigns in the
    1990s
  • 35 stations reporting
  • operational NRT and scientific
  • All stations uses NASA-Ames
  • simple ASCII
  • More than 18 file variations
  • nomenclature and formulations
  • Clear need of homogenization

5
The problem is accentuated
  • Different measurement networks
  • LIDAR, MWR, FTIRs, UV-Vis instruments, etc.
  • Different platforms
  • Groundbased, aircraft, balloon, ship, in-situ,
    etc.
  • Different agencies
  • Dialog across timezones, reporting facilities,
    etc.

6
So, in 1996/1997
  • Near impossible to use effectively for research
    or tools/RDB/web implementations
  • Need to resolve this file reporting problem by
    investigating/understanding the state of affairs
    in the field
  • First objective satellite validation of
    atmospheric chemistry instruments
  • up and coming ESA and NASA missions

7
COSE
  • COSE - Compilation of atmospheric Observations in
    support. of Satellite measurements over Europe
  • European Commission (EC) funded project
  • Consortium of 25 groundbased and satellite teams
  • Timeline 1998-2000

8
COSE (2)
  • Approach
  • Investigated existing file formats, usage,
    community needs in consultation with the
    different stake-holders
  • Recommendations
  • Defined fixed file formulation HDF4 (SDS)
  • Rigid metadata, including strict guidelines for
    attributes requirements, variable naming, etc.
  • Make files truly homogeneous and self explanatory
  • Final document by Bojkov et al., 2002 (available
    through AVDC)

9
Basic HDF4 SDS file layout
  • Global attributes 31 attributes
  • Data source / contact information
  • Dataset contents / location /
  • File generation date / version / caveats /
  • Attributes for each SDS 19 attributes
  • Variable description / notes / caveats
  • Dependencies / dimensions
  • Units / SI conversion factor / Valid min. max.
    / Fill values
  • Display attributes (label, axis, )

10
Global attributes
Variable attributes
11
Variable naming convention
  • Variable name construction 3 part construction
  • VARIABLE_NAME VARIABLE_MODE (
    VARIABLE_DESCRIPTOR)
  • Example
  • The variable name for O3 measured by an O3sonde
    and a Brewer spectrometer are
  • O3_CONCENTRATION for the ozonesonde
  • O3_VERTICAL.SOLAR for the Brewer in direct sun
    mode
  • O3_VERTICAL.ZENITH for the Brewer in Umkehr mode
  • The RMS for a FTIR ozone measurement can be
    expressed as
  • O3_VERTICAL.ZENITH_UNCERTAINTY.RMS

12
Similar conventions for
  • Attribute entries
  • Instrument naming
  • File names

13
First Implementation ESA
  • Spring 2000 ESA Envisat mission was is its
    final validation planning
  • 10 instruments, focus on land, atmosphere and
    oceans.
  • ESA required
  • Fully relational database for calibration and
    validation
  • File QA/QC on all incoming files
  • Specifically COSE concept, including HDF 4, to be
    implemented for atmosphere and ocean instrument
    validation

14
Envisat Cal/Val
  • Implemented in October 2000
  • Meets all ESA requirements, including QA and the
    RDB
  • 200 users, intercontinental participation
  • Backbone of the coordinated validation efforts of
    3 atmospheric and 2 ocean instruments
  • Data sources collected from ship-based to
    satellites
  • Extended to other Envisat/ESA validation
    activities (2004)
  • One issue heterogeneous reporting
  • the ozonesonde problem was back!

15
2004-present
  • EOS-Aura validation activities
  • The Network for Detection of Atmospheric
    Composition Change (NDACC - former NDSC)

16
Aura Validation Data Center
  • Support the NASA EOS-Aura mission
  • 4 atmospheric instruments
  • troposphere to mesosphere measurements
  • Share validation communities with Envisat
  • AVDC uses the same concept/approach as Envisat
  • Maintain Envisat files compatibility
  • Operational since October 2004
  • 300 users - many new participants
  • 2TB validation data

17
AVDC modifications
  • Refined file and metadata requirements through
    thorough analysis of Envisat Cal/Val in early
    2004
  • Eliminate common user errors and misconceptions
  • Added HDF5 implementation (user request due to
    HDF5 satellite data)
  • Implemented rigid measurement reporting templates
  • Collaborative effort with NDACC
  • Measurement specific and defined by expert
    community
  • Results in truly homogeneous files
  • Document describing changes by Bojkov et al.,
    2006 available through AVDC web site

18
Future of the AVDC/Envisat implementation
  • AVDC refinements are being implemented into
    Envisat Cal/Val
  • AVDC/Envisat format being phased into NDACC
    network
  • LIDAR, MWR, FTIR completed
  • AVDC concept to be extended to NASA A-train (and
    most probably NPP wrt validation data)
  • Format is ESA requirement for ESA/EC GMES program
  • To be extended to land and radar altimetry
  • NASA and ESA to remain synchronized

19
Closing remarks
  • This type of work takes time
  • Began in 1998, and will be ongoing
  • Requires proactive involvement by all parties
  • Validation teams, instrument PIs, satellite
    teams,
  • Requires rigid guidelines - down to the
    measurement level
  • Once implemented, it is extremely useful for all
    involved as experienced for the Envisat and Aura
    missions when validation relies on very different
    data sources

20
  • bojan.bojkov_at_gsfc.nasa.gov
  • http//avdc.gsfc.nasa.gov/
Write a Comment
User Comments (0)
About PowerShow.com