Assessment of NPP Sea Ice Products and Product Continuity for Regional and Global Studies - PowerPoint PPT Presentation

1 / 11
About This Presentation
Title:

Assessment of NPP Sea Ice Products and Product Continuity for Regional and Global Studies

Description:

Other EDRs (over sea ice-covered areas) Cloud parameters, neat heat flux Target EDRs ... ice-ocean-atmosphere studies, such as cloud, flux, and SST products? ... – PowerPoint PPT presentation

Number of Views:232
Avg rating:3.0/5.0
Slides: 12
Provided by: lindagm
Category:

less

Transcript and Presenter's Notes

Title: Assessment of NPP Sea Ice Products and Product Continuity for Regional and Global Studies


1
 Assessment of NPP Sea Ice Products and Product
Continuity for Regional and Global Studies 
  • Jim Maslanik
  • University of Colorado
  • Nov 4-6, 2003 

2
EDR/CDR List
  •      Target EDRs
  • ice edge location (EDR 1.2.3.2.2.1)
  • ice concentration (EDR 1.2.3.2.2.2)
  • sea ice characterization (EDR 1.7.8)
  •     Related EDRs
  • sea surface temperature near and within the ice
    pack (EDR 1.2.4)
  • sea ice albedo (EDR 1.5.2)
  • snow cover/depth over sea ice (EDR 1.6.3)
  • ice surface temperature (EDR 1.7.3)
  • basic radiance imagery sea ice EDR 1.2.3)
  •      Other EDRs (over sea ice-covered areas)
  • Cloud parameters, neat heat flux

3
Researcher Roles
  • Principal Investigator Jim Maslanik
  • - Algorithm and cal/val assessments,
    design/generation of proxy and synthetic data
    sets, other high-level research and reporting
    activities.
  • Research Assistants Steve Hart, Charles Fowler
  • - Assistance with data processing, organization
    of comparison data sets and other general
    analysis tasks.

4
PI Background/Expertise
  • Studies of local, regional and hemispheric
    climate variability using a range of data types,
    process models, and climate models.
  • Algorithm development and product generation
    (AVHRR Polar Pathfinder, MODIS, SSM/I, SAR,
    AMSR-E).
  • Algorithm refinement via R/T modeling,
    coefficient/tie point selection, synthetic data.
  • Validation of satellite-derived sea ice and other
    products via collection and use of field,
    aircraft and satellite data (PI for field and
    modeling portion of the EOS AMSR-E sea ice
    product validation).
  • User of operational sea ice / weather / ocean
    data sets.

     Consultant for CMIS ice EDR algorithms
(Ball Aerospace and Raytheon).
5
Assessment Criteria
  • How well do the proposed EDRs represent key
    climate parameters? Are the EDRs measuring the
    most important and/or sensitive sea ice
    conditions or processes?
  • Are the EDRs able to extend existing long-term
    climate records? Level of reliance on sensor
    calibration vs. statistical matching?
  •   How applicable are the EDRs to climate studies
    on a range of space and time scales, i.e., are
    they suitable for local and regional studies as
    well as hemispheric and global analyses.
  • Applicability to modeling, assimilation?
  • How well do the EDRs represent secondary sea
    ice parameters that may offer new climate change
    information?

6
Assessment Criteria (cont.)
  • How well do the EDRs perform under the full
    range of polar conditions? What capabilities
    (confidence flags, documentation, etc.) will be
    provided to indicate performance for users?
  • How sensitive are the EDRs to sensor
    calibration, degradation of in-orbit sensor
    performance, and to changes in external
    environmental factors? How do the magnitudes of
    these sensitivities compare to the expected
    climate-change signal?

7
Assessment Criteria (cont.)
  • How well do the sea ice EDRs integrate with
    other EDRs relevant for ice-ocean-atmosphere
    studies, such as cloud, flux, and SST products?
  • What types of metadata and documentation are
    included with the EDRs?
  • What calibration/validation approaches have been
    considered for the EDRs? How realistic and
    feasible are the cal/val plans in terms of cost,
    logistics, risk? Sufficient to assess claimed
    performance?

8
Approach
  • Reviews ATBDs, algorithms, cal val plans,
    proxy and synthetic data sets
  • Independent performance assessment using
    observations and imagery, proxy and synthetic
    data, sensitivity studies
  • Performance assessment relative to heritage
    algorithms and advanced algorithms
  • Document EDR performance relative to science
    user expectations
  • Assess data packaging issues and
    polar-specific aspects

Within budget limits
9
Resources Offered/Requested
Offered
  • Validation/comparison data sets (satellite and
    aircraft imagery, atmospheric profiles, in-situ
    data).
  • Synthetic data of typical and extreme polar
    surface and atmospheric conditions.

Requested
  • Alternative/additional radiance models and
    other scene generation tools for comparison.

10
Contributions to Deliverables
  • Algorithm analysis report
  • Cal/val planning logistics issues,
    coordination with other cal/val and field
    efforts, etc.
  • Others (data documentation, user-accessible
    docs., QC flags, etc.)

11
Questions?
Write a Comment
User Comments (0)
About PowerShow.com