Splinter Session - PowerPoint PPT Presentation

1 / 7
About This Presentation
Title:

Splinter Session

Description:

Title: Splinter Session - Space Weather Metrics, Verification & Validation. Thursday 20th Nov., 16:30 - 18:00 Author: suzy.bingham Last modified by – PowerPoint PPT presentation

Number of Views:38
Avg rating:3.0/5.0
Slides: 8
Provided by: suzybi
Category:

less

Transcript and Presenter's Notes

Title: Splinter Session


1
Splinter Session Space Weather Metrics,
Verification Validation.Thursday 20th Nov.,
1630 - 1800
2
Agenda1630 Introduction1635 Towards a
verification framework for forecasts of different
centres, Andy Devos, J. Andries, C. Verbeeck, D.
Berghmans (STCE-ROB), Belgium.1645
Verification of extreme event forecasts,
Institute of Space Physics, Peter Wintoft, IRF,
Sweden.1650 SWENET Index Quality Statistics
Database Assessment, Alexi Glover, ESA.1655
Performance Verification of Solar Flare
Prediction Techniques Present Status, Caveats,
and Expectations, Manolis Georgoulis, Academy of
Athens, Greece.1700 The use of modified
Taylor diagrams for comparing geophysical models,
Matthew Angling, University of Birmingham,
UK.1710 Translating verification experience
from Meteorology to Space Weather, Suzy Bingham,
Met Office, UK.1720 Lessons learned from
CCMC-led community-wide model validation
challenges.  Outlook on international
coordination of MV activities, Maria Kuznetsova,
CCMC , USA.1730 Discussion.1800 Finish.
3
Aims         To understand current techniques
used in space weather validation, verification
metrics.           To discuss key physical
parameters products which require validation /
verification.           To ascertain the
validation, verification metric techniques
required to move forward.           To
determine the support required by the community.

4
  • Definitions
  • Validation process of determining the degree
    to which a product or service (including
    potentially software associated data)
    accurately represent the real world from the
    perspective of the intended use(s).
  • E.g. the accuracy of the output of a model
    compared with truth data.
  • Verification process of determining that a
    system or service (including potentially
    software, implementation associated data)
    perform as expected.
  • E.g. for redeployment of model in a different
    locationto confirm that a model accepts the full
    range of inputs, etc.
  • Metrics statistical parameter. Scientific
    metrics are related to specific key parameters
    e.g. Skill scores, Probability Of Detection
    (POD), False Alarm Ratio (FAR).
  • Application metrics include scientific metrics,
    extend to overall service performance are
    essentially KPIs (Key Performance Indicators),
    e.g. accuracy confidence in a service.

5
  • Topics covered in last years plenary session on
    Forecast Verification
  • Forecast verification applied by forecast
    providers
  • Future development of forecast verification
    communicating verification results and
    requirements between users and providers
  • Metaverification -- evaluating performance
    measures
  • Magnetic and other space weather indices
  • Ground effects of space weather
  • Ionospheric and magnetospheric processes
  • Solar and interplanetary data.

6
Presentations...
7
  • Discussion topics
  •  What metrics and validation techniques are
    required in the current space weather landscape?
  • What are the key challenges currently in model
    and forecast benchmarking?
  • What direction should the space weather community
    be taking?
  • What actions can agencies and organisations take
    in order to support a wider space weather
    validation effort?
  • How to establish agreed realistic model/service
    targets to encourage targeted development and
    prototyping?
  • What targeted actions would encourage groups not
    currently involved to further participate in
    space weather validation activities?
Write a Comment
User Comments (0)
About PowerShow.com