Title: Monitoring the Quality of Observations Data Monitoring at ECMWF
1Monitoring the Quality of ObservationsData
Monitoring at ECMWF
- Quality Control
- Monitoring
2Data extraction
- Blacklist
- Data skipped due to systematic bad performance
or due to different considerations (e.g. data
being assessed in passive mode) - Departures and flags available for further
assessment
- Check out duplicate reports
- Ship tracks check
- Hydrostatic check
- Thinning
- Skipped data to avoid Over sampling
- Even so departures from FG and ANA are generated
and usage flags also
- 4DVAR QC
- Rejections
- Used data ? increments
3Data input
Data assimilation
- Raw observation
- Departures (FG AN)
- Flags (data used, thinned, rejected)
- Feedback files (BUFR)
- ODB
Monthly BUFR files for different Obs types
Long term statistics
4Monthly feedback files
- Interactive tools for all obs types allowing
selection of - Layers/Areas
- Time window
- FG/AN
- All data/used/rejected
- Sondes VSTAT files
- RS
- Pilot/Profiler
- Dropsondes
- Monthly stats written to binary files for
- Surface obs
- Aircrafts
5Data Monitoring (Procedures)
- The basic information is included in the feedback
files - The statistics are normally computed by comparing
the observations with a FG (6 or 12 hours
forecast) - But the quality of those forecasts is not the
same everywhere ? no fixed criteria should be
applied when assessing data quality
6Blacklists
- The blacklist at ECMWF is flexible enough to
consider partial blacklisting depending on - Parameters
- Areas
- Layers
- Time cycles
- And of course different observation types.
- MetOps Data Monitoring elaborates a proposal to
update the blacklist which then is discussed with
HMOS and HDA. In cases with heavy changes
sensitivity experiments are carried out before
implementing the new blacklist
7Quality problems in Asia and Russia
850 of the UA stations blacklisted for wind are
PILOT particularly in North Africa and Asia
9Techniques used to assess the quality of
observations Technique
Tools
- Comparison of obs with BG fields
- Consistency between different data sources
- Self-consistency
- Check departures from FG and AN
- Co-locations
- Run time series for data types or individual
platforms
10Radiosondes Monitoring
11Daily Monitoring
- The Met Analyst on duty checks out
- Observations
- Deterministic models
- EPS systems
- Troubleshooting
- The result is a Daily Report available in our
internal web site emphasizing the important
issues of the day
12Radiosondes Monitoring (1)
- ECMWF is the Lead Centre for Radiosondes
Monitoring - We produce
- Monthly Global Reports. This Report is available
now in pdf and html format and and they already
are in our public web site - http//www.ecmwf.int/products/forecasts/monitorin
g/mmr/ - http//www.ecmwf.int/products/forecasts/monitoring
/mmr/mmr.pdf - Support to GUAN available in our web site (free
access) - Twice a year the Consolidated Report for Suspect
Stations - Once a year the list of the best stations to be
used as a reference for the assessment of NWP
models performance
13Radiosondes Monitoring (2)
- There is an important exchange of information
among the main Monitoring Centres - The results are fed back to WMO
14(No Transcript)
15OMEGA wind finding system cessation
16Radiosondes Monitoring (3)
- One example showing the need for unified criteria
(Sondes wind direction statistics ? used to
detect bad antenna orientations) - One example showing the benefits of information
exchange
17JMA and UKMO show a -20 degrees bias
18Sites directional setting changed by 6.5 degrees
19Sondes height monitoring (1)
- Standard levels height are computed by data
producers. - Significant levels height are computed by data
users using the station height included in their
UA catalogues.
20Sondes height monitoring (2)
- There are different techniques to assess the
actual height of RS stations - Alduchov Eskridge
- Computing separate statistics (OBS-FG) for
standard and significant levels wrong station
heights can be detected. - The technique detects stations with systematic
SIGNIFICANT-STD levels biases with small spread
(std). - The software is run at ECMWF on a monthly basis
and then the height catalogue is corrected if
needed - Example
21 Full red line ? Observed temperature profile
Dashed red line ? FG temperature profile Dashed
blue line ? Observed dew point profile Dotted
blue line ? FG dew point profile
The profile looks OK but . the whole
geopotential profile (used at that time) was
systematically rejected by the analysis
22(No Transcript)
23Standard levels
Significant levels
24Height correction applied
25RS temperature bias correction at ECMWF
- Depends on solar elevation
- Depends on RS type
- Wide range of equipment used in the RS network
- Vaisala ? very reliable (not corrected)
- Meisei (Japan) ? very reliable (not corrected)
- VIZ ? reliable (but needs TCORR at the
Stratosphere) - AVK-MRZ Meteorit (Russia) ? not very reliable
at the Stratosphere (needs TCORR at the
Stratosphere) - Indian ? unreliable
- Chinese ? not very reliable at the Stratosphere
(needs TCORR at the Stratosphere) - A lot of RS dont report the sort of equipment
they are using so we have to guess and use an
external list based on contact points and long
term statistics
26(No Transcript)
27A recent example
- Russian radiosondes troubleshooting
28Huge coverage gap
29The network is recovering now
30Radiosondes humidity -
- Currently the humidity from Radiosondes is
blacklisted above 300 hPa - We have plans to use the humidity from
Radiosondes above that level with some
restrictions - RS80 (VAISALA) up to 60 C
- RS90, RS92 (VAISALA) up to 80 C
- MEISEI (Japan) up to 40 C
31Proposal of RS to be used for humidity above 300
hPa
32PROFILERS Monitoring(Doppler radar atmospheric
wind profiles ?High temporal and vertical
resolution)
- http//w3ec2.ecmwf.int/metops/d/inspect/catalog/Da
ta_Monitoring/PROFILERS/
33Profilers
- Doppler radar wind profiles
- 3 different networks
- EU mixed quality, some of them show a poor
performance - USA good and consistent network
- Japan very good and consistent network
34(No Transcript)
35(No Transcript)
36(No Transcript)
37Dropsondes monitoring
- Temperature and wind used
- Humidity blacklisted (under assessment)
- Daily monitoring done when they show up
38Surface observations
- The problem of the station height catalogue
39(No Transcript)
40(No Transcript)
41(No Transcript)
42 SGN REG LAT LON HP
HA 6032
0 1 35.90 -5.32 4 2 CEUTA
SPAIN (CEUTA AND MELILLA) /
ES 60710 1 36.95 8.75 21 20
TABARKA TUNISIA / TUNISIE
62801 1 11.75 32.78 282
282 RENK SUDAN / SOUDAN
63881 1 -7.97 31.63
1923 -999 SUMBAWANGA UNITED
REPUBLIC OF TANZANIA / 65355 1 9.77
1.10 343 342 NIAMTOUGOU
TOGO 68014 1
-19.60 18.12 1400 1411 GROOTFONTEIN
NAMIBIA / NAMIBIE 68030
1 -18.53 25.63 1071 -999 PANDAMATENGA
BOTSWANA
68343 1 -27.65 25.62 1128 1228
BLOEMHOF SOUTH AFRICA /
AFRIQUE DU SUD 68821 1 -33.62 19.47
270 269 WORCESTER SOUTH
AFRICA / AFRIQUE DU SUD
43Surface observations
- WMO catalogue should be the basic source of
information but - Different Centres use different station
catalogues with significant differences - See the next examples
44UKMO applies a bias correction of 6.8 hPa
Source Doc.4/Add1 Submitted Mr I. Gitonga
45(No Transcript)
46Another example
- Tunisia Synop 60718
- Not blacklisted in UKMO
- Not bias corrected in UKMO
- Strongly biased in ECMWF statistics showing a
small std ? a Kalman filtering should be done
in the mean time this Synop is blacklisted
47Kalman filter bias correction
48Surface data monitoring
- Why we can not rely in automatic monitoring
- Very often a meteorological assessment is needed
49An event of mesoscalic increments in the Rockies
Automatic station
50The monitoring of METAR data
- We are assimilating METAR data in passive mode
since March 29th 2004 - As for any other new data source monitoring
statistics are run for a few months. In case the
data quality is considered as acceptable
sensitivity experiments are run to assess the
impact of the new data (this includes the
development of a new blacklist) - If everything is OK the new data source is
switched in data assimilation as active - A few comparisons Metar vs Synop data follow
51Very often Synops from Colombia, Peru, Ecuador
and Mexico are missed but we receive regularly a
good amount of Metar reports from these countries
52(No Transcript)
53Much smaller random deviation in METAR data
pointing to a much smaller number of very bad
reports
54impressive difference in quality in Central and
South America.
55Drifters Monitoring ? the more relevant problems
are related to
- Bad locations
- Sudden deterioration
56Gross errors
Gross errors
57Biased
Kalman correction
Sudden deterioration
58Summary surface data
- A substantial amount of Synop showing pressure
biases and small std values are showing up in
monthly statistics. These results point to a
station height catalogue which is not correct. - We need a correct and unified station height
catalogue otherwise - Bias correction and/or Kalman filtering needed
- Why to blacklist a station that could be useful ?
- The usage of Metar data will hopefully improve
the situation (Aeronautical operations require a
tighter QC than in Synop) - Well have to remove duplicates Metar-Synop
- A new blacklist including Metar has already been
proposed
59Aircrafts
- They provide temperature and wind good quality
data in particular automated observations (AMDAR
ACARS) - More and more ascent and descent profiles are
available on the GTS - Humidity sensors are under assessment
60Aircraft Monitoring
- The quality of this data source is comparable to
Radiosondes - Lots of vertical profiles available on North
America and Europe - The quality of these data is the reason why we
dont use AMV on North America or Europe - .. See the next comparison ACARS vs. RS
61(No Transcript)
62(No Transcript)
63The 9 degrees problem American ACARS
Used data
All data
64AMV
- Atmospheric Motion Vectors (formerly SATOB)
65(No Transcript)
66(No Transcript)
67Wind speed negative bias removed by QC
68The negative speed bias on jets overcorrected by
data producers
69- Co-locations
- AMV vs AIRCRAFT
- AMV vs RADIOSONDES
70Satellite data monitoring (G. van der Grijn
lecture)
- Just one example extracted from the Daily
Monitoring Reports available on the internal web
71ATOVS
- Daily monitoring
- Time series displayed on the web help to identify
problems for different channels and instruments - See this example December 2002
- Longer term monitoring includes bias correction
72(No Transcript)
73(No Transcript)
74(No Transcript)