Diapositive 1 - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Diapositive 1

Description:

The Australian Community Climate and Earth System Simulator (ACCESS) ... JAMSTEC (Japan) ECHAM5(T106) - ORCA deg. IAP-CAS (China) AGCM - LSM ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 36
Provided by: mpime
Category:
Tags: diapositive | orca

less

Transcript and Presenter's Notes

Title: Diapositive 1


1
(No Transcript)
2
Outline
  • PRISM
  • goals benefits
  • FP5 project and the Support Initiative
  • organisation
  • the PRISM Areas of Expertise
  • OASIS
  • historical background
  • the community today
  • some key notes
  • The OASIS4 coupler
  • model adaptation
  • component model description
  • coupled model configuration
  • communication
  • regridding/transformations
  • grids supported
  • future developments and conclusions

3
  • Increase what Earth system modellers have in
    common today (compilers, message passing
    libraries, algebra libraries, etc.)
  • Share the development, maintenance and support of
    a wider set of Earth System Modelling software
    tools and standards.

4
  • reduce the technical development efforts of each
    individual research team
  • facilitate assembling, running, monitoring, and
    post-processing of ESMs based on state-of-the-art
    component models

Help climate modellers spend more time on science
  • promote the key scientific diversity
  • increase scientific collaboration
  • stimulate computer manufacturer contribution
    (tool portability, optimization of next
    generation of platforms for ESM needs)

5
  • 2001-2004 the PRISM EU project
  • a European project funded for 4.8 M by the EC
  • 22 partners
  • 2005-2008 the PRISM Support Initiative
  • 7 partners CERFACS, CGAM, CNRS, ECMWF, MPI-MD,
    UK MetOffice, NEC-CCRLE
  • 9 associate partners CSC, CRAY, IPSL,
    Météo-France, MPI-M, NEC-HPCE, SGI, SMHI, SUN
  • 8 py/y for 3 years

6
(No Transcript)
7
  • PRISM is organised around 5 PRISM Areas of
    Expertise
  • Promotion and, if needed, development of software
    tools for ESM
  • Organisation of related network of experts
  • Technology watch
  • Promotion of community standards
  • Coordination with other international efforts

8
  • Leader S. Valcke (CERFACS)
  • development and support of tools for coupling
    climate modelling component codes
  • OASIS3 and OASIS4 couplers
  • technology watch on coupling tools developed
    outside PRISM
  • PALM coupler (CERFACS)
  • Bespoke Framework Generator (U. of Manchester)
  • CCSM (NCAR),
  • relations with projects involving code coupling
  • UK Met Office FLUME project
  • US ESMF project, GENIE project
  • ACCESS

9
  • Leader M. Carter (UK MetOffice)
  • source version control for software development
  • code extraction and compilation
  • job configuration running
  • Subversion for source version control (PRISM SVN
    server in Hamburg)
  • Standard Compiling and Running Environments
    (SCE/SRE, MPI-MD)
  • integrated environment to compile and run
    different coupled models based on different
    component models (standard directory structure,
    coding rules)
  • Flexible Configuration Management (FCM, UK Met
    Office)
  • version control, compilation
  • prepIFS (ECMWF)
  • tailored graphical interface for model
    configuration (Web services tech.)
  • prepOASIS4 GUI to configure a coupled model
    using OASIS4
  • Supervisor Monitor Scheduler (SMS, ECMWF)
  • management of networks of jobs across a number of
    platforms

10
  • Leader M. Lautenschlager (MPI-MD)
  • data processing, visualisation, archiving and
    exchange for Earth system research
  • standards and infrastructure for networking
    between geographically distributed archives
  • CDO (Climate data Operators form MPI-M)
  • CDAT (Climate Data Analysis Tools, from PCMDI)
  • CERA-2 data model (World Climate Data Centre)
  • geo-referenced climate data detection, browse and
    use
  • MARS (ECMWF)
  • meteorological data access and manipulation (for
    major NWP sites)

11
  • Leader L. Steenman-Clark (CGAM)
  • Meta-data data about data, models, runs, ...
  • a hot topic in the last few years
  • exchange and use of data
  • interchangeability of Earth system models or
    modelling components
  • forum to discuss, develop, and coordinate
    metadata issues
  • Numerical Model Metadata (U. of Reading)
    numerical code bases, simulations
  • CURATOR project (USA) data, codes,
    simulations
  • Numerical grid metadata (GFDL, USA) grid
  • netCDF CF convention (PCMDI and BADC) climate
    and forecast data files
  • OASIS4 metadata coupling and IO interface
  • UK Met Office FLUME project management of
    model configuration

12
  • Leader M.-A. Foujols (IPSL), R. Redler
    (NEC-CCRLE)
  • Computing aspects are highly important for Earth
    system modelling
  • Computer vendors have to be kept informed about
    requirements emerging from the climate modelling
    community
  • Earth system modellers have to be informed about
    computing issues to preview difficulties and
    evolutions
  • file IO and data storage
  • algorithmic development
  • portable software to fit the needs of parallel
    and vector systems
  • sharing of experience (e.g. work on the Earth
    Simulator)
  • establishment of links with computing projects
    (e.g. DEISA)
  • information about important conferences and
    workshops.

13
OASIS developed since 1991 to couple existing
GCMs
1991
2001 --?
--- PRISM ? OASIS
1 ? OASIS 2 ? OASIS3?
? OASIS4 ?
  • OASIS1, OASIS2, OASIS3
  • low resolution, low number of 2D fields, low
    coupling frequency
  • flexibility very important, efficiency not so
    much!
  • OASIS4
  • higher resolution parallel models, massively
    parallel platforms, 3D fields
  • need to optimise and parallelise the coupler

14
  • CERFACS (France)
    ARPEGE3 - ORCA2-LIM ARPEGE4 -
    NEMO-LIM - TRIP
  • METEO-FRANCE (France) ARPEGE4 - ORCA2
    ARPEGE medias - OPAmed ARPEGE3 -
    OPA8.1-GELATO
  • IPSL- LODYC, LMD, LSCE (France)
    LMDz -
    ORCA2LIM LMDz - ORCA4
  • MERCATOR (France) (for interpolation only)
  • MPI - MD (Germany)
    ECHAM5 - MPI-OM ECHAM5 - C-HOPE
    PUMA - C-HOPE EMAD - E-HOPE
    ECHAM5 - E-HOPE ECHAM4 - E-HOPE
  • ECMWF IFS - CTM IFS -
    ORCA2

15
(No Transcript)
16
  • Developers CERFACS, NEC CCRLE, CNRS, SGI, (NEC
    HPCE)
  • Public domain open source license (LGPL)
  • Programming language Fortran 90 and C
  • Public domain libraries vendor optimized
    versions may exist
  • MPI1 and/or MPI2 NetCDF/parallel NetCDF libXML
  • mpp_io SCRIP
  • Static coupling

17
The OASIS3 coupler
  • Coupler developed since more than 15 years in
    CERFACS
  • Stable, well-debugged, but limited
  • Last version oasis3_prism_2-5 delivered in
    September 2006
  • User support provided but most development
    efforts go to OASIS4
  • Mono-process coupler parallel coupling library
    (PSMILe)
  • synchronisation of the component models
  • coupling fields exchange
  • I/O actions
  • mono-process interpolation
  • Platforms
  • Fujitsu VPP5000, NEC SX5-6-8, Linux PC, IBM
    Power4, CRAY XD1, Compaq, SGI Origin, SGI O3400

18
OASIS3 model adaptation
PRISM System Model Interface Library (PSMILe) API
  • Initialization
  • Global grid definition (master process only)
  • Local partition definition
  • Coupling or I/O field declaration
  • Coupling or I/O field sending and receiving
  • in model time stepping loop
  • depending on users specifications in namcouple
  • users defined source or target (end-point
    communication)
  • coupling or I/O sending or receiving at
    appropriate times
  • automatic averaging/accumulation
  • automatic writing of coupling restart file at end
    of run
  • call prism_put (var_id, time, var_array, ierr)
  • call prism_get (var_id, time, var_array, ierr)

19
  • In text file namcouple, read by OASIS3 main
    process, and distributed to component model
    PSMILes at beginning of run
  • total run time
  • component models
  • number of coupling fields
  • for each coupling field
  • source and target names (end-point communication)
    (var_name)
  • grid acronym (grid_name)
  • coupling and/or I/O status
  • coupling or I/O period
  • transformations/interpolations

20
PSMILe based on MPI1 or MPI2 message passing
21
  • separate sequential process
  • neighbourhood search
  • weight calculation
  • interpolation per se during the run
  • on 2D scalar or vector fields
  • Interfacing with RPN Fast Scalar INTerpolator
    package
  • nearest-neighbour, bilinear, bicubic for regular
    Lat-Lon grids
  • Interfacing with SCRIP1.4 library
  • nearest-neighbour, 1st and 2nd order conservative
    remapping for all grids
  • bilinear and bicubic interpolation for
    logically-rectangular grids
  • Bilinear and bicubic interpolation for reduced
    atmospheric grids
  • Other spatial transformations flux correction,
    merging, etc.
  • General algebraic operations

22
  • A parallel central Driver/Transformer
  • launches models at the beginning of the run
    (MPI2)
  • reads the user-defined configuration information
    and distributes it to the component PSMILes
  • performs parallel transformations of the coupling
    fields during the run
  • A parallel model interface library (PSMILe) that
    performs
  • weight-and-address calculation for the coupling
    field interpolations
  • MPI-based coupling exchanges between components
  • component I/O (GFDL mpp_io library)

23
  • Initialization
  • call prism_init_comp (comp_id, comp_name, ierr)
  • call prism_get_localcomm (comp_id, local_comm,
    ierr)
  • Definition of grid (3D)
  • call prism_def_grid (grd_id, grd_name, comp_id,
    grd_shape, type, ierr)
  • call prism_set_corners(grd_id, nbr_crnr,
    crnr_shape, crnr_array, ierr)
  • Placement of scalar points and mask on the grid
  • call prism_set_points (pt_id, pt_name, grd_id,
    pt_shape, pt_lon, pt_lat, pt_vert
    ,ierr)
  • call prism_set_mask (msk_id, grd_id, msk_shape,
    msk_array, ierr)
  • Function overloading to keep the interface
    concise and flexible

24
(No Transcript)
25
  • Coupling or I/O field declaration
  • call prism_def_var
  • (var_id, var_name, grd_id, pt_id, msk_id,
    var_nbrdims,
  • var_shape, var_type, ierr)
  • End of definition
  • call prism_enddef (ierr)
  • Coupling or I/O field sending and receiving
  • in model time stepping loop
  • depending on users specifications in SMIOC
  • users defined source or target, component or
    file (end-point communication)
  • coupling or I/O sending or receiving at
    appropriate times
  • averaging/accumulation
  • call prism_put (var_id, date, date_bounds,
    var_array, info, ierr)
  • call prism_get (var_id, date, date_bounds,
    var_array, info, ierr)
  • Termination
  • call prism_terminate (ierr)

26
  • Application and component description (XML
    files)
  • For each application (code) one Application
    Description (AD)
  • possible number of processes
  • components included
  • For each component in the application
  • one Potential Model Input and Output Description
    (PMIOD)
  • component general characteristics name,
    component simulated,
  • grid information domain, resolution(s), grid
    type,
  • potential I/O or coupling variables
  • local name, standard name
  • units, valid min and max
  • numerical type
  • associated grid and points
  • intent input and/or output

27
  • (Through a GUI), the user produces
  • a Specific Coupling Configuration (SCC)
  • experiment and run start date and end date
  • applications, components for each application
  • host(s), number of processes per host, ranks for
    each component
  • For each component,
  • a Specific Model Input and Output Configuration
    (SMIOC)
  • grid information chosen resolution,
  • I/O or coupling variables
  • name, units, valid min max, numerical type, grid
  • activated intent input and/or output
  • source and/or target (component and/or file)
  • coupling or I/O dates
  • transformations/interpolations/combinations

28
(No Transcript)
29
  • Model interface library PSMILe based on MPI1 or
    MPI2
  • Parallel communication including repartitioning
  • based on geographical description of the
    partitions
  • parallel calculation of communication patterns in
    source PSMILe

30
OASIS4 communication (2/2)
  • end-point communication (source doesnt know
    target and vice-versa)
  • parallel 3D neighbourhood search, based on
    efficient multigrid algorithm, in each source
    process PSMILe
  • extraction of useful part of source field only
  • one-to-one, one-to-many
  • parallel I/O (vector, bundles, vector bundles)
    GFDL mpp_io, parNetCDF

31
  • source time transformations (prism_put)
  • average, accumulation
  • target time transformations (prism_get)
  • time interpolation (for I/O only)
  • statistics
  • local transformations
  • addition/multiplication by scalar
  • interpolation/regridding (3D)
  • nearest-neighbour 2D in the horizontal, none
    in the vertical
  • nearest-neighbour 3D
  • bilinear in the horizontal, none in the
    vertical
  • bicubic (gradient, 16 nghbrs) in the horizontal,
    none in the vertical
  • trilinear

32
  • Regridding, repartitioning, I/O
  • Regular in lon, lat, vert (Reglonlatvrt)
  • lon(i), lat(j), height(k)
  • Irregular in lon and lat, regular in the vert
    (irrlonlat_regvrt)
  • lon(i,j), lat(i,j), height(k)
  • Irregular in lon, lat, and vert (irrlonlatvrt)
    (not fully tested)
  • lon(i,j,k), lat(i,j,k), height(i,j,k)
  • Gaussian Reduced in lon and lat, regular in the
    vert (Gaussreduced_regvrt)
  • lon(nbr_pt_hor), lat(nbr_pt_hor), height(k)
  • Repartitioning and I/O only
  • Non-geographical fields
  • no geographical information attached
  • local partitions described in the global index
    space (prism_def_partition)
  • I/O only
  • Unstructured grids (unstructlonlatvrt)
  • lon(npt_tot), lat(npt_tot), height(npt_tot)

33
  • OASIS4 tested and run with toy examples on
  • NEC SX6 and SX8
  • IBM Power4
  • SGI O3000/2000
  • SGI IA64 Linux server Altix 3000/4000
  • Intel Xeon Infiniband and Myrinet Clusters
  • Linux PC DELL
  • OASIS4 now being used in a reduced number of real
    applications
  • EU project GEMS atmospheric dynamic and
    chemistry coupling
  • SMHI ocean-atmosphere regional coupling
  • UK Met Office global ocean-atmosphere coupling
    (currently prototyping)
  • IFM-GEOMAR (Kiel) in pseudo-models to interpolate
    high-resolution fields.
  • Current developments
  • 2D conservative remapping
  • Parallel global search for the interpolation
  • Transformer efficiency
  • Full validation of current transformations
  • Full public release planned beginning 2007.

34
Conclusions
  • PRISM provides
  • framework promoting common software tools for
    Earth system modelling
  • some (more or less) standard tools (OASIS,
    source management, compiling, )
  • network allowing ESM developers to share
    expertise and ideas
  • visible entry point for international
    coordination
  • metadata definition
  • WCRP white paper with ESMF on Common Modeling
    Infrastructure for the International Community
  • PRISM current decentralized organisation
    (bottom-up approach)
  • allows best of breed tools to naturally emerge
  • relies on the developments done in the different
    partner groups
  • Additional funding needed
  • more networking and coordination activities
  • specific technical developments within the
    partner groups
  • Additional contributors are most welcome to join
    !

35
The end
Write a Comment
User Comments (0)
About PowerShow.com