Status of the International Lattice Data Grid - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Status of the International Lattice Data Grid

Description:

It is planned to make replication beyong grid boundaries possible. ... Interoperability of regional grids next milestones. make use of global Grid developments ... – PowerPoint PPT presentation

Number of Views:34
Avg rating:3.0/5.0
Slides: 31
Provided by: hea660
Category:

less

Transcript and Presenter's Notes

Title: Status of the International Lattice Data Grid


1
Status of the International Lattice Data Grid
  • Karl Jansen
  • For the ILDG Community
  • http//www.lqcd.org

Recommend Live-demo at Poster session (B. Joo,
C. Maynard, D. Pleiter, C. DeTar)
2
Participating Countries
  • Australia, http//www.cs.adelaide.edu.au/users/pau
    lc
  • France, http//www-zeuthen.desy.de/latfor/ldg
  • Germany, http//www-zeuthen.desy.de/latfor/ldg
  • UK, http//www.gridpp.ac.uk/qcdgrid/index.html
  • Italy, http//www-zeuthen.desy.de/latfor/ldg
  • Japan, http//www.jldg.org
  • USA http//www.usqcd.org
  • Common aspect all (more or less successful)
  • participants of soccer Championship
  • Expect new ILDG members Spain, Portugal, Ghana,

3
Lattice QCD in Australia
Adelaide South Australian Partnership for
Advanced Computing (SAPAC) 1.1 Tflops dedicated
to QCD (future gt50 of 4 Tflops) Canberra Austr
alian Partnership for Advanced Computing
(APAC) 0.4 Tflops dedicated to QCD
4
Lattice QCD in France
  • 2 racks apeNEXT, 1.2 Tflops
  • To be installed in Rome I,
  • La Sapienza

5
Lattice QCD in Germany
  • 10 racks apeNEXT, 6 Tflops (4 racks in Zeuthen, 6
    racks in Bielefeld), dedicated to LGT
  • National Supercomputer Centers,
  • 20 available for LGT
  • NIC BG/L System at
  • FZ-Julich, 45 Tflops
  • NIC IBM Regatta system at FZ-Julich, 10 Tflops
  • Altix System at LRZ Munic, 35 Tflops

6
Lattice QCD in the UK
  • UKQCD organization
  • 14K node QCDOC at Edinburgh, 12 Tflops

7
Lattice QCD in Italy
  • 12 racks apeNEXT, 7.2 Tflops, dedicated to
    Lattice
  • Installed at Rome I,
  • La Sapienza, maintained by
    INFN

8
Lattice QCD in Japan
  • PACS-CS, 14.3 Tflops in Tsukuba
  • BG/L 57.3 Tflops at KEK
  • Smaller O(1) Tflops installations at Hirsohima,
    KEK, Kyoto

Hokkaido
KEK
Kanazawa
Tsukuba
Kyoto
Tokyo
Hiroshima
Osaka
9
Lattice QCD in the US
  • QCDOC 3.4 Tflops (sustained)
  • Clusters at FermiLab and JLab
  • 2 Tflops (sustained)
  • National Supercomputer Centers
  • NERSC, ORNL. Pittsburg
  • expected 0.5Tflops
  • Plan of several BG/Ps
  • O(10) Tflops each year

10
The overall picture
  • O(150) Tflops available for Lattice calculations
  • Large collaborations UKQCD, RBC, JLQCD, MILC,
    QCDSF, CSSM, PACS-CS, ETMC
  • Already now
  • sharing configurations within collaborations
    using ILDG

11
Near Future Configurations on the Grid
  • RBC/UKQCD DWF, Nf21,
  • a0.12fm, m0.02,3,4, V16332, 24348
  • MILC, Rooted staggered on Asqtad, Nf21,
  • a0.06-0.125fm, ml/ms0.1,2,4, V40396,
    483144
  • PACS-CS, IwasakiNP improved Wilson, Nf21,
    a0.07-0.12fm, V16332, 20340,28356
  • CSSM, FLIC, Nf21, a0.12fm, V16332
  • QCDSF, WilsonNP improved Wilson, Nf2,
  • a0.05-0.11fm, mPS0.25-1GeV, V up to 32364
  • ETMC, tlSym maximal tmQCD, Nf2,
  • a0.07-0.12, mPS250-500MeV, V24348, 32364

12
Configurations from Older Ensembles
  • Gauge Connection, staggered, Nf2
  • SESAM/GRAL/TXL, Wilson, Nf2
  • CP-PACS, improved Wilson, Nf2

13
Policies to use configuration
  • Dependent on collaboration
  • Acknowledgment,
  • waiting period of 6 months,
  • waiting for publication of key paper,
  • draft in advance,
  • immediate access,
  • cite certain papers,
  • ask for collaboration

14
Two key Elements to make this work
  • Metadata
  • G. Andronico, P. Coddington, R. Edwards, B.
    Joo,
  • C. Maynard, D. Pleiter, J. Simone, T. Yoshie
  • Middleware
  • G. Beckett, D. Byrne, M. Ernst, B. Joo, M.
    Sato,
  • C. Watson

15
Metadata Catalogue
  • Stable version available QCDml1.3
  • Allows to query web sites for configurations
  • Implementations exist for
  • JLDG(Japan),
  • LatFor DataGrid (Germany/France/Italy),
  • QCDgrid (UK),
  • USQCD

16
An example (from LDG)
17
To download
  • Grid certificate (only once)
  • (get member of vo ildg)
  • Get free software (e.g. ltools for Germany) (only
    once)
  • lget dataLFN
  • lfn//ldg/gral/wilson_nf2/b5p6kp158-16x32/co
    nfig.00.000108.dat

18
To upload
  • Convert/Write out configuration to ILDG format
  • Add lime record with (unique) logical file name
    (lfn)
  • (download and install free LIME library)
  • Create metadata
  • - ensemble XML file
  • - configuration XML file
  • - glossary file
  • - configuration checksum
  • lput

19
(No Transcript)
20
The engines behind (Middleware)
  • Regional Grid solutions highly non-trivial, many
    components
  • have to work together
  • All site are ready/testing Prototypes

21
Germany/France/Italy (LDG)
  • user software ltools
  • (lget, lput, lls, lupdate, )
  • part of LDG software, can be installed on
    many linux platforms
  • Metadata catalogue query and download metadata,
    uploading, management
  • Storage elements dcache based
  • Germany, 50Terabytes Orsay 5Terabytes

22
UK (QCDgrid)
  • user software command line tools
  • put-file-on-qcdgrid, get-file-from-qcdgrid

  • also GUI
  • Metadata catalogue complete, deployed on UKQCD
    development system, access through ILDG sample
    clients metadata
  • Storage elements Edinburgh (tier1), Liverpool,
    RAL, Southampton, Swansea
  • 50 Terabytes for storage

23
Japan (JLDG)
  • user software Gfarm file system
  • Gftp with GSI authentication

  • Metadata catalogue completed,
  • Web-service enabled
  • eXist XML
  • generate download scripts for data
  • execution off-line
  • Storage elements
  • 50 Terabytes for storage

24
US (USQCD)
  • user software not yet developed
  • Metadata catalogue ready working
  • Storage elements NERSC, BNL,FNAl
  • 50 Terabytes for storage

25
Australia
  • user software web portal
  • Metadata catalogue ready
  • Storage elements
  • 25 Terabytes for storage

26
Interoperability
  • Operation of interoperable Metadata catalogue
  • services is a big step forward
  • web service description service (wsdl)
  • behavioural specification
  • Test suite to define tests on ILDG
    compatibility
  • can browse (almost all) each others MDC
  • However, significant efforts required to achieve
  • interoperability of othe components
  • - Security
  • - File catalogues
  • - Storage elements

27
Interoperability
  • Security
  • global VO ILDG managed by voms file
    transfer between storage elements SE
  • Successful file transfer between
  • LDG(DESY), QCDgrid(EPCC), USQCD(Fermilab/JLab)

28
Questions
  • Can I determine access rights myself?
  • This very desirable feature is currently
    only supported at regional grid level
  • (define groups, rights for each group and
    ensembles)
  • Will the data be replicated?
  • Currently, replication is possible only
    within the regional grids. It is planned to make
    replication beyong grid boundaries possible.
  • How can I check that I got the right
    configuration?
  • Plaquette value and checksum are provided
  • Can I delete configurtions?
  • Yes, configurations can be removed

29
Questions
  • Is the schema to describe data extensible?
  • Yes, more information can be built in
  • What about algorithm information?
  • rather limited info, since it is too
    complicated, however, possible to add namespace
  • What about propagators?
  • simplified version could be envisaged, else
    seems to be too complicated, under discussion

30
Summary
  • Much progress has been made in building up
    regional grid infrastructure . which is actually
    used!
  • thanks to very hard work of many people
  • Interoperability of regional grids next
    milestones
  • make use of global Grid developments
  • It is time to become a member of VO ILDG
  • Get your Grid certificate at local
    authorization site
  • Proposed policy for VO membership presently
    discussed
Write a Comment
User Comments (0)
About PowerShow.com