les robertson cernit1 - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

les robertson cernit1

Description:

Computing Resource Review Board. CERN, 15 April ... STAG. strategic. technical. advisory. group. GAG. grid. applications. group. requirements. consultation ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 30
Provided by: lesr150
Category:
Tags: cernit1 | les | robertson | stag

less

Transcript and Presenter's Notes

Title: les robertson cernit1


1
LHC Computing Grid Project - LCG
  • Status Report
  • Computing Resource Review Board
  • CERN, 15 April 2003
  • Les Robertson LCG Project Leader
  • Information Technology Division
  • CERN European Organization for Nuclear Research
  • Geneva, Switzerland
  • les.robertson_at_cern.ch

2
Project Organisation
LHCC Technical Review
Computing RRB Resources
Project Overview Board - POB
SC2 - Software and Computing Committee Requiremen
ts, Monitoring
3
Project Execution Board
Project Management project leader area
managers SC2, GDB chairs resource planning
officers
Experiment Delegates
External Projects EDG VDT GridPP INFN Grid
Other Resource Suppliers IN2P3 Germany CERN-IT
4
Applications
5
Applications Area Organisation
6
Staffing
  • Staff at CERN working in the Applications area of
    the LCG project are now consolidated in a single
    group in EP Division
  • Software Group SFT leader John Harvey
  • Identification of common activities between
    experiments more successful than anticipated when
    the project was proposed
  • Staff from external institutes and CERN are
    joining the applications area projects through
    experiments
  • Persistency (POOL) and Core Services (SEAL)
  • More participation expected in Physics Interfaces
    (PI) and Simulation projects

7
Grid Deployment
8
Timeline for the LCG computing service
VDT, EDG tools building up to basic functionality
LCG-1
used for simulated event productions
LCG-2
Stable 1st generation middleware Developing
management, operations tools

principal service for LHC data challenges batch
analysis and simulation
LCG-3
Computing model TDRs
validation of computing models
More stable 2nd generation middleware
Phase 2 TDR
Very stable full function middleware Acquisition,
installation, commissioning of Phase 2 service
(for LHC startup)
validation of computing service
Phase 2 service in production
9
Grid Deployment Organisation
ALICE
ATLAS
CMS
policies, strategy, scheduling, standards,
recommendations
grid deployment manager
grid deployment board
LHCb
Grid Resource Coordinator
proposals for policies strategies
infrastructure operation
LCG security group
infrastructure team
infrastructure team
infrastructure team
infrastructure team
infrastructure team
infrastructure team
infrastructure team
teams at certain regional centres and CERN
10
Good Progress
  • Certification process defined (January)
  • This has been done agreed common process with
    EDG
  • Have agreed joint project with VDT (US)
  • VDT provide basic level (Globus, Condor) testing
    suites
  • We provide higher level testing
  • but need much more effort on devising writing
    basic and application-level tests
  • Packaging/configuration mechanism defined
  • Group (EDG, LCG, VDT) have documented an agreed
    common approach
  • Now will proceed with a staged implementation
  • Basic for LCG-1 in July, and more developed later
  • Roll-out of the pilot service on schedule
  • tests the process, helps to define the procedures
  • then we will have to formalise agreements on
    efforts

11
LCG Ramp-up Schedule
Tier 2 centres will be brought on-line in
parallel once the local Tier 1 is up to provide
support
12
Slow Progress
  • Delivery of middleware for LCG-1 Milestone was
    March 1
  • Late by a few months in getting an agreement on
    the middleware
  • Not the functionality we would have hoped for
  • We have a working set (LCG-0) that is in use
    now
  • VDT 1.1.7(globus 2.2.4) , EDG release 1
  • Deadline for delivery of new EDG middleware
    (release 2) is end April
  • Identify operations and call centres Milestone
    was February 1
  • Operations centre hope to obtain agreement
    soon, centre at RAL and collaboration on tool
    development with IN2P3
  • Support centre discussion with FZK

13
Deployment Staffing
  • Staffing of Grid Deployment at CERN left too late
    Now we have a serious lack of effort
  • EDG testbeds absorbing more effort at CERN than
    planned
  • Infrastructure support, Experiment support (both
    grid experts, and production adaptation)
    understaffed
  • Testing group is badly understaffed
  • had expected to find more tests from EDG
  • hoped that EDG WP8 and GAG would provide packaged
    tests
  • urgent to find at least 3 more full time people
    to contribute here
  • INFN recruiting now but do not expect arrivals
    before July
  • Scheduled German recruitment would largely solve
    the problem but administrative difficulties at
    present

14
Grid Technology
planning the evolution of the grid middleware
used by the project
15
Grid Technology Organisation
recommendations
STAG strategic technical advisory group
GAG grid applications group
consultation
requirements consultation
grid technology manager
negotiation deliverables
negotiation deliverables
US projects
16
Where are we with Grid Technology?
  • We now have practical experience of deploying
    grid software from a number of projects
  • Expectations were much too high
  • more difficult than expected
  • only very basic functionality is available now
  • current development projects will not meet the
    basic SC2 requirements
  • and reliability, scalability, manageability of
    all middleware still to be tackled
  • Lobbying for increased emphasis on production in
    future grid projects, but these are generally
    short term, research focused
  • Relevance of industrial developments still
    unclear
  • Starting to elaborate contingency plans

17
CERN Fabric
  • Organisation
  • Fabric Grid Deployment re-organisation at CERN
  • LCG ??EDG consolidation
  • hardware resource allocation and planning
    improved
  • Funding
  • No further investment in tape infrastructure for
    Phase 1
  • all CERN tape drives upgraded to STK 9940B
  • if necessary Computing Data Challenges may take
    equipment from the production systems outside
    beam time
  • HEP wide availability of ORACLE licenses
  • Re-assessment of Phase 2 costs at CERN being
    prepared
  • review in project, with LHCC
  • ? C-RRB in October

18
Milestone Summaryperiod 1 October 2002 31
March 2003
19
Staffing Summary
20
(No Transcript)
21
Applications area staffing FTEs - April 2003 50.1
Experience-weighted FTEs
Funding Source
Sub-projects
22
LCG Staffing Grid Fabric
23
LCG Staffing127.4 experience-weighted FTEs -
April 2003
LCG
Management
6

Applications
CERN Fabric
39

30

Grid
Grid
Technology at
Deployment
CERN
14

11

By Project Area
By Funding Source
24
(No Transcript)
25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
28
Formal agreements for collaboration by external
institutes
  • Agreement for collaboration in computing
    developments signed with India DAE
  • 10 FTEs per year for 5 years
  • initial projects
  • maths library
  • fabric monitoring and error handling
  • Grid operation monitoring tools
  • Open AFS support
  • Persistency database under discussion
  • Agreement being finalised with Russia
  • initial projects
  • MC generators
  • fabric automation
  • grid middleware application
  • Under active discussion
  • Grid Operations Centre RAL, IN2P3
  • Grid Support centre FZK (Karlsruhe)

29
Major points
  • Applications area scoped out
  • consolidation into 5 major projects
  • PI and Simulation still to be fleshed out
  • Maybe a further project on analysis
  • Persistency solution on track
  • Roll-out of the LCG-0 pilot on track
  • Middleware is later than expected may have to
    reduce functionality
  • Long-term source of solid, supported middleware
    not yet clear
Write a Comment
User Comments (0)
About PowerShow.com