DOE/NSF Review of the U.S. LHC Software and Computing Projects - PowerPoint PPT Presentation

About This Presentation
Title:

DOE/NSF Review of the U.S. LHC Software and Computing Projects

Description:

... be targeted at the official LHC start date, as that is CERN's current party line. ... Break down Grid deliverables into well-defined components. Track them ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 50
Provided by: cms5
Learn more at: https://uscms.org
Category:

less

Transcript and Presenter's Notes

Title: DOE/NSF Review of the U.S. LHC Software and Computing Projects


1
DOE/NSF Reviewof the U.S. LHC Software and
Computing Projects
  • November 27-30, 2001 at Fermilab

2
Agenda Tuesday, Nov 27, 2001

3
Agenda Wednesday, Nov 28, 2001

4
Guide to Documentation
  • On the Web http//www.uscms.org/sc/reviews/doe-
    nsf/2001-11/
  • Project Documentation in your folder
  • Handouts of slides
  • User Facilities and CAS
  • WBS, Schedules and Milestones
  • XProject-based U.S. CMS WBS
  • Project Management Plan v. 1.01
  • Proposal to the NSF
  • Project Cost Tables and Charts
  • Status Reports for last FY2001
  • Additional Documentation
  • Additional Technical Documentation, e.g. CHEP
    papers, Grid requirements document etc.

5
Project OverviewU.S. CMS Software and Computing
ProjectLothar A T Bauerdick, FermilabProject
Manager
  • DOE/NSF Review Nov 27-30, 2001

6
Focus for this Review
  • The goal of the DOE/NSF review is to assess the
    scope, cost and schedule baselines for the U.S.
    LHC Software and Computing Projects, and their
    management structures.
  • The Projects are expected to present
    self-consistent baseline plans targeted to the
    funding guidance received from DOE and NSF, and
    separately address how they would use incremental
    funds.
  • G.Crawford, Chairman DOE/NSF Review
  • I think we are relatively insulated from these
    fluctuations (at CERN),in the following sense
    the computing projects will still be targeted at
    the official LHC start date, as that is CERN's
    current party line. We also have much more
    information from CERN than we did last year about
    what they expect to contribute, and what they
    expect the member states to contribute, to
    computing. Given those assumptions, I think it is
    fair to ask the committee to judge if the US
    project is ready for baselining,with the usual
    caveats about what that means for a software
    project
  • ... DAMN THE TORPEDOES, FULL SPEED AHEAD".

7
Specific Issues for this Review
  • (and outline of this talk)
  • Introduction Scope and Goals of Project
  • General and Financial Status
  • User Facilities Performance
  • Interactions with Grid Projects
  • Involvement in International Efforts
  • Project Plan Scope, Schedule, Budget
  • NSF Proposal
  • Conclusions

8
It is the task of the U.S. CMS Software and
Computing Projectto make sure the CMS and the
U.S. takes full advantage of this unique
opportunity
  • LHC is the most exciting accelerator ever
  • EW Symmetry Breaking, Origin of Mass,
  • First Scalars Discovery of the Higgs Boson
  • Discovery of Super Symmetry (SUSY)
  • Big step in Energy and Luminosity ? Much more
  • U.S. CMS will scoop the full physics potential of
    CMS
  • Critical mass (500 U.S. physicists).
  • Good facilities (at FNAL and universities).
  • New technology for collaboration
    (distributed analysis, video conference)

9
HEP Discovery is ThroughSoftware and Computing
  • LHC Computing Unprecedented in Scale and
    Complexity

10
Software and Computing Deliverables
  • Deliverables of the SC Project
  • CORE SOFTWARE Engineered Infrastructure
    such as architecture, software development,
    analysis tools and processes, distributed data
    management
  • MAJOR USER FACILITIES and Services
    Tier1 and Tier2 Regional Centers, production
    management systems for distributed
    analysis, interface to wide area networks
  • Non-Deliverables of the SC Project
  • Sub-detector and Physics Software
  • Local Computing (Workgroup servers, desktops)
    at home institutions and domestic /
    international networking infrastructure
  • NB Physicists working on computing RD,
    management are off-project

Will be covered by SC MOU
HEP base program
11
CAS Subproject
  • To support the design, development, modeling
    optimization and commissioning of software
    related to detectors being constructed by U.S.
    CMS
  • To provide its share of the framework and
    infrastructure software required to support data
    analysis and simulation for CMS
  • For remote collaborative tools to enable the
    distributed model of computing that permits
    members of U.S. CMS to carry out analysis whether
    they are at home in the U.S. or visiting or
    resident at CERN
  • To satisfy any specialized needs required to
    carry out data analysis activities of interest to
    members of U.S. CMS

In addition to developing software, this
subproject will also provide expert programming
personnel to assist the physicists in developing
reconstruction and physics analysis programs by
serving as mentors, reviewers, advisers and,
where appropriate as software tool-writers. This
will ensure that the software produced will be
easy to integrate into the whole system, will be
efficient in its use of hardware resources, and
will be maintainable and adaptable for the full
life of the CMS data analysis.
12
UF Subproject
  • The goal of the User Facilities Subproject is to
    provide the enabling infrastructure of software
    and computing that will allow US physicists to
    fully participate in the physics program of CMS.
  • To this end the subproject will acquire, develop,
    install, integrate, commission and operate the
    hardware and software for the facilities required
    to support the development and data analysis
    activities of USCMS.
  • This subproject will include a major Tier1
    regional computing center at Fermilab to support
    US physicists working on CMS. It is appropriately
    sized to support this community which comprises
    25 of the full CMS collaboration.
  • Tier 2 Centers are part of the User Facilities
    Subproject

13
Project Organization

JOG LHC Program Office LHC Project Office
DOE/NSF Review Chair Glen Crawford
Project Management Group (PMG) Chair Ken
Stanfield PMG for SC Chair Mike Shaevitz
All Oversight Bodies PMG, SCOP, ASCB Are in place
and functioning
Fermilab Project Oversight
Mgmt LATBauerdick
Software and Computing Project L1 Project
Manager LATBauerdick/Fermilab L.Taylor/NEU deputy
SCOP Chair Ed Blucher
U.S. CMS Advisory Software and Computing
Board (USASCB)chair I.Gaines/Fermilab
CCS D.Stickland Princeton U.
U.S. ASCB D.Green
Core Applications Software Project L2
Manager I.Fisk/UCSD
User FacilitiesProject L2 Manager V.ODell/Fermi
lab
UF V.ODell
CAS I.Fisk/UCSD
PRS J.Branson UCSD
Physics Reconstruction and Selection Detector
Software Groups
14
Project Resources are in 3 Categories
  • User Facilities Equipment
  • Tier-1 center at Fermilab
  • Five Tier-2 centers at U.S. Institutions
  • Prototypes and testbeds do develop and verify the
    system
  • User Facilities Staff
  • Computing Professionals for Computing-related RD
  • Staff at the Fermilab Tier-1 facility, including
    support for U.S. infrastructure
  • Staff at Tier-2 centers for Maintenance and
    Operations
  • Core Application Engineers
  • Computing Professionals/Software Engineers for
    CMS Core Software
  • Support for specific U.S. activities
  • ( Project Office support, management reserve)

15
?? ? User Facilities 3-Phases
  • Tier1 and Tier2 regional centers RD, Equipment,
    Staff
  • Prototyping has started in 2000
  • Computing RD
  • Computing hardware prototyping and test-beds
  • Computing for Physic Reconstruction and
    Selection
  • Deployment 2005-2007
  • Assumes LHC startup in 2006 and design Lumi in
    2007
  • Procurement Model Start deployment in 2005, 30,
    30, 40 costs
  • Ramp-up of User Facility Staff
  • Maintenance and Operations 2007 on
  • Constant staff level
  • Rolling Replacement of hardware components,
    yearly budget 1/3 of initial investment
  • Moores law takes care of Evolution and Upgrades

Prototype Facilities 5 DC in 2003 and 20 DC
in 2004
Fully Functional Facilities at 40 Capacity in
2006
16
? Software Engineering
  • U.S. contributes its share of 25 to total Core
    Software Engineering effort
  • Main U.S. contributions in
  • Software Architecture
  • Interactive Analysis
  • Distributed Computing
  • 8 Engineers now, ramping to 13

Current U.S. Contribution 6 FTE
FTE Profile CMS Core Software and Computing
Total CAS FTE U.S. contribution to CCS
U.S. Specific Support
17
Funding Status FY2001
  • Received 1500k from DOE in April
  • 500k labeled as Equipment funds
  • Tier-1 upgrade, see V.ODells talk
  • 500k Loan from Construction Project
  • Charge PPD budget code by reporting CAS
    engineering efforts
  • Received additional 285k from DOE in August
  • Allocated to Tier-1 equipment procurements
  • Received funding from NSF for 3rd engineer in
    October 2001

18
Project Status
  • UF status and successes
  • Tier-1 facility RD and User systems, CPU
    farms, Disk and Tape, RD
  • CMS software installation and distribution in
    place
  • juggling RD vs Support rave reviews for UF
    from PRS user community
  • Tier-2 prototype centers operational, RD
    program, active in production efforts
  • collaboration with U.S. ATLAS on facility
    issues, e.g. disks
  • CAS status and successes
  • released and use Functional Prototype
    Software for physics studies
  • modularization and re-use of CMS code in
    creation of COBRA project
  • visualization of all relevant reconstructed
    physics objects
  • CMS distributed production environment
  • very significant Geant4 progress
  • Project Office started
  • New hire experienced person on the level of a
    Project Engineer
  • Help with WBS and Schedule, Budget, Reporting,
    Documenting
  • Hope to catch up w/ mechanics of managing the
    project
  • Including MOUs, SOWs, subcontracts, invoicing,
    see Management Session!

See I.Fisks and V.ODells talks, software demo
and parallel sessions tomorrow!
19
Interaction With Grid Projects
  • Since the last meeting two new Grid projects have
    started
  • PPDG and iVDGL

20
Interaction With Grid Projects
  • The PPDG project has received funding and has
    started to form a CMS team at Caltech, UCSD and
    Fermilab.
  • The iVDGL project was recently approved by the
    NSF. U.S. CMS is part of the project with its
    prototyping efforts for Tier-2 centers at
    Caltech/UCSD and U.Florida.
  • We need to ensure a coherent effort
  • ? U.S. CMS mgmt is involved in Grid mgmt
  • Push for development of practical grid tools to
    meet experiments real needs
  • Break down Grid deliverables into well-defined
    components
  • ? Track them
  • monitor the progress of all those development
    efforts, assuming that most will succeed
  • In case that some fail we will need to supply
    replacements using project resources
  • This needs some contingency in manpower or scope

21
Current U.S. CMS Data Grid
  • Fermilab Tier-1 Regional Center
  • Caltech/UCSD Tier-2 prototype
  • U.Florida Tier-2 prototype
  • U.Wisconsin PPDG site
  • CMS software installation on Tier2 prototype
    sites
  • RD on Distributed Job Scheduling, Robust File
    Movements,
  • (see H.Newmans talk)
  • Deliverables of Grid Projects become useful in
    the real world
  • e.g. at the SC2001 CMS-PPDG Demo
  • Demonstrator for distributed CMS production
    between T-1 and T-2 centers using a
    Grid-enabled version of CMS standard production
    environment
  • MOP (Grid remote job execution) and GDMP (Grid
    file replication) CMS-PPDG
  • repeated tonight for you in the software demo!

22
CMS-PPDG SuperComputing 2001 Demo
23
CMS Data Grid Requirements
  • Formulate CMSs requirements, and develop
    architectural models, for the Grid projects
    (GriPhyN, PPDG, EDG)
  • Ensure CMS (Object-collection-oriented)
    requirements are fully accommodated in the
    developing Grid architecture
  • Coordination between WP units of EDG, and
    their counterpart groups in GriPhyN and PPDG
  • Release of a Data Grid Overview and Requirements
    Document
  • Consensus of CMS CCS (reached during Catania CMS
    week June2001)
  • Description of the current view of 'CMS Data
    Grid System' that CMS will operate in 2003
  • List of 'tasks for the grid components', to be
    delivered in 2001-2003
  • CMS application and architectural constraints
    that these Grid components need to take into
    account
  • This document is now official and available to
    you in the documentation.
  • See H.Newmans talk
  • CCS has define the project structure Grid
    Integration Taskto coordinate CMS Grid efforts,
    see D.Sticklands talk

24
NSF Funding through Grid Projects
  • Budget for the GriPhyN and iVDGL projects funded
    by the NSF. Shown is the total project budget and
    the part that is going directly to Universities
    involved in CMS.
  • GriPhyN (NSF-ITR) and iVDGL (NSF-MPS)

25
Involvement in the International Efforts
  • CERN has defined and launched a "LHC Computing
    Grid Project" to address the computing needs of
    the LHC experiments.
  • The U.S. CMS User Facilities will be an integral
    part of the LHC Computing Grid for CMS

26
These Plans are fully In line with the U.S. CMS
Plans!
Les Robertson, HepCCC Open Day Mtg, Bologna, June
15, 2001
27
Issues w/ the LCG Project
  • The U.S. will need to make sure that the project
    and oversight structure
  • in the process of being established by CERN
  • will support effective working relationships,
    efficient decision making processes, and allow
    the U.S. to be involved in the decision processes
    in Europe in an adequate way.
  • This will be essential to protect the substantial
    U.S. investments in RD, hardware and software
    systems and to guarantee interoperability with
    the U.S. facilities.
  • The institutions that provide Tier-1 and Tier-2
    services to the LHC Computing Grid should be
    truly involved in the effort
  • Requirements, Work Plan, Policy Decisions
  • The experiments should take strong ownership of
    the LHC Computing Grid.

Through the U.S. Membership in the SC2 and POP we
should be able to address these issues!
28
NSF Proposal
  • The Project, together with the U.S. Construction
    Project, and on behalf of the U.S. CMS
    Collaboration, has put forward a Proposal to the
    NSF for the preparation of the LHC research
    program, including both Software and Computing
    and detector Maintenance and Operations. The
    November review will be part of the reviewing
    process for this proposal.

29
NSF Proposal
  • CMS has submitted a proposal to the NSF, PI
    S.Reucroft and J.Swain
  • Empowering Universities Preparation of LHC
    Research Program
  • Means to get NSF funding of SC (and MO) to CMS
  • Scope SC and MO/upgrade RD, 8M in 2006
  • Unclear to which program we will be submitting
  • Time table unclear (start of funding supposed to
    be in 2002)
  • DOE/NSF review in November part of the reviewing
    process (of the SC part)
  • Split CMS/ATLAS assumed to be 50/50, also for MO
  • Unclear funding profile, but use existing advise,
    scaled to 8M in 2006
  • Putting SC and MO into one proposal gives has
    two projects addressing
  • Includes Education and Outreach, Broad impact
  • This is a big step forward for CMS, thanks to
    our friends at the NSF
  • General contents
  • We propose a five-year program (20022006) to
    strengthen the software, computing and
    collaborative infrastructure of U.S. universities
    working on the Compact Muon Solenoid (CMS)
    experiment. This is an essential part of a broad
    effort to maximize the scientific return at the
    energy frontier at the Large Hadron Collider
    (LHC) at CERN.

30
Physics Discoveriesby Researchers at
Universities
  • CMS is Committed to Empower the Universities
  • to do Research on LHC Physics Data
  • The enabling technology
  • Tier-2 facilities and the Grid

31
This Model Requires Substantial Additional RD
  • Location Independence of Complex Processing
    Environments
  • Location Transparency of Massive Data Collections
  • Scheduling and Optimization on a Heterogeneous
    Grid of Computing Facilities and Networks
    Monitoring, Integration Issues
  • We rely on the Grid Projects to deliver most of
    that!

32
The Universities have a Big Impact on RDDMO
needs a different environment In the U.S.this
environment exists at the LabsRole of
centralized support, Tier-1 center
  • Software and Computingfor Physics Discovery
    needs
  • Research and Development, Deployment
  • Maintenance and Operations

33
It will empower U.S. Universities to bea strong
component of the CMS Collaboration Physics
Discovery and Research at Universities
  • This model takes advantage of the significant
    strengths of U.S. universities in the area of CS
    and IT
  • Draw Strength and Exploit Synergy BetweenU.S.
    Universities and FermilabSoftware Professionals
    and PhysicistsCS and HEP

34
Request to NSF
  • Core Application Software and Physics Support
  • DOE already providing funding for 6 FTE
  • NSF should significantly ramp up their
    contribution, to 7 FTE in 2006
  • User Facilities Tier-2 Regional Center
    facilities and operations
  • Tier-2 prototyping (complementing iVDGL funding)
  • Tier-2 RD mostly out of base and other projects
    (GriPhyN, iVDGL, PPDG)
  • Deployment of in total 5 Tier-2 centers, starting
    with a pilot implementation in 2004
  • Maintenance and Operations
  • Contribution to U.S. central services
  • Mainly a centralized special operation group
    to provide help and expertise to Tier-2

35
NSF Proposal Funding Profile
  • Numbers for each category are in thousand FY2002
    dollars, an escalation of 3 per year is shown
    separately. Total costs are 22.8M

36
Changes in Project Schedule
  • The project plan was modified to accommodate the
    back-loaded funding profiles given by the funding
    agencies, and adapting to the LHC schedule as
    defined in spring 2001 leaving the Scope
    unchanged

37
Schedule and Milestones
  • Physics drivenmake systems available in time
    for data challenges and data taking
  • 3 waves of equipment procurements, installations,
    commissioning
  • RD system, funded in FY2002 and FY2003
  • Used for 5 data challenge
  • ? release CCS TDR
  • Prototype T1/2 systems, funded in FY2004
  • for 20 data challenge
  • ? end Phase 1, start deployment
  • Fully Functional Tier-1/2 at 1030 capacity,
    funded in FY2005 and FY2006
  • for LHC pilot and physics runs
  • ? find Higgs and Supersymmetry.

38
Tier-1/2 RD Systems SC TDR
  • Milestones CCS TDR U.S. CMS UF RD Systems
    (FY2002/3)

39
Prototype Tier-1/2 20 Data Challenge
  • Milestone 20 DC, U.S. CMS UF pTier-1 System
    (5 system) FY2004

40
40 Capacity Tier1/2 LHC Pilot/Physics Run
  • U.S. CMS UF Milestones fully functional Tier-1 /
    Tier-2 Systems (FY2005 on)

41
UF Costs
  • Detailed Information on Tier-1 Facility Costing
  • See Document in your Handouts

42
Installed Capacity Tier-1 Facility
5 DC RD Tier-1 System
20 DC Prototype Tier-1 System
Fully Functional Facilities At 40 Capacity
for Physics
43
UF Manpower Profile
44
Funding Profile

45
Cost Variance vs Guidance
46
Schedule and Milestones
  • The schedule is tight and contingency is in
    scope.
  • Series of milestones, data challenges to monitor
    progress
  • Slippage of milestones will impact readiness of
    systems for physics, which would impact physics
    scope
  • Budget very tight, and contingency is in scope
    and time
  • Build to cost
  • Possible delays of LHC would have a positive
    effect on this
  • A 40 fully functional system looks very feasible

47
Discovery Potential for Higgs and Susy in 2006!
Higgs
Susy
48
Conclusions
  • Many new developments since May
  • NSF Proposal and iVDGL approval
  • Additional 285k for Tier-1 equipment from DOE
  • Grid w/s and CMS consensus on requirements to the
    Grid
  • Database discussion and evaluation to
    alternatives to Objectivity started
  • Approval of the LHC Computing Grid Project at
    CERN
  • We are presenting a coherent project plan and
    project baseline
  • Considerable detail for User Facilities and CAS
    plans
  • Robust against delays of LHC Schedule on the
    level of several months
  • NSF Empowering Universities for LHC physics
    discovery and researchcompelling case for the
    NSF to fund T2 at U.S. universities pT2!
  • Close Collaboration with LHC Computing Grid
    Project
  • Strong RD phase to make the right technical
    choices w/r to database, Grid, etc
  • The project is ready for being baselined

49
  • THE END
Write a Comment
User Comments (0)
About PowerShow.com