Physics Data Processing at NIKHEF - PowerPoint PPT Presentation

About This Presentation
Title:

Physics Data Processing at NIKHEF

Description:

Realize an LHC physics computing infrastructure optimized for use by ... AA model is fulcrum of balance between 'CERN-centric' and 'really distributed' models ... – PowerPoint PPT presentation

Number of Views:13
Avg rating:3.0/5.0
Slides: 12
Provided by: jeffreya55
Category:

less

Transcript and Presenter's Notes

Title: Physics Data Processing at NIKHEF


1
Physics Data Processing at NIKHEF
  • Jeff Templon

WAR 7 May 2004
2
Goals
  1. Realize an LHC physics computing infrastructure
    optimized for use by NIKHEF physicists
  2. Where possible, combine expertise associated with
    goal 1 for other projects with NIKHEF
    participation
  3. Capitalize on expertise available funds by
    participating in closely-related EU NL projects
  4. Use NIKHEF Grid-computing expertise and capacity
    as currency

3
NIKHEF-optimal LHC Computing Infrastructure
  • Operation of LCG core site
  • Build experience with site operation and discover
    external site issues traditionally ignored by
    CERN
  • Leverage front-runner position earned by our EDG
    effort
  • Strong participation in LHC/LCG/HEP Grid
    framework projects
  • Meten is weten preoptimization is the root of
    all evil (Knuth)
  • Leverage front-runner position earned by EDG
    effort
  • Leading role in Architecture/Design arm of EGEE
  • AA model is fulcrum of balance between
    CERN-centric and really distributed models
  • Make use of accumulated expertise in security
    to gain position in middleware design
  • Preparation for Tier-1
  • Avoids having others determine NIKHEF computing
    priorities

4
LHC/LCG/HEP Projects
  • Strong Coupling to NIKHEF LHC experiment analysis
  • One grad student per experiment, working with
    ARDA project early influence, experience, and
    expertise with LHC analysis frameworks
  • Room for more participation in medium term
    (postdocs, staff)
  • Continuing work with D0 reprocessing
  • D0 metadata model is far advanced compared to LHC
    model
  • Influence via our (LHC) task-distribution
    expertise on US computing
  • Investigations on ATLAS distributed Level-3
    trigger
  • Precursor for LOFAR/Km3NeT activities

5
Preparation for Tier-1
  • Tier-1 for LHC
  • Archive 1/7 of raw data, all ESDs produced on
    site, all MC produced on site, full copies of AOD
    and tags
  • Contribute 1/7 of twice-yearly reprocessing
    power
  • End result major computing facility in
    Watergraafsmeer
  • 1 petabyte each of disk cache tape store per
    year start 2008
  • 2000 CPUs in 2008
  • 1.5 Gbit/s network to CERN
  • These numbers are per experiment
  • NIKHEF contributes research, SARA eventually
    takes lions share of operation
  • NCF must underwrite this effort (MoU with CERN)

6
Overlap with other NIKHEF projects
  • Other HEP experiments
  • D0 work Q4 2003, continuing
  • Babar project together with Miron Livny
    (Wisconsin)
  • Astroparticle physics
  • LOFAR SOC much like LHC Tier-1
  • Km3NeT on-demand repointing much like ATLAS
    Level-3 trigger

7
EU NL Projects
  • EGEE (EU FP6 project, 22 years, 30M)
  • Funding for site operation (together with SARA)
  • Funding for Grid Technology projects (together
    with UvA)
  • Funding for generic applications (read non-LHC)
  • BSIK/VL-E
  • Funding for Data-Intensive Science (everything we
    do)
  • Funding for Scaling and Validation (large-scale
    site operation)
  • Cooperation with other disciplines
  • Leverage multi-disciplinary use of our
    infrastructure into large NCF-funded facility
    (Tier-1)

8
Currency
  • Advantages of Grid Computing for external funding
  • Grid computing (cycles expertise) in exchange
    for membership fees

9
People
  • LHC applications
  • Templon, Bos, postdoc, 3 grad students
  • Non-LHC applications
  • Van Leeuwen (CT), Grijpink (CT), Bos, Templon,
    Groep
  • Grid Technology
  • Groep, Koeroo (CT), Venekamp (CT), Steenbakkers
    (UvA), Templon
  • Site Operations
  • Salomoni (CT), Groep, Templon, other CT support

10
People / Funding
  • EGEE
  • 1 FTE Generic Apps, 1 FTE Site Operations, 1 FTE
    AA
  • BSIK/VL-E
  • 1 FTE Scaling Validation, 1 FTE Data-Intensive
    Sciences
  • Both projects require local 1-1 matching (50
    cost model)
  • Can overlap - 15
  • Possible additional money from bio-range project
  • Possible to replace some manpower with equivalent
    equipment

11
Possible Funding Model
Write a Comment
User Comments (0)
About PowerShow.com