got%20Grid! - PowerPoint PPT Presentation

About This Presentation
Title:

got%20Grid!

Description:

Analysis performed at 9 computing centers and many universities ... the new infrastructure will build on the Grid3 environment, ... – PowerPoint PPT presentation

Number of Views:22
Avg rating:3.0/5.0
Slides: 27
Provided by: matteo4
Category:
Tags: 20grid | centers

less

Transcript and Presenter's Notes

Title: got%20Grid!


1
got Grid!
2
gt HEP distributed computinggt present daysgt 3
examples
3
The CDF ExperimentFermilab
4
(No Transcript)
5
  • USERS
  • 800 scientists
  • 62 institutions
  • 12 nations

CDF Experiment
6
  • RESOURCES
  • 2,000 CPUs
  • 300 TB disk space
  • 1 PB tape storage

CDF Experiment
7
CDF Experiment
8
The DZero ExperimentFermilab
9
DZero Experiment
  • USERS
  • 650 scientists
  • 76 institutions
  • 18 nations

10
DZero Experiment
  • RESOURCES
  • 2000 CPUs
  • 300 TB disk space
  • 1 PB tape storage
  • RESOURCES
  • 2,000 CPUs
  • 180 TB disk space
  • 1 PB tape storage

11
DZero Experiment
12
The BaBar ExperimentSLAC
13
(No Transcript)
14
  • USERS
  • 600 scientists
  • 77 institutions
  • 11 nations

BaBar Experiment
15
  • RESOURCES
  • 4,096 CPUs
  • 400 TB disk space
  • 2 PB tape storage
  • G/Bytes network link

BaBar Experiment
16
  • DISTRIBUTION
  • 50 of resources are offsite
  • Event reconstruction fully distributed onto 2
    sites
  • Analysis performed at 5 computing centers and
    many universities
  • Monte Carlo production performed at 25 sites

BaBar Experiment
17
gt LHC computing scale
  • CERN T0/T1
  • Disk Space PB 5
  • Mass Storage Space PB 20
  • Processing Power MSI2K 20
  • WAN 10Gb/s 5?
  • Tier-1s (Sum of 10)
  • Disk Space PB 20
  • Mass Storage Space PB 20
  • Processing Power MSI2K 45
  • WAN 10Gb/s/Tier-1 1?
  • Tier-2s (Sum of 40)
  • Disk Space PB 12
  • Mass Storage Space PB 5
  • Processing Power MSI2K 40
  • WAN 10Gb/s/Tier-2 .2?

18
gt HEP distributed computinggt the future
19
Building a national permanent production Grid
for science in the U.S.
20
(No Transcript)
21
(No Transcript)
22
gt building osg grid3 and beyond
  • the new infrastructure will build on the Grid3
    environment,
  • now numbering approximately 30 sites and 3500
    processors
  • Need to scale by 5-10 in the next 5 years

23
gt architecture
  • The OSG architecture will follow the
  • principles of symmetry and recursion
  • wherever possible
  • The OSG architecture is VO based.
  • Most services are instantiated within the
  • context of a VO.

24
gt government
Technical Groups 0n (small)
Advisory Committee
Universities,Labs
Executive Board (8-15 representatives Chair,
Officers)
Service Providers
Sites
Researchers
VOs
OSG Council (all members above a certain
threshold Chair, officers)
Research Grid Projects
Enterprise
Core OSG Staff (few FTE)
25
The evolution begins Spring 2005
26
www.OpenScienceGrid.org
Write a Comment
User Comments (0)
About PowerShow.com