GriPhyNiVDGL Project Overview - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

GriPhyNiVDGL Project Overview

Description:

(Jorge Rodriguez) Everything is database driven. Reassessing our work organization (Fig. ... Alex Szalay. Other Grid Projects. System Integration. Carl ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 26
Provided by: paula92
Category:

less

Transcript and Presenter's Notes

Title: GriPhyNiVDGL Project Overview


1
  • GriPhyN/iVDGL Project Overview

Paul Avery University of Florida http//www.phys.u
fl.edu/avery/ avery_at_phys.ufl.edu
GriPhyN/iVDGL External Advisory Committee
Meeting San Diego Supercomputer Center, San
Diego, CA Jan. 13, 2003
2
GriPhyN/iVDGL Basics
  • Both funded through NSF ITR program
  • GriPhyN 11.9M (NSF) 1.6M (matching) (2000
    2005)
  • iVDGL 13.7M (NSF) 2M (matching) (2001
    2006)
  • Basic composition
  • GriPhyN 12 funded universities, SDSC, 3
    labs (80 people)
  • iVDGL 16 funded institutions, SDSC, 3 labs (70
    people)
  • Expts US-CMS, US-ATLAS, LIGO, SDSS/NVO
  • Large overlap of people, institutions, management
  • Grid research vs Grid deployment
  • GriPhyN 2/3 CS 1/3 physics ( 0 H/W)
  • iVDGL 1/3 CS 2/3 physics (20 H/W)
  • iVDGL 2.5M Tier2 hardware (1.4M LHC)
  • Physics experiments provide frontier challenges
  • Virtual Data Toolkit (VDT) in common

3
GriPhyN Institutions
  • U Florida
  • U Chicago
  • Boston U
  • Caltech
  • U Wisconsin, Madison
  • USC/ISI
  • Harvard
  • Indiana
  • Johns Hopkins
  • Northwestern
  • Stanford
  • U Illinois at Chicago
  • U Penn
  • U Texas, Brownsville
  • U Wisconsin, Milwaukee
  • UC Berkeley
  • UC San Diego
  • San Diego Supercomputer Center
  • Lawrence Berkeley Lab
  • Argonne
  • Fermilab
  • Brookhaven

4
iVDGL Institutions
  • U Florida CMS
  • Caltech CMS, LIGO
  • UC San Diego CMS, CS
  • Indiana U ATLAS, iGOC
  • Boston U ATLAS
  • U Wisconsin, Milwaukee LIGO
  • Penn State LIGO
  • Johns Hopkins SDSS, NVO
  • U Chicago CS
  • U Southern California CS
  • U Wisconsin, Madison CS
  • Salish Kootenai Outreach, LIGO
  • Hampton U Outreach, ATLAS
  • U Texas, Brownsville Outreach, LIGO
  • Fermilab CMS, SDSS, NVO
  • Brookhaven ATLAS
  • Argonne Lab ATLAS, CS

T2 / Software
CS support
T3 / Outreach
T1 / Labs(not funded)
5
Goals PetaScale Virtual-Data Grids
Production Team
Individual Investigator
Workgroups
1 Petaflop 100 Petabytes
Interactive User Tools
Request Planning
Request Execution
Virtual Data Tools
Management Tools
Scheduling Tools
Resource
Other Grid
  • Resource
  • Security and
  • Other Grid

Security and
Management
  • Management
  • Policy
  • Services

Policy
Services
Services
  • Services
  • Services

Services
Transforms
Distributed resources(code, storage,
CPUs,networks)
Raw data
source
6
Coordinating U.S. Projects Trillium
  • Trillium GriPhyN iVDGL PPDG
  • Large overlap in project leadership
    participants
  • Large overlap in experiments, particularly LHC
  • Joint projects (monitoring, etc.)
  • Common packaging, use of VDT other GriPhyN
    software
  • Organization from the bottom up
  • With encouragement from funding agencies NSF
    DOE
  • DOE (OS) NSF (MPS/CISE) working together
  • Complementarity DOE (labs), NSF (universities)
  • Collaboration of computer science/physics/astronom
    y encouraged
  • Collaboration strengthens outreach efforts

See Ruth Pordes talk
7
Progress CS, VDT, Outreach
  • Lots of good CS research (Ian Foster)
  • Creation of Chimera Virtual Data System
  • Used in SDSS production, US-CMS production (150K
    events)
  • DAGMan workflow manager
  • Pegasus request planner
  • Installation revolution VDT Pacman (Miron
    Livny)
  • Several major releases this year VDT 1.1.5
  • Pacman Highly configurable packaging tool
  • VDT Pacman vastly simplify Grid software
    installation
  • Used by all experiments
  • Agreement to use VDT by LHC Computing Grid
    Project
  • Outreach expanded (Manuella Campanelli)
  • Coordination with NPACI-EOT, SkyServer, QuarkNet
  • Well attended Outreach meeting March 1, 2002

8
Progress Testbeds WorldGrid
  • Significant Grid Testbeds deployed (Rick
    Cavanaugh)
  • Major productions carried out with Grid tools
  • Large productions helped fix many Globus Condor
    problems
  • Creation of WorldGrid (Rob Gardner)
  • Major effort with EU partners iVDGL, DataTAG,
    EDG
  • Demos at IST2002 and SC2002
  • Plans for major expansion and focus
  • A great outreach tool!

9
WorldGrid Sites
10
Progress Collaboration Networks
  • Major coordination efforts (Ruth Pordes)
  • Formation of Trillium
  • GLUE, HICB
  • LCG relationship SC2, PEB, etc.
  • New members for iVDGL/WorldGrid (varying status)
  • Experiments BTEV, D0, ALICE
  • Institutions Several new U.S. sites pending
  • Countries Korea, Japan, Brazil, Romania,
    Australia,
  • Networks
  • Harvey Newman ICFA/SCIC I2/HENP
  • Working with IEEAF to extend networking to
    remote/poor regions
  • AMPATH network to South America

11
US-iVDGL Data Grid (Sep. 2001)
12
US-iVDGL Data Grid (Spring 2003)
  • Partners?
  • EU
  • CERN
  • Brazil
  • Australia
  • Korea
  • Japan

13
Example Extend US-CMS Testbed
14
Progress New Proposals
  • Dynamic workspaces for scientific analysis
    communities
  • Preproposal submitted Nov. 2002 to ITR large
    program (15M)
  • US-LHC experiments joint EU proposal
  • Collaborative tools
  • To be submitted to ITR medium program (5M) Feb.
    2003
  • Also, MRI, SciDAC,

15
An Inter-Regional Center for High Energy Physics
Research and Educational Outreach (CHEPREO) at
Florida International University
  • Status
  • Proposal submitted Dec. 2002
  • Presented to NSF review panel Jan. 7-8,
    2003
  • Looks very positive
  • E/O Center in Miami area
  • iVDGL Grid Activities
  • CMS Research
  • AMPATH network
  • Intl Activities (Brazil, etc.)

16
Meetings in 2002
  • GriPhyN/iVDGL/PPDG meetings
  • Jan. 7-9, 2002 EAC, Planning, iVDGL Florida
  • Mar. 2002 Outreach Brownsville
  • Apr. 24-26, 2002 All-hands Argonne
  • Jul. 10-11, 2002 Reliability Workshop ISI
  • Oct. 2002 Provenance Workshop Argonne
  • Dec. 2002 Troubleshooting Workshop Chicago
  • Dec. 16-20, 2002 Technical meeting (PPDG) ISI
    Caltech
  • Other meetings
  • iVDGL facilities workshop (BNL)
  • Grid activities at CMS, ATLAS meetings
  • Several LHC computing reviews for US-CMS,
    US-ATLAS
  • Demos at IST2002, SC2002
  • MAGIC workshop (Aug. 2002, Chicago)
  • GGF4, GGF5, GGF6
  • Meetings with LCG (LHC Computing Grid) project
  • HEP coordination meetings (HICB)

17
Context Data Grid Projects
  • US
  • GriPhyN (NSF)
  • iVDGL (NSF)
  • Particle Physics Data Grid (DOE)
  • TeraGrid (NSF)
  • DOE Science Grid (DOE)
  • EU and Asia
  • European Data Grid (EU, EC)
  • CrossGrid (EU, EC)
  • DataTAG (EU, EC)
  • LCG (CERN)
  • Japanese Project (APGrid?)
  • Korea project

18
Management
  • Challenges from large, dispersed, diverse project
  • 100 people
  • 20 funded institutions several unfunded ones
  • Multi-culturalism CS, 4 experiments
  • Different priorities and risk equations
  • Co-Director concept really works
  • Share work, responsibilities
  • CS (Foster) ? Physics (Avery) ? early reality
    check
  • Project coordinators have helped tremendously
  • Mike Wilde GriPhyN Coordinator (UC/Argonne)
  • Rick Cavanaugh GriPhyN Deputy Coordinator
    (Florida)
  • Rob Gardner iVDGL Coordinator (UC)
  • Jorge Rodriquez iVDGL Deputy Coordinator
    (Florida)
  • Ruth Pordes iVDGL Interim Coordinator, PPDG
    Coordinator

19
Management (cont.)
  • Internal coordination
  • Many meetings, telecons, etc.
  • Experiments in different stages of software
    development
  • Joint milestones require negotiation
  • (MOUs being written with experiments to formalize
    relations)
  • External coordination
  • National PPDG, iVDGL, TeraGrid, Globus, NSF,
    SciDAC,
  • International EUDG, LCGP, GGF, HICB, GridPP,
  • Networks Internet2, ESNET, STAR-TAP,
    STARLIGHT, SURFNet, DataTAG
  • Industry trends
  • Highly time dependent
  • Requires lots of travel, meetings, energy
  • New proposals to fill in missing pieces
  • New large ITR stressing analysis environments
  • Medium ITR stressing collaborative tools

20
Management (cont.)
  • New dynamic project web pages a success
  • (Jorge Rodriguez)
  • Everything is database driven
  • Reassessing our work organization (Fig.)
  • 2 years of experience
  • Rethink breakdown of tasks, responsibilities in
    light of experience
  • Exploit iVDGL resources and close connection
  • Testbeds handled by iVDGL
  • Common assistant to iVDGL and GriPhyN
    Coordinators
  • Common web site development for iVDGL and GriPhyN
  • Common Outreach effort for iVDGL and GriPhyN
  • Additional support from iVDGL for software
    integration

21
Project CoordinatorsM. Wilde, R. Cavanaugh
External Advisory Board
System IntegrationCarl Kesselman
GriPhyNManagement
Project Directors Paul Avery Ian Foster
Industrial Connections Alex Szalay
Outreach/Education Manuela Campanelli
Collaboration Board
Project Coordination Group
Other Grid Projects
Physics Experiments
CS Research Coord. I. Foster Execution
Management (Miron Livny) Performance
Analysis (Valerie Taylor) Request Planning
Scheduling (Carl Kesselman) Virtual Data (Reagan
Moore)
Applications Coord. H. Newman ATLAS (Rob
Gardner) CMS (Harvey Newman) LSC(LIGO) (Bruce
Allen) SDSS (Alexander Szalay)
Technical Coord. Committee Chair J. Bunn H.
Newman T. DeFanti (Networks) A. Szalay M.
Franklin(Databases) R. Moore(Digital
Libraries) C. Kesselman(Grids) P. Galvez R.
Stevens (Collaborative Systems)
VD Toolkit Development Coord. M.
Livny Requirements Definition
Scheduling (Miron Livny) Integration
Testing (Carl Kesselman NMI GRIDS
Center) Documentation Support (TBD)
22
iVDGL Management and Coordination
U.S. Piece
US ProjectDirectors
International Piece
US External Advisory Committee
Collaborating Grid Projects
US Project Steering Group
Facilities Team
Core Software Team
Operations Team
Project Coordination Group
Applications Team
GLUE Interoperability Team
Outreach Team
23
Questions for EAC
  • GriPhyN research directions
  • Are our research directions still appropriate?
  • Are there major changes of direction you would
    recommend?
  • Trends
  • Are we missing any major external research or
    technology trends?
  • Metrics
  • What should be our measures of success for iVDGL?
  • Relations with other projects
  • How should iVDGL relate to TeraGrid, DOE Science
    Grid, PPDG and other national and international
    infostructure projects?
  • How should GriPhyN and iVDGL relate efficiently
    to the EU DataGrid (EDG) and the LHC Computing
    Grid (LCG) projects?
  • Should we be doing more with partners to develop
    stronger networking capabilities?

24
Questions for EAC (2)
  • How we should we prioritize activities pertaining
    to
  • Growing the disciplines on iVDGL
  • Growing the number of sites on iVDGL
  • Increasing the international reach of iVDGL?
  • Should we be taking steps now to
    institutionalize iVDGL?

25
Questions About EAC Function
  • Continue to use common EAC for GriPhyN/iVDGL?
  • Some members in common
  • Some specific to GriPhyN
  • Some specific to iVDGL
  • More members needed?
  • Interactions with EAC
  • More frequent updates from GriPhyN / iVDGL?
  • Phone meetings with Directors and Coordinators?
  • Other ideas?
  • Industry relations
  • Several discussions (IBM, Sun, Dell)
  • We need advice here
Write a Comment
User Comments (0)
About PowerShow.com