LCG Applications Area - PowerPoint PPT Presentation


Title: LCG Applications Area


1
LCG Applications Area
  • Torre Wenaus, BNL/CERN
  • LCG Applications Area Manager
  • http//cern.ch/lcg/peb/applications
  • DOE/NSF Review of US LHC Physics and Computing
    Projects
  • January 14, 2003

2
The LHC Computing Grid Project Structure
Project Overview Board
Software andComputingCommittee(SC2)
Project Leader
ProjectExecutionBoard (PEB)
Requirements, Work plan, Monitoring
GridProjects
Project Work Packages
3
LCG Areas of Work
  • Physics Applications Software
  • Application Software Infrastructure libraries,
    tools
  • Object persistency, data management tools
  • Common Frameworks Simulation, Analysis, ..
  • Adaptation of Physics Applications to Grid
    environment
  • Grid tools, Portals
  • Grid Deployment
  • Data Challenges
  • Grid Operations
  • Network Planning
  • Regional Centre Coordination
  • Security access policy
  • Fabric (Computing System)
  • Physics Data Management
  • Fabric Management
  • Physics Data Storage
  • LAN Management
  • Wide-area Networking
  • Security
  • Internet Services
  • Grid Technology
  • Grid middleware
  • Standard application services layer
  • Inter-project coherence/compatibility

4
Applications Area Organization
Apps Area Leader
Architects Forum
Overall management, coordination, architecture
Project
Project
Project
Project Leaders

Work Package Leaders
WP
WP
WP
WP
WP
WP
WP
Direct technical collaboration between experiment
participants, IT, EP, ROOT, LCG personnel
5
Focus on Experiment Need
  • Project structured and managed to ensure a focus
    on real experiment needs
  • SC2/RTAG process to identify, define (need-driven
    requirements), initiate and monitor common
    project activities in a way guided by the
    experiments themselves
  • Architects Forum to involve experiment architects
    in day to day project management and execution
  • Open-ness of information flow and decision making
  • Direct participation of experiment developers in
    the projects
  • Tight, iterative feedback loop to gather user
    feedback from frequent releases
  • Early deployment and evaluation of LCG software
    in experiment contexts
  • Success defined by experiment adoption and
    production deployment

6
Applications Area Projects
  • Software Process and Infrastructure (SPI)
    (operating A.Aimar)
  • Librarian, QA, testing, developer tools,
    documentation, training,
  • Persistency Framework (POOL)
    (operating D.Duellmann)
  • POOL hybrid ROOT/relational data store
  • Mathematical libraries
    (operating F.James)
  • Math and statistics libraries GSL etc. as NAGC
    replacement
  • Group in India will work on this (workplan in
    development)
  • Core Tools and Services (SEAL)
    (operating P.Mato)
  • Foundation and utility libraries, basic framework
    services, system services, object dictionary and
    whiteboard, grid enabled services
  • Physics Interfaces (PI)
    (launched V.Innocente)
  • Interfaces and tools by which physicists directly
    use the software. Interactive (distributed)
    analysis, visualization, grid portals
  • Simulation
    (launch planning in progress)
  • Geant4, FLUKA, simulation framework, geometry
    model,
  • Generator Services
    (launch as part of simu)
  • Generator librarian, support, tool development

Bold Recent developments (last 3 months)
7
Project Relationships
8
Candidate RTAG timeline from March
Blue RTAG/activity launched or (light blue)
imminent
9
LCG Applications Area Timeline Highlights
POOL V0.1 internal release
Architectural blueprint complete
Applications
Hybrid Event Store available for general users
Distributed production using grid services
Distributed end-user interactive analysis
Full Persistency Framework
LCG TDR
LCG
50 prototype (LCG-3)
LCG-1 reliability and performance targets
First Global Grid Service (LCG-1) available
LCG launch week
10
Architecture Blueprint
  • Executive summary
  • Response of the RTAG to the mandate
  • Blueprint scope
  • Requirements
  • Use of ROOT
  • Blueprint architecture design precepts
  • High level architectural issues, approaches
  • Blueprint architectural elements
  • Specific architectural elements, suggested
    patterns, examples
  • Domain decomposition
  • Schedule and resources
  • Recommendations

RTAG established in June After 14 meetings, much
email... A 36-page final report Accepted by SC2
October 11
http//lcgapp.cern.ch/project/blueprint/
11
Component Model
  • Granularity driven by component replacement
    criteria development team organization
    dependency minimization
  • Communication via public interfaces
  • Plug-ins
  • Logical module encapsulating a service that can
    be loaded, activated and unloaded at run time
  • APIs targeted not only to end-users but to
    embedding frameworks and internal plug-ins

12
Software Structure
ROOT, Qt,
Implementation- neutral services
Grid middleware,
STL, ROOT libs, CLHEP, Boost,
13
Distributed Operation
  • Architecture should enable but not require the
    use of distributed resources via the Grid
  • Configuration and control of Grid-based operation
    via dedicated services
  • Making use of optional grid middleware services
    at the foundation level of the software structure
  • Insulating higher level software from the
    middleware
  • Supporting replaceability
  • Apart from these services, Grid-based operation
    should be largely transparent
  • Services should gracefully adapt to unplugged
    environments
  • Transition to local operation modes, or fail
    informatively

14
Managing Objects
  • Object Dictionary
  • To query a class about its internal structure
  • Essential for persistency, data browsing, etc.
  • The ROOT team and LCG plan to develop and
    converge on a common dictionary (common interface
    and implementation) with an interface
    anticipating a C standard (XTI) (Timescale
    1yr?)
  • Will contact Stroustrup, who has started
    implementation
  • Object Whiteboard
  • Uniform access to application-defined transient
    objects, including in the ROOT environment
  • What this will be (how similar to Gaudi,
    StoreGate?) not yet defined
  • Object definition based on C header files
  • Now that ATLAS as well as CMS will use this
    approach, it is being addressed in a common way
    via the LCG AA

15
Dictionary Reflection / Population / Conversion
New in POOL 0.3
In progress
16
Other Architectural Elements
  • Python-based Component Bus
  • Plug-in integration of components providing a
    wide variety of functionality
  • Component interfaces to bus derived from their
    C interfaces
  • Scripting Languages
  • Python and CINT (ROOT) to both be available
  • Access to objects via object whiteboard in these
    environments
  • Interface to the Grid
  • Must support convenient, efficient configuration
    of computing elements with all needed components

17
Domain Decomposition
Products mentioned are examples not a
comprehensive list
Grey not in common project scope (also event
processing framework, TDAQ)
18
Use of ROOT in LCG Software
  • Among the LHC experiments
  • ALICE has based its applications directly on ROOT
  • The 3 others base their applications on
    components with implementation-independent
    interfaces
  • Look for software that can be encapsulated into
    these components
  • All experiments agree that ROOT is an important
    element of LHC software
  • Leverage existing software effectively and do not
    unnecessarily reinvent wheels
  • Therefore the blueprint establishes a
    user/provider relationship between the LCG
    applications area and ROOT
  • LCG AA software will make use of ROOT as an
    external product
  • Draws on a great ROOT strength users are
    listened to very carefully!
  • So far so good the ROOT team has been very
    responsive to needs for new and extended
    functionality coming from POOL

19
Blueprint RTAG Outcomes
  • SC2 decided in October
  • Blueprint is accepted
  • RTAG recommendations accepted to
  • Start common project on core tools and services
  • Start common project on physics interfaces

20
Applications Area Personnel Status
  • 18 LCG apps hires in place and working 2 in
    Jan, Feb
  • Manpower ramp is on target (expected to reach
    20-23)
  • Contributions from UK, Spain, Switzerland,
    Germany, Sweden, Israel, Portugal, US
  • 10 FTEs from IT (DB and API groups) also
    participating
  • 8 FTEs from experiments (CERN EP and outside
    CERN) also participating in (mainly) POOL, SEAL,
    SPI
  • CERN established a new software group as the EP
    home of the LCG applications area (EP/SFT)
  • Led by John Harvey. Taking shape well. Localized
    in B.32
  • Fraction of experiment contribution which is
    US-supported (CERN or US resident) is currently
    30
  • US fraction of total effort is lt10

21
LHC Manpower needs for Core Software
From LHC Computing (Hoffman) Review (FTEs)
2000 Have (miss) 2001 2002 2003 2004 2005
ALICE 12(5) 17.5 16.5 17 17.5 16.5
ATLAS 23(8) 36 35 30 28 29
CMS 15(10) 27 31 33 33 33
LHCb 14(5) 25 24 23 22 21
Total 64(28) 105.5 106.5 103 100.5 99.5
Only computing professionals counted
22
Personnel Resources Required and Available
Estimate of Required Effort
60
50
Now
40
FTEs
30
20
10
0
Jun-03
Jun-04
Sep-02
Dec-02
Mar-03
Sep-03
Dec-03
Mar-04
Sep-04
Dec-04
Mar-05
Quarter ending
Blue Available effort
FTEs today 18 LCG, 10 CERN IT, 8 CERN EP
experiments
Future estimate 20-23 LCG, 13 IT, 28 EP
experiments
23
Current Personnel Distribution
24
(No Transcript)
25
U.S. Leadership
  • Direct leadership and financial contribution
    T.Wenaus as AA manager
  • In addition to contributions via ATLAS and CMS
  • A .75FTE job requiring CERN residence
  • Salary support from the BNL base program (is this
    fair?)
  • CERN residency and US travel costs borne by CERN
  • Together with the strong U.S. presence in CMS and
    ATLAS computing leadership, this role gives the
    U.S. a strong voice in the LCG applications area
  • Not a dominating influence of course e.g. at
    this point all the applications area project
    leaders are Europeans
  • Presence at CERN is very important, like it or
    not
  • Importance is increased because of the utterly
    deplorable state of the CERN infrastructure for
    both audio and video conferencing
  • The U.S. should put up the money to fix this, if
    no one else will it is in our own vital interest

26
Schedule and Resource Tracking (example)
27
Apps area planning materials
  • Planning page linked from applications area page
  • Applications area plan spreadsheet overall
    project plan
  • http//lcgapp.cern.ch/project/mgmt/AppPlan.xls
  • High level schedule, personnel resource
    requirements
  • Applications area plan document overall project
    plan
  • http//lcgapp.cern.ch/project/mgmt/AppPlan.doc
  • Incomplete draft
  • Personnel spreadsheet
  • http//lcgapp.cern.ch/project/mgmt/AppManpower.xls
  • Currently active/foreseen apps area personnel,
    activities
  • WBS, milestones, assigned personnel resources
  • http//atlassw1.phy.bnl.gov/Planning/lcgPlanning.h
    tml
  • Follow Applications Area planning link on the
    review web page

28
Core Libraries and Services (SEAL) Project
  • Launched in Oct, led by Pere Mato (CERN/LHCb)
  • 6-member (3 FTE) team initially M.Marino from
    ATLAS
  • Scope
  • Foundation, utility libraries
  • Basic framework services
  • Object dictionary
  • Grid enabled services
  • Many areas of immediate relevance to POOL these
    are given priority
  • Users of this project are software developers in
    other projects and the experiments
  • Establishing initial plan, reviewing existing
    libraries and services
  • Process for adopting third party code will be
    addressed in this project
  • Initial workplan will be presented to SC2 on Jan
    10
  • 2003/3/31 SEAL V1 essentials in alpha

29
SEAL Work Packages
  • Foundation and utility libraries
  • Boost, CLHEP, , complementary in-house
    development
  • Component model and plug-in manager
  • The core expression in code of the component
    architecture described in the blueprint. Mainly
    in-house development.
  • LCG object dictionary
  • Already active project in POOL being moved to
    SEAL (wider scope than persistency). Will include
    filling dictionary from C header files.
  • Basic framework services
  • Object whiteboard, message reporting, component
    configuration, event management
  • Scripting services
  • Grid services common interface to middleware
  • Education and documentation
  • Assisting experiments with integration

30
Physicist Interface (PI) Project
  • Led by Vincenzo Innocente (CERN/CMS)
  • Covers the interfaces and tools by which
    physicists will directly use the software
  • Planned scope
  • Interactive environment physicists desktop
  • Analysis tools
  • Visualization
  • Distributed analysis, grid portals
  • Very poorly defined and understood
  • Currently surveying experiments on their needs
    and interests
  • In more of an RTAG mode than project mode
    initially, to flesh out plans and try to clarify
    the grid area
  • Will present initial plans (and possibly an
    analysis RTAG proposal) to SC2 on Jan 29

31
Software Process and Infrastructure (SPI)
  • Components available
  • Code documentation, browsing Doxygen, LXR,
    ViewCVS
  • Testing Framework CppUnit, Oval
  • Memory Leaks Valgrind
  • Automatic Builds Probably the ATLAS system
  • Coding and design guidelines RuleChecker
  • CVS organization
  • Configuration/release mgmt SCRAM
  • Software documentation templates
  • http//spi.cern.ch

32
SPI Services
  • CVS repositories
  • One repository per project
  • Standard repository structure and include
    conventions being finalized this week
  • Will eventually move to IT CVS service when it is
    proven
  • AFS delivery area, Software Library
  • /afs/cern.ch/sw/lcg
  • Installations of LCG-developed and external
    software
  • Installation kits for offsite installation
  • LCG Software Library toolsmith started in
    December
  • Build servers
  • Machines with various Linux, Solaris
    configurations available for use
  • Project portal (similar to SourceForge)
    http//lcgappdev.cern.ch
  • Very nice new system using Savannah
    (savannah.gnu.org)
  • Used by CMS as well as LCG ATLAS will probably
    be using it soon
  • Bug tracking, project news, FAQ, mailing lists,
    download area, CVS access,

33
(No Transcript)
34
POOL
  • Pool of persistent objects for LHC, currently in
    prototype
  • Targeted at event data but not excluding other
    data
  • Hybrid technology approach
  • Object level data storage using file-based object
    store (ROOT)
  • RDBMS for meta data file catalogs, object
    collections, etc (MySQL)
  • Leverages existing ROOT I/O technology and adds
    value
  • Transparent cross-file and cross-technology
    object navigation
  • RDBMS integration
  • Integration with Grid technology (eg EDG/Globus
    replica catalog)
  • network and grid decoupled working modes
  • Follows and exemplifies the LCG blueprint
    approach
  • Components with well defined responsibilities
  • Communicating via public component interfaces
  • Implementation technology neutral

35
Pool Release Schedule
  • End September - V0.1 (Released Oct 2)
  • All core components for navigation exist and
    interoperate
  • Assumes ROOT object (TObject) on read and write
  • End October - V0.2 (Released Nov 15)
  • First collection implementation
  • End November - V0.3 (Released Dec 18)
  • First public release
  • EDG/Globus FileCatalog integrated
  • Persistency for general C classes (not
    instrumented by ROOT), but very limited
    elementary types only
  • Event metadata annotation and query
  • June 2003 Production release

36
POOL Milestones
37
Simulation Project
  • Mandated by SC2 to initiate simulation project
    following the RTAG
  • Project being organized now
  • Expected to cover
  • generic simulation framework
  • Multiple simulation engine support, geometry
    model, generator interface, MC truth, user
    actions, user interfaces, average tracking,
    utilities
  • ALICE virtual MC as starting point if it meets
    requirements
  • Geant4 development and integration
  • FLUKA (development and) integration
  • physics validation
  • simulation test and benchmark suite
  • fast (shower) parameterisation
  • generator services

38
Comment on Grid Technology Area (GTA)
  • Quote from slide of Les
  • LCG expects to obtain Grid Technology from
    projects funded by national and regional
    e-science initiatives -- and from industry
  • concentrating ourselves on deploying a global
    grid service
  • All true, but there is a real role for the GTA,
    not just deployment, in LCG
  • Ensuring that the needed middleware is/will be
    there, tested, selected and of production grade
  • (Re)organization in progress to create an active
    GTA along these lines
  • Important for the Applications Area AA
    distributed software will be robust and usable
    only if the grid middleware it uses is so

39
Concluding Remarks
  • Essentially the full expected AA scope is covered
    by the anticipated activities of the projects now
    defined
  • Manpower is in quite good shape
  • Buy-in by the experiments, apart from ALICE, is
    good
  • Substantial direct participation in leadership,
    development, prompt testing and evaluation, RTAGs
  • U.S. CMS represented well because of strong
    presence in computing management and in
    CERN-based personnel
  • U.S. ATLAS representation will improve with
    D.Quarries relocation to CERN as Software
    Leader further increases in CERN presence being
    sought
  • Groups remote from CERN are contributing, but it
    isnt always easy
  • Have pushed to lower the barriers, but still it
    isnt easy
  • New CERN EP/SFT group is taking shape well as a
    CERN hub for applications area activities
  • POOL and SPI are delivering, and the other
    projects are ramping up
  • First persistency prototype released in 2002, as
    targeted in March 2002
View by Category
About This Presentation
Title:

LCG Applications Area

Description:

Configuration and control of Grid-based operation via dedicated services ... state of the CERN infrastructure for both audio and video conferencing ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 40
Provided by: torrew
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: LCG Applications Area


1
LCG Applications Area
  • Torre Wenaus, BNL/CERN
  • LCG Applications Area Manager
  • http//cern.ch/lcg/peb/applications
  • DOE/NSF Review of US LHC Physics and Computing
    Projects
  • January 14, 2003

2
The LHC Computing Grid Project Structure
Project Overview Board
Software andComputingCommittee(SC2)
Project Leader
ProjectExecutionBoard (PEB)
Requirements, Work plan, Monitoring
GridProjects
Project Work Packages
3
LCG Areas of Work
  • Physics Applications Software
  • Application Software Infrastructure libraries,
    tools
  • Object persistency, data management tools
  • Common Frameworks Simulation, Analysis, ..
  • Adaptation of Physics Applications to Grid
    environment
  • Grid tools, Portals
  • Grid Deployment
  • Data Challenges
  • Grid Operations
  • Network Planning
  • Regional Centre Coordination
  • Security access policy
  • Fabric (Computing System)
  • Physics Data Management
  • Fabric Management
  • Physics Data Storage
  • LAN Management
  • Wide-area Networking
  • Security
  • Internet Services
  • Grid Technology
  • Grid middleware
  • Standard application services layer
  • Inter-project coherence/compatibility

4
Applications Area Organization
Apps Area Leader
Architects Forum
Overall management, coordination, architecture
Project
Project
Project
Project Leaders

Work Package Leaders
WP
WP
WP
WP
WP
WP
WP
Direct technical collaboration between experiment
participants, IT, EP, ROOT, LCG personnel
5
Focus on Experiment Need
  • Project structured and managed to ensure a focus
    on real experiment needs
  • SC2/RTAG process to identify, define (need-driven
    requirements), initiate and monitor common
    project activities in a way guided by the
    experiments themselves
  • Architects Forum to involve experiment architects
    in day to day project management and execution
  • Open-ness of information flow and decision making
  • Direct participation of experiment developers in
    the projects
  • Tight, iterative feedback loop to gather user
    feedback from frequent releases
  • Early deployment and evaluation of LCG software
    in experiment contexts
  • Success defined by experiment adoption and
    production deployment

6
Applications Area Projects
  • Software Process and Infrastructure (SPI)
    (operating A.Aimar)
  • Librarian, QA, testing, developer tools,
    documentation, training,
  • Persistency Framework (POOL)
    (operating D.Duellmann)
  • POOL hybrid ROOT/relational data store
  • Mathematical libraries
    (operating F.James)
  • Math and statistics libraries GSL etc. as NAGC
    replacement
  • Group in India will work on this (workplan in
    development)
  • Core Tools and Services (SEAL)
    (operating P.Mato)
  • Foundation and utility libraries, basic framework
    services, system services, object dictionary and
    whiteboard, grid enabled services
  • Physics Interfaces (PI)
    (launched V.Innocente)
  • Interfaces and tools by which physicists directly
    use the software. Interactive (distributed)
    analysis, visualization, grid portals
  • Simulation
    (launch planning in progress)
  • Geant4, FLUKA, simulation framework, geometry
    model,
  • Generator Services
    (launch as part of simu)
  • Generator librarian, support, tool development

Bold Recent developments (last 3 months)
7
Project Relationships
8
Candidate RTAG timeline from March
Blue RTAG/activity launched or (light blue)
imminent
9
LCG Applications Area Timeline Highlights
POOL V0.1 internal release
Architectural blueprint complete
Applications
Hybrid Event Store available for general users
Distributed production using grid services
Distributed end-user interactive analysis
Full Persistency Framework
LCG TDR
LCG
50 prototype (LCG-3)
LCG-1 reliability and performance targets
First Global Grid Service (LCG-1) available
LCG launch week
10
Architecture Blueprint
  • Executive summary
  • Response of the RTAG to the mandate
  • Blueprint scope
  • Requirements
  • Use of ROOT
  • Blueprint architecture design precepts
  • High level architectural issues, approaches
  • Blueprint architectural elements
  • Specific architectural elements, suggested
    patterns, examples
  • Domain decomposition
  • Schedule and resources
  • Recommendations

RTAG established in June After 14 meetings, much
email... A 36-page final report Accepted by SC2
October 11
http//lcgapp.cern.ch/project/blueprint/
11
Component Model
  • Granularity driven by component replacement
    criteria development team organization
    dependency minimization
  • Communication via public interfaces
  • Plug-ins
  • Logical module encapsulating a service that can
    be loaded, activated and unloaded at run time
  • APIs targeted not only to end-users but to
    embedding frameworks and internal plug-ins

12
Software Structure
ROOT, Qt,
Implementation- neutral services
Grid middleware,
STL, ROOT libs, CLHEP, Boost,
13
Distributed Operation
  • Architecture should enable but not require the
    use of distributed resources via the Grid
  • Configuration and control of Grid-based operation
    via dedicated services
  • Making use of optional grid middleware services
    at the foundation level of the software structure
  • Insulating higher level software from the
    middleware
  • Supporting replaceability
  • Apart from these services, Grid-based operation
    should be largely transparent
  • Services should gracefully adapt to unplugged
    environments
  • Transition to local operation modes, or fail
    informatively

14
Managing Objects
  • Object Dictionary
  • To query a class about its internal structure
  • Essential for persistency, data browsing, etc.
  • The ROOT team and LCG plan to develop and
    converge on a common dictionary (common interface
    and implementation) with an interface
    anticipating a C standard (XTI) (Timescale
    1yr?)
  • Will contact Stroustrup, who has started
    implementation
  • Object Whiteboard
  • Uniform access to application-defined transient
    objects, including in the ROOT environment
  • What this will be (how similar to Gaudi,
    StoreGate?) not yet defined
  • Object definition based on C header files
  • Now that ATLAS as well as CMS will use this
    approach, it is being addressed in a common way
    via the LCG AA

15
Dictionary Reflection / Population / Conversion
New in POOL 0.3
In progress
16
Other Architectural Elements
  • Python-based Component Bus
  • Plug-in integration of components providing a
    wide variety of functionality
  • Component interfaces to bus derived from their
    C interfaces
  • Scripting Languages
  • Python and CINT (ROOT) to both be available
  • Access to objects via object whiteboard in these
    environments
  • Interface to the Grid
  • Must support convenient, efficient configuration
    of computing elements with all needed components

17
Domain Decomposition
Products mentioned are examples not a
comprehensive list
Grey not in common project scope (also event
processing framework, TDAQ)
18
Use of ROOT in LCG Software
  • Among the LHC experiments
  • ALICE has based its applications directly on ROOT
  • The 3 others base their applications on
    components with implementation-independent
    interfaces
  • Look for software that can be encapsulated into
    these components
  • All experiments agree that ROOT is an important
    element of LHC software
  • Leverage existing software effectively and do not
    unnecessarily reinvent wheels
  • Therefore the blueprint establishes a
    user/provider relationship between the LCG
    applications area and ROOT
  • LCG AA software will make use of ROOT as an
    external product
  • Draws on a great ROOT strength users are
    listened to very carefully!
  • So far so good the ROOT team has been very
    responsive to needs for new and extended
    functionality coming from POOL

19
Blueprint RTAG Outcomes
  • SC2 decided in October
  • Blueprint is accepted
  • RTAG recommendations accepted to
  • Start common project on core tools and services
  • Start common project on physics interfaces

20
Applications Area Personnel Status
  • 18 LCG apps hires in place and working 2 in
    Jan, Feb
  • Manpower ramp is on target (expected to reach
    20-23)
  • Contributions from UK, Spain, Switzerland,
    Germany, Sweden, Israel, Portugal, US
  • 10 FTEs from IT (DB and API groups) also
    participating
  • 8 FTEs from experiments (CERN EP and outside
    CERN) also participating in (mainly) POOL, SEAL,
    SPI
  • CERN established a new software group as the EP
    home of the LCG applications area (EP/SFT)
  • Led by John Harvey. Taking shape well. Localized
    in B.32
  • Fraction of experiment contribution which is
    US-supported (CERN or US resident) is currently
    30
  • US fraction of total effort is lt10

21
LHC Manpower needs for Core Software
From LHC Computing (Hoffman) Review (FTEs)
2000 Have (miss) 2001 2002 2003 2004 2005
ALICE 12(5) 17.5 16.5 17 17.5 16.5
ATLAS 23(8) 36 35 30 28 29
CMS 15(10) 27 31 33 33 33
LHCb 14(5) 25 24 23 22 21
Total 64(28) 105.5 106.5 103 100.5 99.5
Only computing professionals counted
22
Personnel Resources Required and Available
Estimate of Required Effort
60
50
Now
40
FTEs
30
20
10
0
Jun-03
Jun-04
Sep-02
Dec-02
Mar-03
Sep-03
Dec-03
Mar-04
Sep-04
Dec-04
Mar-05
Quarter ending
Blue Available effort
FTEs today 18 LCG, 10 CERN IT, 8 CERN EP
experiments
Future estimate 20-23 LCG, 13 IT, 28 EP
experiments
23
Current Personnel Distribution
24
(No Transcript)
25
U.S. Leadership
  • Direct leadership and financial contribution
    T.Wenaus as AA manager
  • In addition to contributions via ATLAS and CMS
  • A .75FTE job requiring CERN residence
  • Salary support from the BNL base program (is this
    fair?)
  • CERN residency and US travel costs borne by CERN
  • Together with the strong U.S. presence in CMS and
    ATLAS computing leadership, this role gives the
    U.S. a strong voice in the LCG applications area
  • Not a dominating influence of course e.g. at
    this point all the applications area project
    leaders are Europeans
  • Presence at CERN is very important, like it or
    not
  • Importance is increased because of the utterly
    deplorable state of the CERN infrastructure for
    both audio and video conferencing
  • The U.S. should put up the money to fix this, if
    no one else will it is in our own vital interest

26
Schedule and Resource Tracking (example)
27
Apps area planning materials
  • Planning page linked from applications area page
  • Applications area plan spreadsheet overall
    project plan
  • http//lcgapp.cern.ch/project/mgmt/AppPlan.xls
  • High level schedule, personnel resource
    requirements
  • Applications area plan document overall project
    plan
  • http//lcgapp.cern.ch/project/mgmt/AppPlan.doc
  • Incomplete draft
  • Personnel spreadsheet
  • http//lcgapp.cern.ch/project/mgmt/AppManpower.xls
  • Currently active/foreseen apps area personnel,
    activities
  • WBS, milestones, assigned personnel resources
  • http//atlassw1.phy.bnl.gov/Planning/lcgPlanning.h
    tml
  • Follow Applications Area planning link on the
    review web page

28
Core Libraries and Services (SEAL) Project
  • Launched in Oct, led by Pere Mato (CERN/LHCb)
  • 6-member (3 FTE) team initially M.Marino from
    ATLAS
  • Scope
  • Foundation, utility libraries
  • Basic framework services
  • Object dictionary
  • Grid enabled services
  • Many areas of immediate relevance to POOL these
    are given priority
  • Users of this project are software developers in
    other projects and the experiments
  • Establishing initial plan, reviewing existing
    libraries and services
  • Process for adopting third party code will be
    addressed in this project
  • Initial workplan will be presented to SC2 on Jan
    10
  • 2003/3/31 SEAL V1 essentials in alpha

29
SEAL Work Packages
  • Foundation and utility libraries
  • Boost, CLHEP, , complementary in-house
    development
  • Component model and plug-in manager
  • The core expression in code of the component
    architecture described in the blueprint. Mainly
    in-house development.
  • LCG object dictionary
  • Already active project in POOL being moved to
    SEAL (wider scope than persistency). Will include
    filling dictionary from C header files.
  • Basic framework services
  • Object whiteboard, message reporting, component
    configuration, event management
  • Scripting services
  • Grid services common interface to middleware
  • Education and documentation
  • Assisting experiments with integration

30
Physicist Interface (PI) Project
  • Led by Vincenzo Innocente (CERN/CMS)
  • Covers the interfaces and tools by which
    physicists will directly use the software
  • Planned scope
  • Interactive environment physicists desktop
  • Analysis tools
  • Visualization
  • Distributed analysis, grid portals
  • Very poorly defined and understood
  • Currently surveying experiments on their needs
    and interests
  • In more of an RTAG mode than project mode
    initially, to flesh out plans and try to clarify
    the grid area
  • Will present initial plans (and possibly an
    analysis RTAG proposal) to SC2 on Jan 29

31
Software Process and Infrastructure (SPI)
  • Components available
  • Code documentation, browsing Doxygen, LXR,
    ViewCVS
  • Testing Framework CppUnit, Oval
  • Memory Leaks Valgrind
  • Automatic Builds Probably the ATLAS system
  • Coding and design guidelines RuleChecker
  • CVS organization
  • Configuration/release mgmt SCRAM
  • Software documentation templates
  • http//spi.cern.ch

32
SPI Services
  • CVS repositories
  • One repository per project
  • Standard repository structure and include
    conventions being finalized this week
  • Will eventually move to IT CVS service when it is
    proven
  • AFS delivery area, Software Library
  • /afs/cern.ch/sw/lcg
  • Installations of LCG-developed and external
    software
  • Installation kits for offsite installation
  • LCG Software Library toolsmith started in
    December
  • Build servers
  • Machines with various Linux, Solaris
    configurations available for use
  • Project portal (similar to SourceForge)
    http//lcgappdev.cern.ch
  • Very nice new system using Savannah
    (savannah.gnu.org)
  • Used by CMS as well as LCG ATLAS will probably
    be using it soon
  • Bug tracking, project news, FAQ, mailing lists,
    download area, CVS access,

33
(No Transcript)
34
POOL
  • Pool of persistent objects for LHC, currently in
    prototype
  • Targeted at event data but not excluding other
    data
  • Hybrid technology approach
  • Object level data storage using file-based object
    store (ROOT)
  • RDBMS for meta data file catalogs, object
    collections, etc (MySQL)
  • Leverages existing ROOT I/O technology and adds
    value
  • Transparent cross-file and cross-technology
    object navigation
  • RDBMS integration
  • Integration with Grid technology (eg EDG/Globus
    replica catalog)
  • network and grid decoupled working modes
  • Follows and exemplifies the LCG blueprint
    approach
  • Components with well defined responsibilities
  • Communicating via public component interfaces
  • Implementation technology neutral

35
Pool Release Schedule
  • End September - V0.1 (Released Oct 2)
  • All core components for navigation exist and
    interoperate
  • Assumes ROOT object (TObject) on read and write
  • End October - V0.2 (Released Nov 15)
  • First collection implementation
  • End November - V0.3 (Released Dec 18)
  • First public release
  • EDG/Globus FileCatalog integrated
  • Persistency for general C classes (not
    instrumented by ROOT), but very limited
    elementary types only
  • Event metadata annotation and query
  • June 2003 Production release

36
POOL Milestones
37
Simulation Project
  • Mandated by SC2 to initiate simulation project
    following the RTAG
  • Project being organized now
  • Expected to cover
  • generic simulation framework
  • Multiple simulation engine support, geometry
    model, generator interface, MC truth, user
    actions, user interfaces, average tracking,
    utilities
  • ALICE virtual MC as starting point if it meets
    requirements
  • Geant4 development and integration
  • FLUKA (development and) integration
  • physics validation
  • simulation test and benchmark suite
  • fast (shower) parameterisation
  • generator services

38
Comment on Grid Technology Area (GTA)
  • Quote from slide of Les
  • LCG expects to obtain Grid Technology from
    projects funded by national and regional
    e-science initiatives -- and from industry
  • concentrating ourselves on deploying a global
    grid service
  • All true, but there is a real role for the GTA,
    not just deployment, in LCG
  • Ensuring that the needed middleware is/will be
    there, tested, selected and of production grade
  • (Re)organization in progress to create an active
    GTA along these lines
  • Important for the Applications Area AA
    distributed software will be robust and usable
    only if the grid middleware it uses is so

39
Concluding Remarks
  • Essentially the full expected AA scope is covered
    by the anticipated activities of the projects now
    defined
  • Manpower is in quite good shape
  • Buy-in by the experiments, apart from ALICE, is
    good
  • Substantial direct participation in leadership,
    development, prompt testing and evaluation, RTAGs
  • U.S. CMS represented well because of strong
    presence in computing management and in
    CERN-based personnel
  • U.S. ATLAS representation will improve with
    D.Quarries relocation to CERN as Software
    Leader further increases in CERN presence being
    sought
  • Groups remote from CERN are contributing, but it
    isnt always easy
  • Have pushed to lower the barriers, but still it
    isnt easy
  • New CERN EP/SFT group is taking shape well as a
    CERN hub for applications area activities
  • POOL and SPI are delivering, and the other
    projects are ramping up
  • First persistency prototype released in 2002, as
    targeted in March 2002
About PowerShow.com