LCG Applications Area Status - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

LCG Applications Area Status

Description:

Title: LCG Architecture Blueprint Seminar Subject: LCG Seminar November 6, 2002 Author: Torre Wenaus Last modified by: Torre Wenaus Created Date: 1/9/1999 7:35:23 AM – PowerPoint PPT presentation

Number of Views:184
Avg rating:3.0/5.0
Slides: 52
Provided by: torr154
Category:

less

Transcript and Presenter's Notes

Title: LCG Applications Area Status


1
LCG Applications Area Status
  • Torre Wenaus, BNL/CERN
  • LCG Applications Area Manager
  • http//cern.ch/lcg/peb/applications
  • LHCC Meeting
  • January 27, 2003

2
Outline
  • Applications area scope and organization
  • Architecture
  • Personnel and planning
  • Little on planning since I talked to (most of
    you) about it in Nov
  • Status of applications area projects
  • SPI
  • POOL
  • SEAL
  • PI
  • Math libraries
  • Simulation
  • Relationship to other LCG activity areas
  • Conclusion

3
The LHC Computing Grid Project Structure
Project Overview Board
Software andComputingCommittee(SC2)
Project Leader
ProjectExecutionBoard (PEB)
Requirements, Work plan, Monitoring
GridProjects
Project Work Packages
4
LCG Areas of Work
  • Physics Applications Software
  • Application Software Infrastructure libraries,
    tools
  • Object persistency, data management tools
  • Common Frameworks Simulation, Analysis, ..
  • Adaptation of Physics Applications to Grid
    environment
  • Grid tools, Portals
  • Grid Deployment
  • Data Challenges
  • Grid Operations
  • Network Planning
  • Regional Centre Coordination
  • Security access policy
  • Fabric (Computing System)
  • Physics Data Management
  • Fabric Management
  • Physics Data Storage
  • LAN Management
  • Wide-area Networking
  • Security
  • Internet Services
  • Grid Technology
  • Grid middleware
  • Standard application services layer
  • Inter-project coherence/compatibility

5
Applications Area Organization
Apps Area Leader
Architects Forum
Overall management, coordination, architecture
Project
Project
Project
Project Leaders

Work Package Leaders
WP
WP
WP
WP
WP
WP
WP
Direct technical collaboration between experiment
participants, IT, EP, ROOT, LCG personnel
6
Focus on Experiment Need
  • Project structured and managed to ensure a focus
    on real experiment needs
  • SC2/RTAG process to identify, define (need-driven
    requirements), initiate and monitor common
    project activities in a way guided by the
    experiments themselves
  • Architects Forum to involve experiment architects
    in day to day project management and execution
  • Open information flow and decision making
  • Direct participation of experiment developers in
    the projects
  • Tight iterative feedback loop to gather user
    feedback from frequent releases
  • Early deployment and evaluation of LCG software
    in experiment contexts
  • Success defined by experiment adoption and
    production deployment

7
Applications Area Projects
  • Software Process and Infrastructure (SPI)
    (operating A.Aimar)
  • Librarian, QA, testing, developer tools,
    documentation, training,
  • Persistency Framework (POOL)
    (operating D.Duellmann)
  • POOL hybrid ROOT/relational data store
  • Mathematical libraries
    (operating F.James)
  • Math and statistics libraries GSL etc. as NAGC
    replacement
  • Group in India will work on this (workplan in
    development)
  • Core Tools and Services (SEAL)
    (operating P.Mato)
  • Foundation and utility libraries, basic framework
    services, system services, object dictionary and
    whiteboard, grid enabled services
  • Physics Interfaces (PI)
    (launched V.Innocente)
  • Interfaces and tools by which physicists directly
    use the software. Interactive (distributed)
    analysis, visualization, grid portals
  • Simulation
    (launch planning in progress)
  • Geant4, FLUKA, simulation framework, geometry
    model,
  • Generator Services
    (launch as part of simu)
  • Generator librarian, support, tool development

Bold Recent developments (last 3 months)
8
Project Relationships
9
Candidate RTAG timeline from March
Blue RTAG/activity launched or (light blue)
imminent
10
LCG Applications Area Timeline Highlights
POOL V0.1 internal release
Architectural blueprint complete
Applications
POOL first production release
Distributed production using grid services
Distributed end-user interactive analysis
Full Persistency Framework
LCG TDR
LCG
50 prototype (LCG-3)
LCG-1 reliability and performance targets
First Global Grid Service (LCG-1) available
LCG launch week
11
Architecture Blueprint
  • Executive summary
  • Response of the RTAG to the mandate
  • Blueprint scope
  • Requirements
  • Use of ROOT
  • Blueprint architecture design precepts
  • High level architectural issues, approaches
  • Blueprint architectural elements
  • Specific architectural elements, suggested
    patterns, examples
  • Domain decomposition
  • Schedule and resources
  • Recommendations

RTAG established in June Expt architects, other
experts After 14 meetings, much email... A
36-page final report Accepted by SC2 October 11
http//lcgapp.cern.ch/project/blueprint/
12
Principal architecture requirements
  • Long lifetime support technology evolution
  • C today support language evolution
  • Seamless distributed operation
  • Usability off-network
  • Component modularity, public interfaces
  • Interchangeability of implementations
  • Integration into coherent framework and
    experiment software
  • Design for end-users convenience more than the
    developers
  • Re-use existing implementations
  • Software quality at least as good as any LHC
    experiment
  • Meet performance, quality requirements of
    trigger/DAQ software
  • Platforms Linux/gcc, Linux/icc, Solaris, Windows

13
Component Model
  • Communication via public interfaces
  • APIs targeted to end-users, embedding frameworks,
    internal plug-ins
  • Plug-ins
  • Logical module encapsulating a service that can
    be loaded, activated and unloaded at run time
  • Granularity driven by
  • component replacement criteria
  • dependency minimization
  • development team organization

14
Software Structure
ROOT, Qt,
Implementation- neutral services
Grid middleware,
STL, ROOT libs, CLHEP, Boost,
15
Distributed Operation
  • Architecture should enable but not require the
    use of distributed resources via the Grid
  • Configuration and control of Grid-based operation
    via dedicated services
  • Making use of optional grid middleware services
    at the foundation level of the software structure
  • Insulating higher level software from the
    middleware
  • Supporting replaceability
  • Apart from these services, Grid-based operation
    should be largely transparent
  • Services should gracefully adapt to unplugged
    environments

16
Managing Objects
  • Object Dictionary
  • To query a class about its internal structure
  • Essential for persistency, data browsing, etc.
  • The ROOT team and LCG plan to develop and
    converge on a common dictionary (common interface
    and implementation) with an interface
    anticipating a C standard (XTI) (Timescale
    1yr?)
  • Object Whiteboard
  • Uniform access to application-defined transient
    objects, including in the ROOT environment
  • Object definition based on C header files
  • Used by ATLAS and CMS

17
Other Architectural Elements
  • Python-based Component Bus
  • Plug-in integration of components providing a
    wide variety of functionality
  • Component interfaces to bus derived from their
    C interfaces
  • Scripting Languages
  • Python and CINT (ROOT) to both be available
  • Access to objects via object whiteboard in these
    environments
  • Interface to the Grid
  • Must support convenient, efficient configuration
    of computing elements with all needed components

18
Domain Decomposition
Products mentioned are examples not a
comprehensive list
Grey not in common project scope (also event
processing framework, TDAQ)
19
Use of ROOT in LCG Software
  • Among the LHC experiments
  • ALICE has based its applications directly on ROOT
  • The 3 others base their applications on
    components with implementation-independent
    interfaces
  • Look for software that can be encapsulated into
    these components
  • All experiments agree that ROOT is an important
    element of LHC software
  • Leverage existing software effectively and do not
    unnecessarily reinvent wheels
  • Therefore the blueprint establishes a
    user/provider relationship between the LCG
    applications area and ROOT
  • LCG AA software will make use of ROOT as an
    external product
  • Draws on a great ROOT strength users are
    listened to very carefully!
  • So far so good the ROOT team has been very
    responsive to needs for new and extended
    functionality coming from POOL

20
Blueprint RTAG Outcomes
  • SC2 decided in October
  • Blueprint is accepted
  • RTAG recommendations accepted to
  • Start common project on core tools and services
  • Start common project on physics interfaces

21
Applications Area Personnel Status
  • 18 LCG apps hires in place and working 1 last
    week, 1 in Feb
  • Manpower ramp is on target (expected to reach
    20-23)
  • Contributions from UK, Spain, Switzerland,
    Germany, Sweden, Israel, Portugal, US, India, and
    Russia
  • 12 FTEs from IT (DB and API groups)
  • 12 FTEs from CERN EP/SFT, experiments
  • CERN established a new software group as the EP
    home of the LCG applications area (EP/SFT)
  • Led by John Harvey
  • Taking shape well. Localized in B.32
  • Soon to be augmented by IT/API staff working in
    applications area they will move to EP/SFT
  • Will improve cohesion, sense of project
    participation, technical management effectiveness

22
Applications Area Personnel Summary
Details at http//lcgapp.cern.ch/project/mgmt/AppM
anpower.xls
People FTEs
Total LCG hires working, total 19 19.0
Working directly for apps area projects 15 15.0
ROOT 2 2.0
Grid integration work with experiments 2 2.0
Contributions from
IT/DB 3 2.1
IT/API 11 9.7
EP/SFT experiments total 22 11.9
Working directly for apps area projects 19 9.9
Architecture, management 5 2.0
Total directly working on apps area projects 48 36.7
Overall applications area total 55 42.7
23
Current Personnel Distribution
24
FTE-years
25
Personnel Resources Required and Available
Estimate of Required Effort
SPI
60
Math libraries
50
Now
40
Physicist interface
FTEs
30
Generator services
20
Simulation
10
SEAL
0
POOL
Jun-03
Jun-04
Sep-02
Dec-02
Mar-03
Sep-03
Dec-03
Mar-04
Sep-04
Dec-04
Mar-05
Quarter ending
Blue Available effort
Future estimate based on 20 LCG, 12 IT, 28 EP
experiments
26
Schedule and Resource Tracking (example)
27
MS Project Integration POOL Milestones
28
Apps area planning materials
  • Planning page linked from applications area page
  • Applications area plan spreadsheet overall
    project plan
  • http//lcgapp.cern.ch/project/mgmt/AppPlan.xls
  • High level schedule, personnel resource
    requirements
  • Applications area plan document overall project
    plan
  • http//lcgapp.cern.ch/project/mgmt/AppPlan.doc
  • Incomplete draft
  • Personnel spreadsheet
  • http//lcgapp.cern.ch/project/mgmt/AppManpower.xls
  • Currently active/foreseen apps area personnel,
    activities
  • WBS, milestones, assigned personnel resources
  • http//atlassw1.phy.bnl.gov/Planning/lcgPlanning.h
    tml

29
L1 Milestones (1)
30
L1 Milestones (2)
31
Software Process and Infrastructure Project
  • Alberto Aimar, CERN IT/API

Software development Support
  • General Service for Software projects
  • a. Provide general services needed by each
    project
  • CVS repository, Web Site, Software Library
  • Mailing Lists, Bug Reports, Collaborative
    Facilities
  • b. Provide components specific to the software
    phases
  • Tools, Templates, Training, Examples, etc.

32
SPI Services
  • CVS repositories
  • One repository per project
  • Standard repository structure and include
    conventions
  • Will eventually move to IT CVS service when it is
    proven
  • AFS delivery area, Software Library
  • /afs/cern.ch/sw/lcg
  • Installations of LCG-developed and external
    software
  • LCG Software Library toolsmith started in
    December
  • Build servers
  • Machines with needed Linux, Solaris
    configurations
  • Project portal (similar to SourceForge)
    http//lcgappdev.cern.ch
  • Very nice new system using Savannah
    (savannah.gnu.org)
  • Used by POOL, SEAL, SPI, CMS,
  • Bug tracking, project news, FAQ, mailing lists,
    download area, CVS access,

33
(No Transcript)
34
(No Transcript)
35
SPI Components
  • Code documentation, browsing Doxygen, LXR,
    ViewCVS
  • Testing Framework CppUnit, Oval
  • Memory Leaks Valgrind
  • Automatic Builds NICOS (ATLAS)
  • Coding and design guidelines RuleChecker
  • Standard CVS organization
  • Configuration management SCRAM
  • Acceptance of SCRAM decision shows the system
    works
  • Project workbook
  • All components and services should be in place
    mid-Feb
  • Missing at this point
  • Nightly build system (Being integrated with POOL
    for testing)
  • Software library (Prototype being set up now)

36
POOL Project
  • Dirk Duellmann, CERN IT/DB
  • Pool of persistent objects for LHC
  • Targeted at event data but not excluding other
    data
  • Hybrid technology approach
  • Object level data storage using file-based object
    store (ROOT)
  • RDBMS for meta data file catalogs, object
    collections, etc (MySQL)
  • Leverages existing ROOT I/O technology and adds
    value
  • Transparent cross-file and cross-technology
    object navigation
  • RDBMS integration
  • Integration with Grid technology (eg EDG/Globus
    replica catalog)
  • network and grid decoupled working modes
  • Follows and exemplifies the LCG blueprint
    approach
  • Components with well defined responsibilities
  • Communicating via public component interfaces
  • Implementation technology neutral

37
POOL Release Schedule
  • End September - V0.1 (Released Oct 2)
  • All core components for navigation exist and
    interoperate
  • Assumes ROOT object (TObject) on read and write
  • End October - V0.2 (Released Nov 15)
  • First collection implementation
  • End November - V0.3 (Released Dec 18)
  • First public release
  • EDG/Globus FileCatalog integrated
  • Persistency for general C classes (not
    instrumented by ROOT), but very limited
    elementary types only
  • Event metadata annotation and query
  • End February V0.4
  • Persistency of more complex objects, eg. with STL
    containers
  • Support object descriptions from C header files
    (gcc-xml)
  • June 2003 Production release

Principal apps area milestone defined in March
LCG launch Hybrid prototype by year end
38
Dictionary Reflection / Population / Conversion
New in POOL 0.3
In progress
39
POOL Milestones
40
Core Libraries and Services (SEAL) Project
  • Pere Mato, CERN EP/SFT/LHCb
  • Launched in Oct
  • 6-member (3 FTE) team initially build up to 8
    FTEs by the summer
  • Growth mainly from experiment contributions
  • Scope
  • Foundation, utility libraries
  • Basic framework services
  • Object dictionary (taken over from POOL)
  • Grid enabled services
  • Purpose
  • Provide a coherent and complete set of core
    classes and services in conformance with overall
    architectural vision (Blueprint RTAG)
  • Facilitate the integration of LCG and non-LCG
    software to build coherent applications
  • Avoid duplication of software use community
    standards
  • Areas of immediate relevance to POOL given
    priority
  • Users are software developers in other projects
    and the experiments

41
SEAL Work Packages
  • Foundation and utility libraries
  • Boost, CLHEP, experiment code, complementary
    in-house development
  • Participation in CLHEP workshop this week
  • Component model and plug-in manager
  • The core expression in code of the component
    architecture described in the blueprint. Mainly
    in-house development.
  • LCG object dictionary
  • C class reflection, dictionary population from
    C headers, ROOT gateways, Python binding
  • Basic framework services
  • Object whiteboard, message reporting, component
    configuration, event management
  • Scripting services
  • Python bindings for LCG services, ROOT
  • Grid services
  • Common interface to middleware
  • Education and documentation
  • Assisting experiments with integration

42
SEAL Schedule
  • Jan 2003 - Initial work plan delivered to SC2 on
    Jan 10th and approved
  • Including contents of version v1 alpha
  • March 2003 - V1 alpha
  • Essential elements with sufficient functionality
    for the other existing LCG projects (POOL, )
  • Frequent internal releases (monthly?)
  • June 2003 - V1 beta
  • Complete list of the currently proposed elements
    implemented with sufficient functionality to be
    adopted by experiments
  • June 2003 - Grid enabled services defined
  • The SEAL services which must be grid-enabled are
    defined and their implementation prioritized.

43
Estimation of Needed Resources for SEAL
Current resources should be sufficient for v1
alpha (March)
44
Math Libraries Project
  • Fred James, CERN EP/SFT
  • Many different libraries in use
  • General purpose (NAG-C, GSL, ..)
  • HEP-specific ( CLHEP, ZOOM, ROOT)
  • Modern libraries dealing with matrices and
    vectors (Blitz, Boost..)
  • Financial considerations can we replace NAG with
    open source
  • RTAG yes
  • Do comparative evaluation of NAG-C and GSL
  • Collect information on what is used/needed
  • Evaluation of functionality and performance
  • HEP-specific libraries expected to evolve to meet
    future needs

45
Math library recommendations status
  • Establish support group to provide advice and
    info about existing libraries, and identify and
    develop new functionality
  • Group established in October
  • Which libraries and modules are in use by
    experiments
  • Little response to experiment requests for info
    group in India is scanning experiment code to
    determine usage
  • Detailed study should be undertaken to assess
    needed functionality and how to provide it,
    particularly via free libraries such as GSL
  • Group in India is undertaking this study
  • Initial plan of work developed with Fred James in
    December
  • Targets completion of first round of GSL
    enhancements for April
  • Based on priority needs assessment
  • Work plan needs to be presented to the SC2 soon
  • Scheduled tomorrow, but will be late

46
Physicist Interface (PI) Project
  • Vincenzo Innocente, CERN EP/SFT/CMS
  • Interfaces and tools by which physicists will
    directly use the software
  • Planned scope
  • Interactive environment physicists desktop
  • Analysis tools
  • Visualization
  • Distributed analysis, grid portals
  • Currently developing plans and trying to clarify
    the grid area
  • Completed survey of experiments on their needs
    and interests
  • Talking also to grid projects, other apps area
    projects, ROOT,
  • Initial workplan proposal will be presented to
    PEB, SC2 this week
  • Will plan inception workshops for identified work
    areas
  • Identify contributors, partners, users
    deliverables and schedules personnel assignments

47
Proposed Near Term Work Items for PI
  • Abstract interface to analysis services
  • GEANT4 and some experiments do not wish to depend
    on a specific implementation
  • One implementation must be ROOT
  • Request for a coherent LCG analysis tool-set
  • Interactive analysis environment
  • Access to experiment objects (event-data,
    algorithms etc)
  • Access to high level POOL services (collections,
    metaData)
  • Transparent use of LCG and grid services
  • With possibility to expose them for debugging
    and detailed monitoring
  • GUI (pointclick) and scripting interface
  • Interactive event and detector visualization
  • Integrated with the analysis environment
  • Offering a large palette of 2D and 3D rendering

48
Simulation Project
  • Mandated by SC2 in Dec to initiate simulation
    project, following RTAG recommendations
  • Project being organized now
  • Discussions with experiments, G4, FLUKA, ROOT,
    on organization and roles in progress
  • Probable that I will lead the overall project
    during a startup period, working with a slate of
    strong subproject leaders
  • Scope (these are the tentative subprojects)
  • Generic simulation framework
  • Multiple simulation engine support, geometry
    model, generator interface, MC truth, user
    actions, user interfaces, average GEANE
    tracking, utilities
  • ALICE virtual MC as starting point if it meets
    requirements
  • Geant4 development and integration
  • FLUKA integration
  • Physics validation
  • simulation test and benchmark suite
  • Fast (shower) parameterisation
  • Generator services

49
Collaborations
  • LCG apps area needs to collaborate well with
    projects broader than the LHC
  • See that LHC requirements are met, provide help
    and support targeted at LHC priorities, while
    being good collaborators (e.g. not assimilators)
  • e.g. CLHEP discussions at workshop this week
  • e.g. Geant4 in context of simulation project
  • We also welcome collaborators on LCG projects
  • e.g. (renewed?) interest from BaBar in POOL
  • We also depend on the other LHC activity areas
  • Grid Technology
  • Ensuring that the needed middleware is/will be
    there, tested, selected and of production grade
  • AA distributed software will be robust and usable
    only if the grid middleware it uses is so
  • Grid Deployment AA software deployment
  • Fabrics CERN infrastructure, AA development and
    test facilities

50
A note on my own time
  • Nearing the one year mark of doing the apps area
    leader job with 75 of my time, resident at CERN
  • Other 25 I am an ATLAS/BNL person in the US (1
    week/mo)
  • Working well, and sustainable (the LCG job I
    mean)
  • From my perspective at least!
  • I will be lightening my ATLAS/US load, which will
    be welcome
  • Expect to hand over the US ATLAS Software Manager
    job to a highly capable person in a few days or
    weeks
  • Expect to hand over the ATLAS Planning Officer
    job shortly to someone else
  • Remaining formal US responsibility is BNL Physics
    Applications Software Group Leader, for which I
    have a strong deputy (David Adams)

51
Concluding Remarks
  • Expected area scope essentially covered by
    projects now defined
  • Manpower is in quite good shape
  • Buy-in by the experiments is good
  • Direct participation in leadership, development,
    prompt testing and evaluation, RTAGs
  • EP/SFT group taking shape well as a CERN hub
  • Participants remote from CERN are contributing,
    but it isnt always easy
  • POOL and SPI are delivering, and the other
    projects are ramping up
  • Persistency prototype released in 2002, as
    targeted in March
  • Important benchmark to come delivering
    production-capable POOL scheduled for June
Write a Comment
User Comments (0)
About PowerShow.com