IN2P3 Computing Center - PowerPoint PPT Presentation

1 / 11
About This Presentation
Title:

IN2P3 Computing Center

Description:

IN2P3. Computing Center. Centre de Calcul de l'IN2P3. 27, Bd du 11 Novembre 1918 ... Softwares and OS (purchase, support, maintenance), RH 7.x, Solaris, AIX ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 12
Provided by: unutil1
Category:

less

Transcript and Presenter's Notes

Title: IN2P3 Computing Center


1
IN2P3Computing Center
  • Outline
  • CC-in2p3 in a nutshell
  • Computing,
  • storage
  • Network,
  • other services
  • Grids and Openings

HEPCCC 27 june 2003
  • Centre de Calcul de l'IN2P3
  • 27, Bd du 11 Novembre 1918
  • 69622 VILLEURBANNE
  • France
  • Téléphone 33 4 78 93 08 80
  • Télécopie 33 4 72 69 41 70
  • http//webcc.in2p3.fr
  • http//annuaire.in2p3.fr

http//www.cnrs.fr http//www.in2p3.fr
2

in a nutshell
IT resource and service centre common
to IN2P3-CNRS DSM-CEA
50 people (40 IT engineers)
National 18 HEP labs, 40 experiments, 2500
users
0,5 PBytes Data Bases, Hierarchical
storage
National opening to biology and IT
Grids know-how, culture dissemination
International Tier-1 / Tier-A status
 1000 cpu's (500k SI2K or 1.5 THz)  60 TB
permanent disk
Budget 6-7 M /an Plus 2 M salaries
Network QoS.many custom services "à la carte"
3
Computing Service
Shared resources you use 80 of peak power 100
of the time. You buy average resources, not peak.
For LCG data challenges, you squeeze the other
users for a while Platforms Linux, Solaris,
AIX a few versions each
4
Computing service
45 groups most large HEP and NP experiments are
users, 40 astroparticle, 10 bio, users and
groups growing.
5
Computing service
6
Storage services disk and tapes
  • HPSS
  • 200 ? 600 TB this year, 250 in march, now 300.
  • HPSS between cache disk, 20GB tapes and 200GB
    tapes
  • purchased for Babar but now used by most
    experiments
  • Babar Objectivity 130 TB and 25 TB cache disk,
    others 120 TB and 4.4TB
  • 35 STK 9840 (20GB tapes, fast mount) and 12?25
    STK 9940 (200GB tapes, slower mount, higher I/O),
    mostly used by hpss
  • Accessed by RFIO. Installed capacity on tape 700
    TB
  • Up to 8-10 TB/day
  • Originally, mainly rfcp. Supports files larger
    than 2GB
  • Direct HPSS access from network through BBFTP

7
Storage services disk and tapes
  • Shared disk capacity 60 TB semi-permanent
    storage, AFS, NFS, tape cache, Objectivity,
    Oracle, TSM, local, HPSS, etc..
  • Semi-permanent storage
  • Suited for small files(which deteriorate HPSS
    performances)
  • Access with NFS or RFIO API
  • Back-up possible for experiments whose CC-IN2P3
    is the  base-site  (Auger, Antares)
  • Working on RFIO transparent access
  • Back-up, Archive TSM (Tivoli storage manager)
  • For Home directories, critical experimental data,
    HPSS metadata, Oracle data
  • TSM allows data archival (Elliot).
  • For back up of external data (IN2P3 Admin. Data,
    Biology DB, etc)

8
NETWORK
Renater-3 oct 2002, grid-shape, most links
2.4G, still 2 main nodes
RENATER
CERN
CC-in2p3 Lyon-USA thru Geant and USLIC.
Lyon-Cern now at 1 Gbps
9
Services of an IT division
  • Other services at CC-IN2P3
  • - Data Bases (Objectivity, Oracle, xSQL, )
  • - Softwares and OS (purchase, support,
    maintenance), RH 7.x, Solaris, AIX
  • web sites (gt 70) , web services (webcast,
    phototheque, mail,..),
  • customized services (storage, specialized
    computing resources and more)
  • - Developments software tests (esp. for grids)
  • - Visioconferencing, MCU ("multipoint conference
    unit")
  • document DB (in-house Democrite), bank of thesis
    and of scientific publications -
    http//ccsd.cnrs.fr - or of technical documents
    (Cern EDMS),
  • Directories
  • - CAD
  • - Security, we will host the CNRS system for CAs
  • - Host to external services (nodes of networks,
    IXP, DB)
  • - Teaching, IT schools
  • ? Not last, nor least
  • Hot-line, user-support, a unique 24h/24 system
    for most IT services.
  • But, as a first approx., we do not support
    detector-dependent applications.

10
Openings and changes
  •  GRIDS
  • - Developments, tests, production, dissemination.
  • a new way of working that emerges
  • hosting EDG resource broker and associated
    services, CVS repository, CNRS-IN2P3 in charge of
    WP6, WP7 and WP10 coordination
  • involved in 8 grid projects, IBM coop. agreement
    on grid technology.
  •  Being a  Tier1  2 key-functions,
  • Internationalization,
  • provide a significative part of IT services
  • requested by any large HEP experiment
  • (Babar, D0, Auger, Virgo-Ego, 4 LHC
  • and more in the future)
  • Opening to other fields, beyond astroparticle
  • - Biology grid develoments, users,
    partnerships, contracts.
  • IT grid develoments, partnerships.

11
Conclusion
CC-IN2P3 is one rare example of an IT centre
where 40 HEP collaborations and several grids
share the same resources and services. Other
disciplines are starting to use services
(biology) 2 key experiences will be presented in
the next talks - what does it mean to be a
Tier1 (Babar case) ? - what does it mean to
mutualize over so many groups ?
Write a Comment
User Comments (0)
About PowerShow.com