LHC@FNAL - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

LHC@FNAL

Description:

is a Tier-1 grid computing center for CMS, has designed and fabricated LHC machine components, ... from the Education Department to explain CMS and LHC to tour ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 24
Provided by: dang183
Category:
Tags: fnal | lhc | explain | tier

less

Transcript and Presenter's Notes

Title: LHC@FNAL


1
LHC_at_FNAL a new Remote Operations Center at
Fermilab
J. Patrick, et al Fermilab
2
Abstract
  • Commissioning the LHC accelerator and experiments
    will be a vital part of the worldwide high-energy
    physics program beginning in 2008. A remote
    operations center, LHC_at_FNAL, has been built at
    Fermilab to make it easier for accelerator
    scientists and experimentalists working in North
    America to help commission and participate in
    operations of the LHC and experiments. Evolution
    of this center from concept through construction
    and early use will be presented as will details
    of its controls system, management, and expected
    future use.

3
Contents
  • Introduction
  • Concept
  • Design
  • Construction
  • Early Use
  • Details/Special Features
  • Future Plans
  • Summary Acknowledgements

4
What is LHC_at_FNAL?
  • A Place
  • That provides access to information in a manner
    that is similar to what is available in control
    rooms at CERN
  • Where members of the LHC community can
    participate remotely in CMS and LHC activities
  • A Communications Conduit
  • Between CERN and members of the LHC community
    located in North America
  • An Outreach tool
  • Visitors will be able to see current LHC
    activities
  • Visitors will be able to see how future
    international projects in particle physics can
    benefit from active participation in projects at
    remote locations.

5
What is LHC_at_FNAL?
  • Allow experts located at Fermilab to participate
    in CMS and LHC commissioning and operations.
  • Hardware and software necessary to participate
    effectively in CMS and LHC.
  • Facilitate communication and help members of the
    LHC community in North America contribute their
    expertise.
  • CMS
  • One of several dedicated and interconnected
    operations and monitoring centers, including
  • a traditional Control Room located at Point 5
    in Cessy, France
  • a CMS Centre for up to fifty people located in
    Meyrin, Switzerland and
  • remote centers such as the LHC_at_FNAL at Fermilab.
  • The Centers support the following activities
  • CMS data quality monitoring, prompt sub-detector
    calibrations, and time-critical data analysis of
    express-line and calibration streams
  • operation of CMS computing systems for
    processing, storage and distribution of real CMS
    data and simulated data, both at CERN and at
    offsite centers
  • LHC
  • An extension of the CERN Control Centre (CCC).
  • Provide remote monitoring capabilities for LHC
    accelerator components developed and built in the
    U.S.
  • Development of software for the LHC controls
    system
  • A unique opportunity to have detector and
    accelerator experts in close proximity to each
    other solving problems together.

6
Remote operations for LHC and LARP
  • LHC remote operations
  • training prior to stays at CERN
  • remote participation in studies
  • service after the sale to support accelerator
    components built in the U.S.
  • access to monitoring information
  • software development for LHC controls system
    (LAFS)

CCC at CERN
LARP The US LHC Accelerator Research Program
(LARP) consists of four US laboratories, BNL,
FNAL, LBNL and SLAC, who collaborate with CERN on
the LHC. The LARP program enables U.S.
accelerator specialists to take an active and
important role in the LHC accelerator during its
commissioning and operations, and to be a major
collaborator in LHC performance upgrades.
CCC
7
How did the Concept Evolve?
  • Fermilab
  • has contributed to CMS detector construction,
  • hosts the LHC Physics Center (LPC) for US-CMS,
  • is a Tier-1 grid computing center for CMS,
  • has designed and fabricated LHC machine
    components,
  • is part of the LHC Accelerator Research Program
    (LARP), and
  • is involved in software development for the LHC
    controls system through a collaboration agreement
    with CERN called LHC_at_FNAL Software (LAFS).
  • The LPC had always planned for remote data
    quality monitoring of CMS during operations.
    Could we expand this role to include remote
    shifts?
  • LARP was interested in providing support for
    US-built components, training people before going
    to CERN, and remote participation in LHC studies.
  • We saw an opportunity for US accelerator
    scientists and engineers to work together with
    detector experts to contribute their combined
    expertise to LHC CMS commissioning.
  • The idea of joint remote operations center at
    FNAL emerged (LHC_at_FNAL).

8
Concept
  • Some proof of principle work done by LHC/LARP
    personnel
  • Thanks to AB/OP colleagues at CERN
  • CMS Remote Operations Center
  • Unified Task Force formed at request of FNAL
    Director
  • First meeting 4 May 2005
  • Close-out 19 October 2006
  • Requirements document created and reviewed
  • CMS
  • LHC
  • CMS/LHC combined
  • Constraints
  • 63 total requirements
  • Review 21 July 2005
  • Proposal to Directorate and constitutients
  • Construction Authorization and Engineering May
    2006
  • Construction initiated September 2006

9
Site Visits
  • Technology Research, Education, and
    Commercialization Center (TRECC) West Chicago,
    Illinois (Aug. 25, 2005)
  • Gemini Project remote control room Hilo,
    Hawaii (Sept. 20, 2005)
  • http//docdb.fnal.gov/CMS-public/DocDB/ShowDocumen
    t?docid425
  • Jefferson Lab control room Newport News,
    Virginia (Sept. 27, 2005)
  • http//docdb.fnal.gov/CMS-public/DocDB/ShowDocumen
    t?docid505
  • Hubble Space Telescope STScI Baltimore,
    Maryland (Oct. 25, 2005)
  • National Ignition Facility Livermore,
    California (Oct. 27, 2005)
  • http//docdb.fnal.gov/CMS-public/DocDB/ShowDocumen
    t?docid532
  • General Atomics San Diego, California (Oct.
    28, 2005)
  • Spallation Neutron Source Oak Ridge, Tennessee
    (Nov. 15, 2005)
  • http//docdb.fnal.gov/CMS-public/DocDB/ShowDocumen
    t?docid570
  • Advanced Photon Source Argonne, Illinois (Nov.
    17, 2005)
  • European Space Operations Centre Darmstadt,
    Germany (Dec. 7, 2005)

10
Design
  • Design work done in-house
  • Variation of CCC design due to purpose
  • High Visibility location preferred
  • Laboratory Director
  • Adjacent to meeting and office areas
  • Provide Security
  • Maintain Privacy
  • Special Features
  • Storefront/mullion-free glass
  • Projection Wall Screens
  • Privacy glass between center and adjacent
    conference room
  • Programmable lighting
  • Standalone HVAC system
  • Window treatment - morning glare

11
Location
Wilson Hall Main Entrance
Cafeteria
12
Renderings
13
Consoles
  • Three Bids submitted
  • Consider Cost Specifications
  • Same Vendor as for CCC selected

14
Construction Summary
  • Safety
  • No injuries
  • One incident
  • On-time
  • 12-week schedule
  • Under budget

15
Construction Slide Show
16
Computing
  • Separate Computing Systems/Platforms for
  • Consoles
  • Outreach
  • Videoconferencing
  • Projectors
  • Gateway
  • Server
  • Protected access as appropriate

17
Early Use
  • Current Organization
  • Engineering Working Group
  • Operations Support Team
  • CMS WorkingGroup
  • LHC Working Group
  • Outreach Working Group
  • Primary user to date is CMS
  • Tier-1 Computing support
  • Remote shifts by Tracking group from March - May
  • Test beam activities by HCAL group
  • Currently
  • Global Commissioning runs
  • CMS data operations - CSA07 computing challenge
  • LARP
  • SPS Beam Study period
  • LHC Hardware Commissioning
  • LHC_at_FNAL Software
  • Applications development

18
Noteworthy Features
  • Features that are currently available
  • CERN-style consoles with 8 workstations shared by
    CMS LHC
  • Videoconferencing installed for 2 consoles, can
    be expanded to 4 consoles
  • Webcams for remote viewing of LHC_at_FNAL
  • Secure keycard access to LHC_at_FNAL
  • Secure network for console PCs (dedicated subnet,
    physical security, dedicated router with Access
    Control Lists to restrict access, only available
    in LHC_at_FNAL)
  • 12-minute video essay displayed on the large
    Public Display used by docents from the
    Education Department to explain CMS and LHC to
    tour groups
  • High Definition (HD) videoconferencing system for
    conference room
  • HD viewing of LHC_at_FNAL, and HD display
    capabilities in the centre
  • Secure group login capability for consoles, with
    persistent console sessions
  • Role Based Access Control (RBAC) for the LHC
    controls system (LAFS)
  • Screen Snapshot Service (SSS) for CMS and the LHC
    controls system

19
Details Special Features
  • Role Based Access Control (RBAC) An approach to
    restrict system access to authorized users.
  • What is a ROLE?
  • A role is a job function within an organization.
  • Examples LHC Operator, SPS Operator, RF Expert,
    PC Expert, Developer,
  • A role is a set of access permissions for a
    device class/property group
  • Roles are defined by the security policy
  • A user may assume several roles
  • What is being ACCESSED?
  • Physical devices (power converters, collimators,
    quadrupoles, etc.)
  • Logical devices (emittance, state variable)
  • What type of ACCESS?
  • Read the value of a device once
  • Monitor the device continuously
  • Write/set the value of a device
  • Status
  • Deployed at the end of June 2007
  • This is a FNAL/CERN collaboration (LAFS) working
    on RBAC for the LHC control system.

The software infrastructure for RBAC is crucial
for remote operations in that it provides a
safeguard. Permissions can be setup to allow
experts outside the control room to read or
monitor a device safely.
20
Details Special Features
  • Screen Snapshot Service (SSS) An approach to
    provide a snapshot of a graphical interface to
    remote users.
  • What is a snapshot?
  • An image copy of a graphical user interface at a
    particular instance in time.
  • Examples DAQ system buffer display, operator
    control program,
  • A view-only image, so there is no danger of
    accidental user input.
  • Initially implemented for desktops, but could be
    targeted to application GUIs.
  • What is the role of the service?
  • Receives and tracks the snapshots from the
    monitored applications.
  • Caches the snapshots for short periods of time.
  • Serves the snapshots to requesting
    applications/users.
  • Prevents access from unauthorized
    applications/users.
  • Acts as a gateway to private network applications
    for public network users.
  • How does this work?
  • Applications capture and send snapshots to the
    service provider in the background.
  • Users would access snapshots using a web browser.
  • Status

21
Future Plans
  • Ramp up CMS shifts
  • LHC Hardware Commissioning
  • Keep LHC Project Associates engaged after return
  • US/LARP deliverable monitoring
  • LAFS
  • Continue Applications development
  • LHC Beam Participation
  • SPS and other Injector Beam Studies
  • LARP Instrumentation
  • LHC Commissioning and Beam Studies (especially
    for luminosity upgrades)

22
Summary
  • Remote operations is the next step to enable
    collaborators to participate in operations from
    anywhere in the world. The goals are to have
  • secure access to data, devices, logbooks,
    monitoring information, etc.
  • safeguards, so actions do not jeopardize or
    interfere with operations
  • collaborative tools for effective remote
    participation in shift activities
  • remote shifts to streamline operations.
  • Fermilab has built the LHC_at_FNAL Remote Operations
    Center, which is shared by scientists and
    engineers working on the LHC and CMS.
  • Collaborative Design
  • Built rapidly
  • For the LHC it provides a means to participate
    remotely in LHC studies, access to monitoring
    information, a training facility, and supports
    the collaborative development of software for the
    LHC controls system.
  • For CMS it provides a location (in a U.S. time
    zone) for CMS remote commissioning and operations
    shifts, and Tier-1 grid monitoring shifts.
  • Already a popular stop for visitors and
    dignitaries.
  • http//cd-amr.fnal.gov/remop/remop.html

23
Acknowledgements
  • LHC_at_FNAL Task Force
  • Especially Erik Gottschalk from whom many slides
    were taken
  • Design, Engineering, and Construction Team from
    Fermilab/FESS
  • Gary Van Zandbergen
  • Steve Dixon
  • Merle Olson
  • Tom Prosapio
  • CERN AB/OP
  • Especially Djanko Manglunki
  • Staff of Sites Visited
  • Users of Facility
Write a Comment
User Comments (0)
About PowerShow.com