Oxford University Particle Physics Unix Overview - PowerPoint PPT Presentation

Loading...

PPT – Oxford University Particle Physics Unix Overview PowerPoint presentation | free to view - id: 75c933-Mzc1M



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Oxford University Particle Physics Unix Overview

Description:

Oxford University Particle Physics Unix Overview Pete Gronbech Senior Systems Manager and SouthGrid Technical Co-ordinator 14th October 2010 Graduate Lectures * – PowerPoint PPT presentation

Number of Views:67
Avg rating:3.0/5.0
Slides: 26
Provided by: Physic170
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Oxford University Particle Physics Unix Overview


1
Oxford University Particle Physics Unix Overview
  • Pete Gronbech
  • Senior Systems Manager and
  • SouthGrid Technical Co-ordinator

2
  • Strategy
  • Local Cluster Overview
  • Connecting to it
  • Grid Cluster
  • Computer Rooms

3
Particle Physics Strategy The Server / Desktop
Divide
Virtual Machine Host
Servers
General Purpose Unix Server
Linux Worker nodes
Group DAQ Systems
Web Server
Linux FileServers
NIS Server
torque Server
Win XP PC
Win XP PC
Win 7 PC
Win 7 PC
Linux Desktop
Desktops
Approx 200 Windows XP Desktop PCs with Exceed,
putty or ssh used to access central Linux systems
4
Particle Physics Linux
  • Unix Team (Room 661)
  • Pete Gronbech - Senior Systems Manager and
    SouthGrid Technical Coordinator
  • Ewan MacMahon - Systems Administrator
  • Kashif Mohammad - Deputy Technical Coordinator
  • Aim to provide general purpose Linux based system
    for code development and testing and other Linux
    based applications.
  • Interactive login servers and batch queues are
    provided
  • Systems run Scientific Linux which is a free Red
    Hat Enterprise based distribution
  • Systems are currently running a mixture of SL4,
    and SL5
  • The Systems are being migrated to SL5 currently,
    this is the same version as used on the Grid and
    at CERN. Students are encouraged to test pplxint5
    to let us know of any problems.
  • Worker nodes form a PBS (aka torque) cluster
    accessed via batch queues.

5
Current Clusters
  • Particle Physics Local Batch cluster
  • Oxfords Tier 2 Grid cluster

6
PP Linux Batch Farm
Scientific Linux 4 88 active slots
NFS Servers
pplxwn01
8 Intel 5420 cores
6TB
pplxwn02
8 Intel 5420 cores
Data Areas
8 Intel 5420 cores
pplxwn03
4TB
pplxwn04
8 Intel 5420 cores
10TB
19TB
pplxwn05
8 Intel 5420 cores
LHCb Data
pplxfs2
pplxwn06
8 Intel 5420 cores
lustre
pplxwn07
8 Intel 5420 cores
9TB
pplxwn08
8 Intel 5420 cores
CDF Data
pplxfs3
pplxwn09
8 Intel 5420 cores
19TB
pplxwn10
8 Intel 5420 cores
19TB
lustre
pplxwn11
8 Intel 5420 cores
Data Areas
ATLAS Data
pplxfs4
19TB
Alias to pplxint 2
pplxint3
19TB
lustre
pplxint2
8 Intel 5420 cores
Home areas
pplxfs6
pplxint1
Interactive login nodes
7
Particle Physics Computing
df -h /data/atlas Filesystem Size
Used Avail Use Mounted on pplxlustrenfs.physics.o
x.ac.uk/data/atlas 76T 46T 27T
64/data/atlas gronbech_at_pplxint2gt df -h
/data/lhcb Filesystem
Size Used
Avail Use Mounted on pplxlustrenfs2.physics.ox.ac
.uk/data/lhcb 18T 8.5T 8.6T
50 /data/lhcb
8
PP Linux Batch Farm
Scientific Linux 5 migration plan
pplxwn16
8 Intel 5420 cores
pplxwn15
8 Intel 5420 cores
pplxwn14
8 Intel 5420 cores
pplxwn13
8 Intel 5420 cores
pplxwn12
8 Intel 5420 cores
8 Intel 5345 cores
pplxwn19
8 Intel 5345 cores
pplxwn18
Currently acting as NFS Lustre gateways for the
SL4 nodes
pplxint5
Interactive login nodes
pplxint6
9
http//pplxconfig.physics.ox.ac.uk/ganglia
10
Strong Passwords etc
  • Use a strong password not open to dictionary
    attack!
  • fred123 No good
  • Uaspnotda!09 Much better
  • Better to use ssh with a passphrased key stored
    on your desktop.

11
Connecting with PuTTY
  • Demo
  • Plain ssh terminal connection
  • With key and Pageant
  • ssh with X windows tunnelled to passive exceed
  • ssh, X windows tunnel, passive exceed, KDE
    Session
  • http//www.physics.ox.ac.uk/it/unix/particle/XTunn
    el20via20ssh.htm

12
(No Transcript)
13
Puttygen to create an ssh key on Windows
Paste this into /.ssh/authorized_keys on
pplxint If you are likely to then hop to other
nodes add ForwardAgent yes to a file called
config in the .ssh dir on pplxint Save the
public and private parts of the key to a
subdirectory of your h drive
14
Pageant
  • Run Pageant once after login to load your
    (windows ssh key)

15
SouthGrid Member Institutions
  • Oxford
  • RAL PPD
  • Cambridge
  • Birmingham
  • Bristol
  • JET at Culham

16
Oxford Tier 2 Grid Upgrade 2008
  • 13 systems, 26 servers, 52 cpus, 208 cores. Intel
    5420 clovertown cpus provide 540KSI2K
  • 3 servers each providing 20TB usable storage
    after RAID 6, total 60TB
  • One rack, 2 PDUs, 2 UPSs, 3 3COM 5500G switches

17
2010 Upgrade Due in November
  • Compute Servers
  • Twin squared nodes
  • Dual 8 core AMD Opteron 6128 CPUs provide 64
    cores per unit.
  • Storage
  • 24 2TB disks per unit (44TB after RAID6)
  • Increase in LHCb capacity
  • Allow migration off older servers
  • 362TB disks per unit (68TB after RAID6)
  • Grid Cluster upgrade 200TB

18
Get a Grid Certificate
Must remember to use the same web browser to
request and retrieve the Grid Certificate. Once
you have it in your browser you can export it to
the Linux Cluster to run grid jobs. Details of
these steps and how to request membership of the
SouthGrid VO (if you do not belong to an existing
group such as ATLAS, LHCb) are here http//www.gr
idpp.ac.uk/southgrid/VO/instructions.html
19
Two New Computer Rooms provide excellent
infrastructure for the future
The New Computer room built at Begbroke Science
Park jointly for the Oxford Super Computer and
the Physics department, provides space for 55
(11KW) computer racks. 22 of which will be for
Physics. Up to a third of these can be used for
the Tier 2 centre. This 1.5M project is funded
by SRIF and a contribution of 200K from Oxford
Physics. The room was ready in December 2007.
Oxford Tier 2 Grid cluster was moved there during
spring 2008. All new Physics High Performance
Clusters will be installed here.
20
Oxford Grid Cluster
21
Local Oxford DWB Physics Infrastructure Computer
Room
Completely separate from the Begbroke Science
park a computer room with 100KW cooling and
gt200KW power has been built. 150K Oxford
Physics money. Local Physics department
Infrastructure computer room. Completed September
2007. This allowed local computer rooms to be
refurbished as offices again and racks that were
in unsuitable locations to be re housed.
22
The end for now
  • Ewan will give more details of use of the
    clusters next week
  • Help Pages
  • http//www.physics.ox.ac.uk/it/unix/default.htm
  • http//www.physics.ox.ac.uk/pp/computing/
  • Email
  • pp_unix_admin_at_physics.ox.ac.uk
  • Questions.
  • Network Topology

23
Network
  • Gigabit connection to campus operational since
    July 2005.
  • Second gigabit connection installed Sept 2007.
  • Dual 10 gigabit links installed August 2009
  • Gigabit firewall installed for Physics. Purchased
    commercial unit to minimise manpower required for
    development and maintenance. Juniper ISG 1000
    running netscreen.
  • Firewall also supports NAT and VPN services which
    is allowing us to consolidate and simplify the
    network services.
  • Moving to the firewall NAT has solved a number of
    problems we were having previously, including
    unreliability of videoconferencing connections.
  • Physics-wide wireless network. Installed in DWB
    public rooms, Martin Wood,AOPP and Theory. New
    firewall provides routing and security for this
    network.

24
Network Access
Super Janet 4
2 10Gb/s with Super Janet 5
Physics Firewall
Physics Backbone Router
1Gb/s
OUCS Firewall
1Gb/s
10Gb/s
Backbone Edge Router
10Gb/s
100Mb/s
Campus Backbone Router
1Gb/s
10Gb/s
depts
Backbone Edge Router
depts
100Mb/s
depts
100Mb/s
depts
25
Physics Backbone
Linux Server
1Gb/s
Physics Firewall
Server switch
1Gb/s
Win 2k Server
1Gb/s
1Gb/s
Particle Physics
1Gb/s
100Mb/s
Physics Backbone Router
100Mb/s
1Gb/s
desktop
Clarendon Lab
100Mb/s
1Gb/s
desktop
1Gb/s
1Gb/s
100Mb/s
Astro
Atmos
Theory
About PowerShow.com