White Rose Grid - PowerPoint PPT Presentation

Loading...

PPT – White Rose Grid PowerPoint presentation | free to download - id: 16e54d-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

White Rose Grid

Description:

Sheffield: Peter Fleming, Mike Griffiths. York: Jim Austin, Aaron Turner, Mark Hewitt ... Research Training Programme at Sheffield (M.Griffiths, D Savaz) ... – PowerPoint PPT presentation

Number of Views:71
Avg rating:3.0/5.0
Slides: 23
Provided by: joannas9
Learn more at: http://www.nesc.ac.uk
Category:
Tags: grid | griffiths | rose | white

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: White Rose Grid


1
Embedding e-Science within the White Rose
Universities and Our Region Professor Jie Xu
2
Outline
  • WRG History and Background
  • WRG Organisational Structure
  • WRG System Deployment and Operations
  • The WRG e-Science Centre
  • Research Successes
  • a) Industrial applications
  • b) International collaboration
  • c) Support for e-Social Science
  • Challenges and Lessons Learnt
  • The Way Forward Urgent and important issues

3
WRG History
  • 2000 The three WR Universities decided to
    combine their computing resources using emerging
    Grid technology
  • 2001 A joint procurement of equipment with 2.8M
    SRIF1 investment undertaken
  • 2002 The White Rose Grid (WRG) formally launched
  • (JX joined from Durham in 2003)
  • In the following years further funding from the
    Universities and other sources made it possible
    to update all facilities - in total so far over
    9.1M, generating e-Science research projects to
    the value of 10.7M by 2007 2.5M in 2008
  • The White Rose Grid is one of the projects
    running under auspices of the White Rose
    University Consortium (WRUC), which is a
    strategic partnership between Leeds, Sheffield
    and York their VCs and Pro-VCs as the key WRUC
    members

4
WRG Executive Board
  • The White Rose Grid was setup by the WRG
    Executive Board
  • The Board
  • Profs P Jimack (PM Dew, KW Brodlie) and J Xu
    from Leeds, P Fleming from Sheffield, and J
    Austin from York
  • Drs Julian White (WRUC CEO), J Schmidt, T
    Jackson (2007)
  • Excellent partnership with Computing Services
  • Partners Esteem Systems in conjunction with Sun
    Microsystems Streamline Computing
  • Recently Craig Walker has joined WRUC as a
    Project Development Manager responsible for
    developing and facilitating White Rose projects,
    and supporting staff from all three institutions
    in new collaborations

5
WRG Organisation
  • The White Rose Grid Executive Board used to meet
    every 6-8 weeks now every 4-5 months
  • The White Rose Grid e-Science Centre meets every
    2 weeks
  • (Access Grid face-to-face meetings)
  • Leeds Jie Xu, Joanna Schmidt, Shiv Kaushal
  • Sheffield Peter Fleming, Mike Griffiths
  • York Jim Austin, Aaron Turner, Mark Hewitt
  • The White Rose Technical Team includes staff
    operating WRG systems - Information Systems
    Services (ISS) at Leeds, Corporate Information
    and Computing Services (CICS) at Sheffield, and
    CS at York. This team meets every 3 months
  • e-Science Research Projects PIs and their staff
    (e.g. Virtual Vellum Prof P Ainsworth)

6
WRG Operations
  • WRG activities are supported by a mixture of
    service and research staff, with a complementary
    combination of skills for the development,
    implementation and support of our Grid
  • The Grid technology research element is led by
    Computer Scientists (Leeds, York) whereas the
    necessary operational skills are drawn from the
    Computing Service pool of expertise (Leeds,
    Sheffield) required to support day-to-day service
    on the WRG
  • Technical directions are agreed at our WRG
    Technical Team meetings which are chaired by
    Leeds ISS manager (S Chidlow) and include members
    of the Computing Services, computer scientists as
    well as the current e-Science Centre manager
  • Operational issues are resolved by working
    jointly within the smaller relevant teams

7
WRG Resources
  • Our distinct approach for building the White Rose
    Grid was to bring together the provision of HPC
    services and the emerging Grid technologies
  • In parallel with Grid technologies the WRG offers
    HPC services for our researchers
  • All WRG computational resources are divided into
    two pools 20 are allocated by the WRG Executive
    to Grid-enable applications, Grid middleware
    development, projects of a collaborative nature,
    or projects requiring access to a resource at a
    remote site
  • The remaining 80 are controlled by local sites
    and are used to support more traditional local
    high performance computing

8
Usage of WRG
  • WRG facilities are used by a large number of
    users from a broad range of disciplines. (The
    graph shows the usage by subject area - the cross
    site use of resources is low, though new cross
    site users are continuously being registered)

Physics
Grid middleware tools
9
WRG e-Science Centre
  • User Support, Training and Education
  • Research Training Programme at Sheffield
    (M.Griffiths, D Savaz)
  • Workshops e.g. Taverna Worksflows, the e-Science
    Collaborative Workshop with the Digital Curation
    Centre
  • Seminars e.g. Prof Malcolm Atkinson a lot of
    interest, well attended seminar WR Research
    On-line repository seminar
  • User group activities at York
  • Collaboration with Other e-Science Centres
  • Long-term collaboration with Newcastle, NEReSC
  • Collaboration with other e-Science Centres e.g.
    Oxford e-Research Centre on our EC AssessGrid
    project, potentially might benefit NGS
  • Support for NGS
  • Leeds operates one of the nodes of the NGS. Both
    Sheffield and York are affiliates in NGS

10
Application Portals
  • The Centre investigates the provision of an
    application portal that simplifies access to user
    software applications at Sheffield. We have
    evaluated the EngineFrame portal (an evaluation
    report available now)
  • We have also evaluated the P-Grade portal and
    decided to install it on a system at York and to
    offer a prototype service for our users (this is
    now being implemented)
  • We are also looking at the EASA portal to broaden
    out this service (set of applications) to a
    larger number of users at Sheffield
  • (e.g. Chemical and Process Eng, Electrical and
    Electronic Eng, Mechanical Eng, Medicine and
    Biomedical Sciences)

11
Successes
  • Outcomes of our e-Science and Grid research
    projects e.g. DAME, BROADEN, g-Viz, e-Viz, CoLaB,
    e-Demand, CARMEN, Virtual Vellum, MoSeS, GENeSIS,
    NECTISE
  • Established working multi-campus production Grid
    (WRG) users have access to a larger pool of
    resources and expertise
  • Developed the ability (trust, community,
    mechanisms) to collaborate effectively across the
    WR universities in support of e-Research
  • Linkage with industrial partners e.g.
    Rolls-Royce, BAE Systems
  • Productive engagement with international
    communities
  • Support to NGS outreach with NGS technologies to
    White Rose communities and beyond e.g. Bradford
    University
  • Courses, seminars and workshops on Grids and HPC
  • White Rose Grid application portals (improved
    integration with Sheffield Grid node)

12
Industrial Applications
  • The EPSRC-funded DAME project (Distributed
    Aircraft Maintenance Environment) was the first
    major collaboration between the WR univs, which
    led to the DTI BROADEN project constructing a
    Rolls-Royce pilot Grid as a proving ground for
    utilising Grid services
  • CARMAN is developing a distributed computer
    system that will enable neuroscientists to
    analyse, store and share their data across the UK
  • NECTISE is a five year 9.3M EPSRC/BAES System
    Engineering project aims to advance
    Grid/Network-Enabled Capability (2005 09)

13
International Collaboration
  • The EPSRC CoLaB project (2006 -09) provides
    support for Collaboration between Leeds and
    Beihang (China). The key outcome is the
    production quality Grid middleware CROWN-C, which
    features specific dependability enhancements for
    the development and assessment of high-assurance
    service-oriented systems
  • AssessGrid (EC, 2006 09) addresses obstacles of
    the wide adoption of Grids by bringing risk
    management and assessment to this field
  • Also started collaboration with Clemson
    University to exchange expertise and share
    resources through Globus

14
e-Social Science
  • IBHIS is a joint EPSRC project for healthcare
    information integration from distributed sources
    (2003 05)
  • The ESRC-funded MoSeS (Modelling and Simulation
    for e-Social Science) project was performed in
    the National e-Social Science Centres node at
    Leeds (2005 08)
  • GENeSIS (Generative e-Social Science or MoSeS 2)
    is our latest ESRC e-Social Science project with
    Geography, Dr Mark Birkin, and UCL for a
    multidisciplinary collaboration (2008 11)

15
Key Challenges
  • Technological
  • Those associated with innovative technologies
    (e.g. immaturity of software/middleware, poor
    usability, lack of documentation, steep learning
    curve, individual products rather than an
    integrated environment)
  • Integration - embedding the use of new tools into
    the forthcoming virtual research environment
    (York)
  • VO management (e.g. user registration)
  • Organisational
  • Geographically distributed teams (e.g. technical
    team)
  • Crossing organisational boundaries
  • Overcoming local historical dependencies
  • User Community
  • Sustaining and growing user community
  • Building user trust to new technologies e.g.
    digital certificates
  • Decreasing funding and thus decreasing interest
    in e-Science

16
Key Failures
  • Not enough success in making e-Science
    omnipresent
  • Grid Middleware and Tools
  • Lack of tools for use on the Grid e.g. still need
    to manually register users no tools for user
    authorisation accounting across Grid
  • Market for computational resources?
  • Lack of readily deployable software that enables
    users to access all Grid resources through a
    simple API or Portal
  • Difficulties with using national X509 based
    certification
  • e-Science Applications
  • Lack of software licensing for Grids
  • Difficult to run applications on the Grid

17
Lessons 1
  • Geographically Distributed Teams
  • Frequent meetings demand regular travelling
  • (using both Access Grid meetings and
    face-to-face meetings)
  • Large Team
  • Very large number of technical staff involved
    lengthened the decision process in all aspects of
    the project
  • Human Interaction
  • The project has crossed organisational
    boundaries and any interaction problem has to be
    managed by clearly defining members
    responsibilities
  • Trust and Ownership
  • Questions of ownership and trust are regularly
    posed (e.g. procurement process and equipment
    location)
  • Reaching Agreement
  • Real difficulty in getting agreement over issues
    and priorities between institutions in a VO (e.g.
    open goals vs deliverables)
  • Effective Communication
  • Crucial for the continuing success of the WRG
    (e.g. academic and technical staff governed by
    different management models)

18
Lessons 2
  • WRG equipment is continuously being upgraded
  • Beowulf Type Systems
  • Consideration should be given to the substantial
    amount of space, air conditioning power and
    electrical power required so that they can be
    installed within the planned time-scale
  • Joint Procurement
  • Offered value for money and helped to form a
    close working Grid support team but additional
    significant resources were required to coordinate
    and agree the very large procurement
  • Separate procurements
  • Much easier to handle by individual sites though
    they must ensure that the procured systems will
    couple using Grid technologies
  • fEC Sustainable Model for WRG Support
  • Models being developed separately. At Leeds
    based on a scientific case and business plan the
    University approved funding 1M every 2 years
    towards HPC (5 faculties pay each 200K every 2
    years towards HPC). Leeds is currently engaged in
    procurement of HPC equipment

19
Lessons 3
  • Lessons learned from operating the WRG
  • Support of the Grid requires a larger support
    team than the combined number of staff supporting
    services at local sites as additional issues
    related to the collaborative service need to be
    resolved in our project there are additional
    staff at the White Rose e-Science Centre level
  • Grid needs new operational procedures agreed
    between sites
  • Historical and local dependencies need to be
    considered when developing new procedures for
    Grid
  • Currently the support team needs to include staff
    experienced in service provision as well as
    research staff. These two categories of staff
    with different skills and approaches are needed
    as Grid technologies have not yet matured and are
    not ready for full production service off the
    shelf
  • Grid technologies must be further developed to
    offer a comprehensive package of services which
    can be easily deployed and supported by service
    staff
  • User trust to new technologies needs to be built
    through both training and development of use
    cases to show their functionality and benefits
  • Users requirements are ever changing and the
    Grid resources need to be constantly upgraded

20
Wish List
  • Mature, easy to use and readily deployable
    integrated environment for Grids (including tools
    for user authentication authorisation)
  • Collaborative e-Infrastructure
  • Collaborative support for e-Science (e.g. list of
    experts)
  • Licensing for software applications on Grids
  • Metascheduler (easy to use, easy to integrate
    with the WRG e-Infrastructure, and available in
    public domain)
  • Training education short self-training
    e-Science courses available on the Web and
    covering a range of topics e.g. use of Grid
    tools, use of e-Infrastructure, e-Science
    technologies and methods

WRG Team at AHM2008
21
WRG The way forward
  • Research Themes
  • Distributed diagnostics optimisation (e.g.
    building on DAME, BROADEN)
  • Distributed data mining for support of decision
    making (e.g. pattern matching - AURA-G CARMEN,
    MoSeS)
  • Service-oriented Grid systems visualization
    (e.g. NECTISE, CoLaB, ADVISE, Gviz)
  • WRG Service Provision
  • Business plan for a pilot study on Software as a
    Service also platform/infrastructure as a
    service

Long-Term Goals Expansion of international
collaborations More multidisciplinary
projects Working with WRG user community to
embed e-Science into WR research
22
WRG The way forward 2
  • Promoting and Nurturing e-Science Approaches
  • Enhance further collaboration across the three
    sites
  • WRG users improve access to WRG resources (e.g.
    metascheduler, portal, user registration)
  • NGS support to NGS and its users
  • Outreach to new communities (e.g. NGS)
  • Training education (e.g. iRODS)
About PowerShow.com