GridBench: A Tool for Benchmarking Grids - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

GridBench: A Tool for Benchmarking Grids

Description:

Develop and tune applications. Compare: job submission services, resource allocation policies, scheduling algorithms ... Detect faults and misconfigurations ... – PowerPoint PPT presentation

Number of Views:19
Avg rating:3.0/5.0
Slides: 35
Provided by: gridU
Category:

less

Transcript and Presenter's Notes

Title: GridBench: A Tool for Benchmarking Grids


1
GridBench A Tool for Benchmarking Grids
  • George Tsouloupas Marios Dikaiakos
  • High Performance Computing Lab
  • University of Cyprus
  • georget,mdd_at_ucy.ac.cyhttp//grid.ucy.ac.cy

2
Overview
  • Benchmarking, Challenges and Users
  • Related Work
  • Our approach to performance evaluation
  • GridBench Architecture and Metadata
  • The Tool Interface
  • Results
  • Work in progress
  • Conclusion

3
Challenges
  • Heterogeneous system
  • Hardware, Software and Configuration
  • Non-exclusive use of resources
  • Dynamic environment
  • Distinct administrative domains
  • Find resources, execute benchmark, collect and
    interpret results
  • In short too many variables.

4
Benchmark Users
  • End-users
  • Need to know the capabilities of resources when
    running similar codes.
  • Developers
  • Develop and tune applications
  • Compare job submission services, resource
    allocation policies, scheduling algorithms

5
Benchmark Users
  • Architects/Administrators
  • Improve system design
  • Detect faults and misconfigurations (indicated by
    unexpected results)
  • Compare implementations/systems
  • Researchers
  • Benchmarks can give insight into how Grids work
    and perform
  • Could provide better understanding the nature of
    grids in general.

6
Related Work
  • Probes Benchmark Probes for Grid Assessment
    Chun et al. 2003
  • Grid Benchmarking Research Group (Global Grid
    Forum) (CIGB etc.)Specification Version 1.0
    Wijngaart and Frumkin
  • Benchmarks for Grid Computing Snavely et al.
    2003
  • GridBench (CrossGRID - WP2)
  • Prototype Documentation for GridBench version
    1.0, 2003.
  • Software Requirements Specification, version 1.1
    for GridBench , 2002.
  • GridBench A Tool for Benchmarking the Grid
    Tsouloupas and Dikaiakos 2003

7
Our requirements of a Grid benchmarking tool.
  • Make it easy to conduct experiments
  • Allow the measurement of different aspects of
    the system's performance (Micro-benchmarks,probes
    ,application benchmarks)
  • Should maintain a history of measurements.
  • Accomodate retrieval and comparizon of results
  • Collect monitoring information to help with
    result interpretation.

8
Grid Infrastructure Architecture
Wide Area Network
Central Services (VO, Resource Broker, etc.)
VirtualOrganization
Site
Site
Site
Computing Element
Storage Element
Computing Element
Storage Element
Computing Element
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
Worker Node
9
A layered approach to benchmarking
GridsInfrastructure viewpoint
Individual Resources(cluster nodes, Storage
Elements) Sites (clusters, SMPs) Grid
Constellation(multiple sites, Vos)
  • Layers (Individual Resources, Sites,
    Constellations)
  • Conjecture Layered approach provides a more
    complete perspective on the system under study.

10
A layered approach to benchmarking
GridsSoftware viewpoint
  • Micro-benchmarks -- isolate basic performance
    characteristics
  • Micro-kernel Benchmarks -- synthetic codes
  • Application Benchmarks -- derived from real
    applications

11
GridBench A Tool for Benchmarking Grids.
  • Provides a simple scheme for specifying benchmark
    executions.
  • Provides a set of Benchmarks to characterize
    Grids at several levels.
  • Provides mechanisms for executing benchmarks and
    collecting the results.
  • Archives benchmark specifications and results for
    comparison and interpretation.
  • Provides simple result management tools.
  • Provides a user interface for the above

12
Software Architecture Perspective
13
Software Architecture
14
GridBench Meta-data
benchmark
1..
1..
0..
metric
monitor
Component
0..
0..
1..
parameter
valueVector
parameter
0..
constraint
  • GridBench Definition Language(XML-based)
  • Definition and results co-exist (in archive) in
    the same structure.
  • Intermediate form that allows for easy
    transformation to different job desctiption
    formats

corequisite/prerequisite
0..
0..
0..
parameter
monitor
0..
0..
metric
valueVector
0..
1..
location
resource
15
GridBench Definition Language example
16
Archival/Publication of Results
  • Earlier versions
  • Published to local MDS easy access by users and
    schedulers
  • Recent Versions
  • Benchmark results are archived in a native XML
    database (Apache Xindice)
  • Flexibility.
  • Allows for statistical analysis of results
  • The Benchmark results are associated with
  • GBDL definition. -- Results are meaningless
    without the specific parameters
  • Monitoring data. -- Comprehension/Analysis of
    results is enhanced when combined with monitoring
    data.

17
The Tool
The Definition Interface
1-Pick a benchmark
2- Configure it
18
The Generated GBDL
19
The Browsing Interface
List of Benchmark Executions
Query
Metrics from Selected Executions
Metrics
20
Result management tools
Metrics from Selected Executions
(Can be used to compare Similar metrics)
Drag Drop
21
EPStream submitted to three Computing Elements
Results from EPStream(screenshots)
  • Different Colors represent different Worker Nodes
  • Measures Memory Bandwidth in MB/s

Two EPStream submissions to cluster.ui.sav.sk
22
MPPTest (blocking) submitted to three Computing
Elements
Results from MPPTest
Three MPPTest submissions to apelatis.grid.ucy.ac.
cyusing 2 and 4 nodes.
23
Nine HPL executions on cluster.ui.sav.sk using
various parameters and number of nodes
Results from High Performance Linpack
24
Summary
  • Layered approach to Grid performance evaluation
  • GridBench Definition Language
  • Definition of how and where benchmarks will run.
  • Automatic generation of job descriptions.
  • Utility components
  • Ease execution, and collection of results
  • Result management
  • GUI tool for running/browsing
  • Easy execution on grid resources
  • Initial set of results

25
Conclusion
  • The mechanism/meta-data for defining and
    executing the benchmarks makes it very easy to
    take measurements.
  • XML Storage of definitions and results proved
    rather complicated to query, but quite flexible.
  • The tool prototype is in place, being tested, has
    provided some initial results, and is ready for
    the next revision.
  • Porting Benchmarks to the Grid not as
    straight-forward as anticipated (heterogeneity of
    resources, configuration, libraries)
  • Benchmarks are a great tool for detecting flaws
    in hardware, software and configuration.

26
Work-In-Progress
  • Complete GBDL specification, possibly building
    upon the work of the GGF Job Submission
    Definition Language work-group
  • Implementation of more benchmarks focusing on
    Application-based benchmarks (CrossGrid and other
    applications)

Future Work
  • Integaration with Monitoring tools
  • Result interpretation and tools to assist
    interpretation.

27
Acknowledgments
  • Funded by
  • Part of
  • In cooperation with
  • Many thanks to Dr. Pedro TrancosoUniversity of
    Cyprus.

28
  • Questions,
  • Comments,
  • Suggestions.

http//grid.ucy.ac.cyThank you.
29
Additional Slides
30
Translation to JDL/RSL
  • XML-based GBDL to Job Description
  • Support for simple jobs can be through the use of
    simple templates. (executable, parameters and
    locations are transformed to simple RSL/JDL)
  • Most benchmarks need special command-line
    parameter formatting, or parameter files.

GBDL
Param Handler
Translator
JDL
RSL
...
31
Translation Example GBDL to RSL
  • RSL
  • ((resourceManagerContactce1.grid.ucy")
  • (label"subjob 0")
  • (environment
  • (GLOBUS_DUROC_SUBJOB_INDEX 0))
  • (count2)
  • (arguments"-n 1000")
  • (executable"/bin/myexec" ))
  • ((resourceManagerContact"ce2.grid.ucy")
  • (label"subjob 1")
  • (environment
  • (GLOBUS_DUROC_SUBJOB_INDEX 1))
  • (count2)
  • (arguments"-n 1000")
  • (executable"/bin/myexec" ))

32
The Benchmark Suite
  • Micro-benchmarks at the Worker-node level
  • EPWhetstone embarrassingly parallel adaptation
    of the serial whetstone benchmark.
  • EPStream adaptation of the Stream benchmark.
  • BlasBench evaluate serial performance of the
    BLAS routines.
  • Micro-benchmarks at the Site level
  • Bonnie Storage I/O performance
  • MPPTest MPI performance measurements
  • Micro-benchmarks at the VO level
  • MPPTest MPI performance measurements (spanning
    sites)
  • gb_ftb File Transfer Benchmark
  • Micro-kernels at the Site level
  • High-Performance Linpack
  • Selected kernels from the NAS Parallel Benchmarks
  • Micro-kernels at the VO level
  • Computationally Intensive Grid Benchmarks

33
The Benchmark Suite (cont'd)
  • Application-Kernel benchmarks at the site level
  • CrossGrid application-kernels
  • Application-Kernel benchmarks at the VO level
  • CrossGrid Applications

34
EPWhetstone submitted to two Computing Elements
Results from EPWhetstone
  • Different Colors represent different Worker Nodes
  • Measures Whetstone MIPS

Three EPWhetstone submissions to
apelatis.grid.ucy.ac.cy
Write a Comment
User Comments (0)
About PowerShow.com