Benchmarking Tools and Assessment Environment for Configurable Computing - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Benchmarking Tools and Assessment Environment for Configurable Computing

Description:

Title: Benchmarking Tools and Assessment Environment for Configurable Computing Author: HTC/IS KeyServer Last modified by: Richard B. Katz Created Date – PowerPoint PPT presentation

Number of Views:151
Avg rating:3.0/5.0
Slides: 16
Provided by: HTCISKe
Category:

less

Transcript and Presenter's Notes

Title: Benchmarking Tools and Assessment Environment for Configurable Computing


1
Benchmarking Tools and Assessment Environment
for Configurable Computing
MAPLD 98 September 15-16, 1998
Sanjaya Kumar, Subburajan Ponnuswamy, Chirag
Nanavati, John Golusky, Mark Vojta, Luiz Pires
E-mail skumar_at_htc.honeywell.com Honeywell
Technology Center 3660 Technology
Drive Minneapolis, MN 55418
2
Program Overview
  • Provides a publicly available suite of benchmarks
    for evaluating configurable computing
    infrastructures, both tools and architectures
  • Addresses benchmark specification, procedures,
    metrics, and wide availability
  • Extends benchmarking technology to configurable
    computing
  • Benchmarks are being implemented on a
    configurable computing platform
  • Six benchmarks have been developed, four more
    being planned

3
Unique Aspects
  • Utilizes stressmarks to supplement existing
    functional benchmarks
  • First effort to specify and measure a set of
    characteristics relevant to configurable
    computing systems
  • Addresses a broad range of configurable
    architectures, beyond just single FPGAs
  • Approach is based on an unbiased and technology
    independent benchmark specification methodology
  • Provides a better understanding of how
    configurable computing systems relate to others
    within the design space

Scalability
Interfacing
Capacity
Others
Versatility
Timing Sensitivity
4
Versatility Stressmark
  • Measures an infrastructures ability to perform a
    sequence of distinct, computational steps
    efficiently
  • A space-time trade-off stressmark, possibly
    employing run-time reconfiguration
  • Based on a wavelet image compression algorithm
  • Minimum QoS must be maintained (PSNR and
    compressed bit rate)
  • Metrics include
  • Total elapsed time
  • Reconfiguration overhead
  • FPGA area utilized

5
Capacity Stressmark
  • Measures an architectures usable capacity using
    a Huffman compression algorithm
  • Alphabet defined with K characters, each with a
    frequency f of occurrence
  • Huffman compression tree (at top left) and
    look-up table (bottom left), based on tree,
    constructed using software
  • Objective is to implement the largest look-up
    table possible
  • Metrics include largest table size and look-up
    speed three different approaches
  • Standard VHDL/automatic PR
  • Standard VHDL/manual PR
  • Custom

6
Timing Sensitivity Stressmark
  • Measures an infrastructures ability to implement
    a time-critical computation
  • Based on (COordinate Rotation DIgital Computer)
    CORDIC algorithm for vector rotation
  • Pipelined stages stress both architecture and CAD
    tools (place and route is an important issue)
  • Metrics include latency, throughput, and area
    utilized two different approaches
  • Automatic PR
  • Manual PR

CORDIC 1
MEMORY
Pre-Rotate
Read
CORDIC 2
Scale
Write
CORDIC 12
7
Interfacing Stressmark
  • Measures an infrastructures ability to implement
    an application on a platform consisting of GPPs,
    ASPs, and FPGAs
  • Based on RT Parallel Benchmark Suites Constant
    False Alarm Rate (CFAR) kernel
  • CAD tools include those that perform
    hardware/software partitioning and mapping
  • Metrics include total elapsed time, communication
    time, and speedup due to configurable elements

Hw/Sw Partitioning/Mapping CAD Tools
GPP
ASP
FPGA
8
Scalability Stressmark
Application
  • Measures an infrastructures ability to implement
    an application on a multi-device platform
  • Based on Fast Fourier Transform (complex,
    fixed-point)
  • CAD tools include those that perform partitioning
    and mapping (PM)
  • Metrics include total elapsed time, speedup,
    efficiency, and area utilized possible
    approaches
  • Automatic vs manual PM
  • Automatic vs manual PR

Partitioning/Mapping CAD Tools
Multi-device Platform
9
CAD Benchmark
  • Measures the ability of an infrastructure to
    implement a time-consuming CAD application
  • Based on Boolean satisfiability (SAT)
  • Commonly used for automatic test pattern
    generation and logic synthesis/verification
  • Search intensive problem
  • Benchmark developed in conjunction with Princeton
    University
  • Metrics include total elapsed time and area
    utilized, three different approaches
  • Standard automatic
  • Standard manual
  • Custom

10
Tools and Platform Information
  • Synopsys Design Compiler on a SUN Ultra-SPARC
    running Solaris 2.6
  • Xilinx XACT and M1 place and route tools on a PC
    containing a 166 MHz Pentium running Windows 95
  • Annapolis Micro Systems WILDFORCE Board (1 Xilinx
    4025, 4 Xilinx 4013 devices, 8 MBytes of memory)
  • Preliminary results for many of the benchmarks
    tabulated, being reviewed by Dr. José Muñoz
    (DARPA PM), and further refined

11
Versatility Implementation - 2D Wavelet
Atmel 6010
SUN UltraSPARC
SUN UltraSPARC
Exec Time
97 ms
226 ms
45 ms
Clock Freq
333 MHz
167 MHz
16 MHz
NA
Utilization
NA
62
  • Atmel results obtained from Honeywell SASSO (Gary
    Gardner) as part of NASAs RHrFPGA program
  • Several others within ACS community implementing
    the versatility stressmark, results not available
    at this time

12
Plans for Existing ACS Benchmark Suite
  • Update benchmarks as needed (benchmark
    specification documents, C code, VHDL code,
    miscellaneous fixes)
  • Work with AFRL to provide a mechanism for
    submitting results through the DARPA/AFRL
    Benchmarking web page (www.rl.af.mil/programs/hpcb
    ench), Ralph Kohler - POC
  • Address any suggestions that you may have to
    improve the benchmarks/web site

13
Additional Benchmarks
  • BM 1 Micro-kernel benchmark, based on discrete
    cosine transform (DCT), working with Herman
    Schmit/Seth Goldstein (CMU)
  • BM 2 INFOSEC benchmark, based on SHA-1,
    discussing with Alan Hunsberger (NSA), Burt
    Kalisky (RSA Labs), and Anant Agarwal (MIT)
  • BM 3 Data dependent computations benchmark,
    based on electronic counter-measures application,
    discussing with Rick Pancoast (Lockheed Martin)
  • BM 4 Variable precision arithmetic benchmark,
    discussing with Rajeev Jain (UCLA) and Phillip
    Duncan (Angeles Design Systems)
  • Seems to be a continuing need to develop
    benchmarks corresponding to system level
    applications
  • Will be implemented on an Annapolis Micro Systems
    STARFIRE, PCI-based board utilizing Virtex FPGAs

14
Schedule
15
Summary
  • Preliminary implementation results tabulated
    (being reviewed)
  • Benchmarks can be downloaded www.rl.af.mil/progra
    ms/hpcbench
  • Deliverables
  • Benchmarking methodology document
  • Benchmark specification documents
  • C and VHDL code
  • Four additional benchmarks being developed
  • For more information, visit www.htc.honeywell.com
    /projects/acsbench

DARPA/NASA
Design/Evaluation Tool
Relating ACS Architectures to Others
ACS Benchmarking Technology
Trade-off and Selection Technology
Demonstrating Advantages of ACS Technology
App Developers Procurement Agencies
Developers of Embedded HPC Technologies
Write a Comment
User Comments (0)
About PowerShow.com