StagedDBCMP: Designing Database Servers for Modern Hardware - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

StagedDBCMP: Designing Database Servers for Modern Hardware

Description:

Kun Gao, Nikos Hardavellas, Ippokratis Pandis, Stavros ... QPipe: full-system evaluation on BerkeleyDB. memory hierarchy. I. I. D. D. L1. L2-L3. RAM ... – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 18
Provided by: stavroshar
Category:

less

Transcript and Presenter's Notes

Title: StagedDBCMP: Designing Database Servers for Modern Hardware


1
StagedDB/CMPDesigning Database Servers for
Modern Hardware
  • Anastassia Ailamaki
  • Joint work with
  • Kun Gao, Nikos Hardavellas, Ippokratis Pandis,
    Stavros Harizopoulos, and Babak Falsafi

2
Evolution of hardware design
  • Then HW CPUMemoryDisk

the 80s
today
1 cycle 10 300
DSS stress I/O subsystem
CPUs run faster than they access data
3
CMP, HT, and memory hierarchies
DBMS core design contradicts above goals
4
Traditional database system design
  • Requests (queries) handled by threads
  • Threads execute independently
  • No means to exploit common data/instructions

D
C
DBMS
StagedDB
D
C
New design to expose locality across threads
5
Staged Database Systems
CIDR03
DBMS
queries
Conventional
  • Organize system components into stages
  • No need to change algorithms / structures

High concurrency locality across requests
6
Where is time spent in a DBMS?
Query
PARSER
Query tree
OPTIMIZER
catalogs and statistics
Query plan
operators
EXECUTION
Data
Answer
Query execution engine 90 of response time
7
Conventional One query, many ops
  • Queries are evaluated independently
  • Newer hardware allows higher concurrency
  • More opportunities to share across queries

QUERY ENGINE
8
QPipe operation-level parallelism
SIGMOD05
read
read
write
9
TPC-H workload
throughput (queries/hr)
number of clients
  • Clients use pool of 8 TPC-H queries
  • QPipe reuses large scans, runs up to 2x faster
  • ..while maintaining low response times

10
Online Transaction Processing
  • Query processing operator code gtgt L1-I cache
  • L1-I stalls are 20-40 of execution time
  • Instruction caches cannot grow

Goal instruction cache-residency
11
Synchronized Transactions through Explicit
Processor Scheduling
no STEPS
STEPS
VLDB04
thread 1
thread 1
thread 2
thread 2
select( ) s1 s2 s3
select( ) s1 s2 s3 s4 s5 s6 s7
M M M M
Miss M M M M M M M
select( ) s1 s2 s3
code fits in I-cache
Hit H H H
select( ) s1 s2 s3 s4 s5 s6 s7
M M M M M M M M
s4 s5 s6 s7
M M M M
context-switch point
s4 s5 s6 s7
H H H H
  • Index probe we eliminate 96 of L1-I misses
  • TPC-C we eliminate 2/3 of misses, 1.4 speedup

12
Summary of Results
memory hierarchy
  • 96 fewer I-cache misses
  • STEPS full-system evaluation on Shore
  • 2x throughput improvement
  • QPipe full-system evaluation on BerkeleyDB

L1
D
I
L2-L3
I
D
RAM
Disks
13
Current Prototype
CIDR03,VLDB04,SIGMOD05
Availability, scalability, easy maintenance, high
performance at a low implementation cost
14
StagedCMP HW/SW co-design
  • HW support for StagedDB software
  • Hardware
  • Multi-core processors
  • NUCA (Sea-of-Caches)
  • Streaming buffers
  • Software
  • Staged Database Systems (StagedDB)
  • Traditional DBMS (Oracle, DB2, PostgreSQL)

NUCA for Scientific
NUCA for OLTP
15
StagedCMP StagedDB on Multicore
  • each mEngine run independently on cores
  • Dispatcher routes incoming tasks to cores
  • Decision factors in
  • Amount of work sharing at each uEngine
  • Load of each uEngine

CPU 1
CPU 2
Dispatcher
Potential better work sharing, load balance on
CMP
16
Example Accesses on shared L2
TPCC, DB2 v8.2
StagedDB can improve breakdown of memory accesses
17
Conclusions
  • StagedDB
  • Service-based data management
  • Suitable for deep memory hierarchies
  • Operation-level parallelism
  • Suitable for massively parallel hardware
  • StagedCMP
  • Hardware support for staged architectures
  • Transfer software overheads to hardware

18
  • Thank you

www.cs.cmu.edu/StagedDB
Write a Comment
User Comments (0)
About PowerShow.com