DB-22:%20Zero%20to%2030,154%20in%20Twenty%20Days - PowerPoint PPT Presentation

View by Category
About This Presentation



Performance Optimization of OpenEdge on HP (All platforms) ... Table of Contents. Goal. Key Criteria. Test Systems. Result Summary and Analysis. Lessons Learned ... – PowerPoint PPT presentation

Number of Views:28
Avg rating:3.0/5.0
Slides: 37
Provided by: tomh170
Learn more at: http://download.psdn.com


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: DB-22:%20Zero%20to%2030,154%20in%20Twenty%20Days

DB-22 Zero to 30,154 in Twenty Days
Tom Harris
Director, OpenEdge RDBMS Products
Kent Lipschultz
Technical Alliance Manager, HP
  • Planning
  • Preparation
  • Lab Work
  • Analysis
  • Lessons Learned

Planning Whats a Vendor Lab All About ?
You Get What You Pay For Borrow...
HP Labs HPs Enterprise Solution Alliances
HP Labs HPs Enterprise Solution Alliances
HP ISV Technical Services
Kent Lipschultz(HP St.Paul Minnesota)Technical
Alliance Manager651-982-9794 -
Direct Access to all other HP Labs
Equipment Resources
Benchmarking Centers
HP Labs HPs Enterprise Solution Alliances
Developers Alliance Lab DAL, HP Labs, HP
Cupertino California- Outbound / Inbound-
Benchmarking with OpenEdge (performance
sizing) - Performance Optimization of OpenEdge on
HP (All platforms)- Technology integration (i.e.
MC/SG, OV, VSE, SOA)- Progress RD support when
all other channels fail- Progress client support
when all other channels fail Partner Technology
Access Centers PTAC, 3 in U.S. (NJ, MA TX), 1
in UK, 1 in India- Technology specific expertise
(porting, on-site working sessions) Global
Solution Centers GSC, Nine centers throughout
the world- Client specific POC testing
HP Labs HPs Enterprise Solution Alliances
  • Position HP as the preferred technology provider
    for end-to-end solutions for Progress
  • Ensure tight integration of solutions with
    relevant HP technologies. (Servers, OS,
    Management, High Availability, Storage, etc)
  • Ensure that solutions are optimized and
    benchmarked on HP hardware.
  • Drive emerging technologies into Progress market.

HP Labs Optimized and Benchmarked
Discussion Topics Real world needs are a mix of
real workloads. Is there such thing as a common
and standard OpenEdge implementation? What do
Progress ISV partners already know about their
application characteristics?
Query driven
OpenEdge (i.e. Batch)
Transaction driven
OpenEdge(i.e. OLTP)
How important is real implementation performance?
HP Labs Reserving Equipment and Set Up
  • What happens on the vendor side of this ??
  • CC - Scenarios, Goals, Hardware and Software
    Requirements, Loading, Data Collection
  • Timeline (Target start date, Run time)
  • Documentation and Collateral
  • How long does that usually take?
  • Is this part of the lab time interval ? (yes)
  • When do we get at the system? What do we do?

Plan YOUR Business/Technical Goals
  • Vendor Lab Visits
  • Progress is the partner
  • Its the partner that sponsors the visit
  • A clear goal and a good plan really helps!
  • Expect an initial conference about a visit
  • Equipment specs freeze a month before a visit
  • Theres a LOT of work to do before then !

Plan YOUR Business/Technical Goals
  • Why should we do this?
  • Can we support more business ?
  • How long would a hardware upgrade last us?
  • Can we really support a lot more users ?
  • Where are our bottlenecks ?
  • How do we go about testing on a big system ?
  • What should the deliverables be ?

Plan Configurations Considered
  • Main Test System
  • Model, CPU count
  • Memory
  • Storage and setup needs
  • Network bandwidth
  • Driver System(s)
  • CPU, memory, local storage, network
  • Common storage w/Main ?
  • Special test system requirements ?

Plan What Do I Vary?
  • Change only one thing at a time
  • Ramp up client count or DB size ? Or ?
  • We chose clients, holding memory constant
  • Then we did another run, with bigger memory
  • What is the key criteria? tps, commits,
    time/work ?
  • Whats a good test run?
  • Clear file system cache
  • 10 runs per data point, discarding high low
  • Immediately save logfiles and promon data!
  • Minimal time to re-set for the next test run ...

Script everything you can !
Example Test Scenarios
Plan Remote Access
  • What cant you do remotely ?
  • Will you physically visit the lab ?
  • How will you get programs data there ?
  • How long do you need to set up ?
  • Will you need sources there ? Dev envt ?
  • How will you drive your application ? Mercury?
  • Does your test database need secure deletion ?
  • How will you get your test results home ?

pre-package everything you can and test it !
Plan Initial Game Plan
  • Goal is clear, key criteria are clear
  • Equipment list agreed upon
  • Storage network setup defined
  • OS release, patch level, and special tools
  • OpenEdge release SP defined
  • Test system topology confirmed by the vendor
  • Method for driving the test system looks OK
  • Number of test data points and of runs agreed
  • Plan in place to get software db lab ready
  • Plan in place to get results from the lab

Prep Documentation
  • Test system topology (HW SW)
  • Test data points per run, and number of complete
  • Client range 1K 5K 10K 15K 20K 25K 30K
  • Memory (shmsegsize) 1GB, 16GB
  • Key measurement tps? Commits? Orders processed?
  • Naming conventions for results result storage
  • Initial Setup scripts (sw db at the lab)
  • Test run setup scripts
  • Test run execution save-off scripts
  • Final data collection scripts

Prep Scripting
  • Lab Setup backup tape vs gzip to DVD(s) vs.
  • Lab Setup OpenEdge install, property files,
  • Lab setup gather scripts, etc
  • What starts a test run? Do all clients start, or
    ramp up?
  • Several OpenEdge installs? Which one do you
    run first ?
  • How do you parameterize a test run? (and record
  • Should the vendor gather OS-level data too?
  • When is a test-run done? Transaction , time,
    or logout?
  • It is very common to mix up test result files.

Prep Test Runs (at home)
  • Start with a machine that only has the OS
  • Try your scripts in order (you will find bugs)
  • Setup OK ?
  • OpenEdge setup OK?
  • Do 5 test runs how much do results vary?
  • Result archiving OK?
  • Could someone else repeat this test?

Prep Analysis Check
  • Criteria are known
  • Results have been obtained
  • Now What? Try a brief analysis..
  • Start with goals, config, test description
  • Reduce the data - drop hi/lo, average rest?
  • Plot the points
  • Do these make sense ? What do they say?
  • Do you also need OS info (I/O, pfault, etc)?
  • Adjust the plan, if you need more info

Prep Revised Game Plan
  • Goal is clear, key criteria are clear
  • Equipment list, storage network setup OK
  • OS release, patch level, and special tools
  • OS monitoring reqts documented for lab use
  • OpenEdge release SP defined
  • Method for driving the test system looks OK
  • Additional promon data collection scripted
  • Number of test data points and of runs agreed
  • Plan in place to get software db lab ready
  • Plan in place to get results from the lab

Lab Lead Time
VPN access to external network (Internet)
Private Network switch Attaches to all servers

RP3440-46WR driver1 External network 156.153.117
.161 Private network
RP3440-4872 driver2 External network 156.153.117
.162 Private network
RP3440-46X4 driver3 External network 156.153.117
.163 Private network
RP3440-46X2 driver4 External network 156.153.117
.164 Private network
RX8640-31JD dbsrv External network 156.153.117.
160 Private network
(2) - 2GB fibers connected to EVA
EVA Management PC
Lab Initial Shakedown Run
  • OpenEdge is Installed
  • The application and database are all set
  • Here goes the first test run
  • What ???? This makes no sense
  • Check OS file system cache too big??
  • Check semaphore sets enough ?
  • Check swapping/paging/IO thruput
  • Check OE benchmarking tips ( clients/server,
  • The lab experts may have ideas

Lab Promon vs Vendor Tools
  • Promon or OpenEdge Management
  • Tells what the RDBMS sees
  • Great for RDBMS info
  • Useful to see applevel issues
  • Vendor/OS
  • Tells what the OS sees
  • Great for low-level issues like memory or
  • A good complement to what promon sees

Lab Getting Useful Data
  • Is the data repeatable within 5-8 ?
  • Are ongoing results reasonable
  • If not, talk to the lab team right away
  • There is still time to do drill deeper
  • The whole team wants you to succeed
  • Is there an app bottleneck?
  • Is there a DB bottleneck?
  • Save all data unreasonable may be later on
  • Careful data labeling really helps sort this out

Lab Winding Down
Get the data back home, and confirm it right
away THEN make sure data is deleted, media
returned, etc
Tests Done Organize/Protect Data
  • Lab work is done
  • Data is here
  • Burn it on a DVD
  • Make sure any data reduction has source file id
  • Do the reduction, set up some tables
  • Burn them on a DVD, also with source file id
  • Why? Because here is where the lab time can be
    lost with no hope of recovery

Analysis Getting Started
  • Start using the trial report you started in
  • Build the graphs if you are visual
  • Check the tables if you arent
  • Is there a drop off - maybe a bottleneck or
  • Subsequent data should agree
  • What is your hypothesis?

Analysis Huh? What Happened?
  • Uh Oh We have some inconsistent data in a run
  • Performance dropped and then went up again
  • Performance went up-down-up and stayed up
  • Either way, something is strange
  • Can you simulate this on a local system??
  • Check the data did it get mixed up?
  • Error in data collection? Test script? App? OE?
  • (Usually its a data mixup)

Analysis The Report
  • Table of Contents
  • Goal
  • Key Criteria
  • Test Systems
  • Result Summary and Analysis
  • Lessons Learned

Heres What We Saw . . .
RX8640-31JD dbsrv External network 156.153.117.
160 Private network
-B 3145728 -L 500000 -Mm 8192 -semsets
20 -minport 11000 -maxport 13000 -spin
50000 -bibufs 500 -Ma 100 -Mi 5
clients -Mn ------------------ 1000
15 5000 57 10000 108 15000
162 20000 222 25000 273
Whats With the 30,154 ?
Analysis Lessons Learned
What changed from Initial Game Plan to Revised
Plan ? What difficulties were there in Lab setup
? What difficulties were detected at the Lab vs
home office ? Were there data consistency issues
? Why? Were there difficulties in the analysis?
What could help ? What was the primary value of
the benchmark session ? Are the scripts,
database, apps, and config files archived
? Should the next test set be run remotely or at
the vendor lab ?
Thank you foryour time
(No Transcript)
About PowerShow.com