System Performance Measurement and Analysis with Web-based Presentation of Results - PowerPoint PPT Presentation

Loading...

PPT – System Performance Measurement and Analysis with Web-based Presentation of Results PowerPoint presentation | free to download - id: 5bcf53-YzZkO



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

System Performance Measurement and Analysis with Web-based Presentation of Results

Description:

System Performance Measurement and Analysis with Web-based Presentation of Results Phil Cannata Sun Microsystems, Inc. Overview iPlanet Directory server iDS ... – PowerPoint PPT presentation

Number of Views:104
Avg rating:3.0/5.0
Slides: 49
Provided by: PhilC152
Learn more at: http://regions.cmg.org
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: System Performance Measurement and Analysis with Web-based Presentation of Results


1
System Performance Measurement and Analysis with
Web-based Presentation of Results
  • Phil Cannata
  • Sun Microsystems, Inc.

2
Overview
  • iPlanet Directory server
  • iDS Performance Engineering Group
  • Methodology
  • Goal is presentation of credible results
  • Data collection and analysis tools
  • Web-based presentation of results
  • Slide Show Demo

3
iPlanet Directory Server (iDS)
  • Read-mostly Hierarchical DBMS for identity
    services, network resources and other
    enterprise-wide information resources
  • scalable
  • high availability
  • High performance compared to RDBMSs
  • Bottom line performance is critical

4
iPlanet Directory Server (iDS)
Search Update
Operating System
iDS Application
LDAP
Entry Cache Entries
Algorithms? OS? Memory Size? Threading? Multiple
CPUs? Locking? Indexing? Network? WAN?
Standby Memory
cache fault / page fault
SleepyCat DB
db Cache Entries Indices
read
System Cache
flush
copy
Disks
RAID?
Fast write cache
Replication Logs
Database Entries Indicies
Transaction Logs
Fast write cache
Fast write cache
5
iPlanet Directory Server (iDS)
6
iDS Performance Engineering Mission
  • Develop performance evaluations and
    characterizations of iPlanet Directory Products
  • Work with developers, to help resolve iDS
    performance issues
  • Work with deployment engineers, to develop tuning
    guides and customer configuration planning aids

7
Successes
  • Helped to keep iDS performance in leadership
    position
  • Located and found solutions for some major
    performance problems (see next page)
  • Helped develop credible/realistic statements
    about product performance
  • Gained enough knowledge to start working on
    discrete event simulation models of performance
    behavior

8
Some Specific Performance Problems
  • Uncovered performance problems in new indexing
    scheme
  • This scheme was removed from release
  • Discovered quadratic growth rate problem with new
    hashing scheme for Entry cache (see here)
  • Located problems with virtual attribute feature
    (see here)
  • Discovered transaction checkpointing bugs
  • Found replication performance on NT to be ltlt than
    on Solaris

9
Methodology
  • Produce baseline performance analysis and
    characterization of directory products
  • independent
  • non-anecdotal
  • repeatable

10
Methodology
  • Independent results

QA and Release Engineering
Deployment Engineering
Project Management
Development
Development Santa Clara
Development Grenoble
Development Austin
Performance Engineering
11
Methodology
  • Non-anecdotal results

12
Methodology
  • Reproducible results
  • Each experiment uses an isolated network
  • Each experiment starts with a known copy of OS
    Level, iDS configuration and TCP/IP tuning
  • on Solaris, use Flash Archive plus rebuild
    filesystems and unarchive the database
  • on NT, use Imagecast plus unarchive the database
  • Problems had been found if this was not done
  • 20 variability in results
  • files on disk effected performance
  • also see here

13
Methodology
  • Use a common shell script to collect detailed
    data on system under test
  • operational data
  • cpu(s)
  • disk drives
  • memory
  • network
  • environmental data
  • hardware, file system
  • operating system, iDS configuration

14
Data Collection Tools
  • Standard test drivers
  • Common scripts to submit streams of requests to
    server(s)
  • add
  • delete
  • modify
  • search
  • Each script collects summary data

15
Data Collection Tools
  • System utilities
  • Unix (Solaris)
  • Wham Software and Engineering
  • DRM (Distributed Resource Monitor)
  • dstat
  • System utilities (iostat, vmstat, prtmem, etc.)
  • Windows NT/2000
  • perfmon

see here
16
Typical Lab Configuration
17
Presentation of Results
  • Problems
  • Collect lots of data need means for storing,
    summarizing and viewing data summaries
  • Need graphical presentations
  • Results need to be viewable in many locations
  • Austin, Santa Clara, Fresno, Grenoble
  • Need to make comparisons between different
    systems, releases, parameter settings, etc.

18
Presentation of Results
Tom
Herb ?
Reads from Database Disk Blue Adds with no
replication Red Adds with replication
19
Web-Based Solution
  • Graphical
  • Interactive
  • Geographically distributed users
  • Off-of-the-shelf solutions were available
  • Central repository of data, with browser-based
    interface via web server

20
Additional Issues
  • Now REALLY need a uniform way of collecting and
    archiving data
  • Compare performance of new releases with older
    releases
  • Evaluate impact of new features
  • Quantify improvements (or lack thereof)
  • Data will now be reusable

21
Additional Issues
  • Want to build up a legacy of data and
    interpretations
  • Want to be a source of expertise for performance
    analysis in iPlanet Directory Products

22
Methodology
  • Produce baseline performance analysis and
    characterization of directory products
  • independent
  • repeatable
  • non-anecdotal
  • uniform
  • reusable
  • legacy

23
Web Presentation Project
  • Had testing and data collection procedures in
    place (sort of)
  • Decided to use iDS to archive results for each
    experiment
  • summary record (can be queried)
  • Environmental data
  • Server data (WHAM)
  • Client data (WHAM)

24
Web Presentation Project
  • Graphing and Analysis package
  • Mathematica
  • Web presentation
  • WebMathematica

http//www.wolfram.com/products/webmathematica/
25
Typical Lab Configuration
26
Web Pages and Reports
27
Slide Show Demo
28
A Pretty Story Web Page Development
29
A Pretty Story Web Page Development
New Repository Entries
New Data Description
30
A Pretty Story Web Page Development
Invalid Entry
New Repository Entries
New Data Description
31
A Pretty Story Web Page Development
Mathematica Server Page
Web Server
32
A Not So Pretty Story Web Page Development
Mathematica Server Page
Web Server
Mature Audiences Only
33
A Not So Pretty Story Web Page Development
Mathematica Server Page
Web Server
Web Page Template
Save Links
Build Web Page
Awk script
hrefs on current page
Get and Format Entries from Results Repository
that are not on Web Page
File of New Entries
Results Repository
Confirm that Entries on Web Page are still in the
Results Repository
34
A Little Nicer Story Server Side Processing
35
A Little Nicer Story Server Side Processing
Client
Apache Web Server
Tomcat Servlet Container
Web Mathematica
UNIX
Check Requested Page
Check Requested Page
Process .msp query form page
Check Requested Page
Allocate new thread
kernel pool
Results Repository
Run WebMathematica Servlet Service
dynamic
Mathematica kernel
All Data DRM Data
static
Serve HTML Page
Generate Report
Add HTTP headers
Parsed Data
36
Performance Engineering Web Presentation Web
Presentation
Tom
Herb ?
Start Nov. Dec. 2001 First Results March 1,
2002 There was a need to retrofit old data but it
was not too bad.
37
First Results
38
Performance Engineering Web Presentation March 1,
2001
Notice the Repeatability of Toms Data
Tom(Sept.) vs Herb (Dec.) Tom(Sept.) vs
Tom (Dec.) Tom(Dec.) vs Herb (Dec.)
System Time
User Time
IO Wait Time
39
Performance Engineering Web Presentation
Tracking Down the Problem
  • Difference in methodology (reboot, ldif files
    were different)
  • Dual network card causing ftp transfer rate
    difference of 20
  • Dont trust information from cfgadm about disk
    manufacturer's model numbers, trust the Fru
    number (March 14).

40
Performance Engineering Web Presentation March 1,
2001
Tom(Sept.) vs Herb (Dec.) Tom(Sept.) vs
Tom (Dec.) Tom(Dec.) vs Herb (Dec.)
Kbytes to disk
Kbytes from disk
41
Performance Engineering Web Presentation Tracking
Down the Problem
Nothing yet.
42
Performance Engineering Web Presentation March 1,
2001
Once again, notice the Repeatability
Tom(Sept.) vs Tom (Dec)
Tom(Sept.) vs Tom (Dec)
Incoming packets
Incoming kBytes
Outgoing packets
Outgoing kBytes
43
Performance Engineering Web Presentation Tracking
Down the Problem
Nothing yet.
44
Performance Engineering System
  • Data Collection
  • Data Storage
  • Data Analysis
  • Presentation of results

45
Future Directions
  • Need to simplify user interface
  • graphs on demand
  • better summary presentations
  • more automatic operation
  • Return live Mathematica notebooks
  • Need to integrate results from Windows systems,
    Linux systems, etc.

46
Summary
  • Have put in place a system for analyzing and
    characterizing performance if iDS
  • data driven
  • reproducible experimental results
  • archived data
  • Web-based, interactive presentation of graphical
    results

47
Summary
  • Have used this system (and its predecessors) to
    analyze and fix performance problems
  • Have quantified performance improvements in
    successive product releases
  • Have contributed to development of tuning guides
    and configuration planning aids

48
Summary
  • Have demonstrated value of consistent, ongoing,
    and rigorous performance evaluation project
About PowerShow.com