Department of Electrical and Computer Engineering - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

Department of Electrical and Computer Engineering

Description:

Smoothing by running GS. Construct upper level steady state distribution - q ... Optimal aggregation makes most out of GS smoothing at each level and reduces the ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 32
Provided by: XZh469
Category:

less

Transcript and Presenter's Notes

Title: Department of Electrical and Computer Engineering


1
Multilevel Solver for Continuous Markov Chains
  • Daniel Chen
  • Xiaolan Zhang

2
Outline
  • Motivation
  • Introduction to multilevel solver
  • Gauss-Seidel algorithm (GS)
  • Horton-Leutenegger Multilevel algorithm (ML)
  • Our automatic aggregation algorithm (2AGG)
  • Implementation
  • Results
  • Lecture notes example
  • M/M/1/1000 queue example
  • Molloys closed queuing network example
  • Conclusion

3
Motivation
  • Markov system generated by modeling tool
    generally has large number of states
  • Steady state solution must be solved numerically
    - Power, Gauss-Seidel, and SOR.
  • Many iterations required to reach convergence.
  • When modeling complex system
  • When high precision is required
  • When the transition rate has large variance

4
The idea of iteration
If we define
and put
, we find
5
Gauss-Seidel algorithm
6
GS Example
7
Multi-level algorithm
8
Multi-level Algorithm
Solve Directly
Smoothing by running GS
Construct upper level steady state distribution -
q
Construct upper level rate transition matrix - P
Call MSS with the upper level q and P
Correct the result in the lower level
9
ML Example
Create upper level matrix
10
States aggregation
  • Aggregation of states is essential to the success
    of ML algorithm.
  • Optimal aggregation makes most out of GS
    smoothing at each level and reduces the number of
    levels (recursion in MSS).
  • Searching for the optimal strategy for grouping
    any number of states could be difficult.
  • We provide a heuristic aggregation criteria for
    grouping two states.

11
Two-state aggregation (2AGG)
  • An automatic aggregation tool analyzing the rate
    transition matrix.
  • Group states in a unit of two.
  • Start with states of maximum out degree.
  • For each state, greedy search a mate in all
    available states. The mate state should present
    the following three properties to the given
    state
  • Strong connection
  • Similar magnitude in transition rate
  • Preferable high transition rate
  • Determine unit one by one until exhaust the state
    space.

12
Mutual communication
Similar magnitude
Maximum rate
13
Implementation
  • Linked adjacency list to save space. However have
    penalty on ML speed. Matrix search function has
    been accelerated for ML.
  • GS algorithm.
  • ML algorithm for neighboring state aggregation at
    all levels.
  • 2AGG algorithm.
  • Pre-process function to permute states with the
    mate state. Better aggregation provided.
  • Only preprocess at the first level. Future
    improve expected if applied on all levels.

14
Comparison metrics and results
  • Fair comparison between GS and ML.
  • Reflects the implementation potential of ML.
  • Three metrics
  • Number of total GS iterations
  • For ML number of MSS iterations x total number
    of GS calls in each iteration x GS iterations at
    each call (v).
  • Disadvantage to ML because the size matrix being
    smoothed is reduced at upper level.
  • Number of floating point operations (nflop)
  • Count multiplication, division and beyond.
  • Rough but reflects the computational complexity
    of ML without memory overheads.
  • CPU processing time
  • Most fair comparison on a particular
    implementation.
  • Lack of insight on overhead improvements.

15
Lecture notes example
  • Show the advantage of ML on Markov chain with
    obvious bottleneck rate transitions.
  • Show the importance of aggregation choices

16
Lecture notes example
17
Lecture notes example
18
Lecture notes example
19
M/M/1/1000 queue
  • Birth-death chain with asymmetric rate flows.
  • 2AGG tool is not applicable.
  • Show how death rate u affects the performance.
  • Understand different performance metrics.

20
M/M/1/1000 queue
21
M/M/1/1000 queue
22
M/M/1/1000 queue
23
Molloys closed queuing network
  • Show how increment of state space affects the
    performance.
  • Change the number of states by number of tokens
    initially put at place P.
  • ML-p is the version integrated with 2AGG at the
    first level.

24
Molloys closed queuing network
25
Molloys closed queuing network
26
Conclusion and future work
  • Aggregation strategy is very important.
  • The result is mostly case-wise.
  • Overhead cost of ML impairs the overall
    performance.
  • Challenges lies on the demand of intelligent
    tools to improve ML and the control of overhead
    cost.
  • Future work
  • Determine optimal number of GS smoothing at each
    level.
  • Extend aggregation tool to arbitrary state
    grouping at all levels.
  • Improve Q matrix data structure to reduce memory
    access overheads.

27
References
  • G. Horton, S. Leutenegger, A Multi-Level
    Solution Algorithm for Steady-State Markov
    Chains.
  • W. H. Sanders, Lecture Notes for ECE/CS 541 and
    CSE 524 Computer System Analysis.
  • PERFORM group, Mobius User Manual.

28
Multilevel Solver for Continuous Markov Chains
  • Daniel Chen
  • Xiaolan Zhang

29
Lecture notes example
30
Lecture notes example
31
Molloys closed queuing network
Write a Comment
User Comments (0)
About PowerShow.com