A more reliable reduction algorithm for behavioral model extraction - PowerPoint PPT Presentation

Loading...

PPT – A more reliable reduction algorithm for behavioral model extraction PowerPoint presentation | free to download - id: 10a6e8-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

A more reliable reduction algorithm for behavioral model extraction

Description:

A more reliable reduction algorithm for behavioral model extraction ... PQvi = ?ivi wiPQ = ?iwi. Balanced truncation reduction (TBR) Very expensive. ... – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 32
Provided by: Vasi64
Learn more at: http://www.mit.edu
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: A more reliable reduction algorithm for behavioral model extraction


1
A more reliable reduction algorithm for
behavioral model extraction
  • Dmitry Vasilyev, Jacob White
  • Massachusetts Institute of Technology

2
Outline
  • Background
  • Projection framework for model reduction
  • Balanced Truncation algorithm and approximations
  • AISIAD algorithm
  • Description of the proposed algorithm
  • Modified AISIAD and a low-rank square root
    algorithm
  • Efficiency and accuracy
  • Conclusions

3
Model reduction problem
inputs
outputs
Many (gt 104) internal states
inputs
outputs
few (lt100) internal states
  • Reduction should be automatic
  • Must preserve input-output properties

4
Differential Equation Model
- state
A stable, n x n (large) E SPD, n x n
- vector of inputs
- vector of outputs
  • Model can represent
  • Finite-difference spatial discretization of PDEs
  • Circuits with linear elements

5
Model reduction problem
n large (thousands)!
q small (tens)
Need the reduction to be automatic and preserve
input-output properties (transfer function)
6
Approximation error
  • Wide-band applications model should have small
    worst-case error

gt maximal difference over all frequencies
?
7
Projection framework for model reduction
  • Pick biorthogonal projection matrices W and V
  • Projection basis are columns of V and W

x
Vxr x
x
n
V
q
xr
Ax
WTAVxr
Most reduction methods are based on projection
8
Projection should preserve important modes
u
y
LTI SYSTEM
t
t
input
output
P (controllability) Which modes are easier to
reach?
Q (observability) Which modes produce more output?
X (state)
  • Reduced model retains most controllable and most
    observable modes
  • Mode must be both very controllable and very
    observable

9
Balanced truncation reduction (TBR)
Compute controllability and observability
gramians P and Q (n3) AP PAT BBT
0 ATQ QA CTC 0 Reduced model keeps
the dominant eigenspaces of PQ
(n3) PQvi ?ivi wiPQ ?iwi
Reduced system (WTAV, WTB, CV, D)
Very expensive. P and Q are dense even for
sparse models
10
Most reduction algorithms effectively separately
approximate dominant eigenspaces of P and Q
  • Arnoldi Grimme 97 V colspA-1B, A-2B, ,
    WVT , approx. Pdom only
  • Padé via Lanczos Feldman and Freund
    95 colsp(V) A-1B, A-2B, , - approx. Pdom
    colsp(W) A-TCT, (A-T )2CT, , - approx. Qdom
  • Frequency domain POD Willcox 02, Poor Mans
    TBR Phillips 04

colsp(V) (j?1I-A)-1B, (j?2I-A)-1B, , -
approx. Pdom colsp(W) (j?1I-A)-TCT,
(j?2I-A)-TCT, , - approx. Qdom
However, what matters is the product PQ
11
RC line (symmetric circuit)
V(t) input i(t) - output
  • Symmetric, PQ all controllable states are
    observable and vice versa

12
RLC line (nonsymmetric circuit)
Vector of states
  • P and Q are no longer equal!
  • By keeping only mostly controllable and/or only
    mostly observable states, we may not find
    dominant eigenvectors of PQ

13
Lightly damped RLC circuit
R 0.008, L 10-5 C 10-6 N100
  • Exact low-rank approximations of P and Q of
    order lt 50 leads to PQ 0!!

14
Lightly damped RLC circuit
Top 5 eigenvectors of Q
Top 5 eigenvectors of P
Union of eigenspaces of P and Q does not
necessarily approximate dominant eigenspace of
PQ .
15
AISIAD model reduction algorithm
Idea of AISIAD approximation
Approximate eigenvectors using power
iterations Vi converges to dominant
eigenvectors of PQ Need to find the product
(PQ)Vi
How?
16
Approximation of the product Vi1 qr(PQVi),
AISIAD algorithm
Wi qr(QVi)
Vi1 qr(PWi)
Approximate using solution of Sylvester equation
Approximate using solution of Sylvester equation
17
More detailed view of AISIAD approximation
Right-multiply by Wi
(original AISIAD)
X
X
H, qxq
M, nxq
18
Modified AISIAD approximation
Right-multiply by Vi

X
X
H, qxq
Approximate!
M, nxq
19
Modified AISIAD approximation
Right-multiply by Vi

X
X
H, qxq
Approximate!
M, nxq
We can take advantage of numerous methods, which
approximate P and Q!
20
Specialized Sylvester equation
-M
X
X
A


H
q x q
n x q
n x n
Need only column span of X
21
Solving Sylvester equation
Schur decomposition of H
-M
X
X
A






Solve for columns of X
X
22
Solving Sylvester equation
Schur decomposition of H
  • Applicable to any stable A
  • Requires solving q times

Solution can be accelerated via fast MVP
Another methods exists, based on IRA, needs Agt0
Zhou 02
23
Solving Sylvester equation
Schur decomposition of H
  • Applicable to any stable A
  • Requires solving q times


For SISO systems and P0 equivalent to matching
at frequency points ?(WTAW)
24
Modified AISIAD algorithm
LR-sqrt

  1. Obtain low-rank approximations of P and Q
  2. Solve AXi XiH M 0, gt Xi PWi where
    HWiTATWi, M P(I - WiWiT)ATWi BBTWi
  3. Perform QR decomposition of Xi ViR
  4. Solve ATYi YiF N 0, gt Yi QVi where
    FViTAVi, N Q(I - ViViT)AV CTCVi
  5. Perform QR decomposition of Yi Wi1 R to get new
    iterate.
  6. Go to step 2 and iterate.
  7. Bi-orthogonalize W and V and construct reduced
    model



(WTAV, WTB, CV, D)
25
For systems in the descriptor form
Generalized Lyapunov equations
Lead to similar approximate power iterations
26
mAISIAD and low-rank square root
Low-rank gramians
(cost varies)
mAISIAD
LR-square root
(inexpensive step)
(more expensive)
For the majority of non-symmetric cases, mAISIAD
works better than low-rank square root
27
RLC line example results
H-infinity norm of reduction error (worst-case
discrepancy over all frequencies)
N 1000, 1 input 2 outputs
28
Steel rail coolling profile benchmark
Taken from Oberwolfach benchmark collection,
N1357 7 inputs, 6 outputs
29
mAISIAD is useless for symmetric models
For symmetric systems (A AT, B CT) PQ,
therefore mAISIAD is equivalent to LRSQRT for
P,Q of order q


RC line example
30
Cost of the algorithm
  • Cost of the algorithm is directly proportional to
    the cost of solving a linear system (where
    sjj is a complex number)
  • Cost does not depend on the number of inputs and
    outputs

(non-descriptor case)
(descriptor case)
31
Conclusions
  • The algorithm has a superior accuracy and
    extended applicability with respect to the
    original AISIAD method
  • Very promising low-cost approximation to TBR
  • Applicable to any dynamical system, will work
    (though, usually worse) even without low-rank
    gramians
  • Passivity and stability preservation possible via
    post-processing
  • Not beneficial if the model is symmetric
About PowerShow.com