Engineering Computation - PowerPoint PPT Presentation

About This Presentation
Title:

Engineering Computation

Description:

Surprisingly, during the Gauss elimination procedure, this matrix L is obtained, ... Gauss-Jordan (1 r.h.s) n3/2 O (n2) LU decomposition n3/3 O (n2) ... – PowerPoint PPT presentation

Number of Views:56
Avg rating:3.0/5.0
Slides: 23
Provided by: dptomatema
Category:

less

Transcript and Presenter's Notes

Title: Engineering Computation


1
EngineeringComputation
  • Lecture 6

2
Gauss elimination (operation counting)
Forward Elimination DO k 1 to n1 DO i
k1 to n r A(i,k)/A(k,k) DO j k1
to n A(i,j)A(i,j) rA(k,j) ENDDO B(
i) B(i) rB(k) ENDDO ENDDO
3
Gauss elimination (operation counting)
Operation Counting for Gaussian Elimination Back
Substitution X(n) B(n)/A(n,n) DO i n1 to
1 by 1 SUM 0 DO j i1 to n SUM SUM
A(i,j)X(j) ENDDO X(i) B(i)
SUM/A(i,i) ENDDO
4
Gauss elimination (operation counting)
Operation Counting for Gaussian
Elimination Forward Elimination Inner
loop Second loop
(n2 2n) 2(n 1)k k2
5
Gauss elimination (operation counting)
Operation Counting for Gaussian
Elimination Forward Elimination (cont'd) Outer
loop
6
Gauss elimination (operation counting)
Operation Counting for Gaussian Elimination Back
Substitution Inner Loop Outer Loop
7
Gauss elimination (operation counting)
Total flops Forward Elimination Back
Substitution n3/3 O (n2) n2/2 O
(n) ? n3/3 O (n2)
To convert (A,b) to (U,b') requires n3/3, plus
terms of order n2 and smaller, flops. To back
solve requires 1 2 3 4 . . . n
n (n1) / 2 flops Grand Total the entire
effort requires n3/3 O(n2) flops altogether.
8
Gauss-Jordan Elimination
Diagonalization by both forward and backward
elimination in each column. Perform
elimination both backwards and forwards
until Operation count for Gauss-Jordan is
(slower than Gauss elimination)
9
Gauss-Jordan Elimination
Example (two-digit arithmetic)
x1 0.015 (vs. 0.016, et 6.3) x2
0.041 (vs. 0.041, et 0) x3 0.093
(vs. 0.091, et 2.2)
10
Gauss-Jordan Matrix Inversion
The solution of Ax b is x
A-1b where A-1 is the inverse
matrix of A Consider A A-1 I 1)
Create the augmented matrix A I 2)
Apply Gauss-Jordan elimination gt I A-1
11
Gauss-Jordan Matrix Inversion
Gauss-Jordan Matrix Inversion (with 2 digit
arithmetic)
MATRIX INVERSE A-1
12
Gauss-Jordan Matrix Inversion
CHECK A A -1 I
A -1 b x
Gaussian Elimination
13
LU decomposition
  • LU decomposition - The LU decomposition is a
    method that uses the elimination techniques to
    transform the matrix A in a product of triangular
    matrices. This is specially useful to solve
    systems with different vectors b, because the
    same decomposition of matrix A can be used to
    evaluate in an efficient form, by forward and
    backward sustitution, all cases.

14
LU decomposition
15
LU decomposition
  • LU decomposition is very much related to Gauss
    method, because the upper triangular matrix is
    also looked for in the LU decomposition. Thus,
    only the lower triangular matrix is needed.
  • Surprisingly, during the Gauss elimination
    procedure, this matrix L is obtained, but one is
    not aware of this fact. The factors we use to get
    zeroes below the main diagonal are the elements
    of this matrix L.

Substract
16
LU decomposition
17
LU decomposition (Complexity)
Basic Approach Consider Ax b a)
Gauss-type "decomposition" of A into LU
n3/3 flops Ax b
becomes L Ux b let Ux ? d
b) First solve L d b for d by
forward subst. n2/2 flops c) Then solve
Ux d for x by back substitution
n2/2 flops
18
LU Decompostion notation
A L U0
A L0 U
A L0 U0 D
A L1 U
A L U1
19
LU decomposition
LU Decomposition Variations Doolittle L1U Gen
eral A Crout LU1 General
A Cholesky LL T Pos. Def. Symmetric
A Cholesky works only for Positive Definite
Symmetric matrices Doolittle versus Crout
Doolittle just stores Gaussian elimination
factors where Crout uses a different series of
calculations (see CC 10.1.4). Both decompose
A into L and U in n3/3 FLOPS Different
location of diagonal of 1's Crout uses each
element of A only once so the same array can be
used for A and L\U saving computer memory!
20
LU decomposition
Matrix Inversion Definition of a matrix
inverse A A-1 I gt A x
b A-1 b x
First Rule Dont do it. (numerically
unstable calculation)
21
LU decomposition
Matrix Inversion If you really must -- 1)
Gaussian elimination A I gt U B' gt
A-1 2) Gauss-Jordan A I gt I A-1
Inversion will take n3 O(n2) flops if one is
careful about where zeros are (taking advantage
of the sparseness of the matrix) Naive
applications (without optimization) take 4n3/3
O(n2) flops. For example, LU decomposition
requires n3/3 O(n2) flops. Back solving twice
with n unit vectors ei 2 n (n2/2) n3
flops. Altogether n3/3 n3 4n3/3 O(n2)
flops
22
FLOP Counts for Linear Algebraic Equations
Summary FLOP Counts for Linear Algebraic
Equations, Ax b Gaussian Elimination (1
r.h.s) n3/3 O (n2) Gauss-Jordan (1 r.h.s) n3/2
O (n2) LU decomposition n3/3 O (n2) Each
extra LU right-hand-side n2 Cholesky
decomposition (symmetric A) n3/6 O
(n2) Inversion (naive Gauss-Jordan) 4n3/3 O
(n2) Inversion (optimal Gauss-Jordan) n3 O
(n2) Solution by Cramer's Rule n!
Write a Comment
User Comments (0)
About PowerShow.com