Lecture 10 System of Equations - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Lecture 10 System of Equations

Description:

Relaxation technique. Iterative Techniques ... SOR (Successive over Relaxation) uses the residuals to accelerate the convergence. ... – PowerPoint PPT presentation

Number of Views:78
Avg rating:3.0/5.0
Slides: 30
Provided by: ericsa3
Category:

less

Transcript and Presenter's Notes

Title: Lecture 10 System of Equations


1
Lecture 10 System of Equations
  • February 15, 2001
  • CVEN 302

2
Lectures Goals
  • Iterative Techniques
  • Jacobian method
  • Gaus-Siedel method
  • Relaxation technique

3
Iterative Techniques
  • The method of solving simultaneous linear
    algebraic equations using Gaussian Elimination.
    Problems can arise from round-off errors and zero
    on the diagonal.
  • One means of obtaining an approximate solution to
    the equations is to use an educated guess.

4
Iterative Methods
  • We will look at three iterative methods
  • Jacobi Method
  • Gauss-Seidel Method
  • Successive over Relaxation (SOR)

5
Convergence Restrictions
  • There are two conditions for the iterative method
    to converge.
  • Necessary that 1 coefficient in each equation is
    dominate.
  • The sufficient condition is that the diagonal is
    dominate.

6
Jacobi Iteration
  • If the diagonal is dominant, the matrix can be
    rewritten in the following form

7
Jacobi Iteration
  • The technique can be rewritten in a shorthand
    fashion, where D is the diagonal, A is the
    matrix without the diagonal and c is the
    right-hand side of the equations.

8
Jacobi Iteration
  • The technique solves for the entire set of x
    values for each iteration.
  • The problem does not update the values until an
    iteration is completed.

9
Example (Jacobi Iteration)
  • 4X1 2X2 2
  • 2X1 10X2 4X3 6
  • 4X2 5X3 5
  • Solution (X1 , X2 , X3 ) (0.41379, 0.17241,
    0.86206)

10
Jacobi Example
  • Formulation of the matrix

11
Jacobi Iteration
  • Iteration 1 2 3 4 5 6 7
  • X 1 0.5 0.2 0.45 0.324 0.429 0.376 0.42
  • X 2 0.6 0.1 0.352 0.142 0.248 0.16 0.204
  • X 3 1 0.52 0.92 0.718 0.886 0.802 0.872

12
Jacobi Program
  • The computer program is setup to do the Jacobi
    method for any size square matrix
  • Jacobi(A,b)
  • The program can has options for maximum number of
    iterations, nmax, and tolerance, tol.
  • Jacobi(A,b,nmax,tol)

13
Gauss-Seidel Iterations
  • The Gauss-Seidel / Seidel technique is similar to
    the Jacobi iteration technique with one
    difference.
  • The method updates the results continuously. It
    uses the new information from the previous
    iteration to accelerate converge to a solution.

14
Gauss-Seidel Model
  • The Gauss-Seidel Algorithm
  • The combined vector is upgraded ever term.

15
Example (Gauss-Seidel Iteration)
  • 4X1 2X2 2
  • 2X1 10X2 4X3 6
  • 4X2 5X3 5
  • Solution (X1 , X2 , X3 ) (0.41379, 0.17241,
    0.86206)

16
Gauss-Seidel Example
  • Formulation of the matrix problem

17
Gauss-Seidel Iteration
  • Iteration 1 2 3 4 5 6 7
  • X 1 0.5 0.25 0.345 0.384 0.401 0.408 0.411
  • X 2 0.5 0.31 0.231 0.197 0.183 0.177 0.175 X
    3 0.6 0.75 0.815 0.842 0.854 0.858 0.858

18
Gauss-Seidel Model
  • The computer program is setup to do the
    Gauss-Seidel method for any size square matrix
  • Seidel(A,b)
  • The program can has options for maximum number of
    iterations, nmax, and tolerance, tol.
  • Seidel(A,b,nmax,tol)

19
Example - Iteration with no diagonal domination
  • 3X1 - 3X2 5X3 4
  • X1 2X2 - 6X3 3
  • 2X1 - X2 3X3 1
  • Solution (X1 , X2 , X3 ) (1.00, -2.00, -1.00)

20
Using the GS algorithm
  • Using the Gauss-Seidel Program with the
    following A matrix and b vector.
  • Solution (X1 , X2 , X3 ) (1.00, -2.00, -1.00)

21
Using a Gauss-Seidel Iteration
  • Iteration 1 2 3 4 5 6
    7
  • X 1 1.33 2.283 3.425 2.057 2.765 1.474 2.023
  • X 2 1.5 1.825 -0.243 -1.953 -5.622 -4.95 -7.91
    5
  • X 3 0.33 -0.16 -0.58 -2.031 -1.689 -3.384 -0.3
    34

22
Successive over Relaxation
  • The technique is a modification on the
    Gauss-Seidel method with an additional parameter,
    w, that may accelerate the convergence of the
    iterations.
  • The weighting parameter, w, has two ranges 0 lt w
    lt1, and 1lt w lt2. If w 1, then the problem is
    the Gauss-Seidel technique.

23
SOR Method
  • The SOR algorithm is defined as
  • The difference is the weighting parameter, w.

24
The Weighting Parameter
  • If the parameter, w is under 1, the residuals
    will be under-relaxed.
  • If the parameter, w 1, the residuals are equal
    to a Gauss-Seidel model.
  • If 1lt w lt 2 the residuals will be over-relaxed
    and will general help accelerate the convergence
    of the solution.

25
Example of SOR
  • 4X1 2X2 2
  • 2X1 10X2 4X3 6
  • 4X2 5X3 5
  • Solution (X1 , X2 , X3 ) (0.41379, 0.17241,
    0.86206)

26
SOR Example
  • Formulation of the SOR Algorithm

27
SOR Example
28
Effects of w Parameter
  • Using the SOR program SOR(A,b,w,nmax,tol) with
    nmax50 and tol 0.000001

29
Summary
  • Convergence conditions need to be met in-order
    for iterative techniques to converge
  • Jacobi method upgrades the values after each
    iteration.
  • Gauss-Seidel upgrades continuously through
    method.
  • SOR (Successive over Relaxation) uses the
    residuals to accelerate the convergence.
Write a Comment
User Comments (0)
About PowerShow.com