Skip to main content

Linear Equation - Iterative Solution

Iterative or Indirect method is a method of considering an approximation to the true solution and, if successful, obtain better and better approximations from a computational cycle repeated as often as may be necessary for achieving a required accuracy.

The iterative methods shall be applied if matrices have large main diagonal entries or if a large system is sparse (having many zero coefficients).

Gauss Siedel Iteration Method

A = I + L + U

where

I - n x n unit matrix

L - Lower triangle matrix with zero main diagonals

U - Upper triangle matrix with zero main diagonals

Substituting the A in Ax = b

(I + L + U)x = b

x = b - Lx - Ux

The each component is replaced an approximation of a component by a corresponding new approximation as soon as the latter has been computed.

x(m+1) = b - Lx(m+1) - Ux(m)

A sufficient condition for convergence is 

||C|| < 1   where ||C|| is matrix norm

The Gauss–Seidel iteration belongs to a class of methods often called relaxation methods. The method is also referred as successive corrections.

Jacobi Iteration Method

This method is referred as Simultaneous corrections method as no component of an approximation is used until all the components have been computed.

x(m+1) = b + (I - A)x(m)

This method converges for every choice of x if and only if the spectral radius of I - A is less than 1.

Popular posts from this blog

Exercise 1 - Amdahl's Law

A programmer is given the job to write a program on a computer with processor having speedup factor 3.8 on 4 processors. He makes it 95% parallel and goes home dreaming of a big pay raise. Using Amdahl’s law, and assuming the problem size is the same as the serial version, and ignoring communication costs, what is the speedup factor that the programmer will get? Solution Speedup formula as per Amdahl's Law, N - no of processor = 4 f - % of parallel operation = 95% Speedup = 1 / (1 - 0.95) + (0.95/4) = 1 / 0.5 + (0.95/4) Speedup = 3.478 The programmer gets  3.478 as t he speedup factor.

Exercise 2 - Amdahl's Law

A programmer has parallelized 99% of a program, but there is no value in increasing the problem size, i.e., the program will always be run with the same problem size regardless of the number of processors or cores used. What is the expected speedup on 20 processors? Solution As per Amdahl's law, the speedup,  N - No of processors = 20 f - % of parallel operation = 99% = 1 / (1 - 0.99) + (0.99 / 20) = 1 / 0.01 + (0.99 / 20) = 16.807 The expected speedup on 20 processors is 16.807

Gaussian Elimination - Row reduction Algorithm

 Gaussian elimination is a method for solving matrix equations of the form, Ax=b.  This method is also known as the row reduction algorithm. Back  Substitution Solving the last equation for the variable and then work backward into the first equation to solve it.  The fundamental idea is to add multiples of one equation to the others in order to eliminate a variable and to continue this process until only one variable is left. Pivot row The row that is used to perform elimination of a variable from other rows is called the pivot row. Example: Solving a linear equation The augmented matrix for the above equation shall be The equation shall be solved using back substitution. The eliminating the first variable (x1) in the first row (Pivot row) by carrying out the row operation. As the second row become zero, the row will be shifted to bottom by carrying out partial pivoting. Now, the second variable (x2)  shall be eliminated by carrying out the row operation again. ...