Skip to main content

Operation Count - Gauss Elimination

In modern computers, all floating-point operations are done in hardware, so long operations may be as significant, gives an indication of the operational cost of Gaussian elimination. 

Gaussian elimination is a method of solving a linear system Ax = b (consisting of m equations in n unknowns) by bringing the augmented matrix to an upper triangular form. This elimination process is also called the forward elimination method. Then the unknowns n can be solved by back substitution.

At first a multiplier is computed and then subtracted from row Ri that multiplier times row R1. The zero that is being created in this process is not computed. So the elimination requires (n - 1) multiplications per row.

The tabular column shows the counts of each operation.

Flops refers floating point operation.

Since (n - 1) steps required, k goes from 1 to (n - 1) and thus the total number of operations in this forward elimination is

consider (n - k) = s
The above equation is obtained dropping lower power on n. The function grows about proportional to n cube.
In the back substitution, the (n -1) multiplications and subtractions are made and 1 division. Hence the number of operation in the back substitution is
The backward substitution grows more slowly than the number of operations in the forward elimination of the Gauss algorithm. Hence it is negligible for large systems.

Popular posts from this blog

Exercise 2 - Amdahl's Law

A programmer has parallelized 99% of a program, but there is no value in increasing the problem size, i.e., the program will always be run with the same problem size regardless of the number of processors or cores used. What is the expected speedup on 20 processors? Solution As per Amdahl's law, the speedup,  N - No of processors = 20 f - % of parallel operation = 99% = 1 / (1 - 0.99) + (0.99 / 20) = 1 / 0.01 + (0.99 / 20) = 16.807 The expected speedup on 20 processors is 16.807

Gaussian Elimination - Row reduction Algorithm

 Gaussian elimination is a method for solving matrix equations of the form, Ax=b.  This method is also known as the row reduction algorithm. Back  Substitution Solving the last equation for the variable and then work backward into the first equation to solve it.  The fundamental idea is to add multiples of one equation to the others in order to eliminate a variable and to continue this process until only one variable is left. Pivot row The row that is used to perform elimination of a variable from other rows is called the pivot row. Example: Solving a linear equation The augmented matrix for the above equation shall be The equation shall be solved using back substitution. The eliminating the first variable (x1) in the first row (Pivot row) by carrying out the row operation. As the second row become zero, the row will be shifted to bottom by carrying out partial pivoting. Now, the second variable (x2)  shall be eliminated by carrying out the row operation again. ...

Decision Tree Scalability Methods

 The scalable decision tree induction methods are RainForest and BOAT. RainForest method maintains an AVC set for each attribute at each node. AVC stands for Attribute Value Classlabel. BOAT , stands for Bootstrapped Optimistic Algorithm for Tree construction, uses a statistical technique known as bootstrapping., by which several smaller subsets are created. The several trees are created using the subsets and finally the full tree is generated using the trees created by smaller subsets. BOAT was found to be two to three times faster than RainForest.