Math Question Message #1 Posted by Namir on 6 Nov 2012, 7:43 a.m.
I am looking at the Jacobi Method which is the simplest (and slowest) iterative method for solving a large system of linear equations. Such systems are too large for traditional methods to work. I am solving:
A x = b
The Jacobi method uses the following matrix form for its iterations:
D x = b – (L + U) x
Where D, L and U are the diagonal, strictly left lower triangular, and strictly right upper triangular matrices, respectively. The algorithm is guaranteed to work, with any initial guess for x, if matrix A has diagonal dominant elements.
In case matrix A is not diagonal dominant, I am trying to find a trick to add a preconditioning matrix P to make A diagonal dominant. P is a diagonal matrix with zeros in non-diagonal elements. Each diagonal value of matrix P is equal to the sum of absolute values in the corresponding row of matrix A. Thus I have:
(A + P) x = b + P x
The iteration formula becomes:
(D + P) x = b + P x – (L + U) x
If I use a guess for x on the right hand side, the iterations do not converge. However, if I solve for x using a traditional method to get a good approximation for x and use that in evaluating the term P x, then the above equation converges to a more accurate solution. In fact, using this approach the expression b + P x can be regarded as equal to b' which is a new constants vector. So basically I am changing the original equation to a more favorable form by updating matrix A and vector b.
The question to is this. Am I cheating in the above method that I described? Can you offer any better solutions?
Namir
PS: whatever approach works with the Jacobi method can also be applied to the Gauss-Seidel method which is a better method. There are algorithms (variants of the conjugate gradient method) that solve large linear systems of equations where matrix A is not diagonal dominant or symmetrical. I am trying to go back to basics and see if the Jacobi and Gauss-Seidel methods can get a face lift.
Edited: 6 Nov 2012, 10:41 a.m.
|