We will see that, for each method, the spectral radius takes the form of for some. The smaller the spectral radius of, the higher the rate of convergence.
Given an initial guess, we can solve the following equation to find the next iterate, then, and so on:Īnd concluded that the iterative method converges from any initial guess, if and only if, the spectral radius is less than 1.īefore we move on, let us define the rate of convergence: This let us define an iterative method based on fixed-point method. We also looked at the general equation, and discussed splitting the matrix into a sum of two matrices: The points are marked in red, and their indices in green. Discretize the square domain by placing N interior points along each axis. Since we did not know the values of at points, we had to solve equations.
Recall that we placed interior points along each axis. In addition, we will analyze the convergence of each method for the Poisson’s equation, both analytically and numerically. Today, we will look at Jacobi, Gauss-Seidel, Successive Over-Relaxation (SOR), and Symmetric SOR (SSOR), and a couple of related techniques- red-black ordering and Chebyshev acceleration.
The matrix, which represents the discrete Laplace operator, is sparse, so we can use an iterative method to solve the equation efficiently. Last time, we looked at 2D Poisson’s equation and discussed how to arrive at a matrix equation using the finite difference method.