KrylovSubspace Methods II Lecture 7 Alessandra Nardi Thanks
Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy
Last lectures review • Overview of Iterative Methods to solve Mx=b – Stationary – Non Stationary • QR factorization – Modified Gram-Schmidt Algorithm – Minimization View of QR • General Subspace Minimization Algorithm • Generalized Conjugate Residual Algorithm – Krylov-subspace – Simplification in the symmetric case – Convergence properties • Eigenvalue and Eigenvector Review – Norms and Spectral Radius – Spectral Mapping Theorem
Arbitrary Subspace Methods Residual Minimization
Arbitrary Subspace Methods Residual Minimization Use Gram-Schmidt on Mwi’s!
Krylov Subspace Methods Krylov Subspace kth order polynomial
Krylov Subspace Methods Subspace Generation The set of residuals also can be used as a representation of the Krylov-Subspace Generalized Conjugate Residual Algorithm Nice because the residuals generate next search directions
Krylov-Subspace Methods Generalized Conjugate Residual Method (k-th step) Determine optimal stepsize in kth search direction Update the solution (trying to minimize residual) and the residual Compute the new orthogonalized search direction (by using the most recent residual)
Krylov-Subspace Methods Generalized Conjugate Residual Method (Computational Complexity for k-th step) Vector inner products, O(n) Matrix-vector product, O(n) if sparse Vector Adds, O(n) O(k) inner products, total cost O(nk) If M is sparse, as k (# of iters) approaches n, Better Converge Fast!
Summary • What is an iterative non stationary method: x(k+1) =x(k)+akpk • How search to calculate: – Search directions (pk) – Step along search directions (ak) • Krylov Subspace GCR • GCR is O(k 2 n) – Better converge fast! Now look at convergence properties of GCR
Krylov Methods Convergence Analysis Basic properties
Krylov Methods Convergence Analysis Optimality of GCR poly GCR Optimality Property Therefore Any polynomial which satisfies the constraints can be used to get an upper bound on
Eigenvalues and eigenvectors review Induced norms Theorem: Any induced norm is a bound on the spectral radius Proof:
Useful Eigenproperties Spectral Mapping Theorem Given a polynomial Apply the polynomial to a matrix Then
Krylov Methods Convergence Analysis Overview Matrix norm property where GCR optimality property is any (k+1)-th order polynomial subject to: may be used to get an upper bound on
Krylov Methods Convergence Analysis Overview • Review on eigenvalues and eigenvectors – Induced norms: relate matrix eigenvalues to the matrix norms – Spectral mapping theorem: relate matrix eigenvalues to matrix polynomials • Now ready to relate the convergence properties of Krylov Subspace methods to eigenvalues of M
Krylov Methods Convergence Analysis Norm of matrix polynomials Cond(V)
Krylov Methods Convergence Analysis Norm of matrix polynomials
Krylov Methods Convergence Analysis Important observations 1) The GCR Algorithm converges to the exact solution in at most n steps 2) If M has only q distinct eigenvalues, the GCR Algorithm converges in at most q steps
Krylov Methods Convergence Analysis Convergence for MT=M - Residual Polynomial If M = MT then 1) M has orthonormal eigenvectors 2) M has real eigenvalues
Krylov Methods Convergence Analysis Residual Polynomial Picture (n=10) 1 * = evals(M) - = 5 th order poly - = 8 th order poly
Krylov Methods Convergence Analysis Residual Polynomial Picture (n=10) Strategically place zeros of the poly
Krylov Methods Convergence Analysis Convergence for MT=M – Polynomial min-max problem
Krylov Methods Convergence Analysis Convergence for MT=M – Chebyshev solves min-max The Chebyshev Polynomial =
Chebychev Polynomials minimizing over [1, 10]
Krylov Methods Convergence Analysis Convergence for MT=M – Chebyshev bounds
Krylov Methods Convergence Analysis Convergence for MT=M – Chebyshev result
Krylov Methods Convergence Analysis Examples For which problem will GCR Converge Faster?
Which Convergence Curve is GCR? Iteration
Krylov Methods Convergence Analysis Chebyshev is a bound GCR Algorithm can eliminate outlying eigenvalues by placing polynomial zeros directly on them.
Iterative Methods - CG Why ? How? Convergence is related to: – Number of distinct eigenvalues – Ratio between max and min eigenvalue Now we know
Summary • Reminder about GCR – Residual minimizing solution – Krylov Subspace – Polynomial Connection • Review Eigenvalues – Induced Norms bound Spectral Radius – Spectral mapping theorem • Estimating Convergence Rate – Chebyshev Polynomials
- Slides: 31