Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ronald B. Morgan is active.

Publication


Featured researches published by Ronald B. Morgan.


SIAM Journal on Matrix Analysis and Applications | 1995

A Restarted GMRES Method Augmented with Eigenvectors

Ronald B. Morgan

The GMRES method for solving nonsymmetric linear equations is generally used with restarting to reduce storage and orthogonalization costs. Restarting slows down the convergence. However, it is possible to save some important information at the time of the restart. It is proposed that approximate eigenvectors corresponding to a few of the smallest eigenvalues be formed and added to the subspace for GMRES. The convergence can be much faster, and the minimum residual property is retained.


SIAM Journal on Scientific Computing | 2002

GMRES with Deflated Restarting

Ronald B. Morgan

A modification is given of the GMRES iterative method for nonsymmetric systems of linear equations. The new method deflates eigenvalues using Wu and Simons thick restarting approach [SIAM J. Matrix Anal. Appl., 22 (2000), pp. 602--616]. It has the efficiency of implicit restarting but is simpler and does not have the same numerical concerns. The deflation of small eigenvalues can greatly improve the convergence of restarted GMRES. Also, it is demonstrated that using harmonic Ritz vectors is important because then the whole subspace is a Krylov subspace that contains certain important smaller subspaces.


Linear Algebra and its Applications | 1991

Computing Interior Eigenvalues of Large Matrices

Ronald B. Morgan

Computing eigenvalues from the interior of the spectrum of a large matrix is a difficult problem. The Rayleigh-Ritz procedure is a standard way of reducing it to a smaller problem, but it is not optimal for interior eigenvalues. Here a method is given that does a better job. In contrast with standard Rayleigh-Ritz, a priori bounds can be given for the accuracy of interior eigenvalue and eigenvector approximations. When applied to the Lanczos algorithm, this method yields better approximations at early stages. Applied to preconditioning methods, the convergence rate is improved.


Mathematics of Computation | 1996

On restarting the Arnoldi method for large nonsymmetric eigenvalue problems

Ronald B. Morgan

The Arnoldi method computes eigenvalues of large nonsymmetric matrices. Restarting is generally needed to reduce storage requirements and orthogonalization costs. However, restarting slows down the convergence and makes the choice of the new starting vector difficult if several eigenvalues are desired. We analyze several approaches to restarting and show why Sorensens implicit QR approach is generally far superior to the others. Ritz vectors are combined in precisely the right way for an effective new starting vector. Also, a new method for restarting Arnoldi is presented. It is mathematically equivalent to the Sorensen approach but has additional uses.


Siam Journal on Scientific and Statistical Computing | 1986

Generalizations of Davidson's method for computing eigenvalues of sparse symmetric matrices

Ronald B. Morgan; David S. Scott

This paper analyzes Davidson’s method for computing a few eigenpairs of large sparse symmetric matrices. An explanation is given for why Davidson’s method often performs well but occasionally performs very badly. Davidson’s method is then generalized to a method which offers a powerful way of applying preconditioning techniques developed for solving systems of linear equations to solving eigenvalue problems.


SIAM Journal on Matrix Analysis and Applications | 2000

Implicitly Restarted GMRES and Arnoldi Methods for Nonsymmetric Systems of Equations

Ronald B. Morgan

The generalized minimum residual method (GMRES) is well known for solving large nonsymmetric systems of linear equations. It generally uses restarting, which slows the convergence. However, some information can be retained at the time of the restart and used in the next cycle. We present algorithms that use implicit restarting in order to retain this information. Approximate eigenvectors determined from the previous subspace are included in the new subspace. This deflates the smallest eigenvalues and thus improves the convergence. The subspace that contains the approximate eigenvectors is itself a Krylov subspace, but not with the usual starting vector. The implicitly restarted FOM algorithm includes standard Ritz vectors in the subspace. The eigenvalue portion of its calculations is equivalent to Sorensens IRA algorithm. The implicitly restarted GMRES algorithm uses harmonic Ritz vectors. This algorithm also gives a new approach to computing interior eigenvalues.


Numerical Linear Algebra With Applications | 1998

Harmonic projection methods for large non-symmetric eigenvalue problems

Ronald B. Morgan; Min Zeng

The problem of finding interior eigenvalues of a large nonsymmetric matrix is examined. A procedure for extracting approximate eigenpairs from a subspace is discussed. It is related to the Rayleigh–Ritz procedure, but is designed for finding interior eigenvalues. Harmonic Ritz values and other approximate eigenvalues are generated. This procedure can be applied to the Arnoldi method, to preconditioning methods, and to other methods for nonsymmetric eigenvalue problems that use the Rayleigh–Ritz procedure. The subject of estimating the boundary of the entire spectrum is briefly discussed, and the importance of preconditioning for interior eigenvalue problems is mentioned.


SIAM Journal on Scientific Computing | 1993

Preconditioning the Lanczos algorithm for sparse symmetric eigenvalue problems

Ronald B. Morgan; David S. Scott

A method for computing a few eigenpairs of sparse symmetric matrices is presented and analyzed that combines the power of preconditioning techniques with the efficiency of the Lanczos algorithm. The method is related to Davidson’s method and its generalizations, but can be less expensive for matrices that are fairly sparse. A double iteration is used. An effective termination criterion is given for the inner iteration. Quadratic convergence with respect to the outer loop is shown.


Linear Algebra and its Applications | 2008

Deflated GMRES for systems with multiple shifts and multiple right-hand sides

Dean Darnell; Ronald B. Morgan; Walter Wilcox

Abstract We consider solution of multiply shifted systems of nonsymmetric linear equations, possibly also with multiple right-hand sides. First, for a single right-hand side, the matrix is shifted by several multiples of the identity. Such problems arise in a number of applications, including lattice quantum chromodynamics where the matrices are complex and non-Hermitian. Some Krylov iterative methods such as GMRES and BiCGStab have been used to solve multiply shifted systems for about the cost of solving just one system. Restarted GMRES can be improved by deflating eigenvalues for matrices that have a few small eigenvalues. We show that a particular deflated method, GMRES-DR, can be applied to multiply shifted systems. In quantum chromodynamics, it is common to have multiple right-hand sides with multiple shifts for each right-hand side. We develop a method that efficiently solves the multiple right-hand sides by using a deflated version of GMRES and yet keeps costs for all of the multiply shifted systems close to those for one shift. An example is given showing this can be extremely effective with a quantum chromodynamics matrix.


Journal of Computational Physics | 1992

Generalizations of davidson's method for computing eigenvalues of large nonsymmetric matrices

Ronald B. Morgan

Abstract Davidsons method for nonsymmetric eigenvalue problems is examined. Some analysis is given for why Davidsons method is effective. An implementation is given that avoids use of complex arithmetic. This reduces the expense if complex eigenvalues are computed. Also discussed is a generalization of Davidsons method that applies the preconditioning techniques developed for systems of linear equations to nonsymmetric eigenvalue problems. Convergence can be rapid if there is an approximation to the matrix that is both factorable and fairly accurate.

Collaboration


Dive into the Ronald B. Morgan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Min Zeng

University of Missouri

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge