Thumbnail
Access Restriction
Subscribed

Author Traub, J. F. ♦ Woniakowski, H.
Source ACM Digital Library
Content type Text
Publisher Association for Computing Machinery (ACM)
File Format PDF
Copyright Year ©1984
Language English
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Abstract The information-based study of the optimal solution of largelinear systems is initiated by studying the case of Krylovinformation. Among the algorithms that use Krylov information areminimal residual, conjugate gradient, Chebyshev, and successiveapproximation algorithms. A "sharp" lower bound on the number ofmatrix-vector multiplications required to compute anå-approximation is obtained for any orthogonally invariantclass of matrices. Examples of such classes include many ofpractical interest such as symmetric matrices, symmetric positivedefinite matrices, and matrices with bounded condition number. Itis shown that the minimal residual algorithm is within at most onematrix-vector multiplication of the lower bound. A similar resultis obtained for the generalized minimal residual algorithm. Thelower bound is computed for certain classes of orthogonallyinvariant matrices. How the lack of certam properties (symmetry,positive definiteness) increases the lower bound is shown. Aconjecture and a number of open problems are stated.
ISSN 00045411
Age Range 18 to 22 years ♦ above 22 year
Educational Use Research
Education Level UG and PG
Learning Resource Type Article
Publisher Date 1984-06-01
Publisher Place New York
e-ISSN 1557735X
Journal Journal of the ACM (JACM)
Volume Number 31
Issue Number 3
Page Count 15
Starting Page 545
Ending Page 559


Open content in new tab

   Open content in new tab
Source: ACM Digital Library