On the Convergence of Conjugate Gradient Variants in Finite Precision Arithmetic

Tyler Chen

This is a companion piece to the publication:

    title = {On the Convergence Rate of Variants of the Conjugate Gradient Algorithm in Finite Precision Arithmetic},
    author = {Anne Greenbaum and Hexuan Liu and Tyler Chen},
    year = {2019},
    eprint = {1905.05874},
    archivePrefix = {arXiv},
    primaryClass = {cs.NA},
    note = {arXiv preprint: 1905.05874},

A preprint is available on arXiv: https://arxiv.org/pdf/1905.05874.pdf.

Why should I care?

The behaviour of the conjugate gradient algorithm in finite precision is very different than what is predicted by exact arithmetic theory. In this sense, the algorithm could be considered unstable. However, the conjugate gradient algorithm is widely used in practice, so it is important to understand its behaviour in finite precision.


If you are not familiar with the Conjugate Gradient method, it may be worth reading this page first.

Contributions of this paper

In this paper we show (numerically) why on some problems certain variants of the conjugate gradient algorithm converge more slowly.