Last modified: September 11, 2023

This article is written in: πŸ‡ΊπŸ‡Έ

QR method

The QR method is a widely used algorithm in numerical linear algebra for determining the eigenvalues of a given square matrix. Unlike direct methods such as solving the characteristic polynomial, which can be complicated and unstable numerically for large matrices, the QR method leverages iterative transformations that preserve eigenvalues. Over repeated iterations, these transformations lead the matrix closer to a quasi-upper-triangular or upper-triangular form, from which the eigenvalues can be read directly.

output(23)

Mathematical Formulation

Consider an nΓ—n matrix A. The QR decomposition (factorization) of A is given by:

A=QR

where Q is an orthogonal matrix (QTQ=I) and R is an upper-triangular matrix.

The QR method applies the following iterative scheme:

I. Initialization: Set A0=A.

II. Iteration: For each iteration kβ‰₯1:

Compute the QR factorization of Akβˆ’1:

Akβˆ’1=QkRk.

Form a new matrix:

Ak=RkQk.

Notice that:

Ak=RkQk=Qkβˆ’1Akβˆ’1Qk, which means Ak is similar to Akβˆ’1. Since similarity transformations preserve eigenvalues, all Ak share the same eigenvalues as the original matrix A.

As k increases, under certain conditions (e.g., a well-chosen shift strategy), the matrix Ak converges to an upper-triangular matrix (or a quasi-upper-triangular form in the real case), whose diagonal entries are the eigenvalues of A.

Derivation

The idea behind the QR method arises from the following observations:

I. Similarity and Eigenvalues:

Two matrices A and B are similar if there exists an invertible matrix C such that A=Cβˆ’1BC. Similar matrices share the same eigenvalues.

II. QR Decomposition:

Every invertible (or at least full rank) matrix A can be decomposed into an orthogonal matrix Q and an upper-triangular matrix R.

III. Iterative Process:

By repeatedly factoring Akβˆ’1 into QkRk and then forming Ak=RkQk, we create a sequence of similar matrices A0,A1,A2,…. If Ak converges to an upper-triangular matrix, the eigenvalues are the diagonal elements of that limit.

In practice, to ensure rapid convergence and numerical stability, shifts are employed (the so-called "QR algorithm with shifts"). This modification chooses special shifts based on elements of Ak to speed up convergence to eigenvalues.

Algorithm Steps

I. Initialization: Set A0=A.

II. QR Factorization: Compute the QR factorization of Akβˆ’1:

Akβˆ’1=QkRk.

The QR factorization can be computed using:

III. Form New Matrix:

Ak=RkQk.

Note that:

Ak=Qkβˆ’1Akβˆ’1Qk,

so Ak and Akβˆ’1 are similar.

IV. Check for Convergence:

If Ak is sufficiently close to an upper-triangular matrix, or the off-diagonal elements are below a given tolerance, stop.

V. Output:

The diagonal elements of the nearly upper-triangular matrix Ak at convergence approximate the eigenvalues of the original matrix A.

Example

Given System:

A=[41 23].

I. First QR Factorization:

Perform the QR factorization on A0=A.

Suppose we find:

A=Q1R1, with

Q1=[0.8944βˆ’0.4472 0.44720.8944],R1=[4.47211.7889 02.2361].

II. Form A1:

A1=R1Q1.

After multiplication, ideally, we move closer to an upper-triangular form. Repeating this process multiple times (depending on the complexity of the matrix) will yield a matrix whose off-diagonal elements approach zero.

III. Convergence:

After sufficient iterations, the matrix Ak will approximate an upper-triangular matrix. The diagonal entries of this matrix give the eigenvalues of A.

For this simple 2Γ—2 matrix, the method would quickly converge. The eigenvalues obtained would match the exact eigenvalues solved by the characteristic equation.

Advantages

I. All Eigenvalues Simultaneously:

The QR method retrieves all eigenvalues of a matrix at once, rather than computing them individually.

II. Numerical Stability:

With proper implementation (especially using orthogonal transformations and shifts), the QR method is stable and widely considered the "gold standard" for eigenvalue computations in numerical libraries.

III. Broad Applicability:

The QR method works for both real and complex matrices and can be adapted to handle a variety of matrix types efficiently.

Limitations

I. No Direct Eigenvectors:

The basic QR algorithm finds eigenvalues but does not directly produce eigenvectors. Additional steps or modifications are required to recover eigenvectors.

II. Computational Effort:

Although efficient algorithms exist, the QR method can be computationally intensive for very large matrices. Advanced techniques like the Hessenberg form and modern parallel algorithms are used to improve performance.

III. Convergence Speed for Certain Matrices:

While generally fast, convergence can be slow if the matrix has certain structures or if shifts are not chosen wisely, making it less practical without proper optimization.

Table of Contents

    QR method
    1. Mathematical Formulation
    2. Derivation
    3. Algorithm Steps
    4. Example
    5. Advantages
    6. Limitations