Applying The Power Method to Find a Dominating Eigenvalue

Applying The Power Method to Find a Dominating Eigenvalue

Recall from The Power Method that if $A$ is an $n \times n$ matrix with $n$ distinct eigenvalues, $\lambda_1, \lambda_2, ..., \lambda_n$ for which one is dominating, say $\lambda_1$ such that $\mid \lambda _1 \mid \: > \: \mid \lambda_2 \mid \: > \: ... \: > \: \mid \lambda_n \mid$, then we saw that we could approximate the eigenvalue $\lambda_1$ iteratively.

The scratch work on the page linked above was rather messy, so we will develop the method more generally here and then look at an example. Let $x^{(0)}$ be a nonzero vector such that $\| x^{(0)} \|_{\infty} = \max_{1 ≤ i ≤ n} \mid x_i \mid = 1$ so that each of the components in the vector $x^{(0)}$ are between $-1$ and $1$. Then define successive approximations by:

(1)
\begin{align} \quad x^{(1)} = Ax^{(0)} \\ \quad x^{(2)} = Ax^{(1)} = A(Ax^{(0)}) = A^2x^{(0)} \\ \quad \quad \vdots \quad \quad \\ \quad x^{(n)} = A(x^{(n-1)}) = A^2x^{(n-2)} = ... = A^nx^{(0)} \end{align}

The vectors $x^{(n)}$ are approximation to a corresponding eigenvector of the dominated eigenvalue when appropriately scaled. However, we're looking for the dominated eigenvalue. The following theorem will give us a way to find this eigenvalue from the eigenvector approximations.

Theorem 1: Let $x$ be an eigenvector of $A$. Then the eigenvalue $\lambda$ corresponding to this eigenvector is given by $\lambda = \frac{Ax \cdot x}{x \cdot x}$.

The formula above is called the Rayleigh Quotient for the eigenvalue $\lambda$.

  • Proof: Suppose that $x$ is an eigenvector of $A$. Then $Ax = \lambda x$ for the corresponding eigenvalue of $A$. Substituting this into the Rayleigh quotient and we have that:
(2)
\begin{align} \quad \frac{Ax \cdot x}{x \cdot x} = \frac{\lambda x \cdot x}{x \cdot x} = \lambda \frac{x \cdot x}{x \cdot x} = \lambda \quad \blacksquare \end{align}

Let's now look at an example of finding an eigenvalue using this method.

Example 1

Find the dominating eigenvalue of the matrix $A = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix}$.

You should verify using the characteristic polynomial that the eigenvalues of $A$ are $-2$ and $6$. In particular, $6$ is the dominating eigenvalue.

We will now use the power method to further show that $6$ is an eigenvalue (though the method explained above is much easier for this case where $A$ was a small matrix). Let $x^{(0)} = \begin{bmatrix} 0.5\\ 0.5 \end{bmatrix}$ be our initial approximation. Then:

(3)
\begin{align} \quad x^{(1)} = Ax^{(0)} = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix} \begin{bmatrix} 0.5\\ 0.5 \end{bmatrix} = \begin{bmatrix} 4 \\ 2 \end{bmatrix} \\ \quad x^{(2)} = Ax^{(1)} = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix} \begin{bmatrix} 4 \\ 2 \end{bmatrix} = \begin{bmatrix} 24 \\ 14 \end{bmatrix} \\ \quad x^{(3)} = Ax^{(2)} = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix} \begin{bmatrix} 24 \\ 14 \end{bmatrix} = \begin{bmatrix} 142 \\ 86 \end{bmatrix} \\ \quad x^{(4)} = Ax^{(3)} = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix} \begin{bmatrix} 142 \\ 86 \end{bmatrix} = \begin{bmatrix} 856 \\ 512 \end{bmatrix}\\ \quad x^{(5)} = Ax^{(4)} = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix} \begin{bmatrix} 856 \\ 512 \end{bmatrix} = \begin{bmatrix} 5128 \\ 3080 \end{bmatrix} \\ \end{align}

We will now scale our eigenvector approximations:

(4)
\begin{align} \quad x^{(1)} = 2 \begin{bmatrix} 2 \\ 1 \end{bmatrix} \\ \quad x^{(2)} = 14 \begin{bmatrix} 1.714 \\ 1 \end{bmatrix} \\ \quad x^{(3)} = 86 \begin{bmatrix} 1.651 \\ 1 \end{bmatrix} \\ \quad x^{(4)} = 512 \begin{bmatrix} 1.672 \\ 1 \end{bmatrix} \\ \quad x^{(5)} = 3080 \begin{bmatrix} 1.665 \\ 1 \end{bmatrix} \\ \end{align}

Let's now compute $A\tilde{x^{(5)}}$ where $\tilde{x^{(5)}}$ is our scaled eigenvalue approximation of $x^{(5)}$. We have that:

(5)
\begin{align} \quad A\tilde{x^{(5)}} = \begin{bmatrix} 3 & 5\\ 3 & 1 \end{bmatrix} \begin{bmatrix} 1.665 \\ 1 \end{bmatrix} = \begin{bmatrix} 9.995 \\ 5.995 \end{bmatrix} \end{align}

We now apply the formula given in Theorem 1:

(6)
\begin{align} \quad \frac{Ax \cdot x}{x \cdot x} = \frac{\begin{bmatrix} 9.995 \\ 5.995 \end{bmatrix} \cdot \begin{bmatrix} 1.665 \\ 1 \end{bmatrix}}{\begin{bmatrix} 1.665 \\ 1 \end{bmatrix} \cdot \begin{bmatrix} 1.665 \\ 1 \end{bmatrix}} = \frac{22.636675}{3.772225} \approx 6 \end{align}
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License