The Power Method
We will now look at a numerical method for finding one eigenvalue of an $n \times n$ matrix - in particular, the largest eigenvalue. Assume that $A$ is an $n \times n$ matrix that has $n$ distinct eigenvalues $\lambda_1$, $\lambda_2$, …, $\lambda_n$. Further assume that amongst these eigenvalues there exists one whose absolute value is largest - assume that this eigenvalue is $\lambda_1$ without loss of generality, and so then:
(1)
\begin{align} \quad \mid \lambda_1 \mid \: > \: \mid \lambda_2 \mid ≥ \: ... \: ≥ \: \mid \lambda_n \mid \end{align}
Now we will let $v^{(1)}$, $v^{(2)}$, …, $v^{(n)}$ be corresponding nonzero eigenvectors to the eigenvalues $\lambda_1$, $\lambda_2$, …, $\lambda_n$. Then this set of eigenvectors $\{ v^{(1)}, v^{(2)}, ..., v^{(n)} \}$ is linearly independent. Let $x^{(0)}$ be a vector whose infinity norm is equal to $1$, that is $\| x^{(0)} \|_{\infty} = 1 = x_{p_0}^{(0)}$ for some component $x_{p_0}^{(0)}$ of $x^{(0)}$ (Recall that $\| x \|_{\infty} = \max_{1 ≤ i ≤ n} \mid x_i \mid$). Thus all of the components of $x$ are between $-1$ and $1$ in value. Now since $\{ v^{(1)}, v^{(2)}, ..., v^{(n)} \}$ is a linearly independent set of vectors, then for $x^{(0)} \in \mathbb{R}^n$ there exists constants $\beta_1$, $\beta_2$, …, $\beta_n$ such that
(2)
\begin{align} \quad x^{(0)} = \beta_1v^{(1)} + \beta_2v^{(2)} + ... + \beta_nv^{(n)} = \sum_{j=1}^{n} \beta_jv^{(j)} \end{align}
Now when we multiply this vector by the matrix $A$ to get that:
(3)
\begin{align} \quad Ax^{(0)} = A\sum_{j=1}^{n} \beta_j v^{(j)} = \sum_{j=1}^{n} \beta_jAv^{(j)} = \sum_{j=1}^{n} \beta_j \lambda_j v^{(j)} = \beta_1 \lambda_1 v^{(1)} + \beta_2 \lambda_2 v^{(2)} + ... + \beta_n \lambda_n v^{(n)} \end{align}
We then multiply the original equation $x^{(0)} = \sum_{j=1}^{n} \beta_jv^{(j)}$ by $A^2$ to get that:
(4)
\begin{align} \quad A^2x^{(0)} = A^2\sum_{j=1}^{n} \beta_j v^{(j)} = \sum_{j=1}^{n} \beta_jA^2v^{(j)} = \sum_{j=1}^{n} \beta_j \lambda_j^2 v^{(j)} = \beta_1 \lambda_1^2 v^{(1)} + \beta_2 \lambda_2^2 v^{(2)} + ... + \beta_n \lambda_n^2 v^{(n)} \end{align}
We continue to multiply the original equation $x^{(0)} = \sum_{j=1}^{n} \beta_jv^{(j)}$ by successively largest powers of $A$, that is, multiply this equation by $A$, $A^2$, …, $A^k$, … When we multiply this equation by the matrix $A^k$ we have that:
(5)
\begin{align} \quad A^kx^{(0)} = A^k\sum_{j=1}^{n} \beta_j v^{(j)} = \sum_{j=1}^{n} \beta_jA^kv^{(j)} = \sum_{j=1}^{n} \beta_j \lambda_j^k v^{(j)} = \beta_1 \lambda_1^k v^{(1)} + \beta_2 \lambda_2^k v^{(2)} + ... + \beta_n \lambda_n^k v^{(n)} \end{align}
We will now factor out $\lambda_1^{(k)}$ to get:
(6)
\begin{align} \quad A^kx^{(0)} = \sum_{j=1}^{n} \beta_j \left ( \frac{\lambda_j}{\lambda_1} \lambda_1 \right )^kv^{(j)} = \lambda_1^k \sum_{j=1}^{n} \beta_j \left ( \frac{\lambda_j}{\lambda_1} \right )^k v^{(j)} = \lambda_1^k \left ( \beta_1 \left ( \frac{\lambda_1}{\lambda_1}\right )^k v^{(1)} + \beta_2 \left ( \frac{\lambda_2}{\lambda_1}\right )^k v^{(2)} + ... + \beta_n \left ( \frac{\lambda_n}{\lambda_1}\right )^k v^{(n)} \right ) \end{align}
If we take the limit from both sides as $k \to \infty$ we see that since $\frac{\lambda_j}{\lambda_1} < 1$ for $j = 2, 3, ..., n$ that:
(7)
\begin{align} \quad \lim_{k \to \infty} A^k x^{(0)} = \lim_{k \to \infty} \lambda_1^k \left ( \beta_1 \left ( \frac{\lambda_1}{\lambda_1}\right )^k v^{(1)} + \beta_2 \left ( \frac{\lambda_2}{\lambda_1}\right )^k v^{(2)} + ... + \beta_n \left ( \frac{\lambda_n}{\lambda_1}\right )^k v^{(n)} \right ) \\ \quad \lim_{k \to \infty} A^k x^{(0)} = \lim_{k \to \infty} \lambda_1^k \beta_1 v^{(1)} \end{align}
Note that this limit converges to $0$ provided that $\mid \lambda_1 \mid < 1$ and diverges if $\mid \lambda_1 \mid > 1$ provided that $\beta_1 \neq 0$. Now let $y^{(1)} = Ax^{(0)}$ and $\mu^{(1)} = y_{p_0}^{(1)}$ and noting that $x_{p_0}^{(0)} = 1$ we have that:
(8)
\begin{align} \quad \mu^{(1)} = y_{p_0}^{(1)} = \frac{y_{p_0}^{(1)}}{x_{p_0}^{(0)}} = \frac{\sum_{j=1}^{n} \beta_j \lambda_j v_{p_0}^{(j)}}{\sum_{j=1}^{n} \beta_j v_{p_0}^{(j)}} = \frac{\beta_1 \lambda_1 v_{p_0}^{(1)} + \lambda_1 \sum_{j=2}^{n} \beta_j \left ( \frac{\lambda_j}{\lambda_1} \right ) v_{p_0}^{(j)}}{\beta_1 v_{p_0}^{(1)} + \sum_{j=2}^{n} \beta_j v_{p_0}^{(j)}} = \lambda_1 \left ( \frac{\beta_1 v_{p_0}^{(1)} + \sum_{j=2}^{n} \beta_j \left ( \frac{\lambda_j}{\lambda_1} \right ) v_{p_0}^{(j)}}{\beta_1 v_{p_0}^{(1)} + \sum_{j=2}^{n} \beta_j v_{p_0}^{(j)}} \right ) \end{align}
Now for step $k$, let $p_k$ be the smallest integer for which $\mid y_{p_k}^{(k)} \mid = \| y^{(k)} \|_{\infty}$. Let $y^{(k)} = Ax^{(k-1)}$ and define:
(9)
\begin{align} \quad \mu^{(k)} = y_{p_{k-1}^{(k)}} = \lambda_1 \left ( \frac{\beta_1 v_{p_{k-1}}^{(1)} + \sum_{j=2}^n \beta_j \left ( \frac{\lambda_j}{\lambda_1} \right )^k v_{p_{k-1}}^{(j)}}{\beta_1 v_{p_{k-1}}^{(1)} + \sum_{j=2}^{n} \beta_j \left ( \frac{\lambda_j}{\lambda_1} \right ) ^{k-1} v_{p_{k-1}}^{(j)}} \right ) \end{align}
And define:
(10)
\begin{align} \quad x^{(k)} = \frac{y^{(k)}}{y_{p_k}^{(k)}} = \frac{A^k x^{(0)}}{\prod_{k=1}^{k} y_{p_k}^{(k)}} \end{align}
We thus obtain three sequences $\{ x^{(k)} \}$, $\{ y^{(k)} \}$, and $\{ \mu^{(k)} \}$ with the latter being a sequence of scalars. Notice that $\lim_{k \to \infty} \mu^{(k)} = \lambda_1$ since $\biggr \rvert \frac{\lambda_j}{\lambda_1} \biggr \rvert < 1$ for $j = 2, 3, ..., n$ and $\lim_{k \to \infty} x^{(k)}$ converges to an eigenvector that corresponds to the eigenvalue $\lambda_1$.