Taylor's Theorem and The Lagrange Remainder
Taylor's Theorem and The Lagrange Remainder
We are about to look at a crucially important theorem known as Taylor's Theorem. Before we do so though, we must look at the following extension to the Mean Value Theorem which will be needed in our proof.
Theorem 1 (Cauchy's Mean Value Theorem): Suppose that $f$ and $g$ are continuous on the closed interval $[a, b]$ and differentiable on $(a, b)$. If $g(a) \neq g(b)$ then there exists a $\mu \in (a, b)$ such that $\frac{f(b) - f(a)}{g(b) - g(a)} = \frac{f'(\mu)}{g'(\mu)}$. |
We will now look and prove Taylor's Theorem which provides us a formula to determining the error $E_n$ between $P_n$ and $f$.
Theorem 2 (Taylor's Theorem): Suppose that $f$ is $n + 1$ times differentiable on some interval containing the center of convergence $c$ and $x$, and let $P_n(x) = f(c) + \frac{f^{(1)}(c)}{1!}(x - c) + \frac{f^{(2)}(c)}{2!}(x - c)^2 + ... + \frac{f^{(n)}(c)}{n!}(x - c)^n$ be the $n^{\mathrm{th}}$ order Taylor polynomial of $f$ at $x = c$. Then $f(x) = P_n(x) + E_n(x)$ where $E_n(x)$ is the error term of $P_n(x)$ from $f(x)$ and for $\xi$ between $c$ and $x$, the Lagrange Remainder form of the error $E_n$ is given by the formula $E_n(x) = \frac{f^{(n+1)}(\xi)}{(n + 1)!}(x - c)^{n+1}$. |
Another form of the error $E_n(x)$ can be given with another formula known as The Integral Remainder and is given by $E_n(x) = \frac{1}{n!} \int_c^x (x - t)^n f^{(n+1)}(t) \: dt$. We will look into this form of the remainder soon.
- Proof: We will prove Taylor's Theorem using mathematical induction. First let's consider the case where $n = 0$. Then we have that $P_0(x) = f(c)$ and $E_0(x) = \frac{f'(\xi)}{1!}(x - c) = f'(\xi)(x - c)$ for $\xi$ between $c$ and $x$ and so:
\begin{align} f(x) = P_0(x) + E_0(x) \\ f(x) = f(c) + f'(\xi) (x - c) \\ f(x) - f(c) = f'(\xi)(x - c) \\ f'(\xi) = \frac{f(x) - f(c)}{x - c} \end{align}
- We see that in the case where $n = 0$ we obtain the Mean Value Theorem, so the case where $n = 0$ is true, that is $f(x) = P_0(x) + E_0(x)$.
- Now consider the case where $n = 1$. Then the error $E_1(x) = \frac{f''(\xi)}{2!}(x - c)^2$ for some $\xi$ between $c$ and $x$ is simply the error formula between the linearization of $f$ centered at $x = c$ and $f$, so the case where $n = 1$ is true, that is $f(x) = P_1(x) + E_1(x)$.
- Now suppose that Theorem 1 is true for the case when $n = k - 1$ where $k ≥ 2$, and assume that $f$ is any function differentiable $k$ times on some interval containing the center of convergence $c$ and $x$. Then we have that Lagrange remainder for the error is given by $E_{k-1} = \frac{f^{(k)}(\xi)}{k!} (x - c)^k$ for some $\xi$ between $c$ and $x$.
- Assume that $x > c$. If $g(t) = (t - c)^{k+1}$ then applying Cauchy's Mean Value Theorem on $E_k(t)$ and $g(t)$ on the interval $[c, x]$ we get that for some $\mu \in (c, x)$ we have that:
\begin{align} \quad \frac{E_k(x) - E_k(c)}{g(x) - g(c)} = \frac{E_k'(\mu)}{g'(\mu)} \\ \quad \frac{E_k(x) - E_k(c)}{(x - c)^{k+1} - (c - c)^{k+1}} = \frac{E_k'(\mu)}{(k + 1)(\mu - c)^k} \\ \quad \frac{E_k(x)}{(x - c)^{k+1}} = \frac{E_k'(\mu)}{(k + 1)(\mu - c)^k} \end{align}
- Now notice that:
\begin{align} \quad E_k'(\mu) = \frac{d}{dt} \left ( f(t) - \left [ f(c) + f^{(1)}(c)(t - c) + \frac{f^{(2)}(c)}{2!}(t - c)^2 + ... + \frac{f^{(k)}(c)}{k!}(t - c)^k \right ] \right ) \biggr \rvert_{t = \mu} \\ \quad E_k'(\mu) = f'(\mu) - \left [ f(c) + f^{(1)}(c)(\mu - c) + \frac{f^{(2)}(c)}{2!}(\mu - c)^2 + ... + \frac{f^{(k)}(c)}{k!}(\mu - c)^k \right ] \end{align}
- We can see that $E_k'(\mu)$ is equal to the error $E_{k-1}(\mu)$ for the error between $f'$ and its $k - 1$ order Taylor polynomial centered at $x = c$. By the induction hypothesis, we have that then for some $\xi$ between $c$ and $\mu$:
\begin{align} \quad E_k'(\mu) = \frac{(f')^{(k)}(\xi)}{k!}(\mu - c)^k = \frac{f^{(k+1)}(\xi)}{k!}(\mu - c)^k \end{align}
- Substituting this into our Cauchy's Mean Value Theorem equation from earlier and we get that for some number $\xi \in (a, \mu)$ (noting that if $\xi \in (a, \mu)$ then $\xi \in (a, x)$ since $\mu \in (a, x)$) we have that:
\begin{align} \quad \frac{E_k(x)}{(x - c)^{k+1}} = \frac{\frac{f^{(k+1)}(\xi)}{k!}(\mu - c)^k}{(k + 1)(\mu - c)^k} \\ \quad \frac{E_k(x)}{(x - c)^{k+1}} = \frac{f^{(k+1)}(\xi)}{(k+1)!} \\ \quad E_k(x) = \frac{f^{(k+1)}(\xi)}{(k+1)!} (x - c)^{k+1} \end{align}
- A similar proof can be constructed for when $a > x$. Thus we have shown by induction that Theorem 1 is true. $\blacksquare$
Corollary 1: Suppose that derivatives of all orders exists on an interval containing $c$ and $x$. Then Taylor's Theorem holds and if for any $x$ we have that $\lim_{n \to \infty} E_n(x) = 0$ then $\lim_{n \to \infty} P_n(x) = f(x)$. |
- Proof: Taking the limit as $n \to \infty$ on both sides of Taylor's formula and we get that:
\begin{align} \quad \lim_{n \to \infty} \left [f(x) - P_n(x) \right ] = \lim_{n \to \infty} E_n(x) \\ \quad \lim_{n \to \infty} \left [f(x) - P_n(x) \right ] = 0 \end{align}
- Therefore as $n \to \infty$, $P_n(x) \to f(x)$ and so $\lim_{n \to \infty} P_n(x) = f(x)$. $\blacksquare$