# Taylor's Theorem for the Calculus of Finite Differences

Recall that in Calculus, Taylor's theorem states that if $f$ is a real-variable function that is $k+1$-times differentiable on some interval containing containing $c$ and $x$ then for the $n^{\mathrm{th}}$ degree Taylor polynomial $P_n$

(1)we have that $f(x) = P_n(x) + E_n(x)$ where $E_n$ is the error term between $f(x)$ and $P_n$ and for some $\xi$ between $c$ and $x$ we have that $\displaystyle{E_n(x) = \frac{f^{(n+1)}(c)}{(n+1)!}(x - c)^{n+1}}$.

Consider Taylor's theorem in the case of a polynomial $f(x) = a_0 + a_1x + a_2x^2 + ... + a_nx^n$. Then we have that the $c = 0$ and that the coefficients of $f$ are given for each $k \in \{0, 1, 2, ..., n \}$ by:

(2)We will now look at an analogue of Taylor's theorem for polynomials in terms of the calculus of finite differences.

Theorem 1 (Taylor's Theorem for the Calculus of Finite Differences): If $\displaystyle{f(x) = a_0x^{\underline{0}} + a_1x^{\underline{1}} + ... + a_nx^{\underline{n}} = \sum_{k=0}^{n} a_kx^{\underline{k}}}$ then the coefficients $a_k$ of this polynomial for each $k \in \{0, 1, 2, ..., n \}$ are $a_k = \frac{\Delta^k f(x)}{k!} \biggr \lvert_{x=0}$ and so, $\displaystyle{f(x) = f(x)\biggr \lvert_{x=0} x^{\underline{0}} + \frac{\Delta f(x)}{1!} \biggr \lvert_{x=0} x^{\underline{1}} + \frac{\Delta^2 f(x)}{2!} \biggr \lvert_{x=0} x^{\underline{2}} + ... + \frac{\Delta^n f(x)}{n!} \biggr \lvert_{x=0} x^{\underline{n}} = \sum_{k=0}^{n} \frac{\Delta^k f(x)}{k!} \biggr \lvert_{x=0} x^{\underline{k}} = \sum_{k=0}^{n} \Delta^k f(x) \biggr \lvert_{x=0} \cdot \binom{x}{k}}$. |