Jordan's and Dini's Tests for Convergence of a Fourier Series at a Point
Recall from the Jordan's Theorem for Dirichlet Integrals page that if $g$ is of bounded variation on $[0, b]$ for some $b > 0$ then:
(1)On the Dini's Theorem for Dirichlet Integrals page we saw that if $g(0+)$ exists and if there exists a $b > 0$ such that $\displaystyle{\int_0^b \frac{g(t) - g(0+)}{t} \: dt}$ exists as a Lebesgue integral then once again:
(2)Also recall from the The Riemann Localization Theorem that if $f \in L([0, 2\pi])$ is a $2\pi$-periodic function then the Fourier series generated by $f$ converges at $x$ if and only if there exists a $b$ with $0 < b < \pi$ such that $\displaystyle{\lim_{n \to \infty} \frac{2}{\pi} \int_0^b \frac{f(x + t) + f(x - t)}{2} \frac{\sin \left ( \left ( n + \frac{1}{2} \right)t \right ) }{t} \: dt}$ exists in which case the Fourier series generated by $f$ converges at $x$ to this limit.
We will now combine these results giving us sufficient conditions for a Fourier series generated by a function $f$ to converge at a point $x$.
We first state the extremely important Jordan's Test:
Theorem 1 (Jordan's Test): Let $f \in L([0, 2\pi])$ be a $2\pi$-periodic function such that $f$ is of bounded variation on $[x - \delta, x + \delta]$ for some $\delta$ with $0 < \delta < b$. Then the Fourier series generated by $f$ converges at $x$ to $\displaystyle{\lim_{t \to 0+} \frac{f(x + t) + f(x - t)}{2}}$. |
Note that the only required condition for convergence at $x$ of the Fourier series generated by $f$ is that $f \in L^2([0, 2\pi])$ is a $2\pi$-periodic function that is of bounded variation on some closed subinterval containing $x$.
We now state the equally important Dini's Test:
Theorem 2 (Dini's Test): Let $f \in L([0, 2\pi])$ be a $2\pi$-perioic function and let $\displaystyle{g(x) = \frac{f(x + t) + f(x - t)}{2}}$. If $g(0+)$ exists and if there exists a $\delta$ with $0 < \delta < \pi$ such that $\displaystyle{\int_0^{\delta} \frac{g(t) - g(0+)}{t} \: dt}$ exists then the Fourier series generated by $f$ converges at $x$ to $\displaystyle{g(0+) = \lim_{t \to 0+} \frac{f(x + t) + f(x - t)}{2}}$. |
- Proof: Let $f \in L([0, 2\pi])$ be a $2\pi$-periodic function and let $\displaystyle{g(x) = \frac{f(x + t) + f(x - t)}{2}}$. Assume that $g(0+)$ exists and that there exists a $\delta$ with $0 < \delta < \pi$ such that $\displaystyle{\int_0^{\delta} \frac{g(t) - g(0+)}{t} \: dt}$ exists. Then by Dini's theorem we have that:
- If $(s_n(x))$ denotes the sequence of partial sums for the Fourier series generated by $f$ at $x$, then the Fourier series generated by $f$ converges at $x$ to: