Introduction To Limits

# Defining Limits

We will now look at one fundamental building blocks of Calculus… limits! We will start by defining the notion of a limit informally.

 Definition (Informal): If $f$ is a function, then we say that the Limit as $x$ Approaches $a$ is $L$ written $\lim_{x \to a} f(x) = L$ if as $x$ gets sufficiently close to the value $a$ from both the left and right sides of $a$, then $f(x)$ gets sufficiently close to $L$.

The definition above is usually sufficient for most introductory calculus classes, however the formal definition below will be necessary for more advanced calculus classes. If you are taking an introductory calculus class, it is advised that you skip through the formal definition of a limit below and carry on forward.

 Definition (Formal): Suppose that $f$ is a function defined on the open interval $I$ and contains the value $a$ in that interval except possibly $a$. We can write $\lim_{x \to a} f(x) = L$ if for every $\epsilon > 0$, there is a $\delta > 0$ such that if $0 < \mid x - a \mid < \delta$, then $\mid f(x) - L \mid < \epsilon$. We thus say that $f(x)$ can be arbitrarily close to $L$ by making the distance from $x$ to $a$ sufficiently small. We will now begin to look at some rather elementary examples of limits. Consider the following function, $f(x) = 4x + 3$, and suppose we want to calculate $\lim_{x \to 1} f(x)$. Clearly, we note that the $\lim_{x \to 1} f(x) = 7$ from the graph of $f$: However, suppose we want to find a condition such that $f(x)$ differs from our limit $L = 7$ by only 0.01? Reworded, we want to find a condition such that $\mid f(x) - 7 \mid < 0.01$.

We need to therefore find a value $\delta$ such that if $\mid x - 1 \mid < \delta$ ($x \neq 1$), then $\mid f(x) - 7 \mid < 0.01$. We note that $f(1.0025) = 7.01$, and that $f(0.9975) = 6.99$. Therefore, if $\mid x - 1 \mid < 0.0025$, then $\mid f(x) - 7 \mid < 0.01$.

By our definition of a limit, we can take any tolerance error $\epsilon$ such that $\mid f(x) - L \mid < \epsilon$ when we make the distance between $x$ and $a$ sufficiently small, that is $\mid x - a \mid < \delta$.

# Evaluating Basic Limits

Consider the function $f(x) = x^2 + x$ illustrated below: Suppose we want to evaluate $\lim_{x \to 0} f(x)$. As $x \to 0$, our function $f$ appears to be approaching 0 as well, that is $f(x) \to 0$. We thus say that $\lim_{x \to 0} f(x) = 0$. In fact, we can verify this:

(1)
\begin{align} f(x) = x^2 + x \\ f(0) = 0^2 + 0 = 0 \end{align}

There are some cases where we cannot verify this though. Consider the function $g(x) = \frac{1}{x^2}$. If we wanted to evaluate $\lim_{x \to 0} g(x)$, we couldn't simply input $g(0)$ and find our answer, since $g(0) = \frac{1}{0}$ is undefined. Recall from our definition of limits that our function doesn't have to be defined at what we're approaching as in this case it isn't. We can still find the limit though.

From the diagram, it appears that on both sides of $x = 0$, the function begins to "explode" and get very large. Using a table, we can see this rather clearly as we take negative values of x from the left side very close to 0 but not 0:

x g(x)
-0.1 100
-0.01 10000
-0.001 1000000

We see the same thing happening on the right side as we take positive values of x that are very close to 0 but not 0:

x g(x)
0.1 100
0.01 10000
0.001 1000000

As we can imagine, if we take x to be closer and closer to 0, we get that $g(x)$ gets closer and closer to infinity, and therefore, $\lim_{x \to 0} g(x) = \infty$.