The Implicit Function Theorem

The Implicit Function Theorem

It is important to review the pages on Systems of Multivariable Equations and Jacobian Determinants page before reading forward.

We recently saw some interesting formulas in computing partial derivatives of implicitly defined functions of several variables on the The Implicit Differentiation Formulas page. We will now delve deeper into the intuition behind these formulas.

Suppose that $y = f(x)$ is a single variable real-valued function that is defined implicitly such that $F(x, y) = F(x, y(x)) = 0$, and suppose that the point $(a, b)$ lies on this curve (and so $F(a, b) = 0$). Also suppose that $F$ has continuous first partial derivatives in some neighbourhood of $(a, b)$. We want to know whether or not we can solve for $y$ explicitly as a function of $x$ for all points near $(a, b)$.


For the arbitrary curve (which happens to be an ellipse) given above, it appears that there are many points $(a, b)$ for which there exists a neighbourhood around $(a, b)$ such that $y$ can be solved for in terms of $x$ as a portion of either the top half or bottom half semi-ellipse. However, not all points on this curve have this property. Any neighbourhood centered at a point with a vertical tangent line will not contain a unique value of $y$ for every value of $x$, since any neighbourhood centered around one of these points will contain a portion of both the top semi-ellipse and bottom semi-ellipse.

Now since $F(x, y) = 0$, we can partial differentiate this function with respect to $x$ and get that:

\begin{align} \quad F_1 (x, y) \cdot \frac{dx}{dx} + F_2 (x, y) \cdot \frac{dy}{dx} = 0 \end{align}

If we rearrange the terms in the equation above and isolate for $\frac{dy}{dx}$ we obtain:

\begin{align} \quad \frac{dy}{dx} = - \frac{F_1 (x, y)}{F_2 (x, y)} \\ \quad \frac{dy}{dx} \biggr \rvert_{x=a} = - \frac{F_1 (a, b)}{F_2 (a, b)} \end{align}

We first saw this formula on The Implicit Differentiation Formulas page. Now provided that $F_2 (a, b) \neq 0$, then such a solution $y = y(x)$ will exist. Note that $F_2 (a, b) = 0$ if the tangent line at $(a, b)$ is vertical.

We can then extend this idea further to functions of three or more variables. The result given above is only one small case of what is known as the Implicit Function Theorem which we will outline below.

Theorem 1 (The Implicit Function Theorem): Let $\left\{\begin{matrix} F_{(1)} (x_1, x_2, ..., x_m, y_1, y_2, ..., y_n) = 0\\ F_{(2)} (x_1, x_2, ..., x_m, y_1, y_2, ..., y_n) = 0\\ \vdots \\F_{(n)} (x_1, x_2, ..., x_m, y_1, y_2, ..., y_n) = 0 \end{matrix}\right.$ be a system of $n$ multivariable equations of $m + n$ variables, and let $P_0 (a_1, a_2, ..., a_m, b_1, b_2, ..., b_n)$ be a solution to the system. Suppose that $F_{(i)}$ has continuous first partial derivatives for $i = 1, 2, ..., n$ with respect to each of the variables $x_1, x_2, ..., x_m, y_1, y_2, ..., y_n$ near the point $P_0$, and suppose that the Jacobian determinant $\frac{\partial (F_{(1)}, F_{(2)}, ..., F_{(n)})}{\partial (y_1, y_2, ..., y_n)} \biggr \rvert_{P_0}\neq 0$. Then the system can be solved for $y_1, y_2, ..., y_n$ as functions of the variables $x_1, x_2, ..., x_m$ near $P_0$ and so there exists functions $f_1 (x_1, x_2, ..., x_m)$, $f_2 (x_1, x_2, ..., x_m)$, …, $f_n (x_1, x_2, ..., x_m)$ such that $f_j (a_1, a_2, ..., a_m) = b_j$ for $j = 1, 2, ..., n$ and such that for all $(x_1, x_2, ..., x_m)$ close to $(a_1, a_2, ..., a_m)$ we have that $\left\{\begin{matrix} F_{(1)} (x_1, x_2, ..., x_m, f_1(x_1, x_2, ..., x_m), f_2(x_1, x_2, ..., x_m), ..., f_n(x_1, x_2, ..., x_m) = 0\\ F_{(2)} (x_1, x_2, ..., x_m, f_1(x_1, x_2, ..., x_m), f_2(x_1, x_2, ..., x_m), ..., f_n(x_1, x_2, ..., x_m) = 0\\ \vdots \\ F_{(n)} (x_1, x_2, ..., x_m, f_1(x_1, x_2, ..., x_m), f_2(x_1, x_2, ..., x_m), ..., f_n(x_1, x_2, ..., x_m) = 0 \end{matrix}\right.$.

As a consequence of the Implicit Function Theorem and Cramer's Rule, the partial derivatives of this system at the point $P_0$ can be computed with the following formula containing Jacobian Determinants:

\begin{align} \quad \frac{\partial f_i}{\partial x_j} = \left ( \frac{\partial y_i}{\partial x_j} \right )_{x_1, x_2, ..., x_{j-1}, x_{j+1}, ..., x_m} = - \frac{\frac{\partial (F_{(1)}, F_{(2)}, ..., F_{(n)})}{\partial (y_1, y_2, ..., x_j, ..., y_n)}}{\frac{\partial (F_{(1)}, F_{(2)}, ..., F_{(n)})}{\partial (y_1, y_2, ..., y_i, ..., y_n)}} \end{align}
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License