# Product rule

In calculus, the product rule is a formula used to find the derivatives of products of two or more functions. It may be stated as

$(f\cdot g)'=f'\cdot g+f\cdot g'$ or in Leibniz's notation

${\dfrac }(u\cdot v)={\dfrac }\cdot v+u\cdot {\dfrac }.$ In differentials notation, this can be written as

$d(uv)=u\,dv+v\,du.$ In Leibniz's notation, the derivative of the product of three functions (not to be confused with Euler's triple product rule) is

${\dfrac }(u\cdot v\cdot w)={\dfrac }\cdot v\cdot w+u\cdot {\dfrac }\cdot w+u\cdot v\cdot {\dfrac }.$ ## Discovery

Discovery of this rule is credited to Gottfried Leibniz, who demonstrated it using differentials. (However, J. M. Child, a translator of Leibniz's papers, argues that it is due to Isaac Barrow.) Here is Leibniz's argument: Let u(x) and v(x) be two differentiable functions of x. Then the differential of uv is

${\begind(u\cdot v)&{}=(u+du)\cdot (v+dv)-u\cdot v\\&{}=u\cdot dv+v\cdot du+du\cdot dv.\end}$ Since the term du·dv is "negligible" (compared to du and dv), Leibniz concluded that

$d(u\cdot v)=v\cdot du+u\cdot dv$ and this is indeed the differential form of the product rule. If we divide through by the differential dx, we obtain

${\frac }(u\cdot v)=v\cdot {\frac }+u\cdot {\frac }$ which can also be written in Lagrange's notation as

$(u\cdot v)'=v\cdot u'+u\cdot v'.$ ## Examples

• Suppose we want to differentiate f(x) = x2 sin(x). By using the product rule, one gets the derivative f(x) = 2x sin(x) + x2 cos(x) (since the derivative of x2 is 2x and the derivative of the sine function is the cosine function).
• One special case of the product rule is the constant multiple rule, which states: if c is a number and f(x) is a differentiable function, then cf(x) is also differentiable, and its derivative is (cf)(x) = cf(x). This follows from the product rule since the derivative of any constant is zero. This, combined with the sum rule for derivatives, shows that differentiation is linear.
• The rule for integration by parts is derived from the product rule, as is (a weak version of) the quotient rule. (It is a "weak" version in that it does not prove that the quotient is differentiable, but only says what its derivative is if it is differentiable.)

## Proofs

### Proof by factoring (from first principles)

Let h(x) = f(x)g(x) and suppose that f and g are each differentiable at x. We want to prove that h is differentiable at x and that its derivative, h(x), is given by f(x)g(x) + f(x)g(x). To do this, $f(x)g(x+\Delta x)-f(x)g(x+\Delta x)$ (which is zero, and thus does not change the value) is added to the numerator to permit its factoring, and then properties of limits are used.

${\beginh'(x)&=\lim _{\Delta x\to 0}{\frac {\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {{\big [}f(x+\Delta x)-f(x){\big ]}\cdot g(x+\Delta x)+f(x)\cdot {\big [}g(x+\Delta x)-g(x){\big ]}}{\Delta x}}\\[5pt]&=\lim _{\Delta x\to 0}{\frac {\Delta x}}\cdot \underbrace {\lim _{\Delta x\to 0}g(x+\Delta x)} _{\text}+\lim _{\Delta x\to 0}f(x)\cdot \lim _{\Delta x\to 0}{\frac {\Delta x}}\\[5pt]&=f'(x)g(x)+f(x)g'(x).\end}$ The fact that

$\lim _{\Delta x\to 0}g(x+\Delta x)=g(x)$ is deduced from a theorem that states that differentiable functions are continuous.

### Brief proof

By definition, if $f,g:\mathbb \rightarrow \mathbb$ are differentiable at $x$ then we can write

$f(x+h)=f(x)+f'(x)h+\psi _(h)\qquad \qquad g(x+h)=g(x)+g'(x)h+\psi _(h)$ such that $\lim _{\frac {\psi _(h)}}=\lim _{\frac {\psi _(h)}}=0,$ also written $\psi _,\psi _\sim o(h)$ . Then:

${\beginfg(x+h)-fg(x)=(f(x)+f'(x)h+\psi _(h))(g(x)+g'(x)h+\psi _(h))-fg(x)=f'(x)g(x)h+f(x)g'(x)h+o(h)\\[12pt]\end}$ Taking the limit for small $h$ gives the result.

### Quarter squares

There is a proof using quarter square multiplication which relies on the chain rule and on the properties of the quarter square function (shown here as q, i.e., with $q(x)={\tfrac }}$ ):

$f=q(u+v)-q(u-v),$ Differentiating both sides:

${\beginf'&=q'(u+v)(u'+v')-q'(u-v)(u'-v')\\[4pt]&=\left((u+v)(u'+v')\right)-\left((u-v)(u'-v')\right)\\[4pt]&=(uu'+vu'+uv'+vv')-(uu'-vu'-uv'+vv')\\[4pt]&=vu'+uv'\\[4pt]&=uv'+u'v\end}$ ### Chain rule

The product rule can be considered a special case of the chain rule for several variables.

$={\frac {\partial (ab)}{\partial a}}{\frac }+{\frac {\partial (ab)}{\partial b}}{\frac }=b{\frac }+a{\frac }.$ ### Non-standard analysis

Let u and v be continuous functions in x, and let dx, du and dv be infinitesimals within the framework of non-standard analysis, specifically the hyperreal numbers. Using st to denote the standard part function that associates to a finite hyperreal number the real infinitely close to it, this gives

${\begin{\frac }&=\operatorname \left({\frac {(u+du)(v+dv)-uv}}\right)\\[4pt]&=\operatorname \left({\frac }\right)\\[4pt]&=\operatorname \left({\frac }\right)\\[4pt]&=u{\frac }+v{\frac }.\end}$ This was essentially Leibniz's proof exploiting the transcendental law of homogeneity (in place of the standard part above).

### Smooth infinitesimal analysis

In the context of Lawvere's approach to infinitesimals, let dx be a nilsquare infinitesimal. Then du = u′ dx and dv = v′ dx, so that

${\begind(uv)&=(u+du)(v+dv)-uv\\&=uv+u\cdot dv+v\cdot du+du\cdot dv-uv\\&=u\cdot dv+v\cdot du+du\cdot dv\\&=u\cdot dv+v\cdot du\,\!\end}$ since

$du\,dv=u'v'(dx)^=0$ ## Generalizations

### A product of more than two factors

The product rule can be generalized to products of more than two factors. For example, for three factors we have

${\frac }={\frac }vw+u{\frac }w+uv{\frac }.$ For a collection of functions $f_,\dots ,f_$ , we have

${\frac }\left[\prod _^f_(x)\right]=\sum _^\left(\left({\frac }f_(x)\right)\prod _f_(x)\right)=\left(\prod _^f_(x)\right)\left(\sum _^{\frac (x)}(x)}}\right).$ ### Higher derivatives

It can also be generalized to the general Leibniz rule for the nth derivative of a product of two factors, by symbolically expanding according to the binomial theorem:

$d^(uv)=\sum _^\cdot d^{(n-k)}(u)\cdot d^{(k)}(v).$ Applied at a specific point x, the above formula gives:

$(uv)^{(n)}(x)=\sum _^\cdot u^{(n-k)}(x)\cdot v^{(k)}(x).$ Furthermore, for the nth derivative of an arbitrary number of factors:

$\left(\prod _^f_\right)^{(n)}=\sum _+j_+\cdots +j_=n},j_,\ldots ,j_}\prod _^f_^{(j_)}.$ ### Higher partial derivatives

For partial derivatives, we have

${\partial ^ \over \partial x_\,\cdots \,\partial x_}(uv)=\sum _{\partial ^{|S|}u \over \prod _\partial x_}\cdot {\partial ^v \over \prod _\partial x_}$ where the index S runs through all 2n subsets of , and |S| is the cardinality of S. For example, when n = 3,

${\begin&{\partial ^ \over \partial x_\,\partial x_\,\partial x_}(uv)\\[6pt]={}&u\cdot {\partial ^v \over \partial x_\,\partial x_\,\partial x_}+{\partial u \over \partial x_}\cdot {\partial ^v \over \partial x_\,\partial x_}+{\partial u \over \partial x_}\cdot {\partial ^v \over \partial x_\,\partial x_}+{\partial u \over \partial x_}\cdot {\partial ^v \over \partial x_\,\partial x_}\\[6pt]&+{\partial ^u \over \partial x_\,\partial x_}\cdot {\partial v \over \partial x_}+{\partial ^u \over \partial x_\,\partial x_}\cdot {\partial v \over \partial x_}+{\partial ^u \over \partial x_\,\partial x_}\cdot {\partial v \over \partial x_}+{\partial ^u \over \partial x_\,\partial x_\,\partial x_}\cdot v.\end}$ ### Banach space

Suppose X, Y, and Z are Banach spaces (which includes Euclidean space) and B : X × YZ is a continuous bilinear operator. Then B is differentiable, and its derivative at the point (x,y) in X × Y is the linear map D(x,y)B : X × YZ given by

$(D_{\left(x,y\right)}\,B)\left(u,v\right)=B\left(u,y\right)+B\left(x,v\right)\qquad \forall (u,v)\in X\times Y.$ ### Derivations in abstract algebra

In abstract algebra, the product rule is used to define what is called a derivation, not vice versa.

### Vector functions

The product rule extends to scalar multiplication, dot products, and cross products of vector functions.

For scalar multiplication: $(f\cdot \mathbf )'=f\;'\cdot \mathbf +f\cdot \mathbf \;'$ For dot products: $(\mathbf \cdot \mathbf )'=\mathbf \;'\cdot \mathbf +\mathbf \cdot \mathbf \;'$ For cross products: $(\mathbf \times \mathbf )'=\mathbf \;'\times \mathbf +\mathbf \times \mathbf \;'$ Note: cross products are not commutative, i.e. $(f\times g)'\neq f'\times g+g'\times f$ , instead products are anticommutative, so it can be written as $(f\times g)'=f'\times g-g'\times f$ ### Scalar fields

For scalar fields the concept of gradient is the analog of the derivative:

$\nabla (f\cdot g)=\nabla f\cdot g+f\cdot \nabla g$ ## Applications

Among the applications of the product rule is a proof that

$x^=nx^$ when n is a positive integer (this rule is true even if n is not positive or is not an integer, but the proof of that must rely on other methods). The proof is by mathematical induction on the exponent n. If n = 0 then xn is constant and nxn − 1 = 0. The rule holds in that case because the derivative of a constant function is 0. If the rule holds for any particular exponent n, then for the next value, n + 1, we have

${\beginx^&{}=\left(x^\cdot x\right)\\[12pt]&{}=xx^+x^x\qquad {\mbox{(the product rule is used here)}}\\[12pt]&{}=x\left(nx^\right)+x^\cdot 1\qquad {\mbox{(the induction hypothesis is used here)}}\\[12pt]&{}=(n+1)x^.\end}$ Therefore, if the proposition is true for n, it is true also for n + 1, and therefore for all natural n.