The Derivative

We begin with the definition of the derivative of a function.
Let \(I\subset\real\) be an interval and let \(c\in I\). We say that \(f:I\rightarrow\real\) is differentiable at \(c\) or has a derivative at \(c\) if \[ \lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c} \] exists. We say that \(f\) is differentiable on \(I\) if \(f\) is differentiable at every point in \(I\).
By definition, \(f\) has a derivative at \(c\) if there exists a number \(L\in\real\) such that for every \(\eps \gt 0\) there exists \(\delta \gt 0\) such that if \(|x-c| \lt \delta\) then \[ \left|\frac{f(x)-f(c)}{x-c} - L \right| \lt \eps. \] If \(f\) is differentiable at \(c\), we will denote \(\displaystyle \lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c}\) by \(f'(c)\), that is, \[ f'(c) = \lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c}. \] The rule that sends \(c\) to the number \(f'(c)\) defines a function on a possibly smaller subset \(J\subset I\). The function \(f':J\rightarrow\real\) is called the derivative of \(f\).
Let \(f(x) = 1/x\) for \(x\in(0,\infty)\). Prove that \(f'(x) = -\frac{1}{x^2}\).
Let \(f(x) = \sin(x)\) for \(x\in\real\). Prove that \(f'(x) = \cos(x)\).
Recall that \[ \sin(x) - \sin(c) = 2\sin\left(\tfrac{x-c}{2}\right)\cos\left(\tfrac{x+c}{2}\right) \] and that \(\lim_{x\rightarrow 0} \frac{\sin(x)}{x} = 1\). Therefore, \begin{align*} \lim_{x\rightarrow c} \frac{\sin(x)-\sin(c)}{x-c} &= \lim_{x\rightarrow c} \frac{2\sin\left(\frac{x-c}{2}\right)\cos\left(\frac{x+c}{2}\right)}{x-c} \\[2ex] &= \lim_{x\rightarrow c} \left(\frac{\sin\left(\frac{x-c}{2}\right)}{\frac{x-c}{2}} \right)\cos\left(\tfrac{x+c}{2}\right)\\[2ex] &= 1\cdot \cos(c) = \cos(c). \end{align*} Hence \(f'(c) = \cos(c)\) for all \(c\) and thus \(f'(x) = \cos(x)\).
Prove by definition that \(f(x)=\frac{x}{1+x^2}\) is differentiable on \(\real\).
We have that \begin{align*} \frac{f(x)-f(c)}{x-c} &= \frac{\frac{x}{1+x^2}-\frac{c}{1+c^2}}{x-c} \\[2ex] &= \frac{x(1+c^2)-c(1+x^2)}{(1+x^2)(1+c^2)(x-c)}\\[2ex] &= \frac{1-cx}{(1+c^2)(1+x^2)}. \end{align*} Now \[ \lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c} = \frac{1-c^2}{(1+c^2)^2}. \] Hence, \(f'(c)\) exists for all \(c\in\real\) and the derivative function of \(f\) is \[ f'(x) = \frac{1-x^2}{(1+x^2)^2}. \]
Prove that \(f'(x)=\alpha\) if \(f(x)=\alpha x + b\).
We have that \(f(x)-f(c) = \alpha x - \alpha c = \alpha(x-c)\). Therefore, \(\lim_{x\rightarrow c}\frac{f(x)-f(c)}{x-c}= \alpha\). This proves that \(f'(x) = \alpha\) for all \(x\in \real\).
Compute the derivative function of \(f(x)=|x|\) for \(x\in \real\).
If \(x \gt 0\) then \(f(x)=x\) and thus \(f'(x)=1\) for \(x \gt 0\). If \(x \lt 0\) then \(f(x)=-x\) and therefore \(f'(x)=-1\) for \(x \lt 0\). Now consider \(c=0\). We have that \[ \frac{f(x)-f(c)}{x-c} = \frac{|x|}{x}. \] We claim that the limit \(\lim_{x\rightarrow 0}\frac{|x|}{x}\) does not exist and thus \(f'(0)\) does not exist. To see this, consider \(x_n=1/n\). Then \((x_n)\rightarrow 0\) and \(f(x_n)=1\) for all \(n\). On the other hand, consider \(y_n=-1/n\). Then \((y_n)\rightarrow 0\) and \(f(y_n)=-1\). Hence, \(\lim_{n\rightarrow\infty} f(x_n) \neq \lim_{n\rightarrow \infty} f(y_n)\), and thus the claim holds by the Sequential criterion for limits. The derivative function \(f'\) of \(f\) is therefore defined on \(A=\real\backslash\hspace{-0.3em}\{0\}\) and is given by \[ f'(x) = \begin{cases} 1, & x \gt 0\\[2ex] -1, & x \gt 0.\end{cases} \] Hence, even though \(f\) is continuous at every point in its domain \(\real\), it is not differentiable at every point in its domain. In other words, continuity is not a sufficient condition for differentiability.
Suppose that \(f:I\rightarrow\real\) is differentiable at \(c\). Then \(f\) is continuous at \(c\).
To prove that \(f\) is continuous at \(c\) we must show that \(\lim_{x\rightarrow c} f(x) = f(c)\). By assumption \(\lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c}=f'(c)\) exists, and clearly \(\lim_{x\rightarrow c}(x-c)=0\). Hence we can apply the Limits laws and compute \begin{align*} \lim_{x\rightarrow c} f(x) &= \lim_{x\rightarrow c} (f(x)-f(c) + f(c))\\ &= \lim_{x\rightarrow c}\left(\frac{f(x)-f(c)}{(x-c)}(x-c) + f(c)\right)\\ &= f'(c) \cdot 0 + f(c)\\ &= f(c) \end{align*} and the proof is complete.
Let \(f:I\rightarrow\real\) and \(g:I\rightarrow\real\) be differentiable at \(c\in I\). The following hold:
  1. If \(\alpha\in\real\) then \((\alpha f)\) is differentiable and \((\alpha f)'(c) = \alpha f'(c)\).
  2. \((f+g)\) is differentiable at \(c\) and \((f+g)'(c) = f'(c) + g'(c)\).
  3. \(fg\) is differentiable at \(c\) and \((fg)'(c) = f'(c)g(c) + f(c)g'(c)\).
  4. If \(g(c)\neq 0\) then \((f/g)\) is differentiable at \(c\) and \[\left(\frac{f}{g}\right)'(c) = \frac{f'(c)g(c) - f(c)g'(c)}{g(c)^2}\]
Parts (i) and (ii) are straightforward. We will prove only (iii) and (iv). For (iii), we have that \begin{align*} \frac{f(x)g(x)-f(c)g(c)}{x-c} &= \frac{f(x)g(x) - f(c)g(x) + f(c)g(x) - f(c)g(c)}{x-c}\\[2ex] &=\frac{f(x)-f(c)}{x-c} g(x) + f(c) \frac{g(x)-g(c)}{x-c}. \end{align*} Now \(\lim_{x\rightarrow c}g(x) = g(c)\) because \(g\) is differentiable at \(c\). Therefore, \begin{align*} \lim_{x\rightarrow c} \frac{f(x)g(x)-f(c)g(c)}{x-c} &= \lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c} g(x) + \lim_{x\rightarrow c} f(c) \frac{g(x)-g(c)}{x-c}\\[2ex] &= f'(c) g(c) + f(c) g'(c). \end{align*} To prove part (iv), since \(g(c)\neq 0\), then there exist a \(\delta\)-neighborhood \(J=(c-\delta,c+\delta)\) such that \(g(x)\neq 0\) for all \(x\in J\). If \(x\in J\) then \begin{align*} \frac{\frac{f(x)}{g(x)} - \frac{f(c)}{g(x)}}{x-c} &= \frac{f(x)g(c)-g(x)f(c)}{g(x)g(c) (x-c)}\\[2ex] &=\frac{f(x)g(c)-f(c)g(c) + f(c)g(c) - g(x)f(c)}{g(x)g(c) (x-c)}\\[2ex] &= \frac{\frac{f(x)g(c)-f(c)g(c)}{x-c} - \frac{f(c)g(x)-f(c)g(c)}{x-c}}{g(x)g(c)} \end{align*} Since \(g(c)\neq 0\), it follows that \[ \lim_{x\rightarrow c} \frac{\frac{f(x)}{g(x)} - \frac{f(c)}{g(x)}}{x-c} = \frac{f'(c) g(c) - f(c)g'(c)}{g(c)^2} \] and the proof is complete.
We now prove the Chain Rule.
Let \(f:I\rightarrow\real\) and \(g:J\rightarrow\real\) be functions such that \(f(I)\subset J\) and let \(c\in I\). If \(f'(c)\) exists and \(g'(f(c))\) exists then \((g\circ f)'(c)\) exists and \((g\circ f)'(c) = g'(f(c))f'(c)\).
Suppose that there exists a neighborhood of \(c\) where \(f(x)\neq f(c)\). Otherwise, the composite function \((g\circ f)(x)\) is constant in a neighborhood of \(c\), and then clearly differentiable at \(c\). Consider the function \(h:J\rightarrow\real\) defined by \[ h(y) = \begin{cases} \frac{g(y)-g(f(c))}{y-f(c)}, & y\neq f(c) \\[2ex] g'(f(c)), & y=f(c).\end{cases} \] Now \begin{align*} \lim_{y\rightarrow f(c)} h(y) &= \lim_{y\rightarrow f(c)} \frac{g(y)-g(f(c))}{y-c} \\[2ex] &= g'(f(c))'\\[2ex] &= h(f(c)). \end{align*} Hence, \(h\) is differentiable at \(f(c)\) and therefore \(h\) is at \(f(c)\). Now, \[ \frac{g(f(x))-g(f(c))}{x-c} = h(f(x)) \frac{f(x)-f(c)}{x-c} \] and therefore \begin{align*} \lim_{x\rightarrow c} \frac{g(f(x))-g(f(c))}{x-c} &= \lim_{x\rightarrow c} h(f(x)) \frac{f(x)-f(c)}{x-c}\\ & = h(f(c)) f'(c)\\ & = g'(f(c))f'(c). \end{align*} Therefore, \((g\circ f)'(c) = g'(f(c))f'(c)\) as claimed.
Compute \(f'(x)\) if \[ f(x) = \begin{cases} x^2\sin(\tfrac{1}{x}), & x\neq 0\\[2ex] 0, & x=0.\end{cases} \] Where is \(f'(x)\) continuous?
When \(x\neq 0\), \(f(x)\) is the composition and product of differentiable functions at \(x\), and therefore \(f\) is differentiable at \(x\neq 0\). For instance, on \(A=\real\backslash\hspace{-0.3em}\{0\}\), the functions \(1/x\), \(\sin(x)\) and \(x^2\) are differentiable at every \(x\in A\). Hence, if \(x\neq 0\) we have that \[ f'(x) = 2x\sin(\tfrac{1}{x}) - \cos(\tfrac{1}{x}). \] Consider now \(c=0\). If \(f'(0)\) exists it is equal to \begin{align*} \lim_{x\rightarrow 0} \frac{f(x)-f(c)}{x-c} &= \lim_{x\rightarrow 0} \frac{x^2\sin(\tfrac{1}{x})}{x}\\ & = \lim_{x\rightarrow 0} x\sin(\tfrac{1}{x}). \end{align*} Using the Squeeze Theorem, we deduce that \(f'(0)=0\). Therefore, \[ f'(x) = \begin{cases} 2x\sin(\tfrac{1}{x}) - \cos(\tfrac{1}{x}), & x\neq 0\\[2ex] 0, & x=0.\end{cases} \] From the above formula obtained for \(f'(x)\), we observe that when \(x\neq 0\) \(f'\) is continuous since it is the product/difference/composition of continuous functions. To determine continuity of \(f'\) at \(x=0\) consider \(\lim_{x\rightarrow 0} f'(x)\). Consider the sequence \(x_n=\frac{1}{n\pi}\), which clearly converges to \(c=0\). Now, \(f'(x_n) = \frac{2}{n\pi}\sin(n\pi) -\cos(n\pi)\). Now, \(\sin(n\pi)=0\) for all \(n\) and therefore \(f'(x_n) = -\cos(n\pi)=(-1)^{n+1}\). The sequence \(f'(x_n)\) does not converge and therefore \(\lim_{x\rightarrow 0} f'(x)\) does not exist. Thus, \(f'\) is not continuous at \(x=0\).
Compute \(f'(x)\) if \[ f(x) = \begin{cases} x^3\sin(\tfrac{1}{x}), & x\neq 0\\[2ex] 0, & x=0\end{cases} \] Where is \(f'(x)\) continuous?
When \(x\neq 0\), \(f(x)\) is the composition and product of differentiable functions, and therefore \(f\) is differentiable at \(x\neq 0\). For instance, on \(A=\real\backslash\hspace{-0.3em}\{0\}\), the functions \(1/x\), \(\sin(x)\) and \(x^3\) are differentiable on \(A\). Hence, if \(x\neq 0\) we have that \[ f'(x) = 3x^2\sin(\tfrac{1}{x}) - x\cos(\tfrac{1}{x}). \] Consider now \(c=0\). If \(f'(0)\) exists it is equal to \begin{align*} \lim_{x\rightarrow 0} \frac{f(x)-f(c)}{x-c} &= \lim_{x\rightarrow 0} \frac{x^3\sin(\tfrac{1}{x})}{x}\\ & = \lim_{x\rightarrow 0} x^2\sin(\tfrac{1}{x}) \end{align*} and using the Squeeze Theorem we deduce that \(f'(0)=0\). Therefore, \[ f'(x) = \begin{cases} 3x^2\sin(\tfrac{1}{x}) - x\cos(\tfrac{1}{x}), & x\neq 0\\[2ex] 0, & x=0.\end{cases} \] When \(x\neq 0\), \(f'\) is continuous since it is the product/difference/composition of continuous functions. To determine continuity of \(f'\) at \(c=0\) we consider the limit \(\lim_{x\rightarrow 0} f'(x)\). Now \(\lim_{x\rightarrow 0} 3x^2\sin(\tfrac{1}{x})=0\) using the Squeeze Theorem, and similarly \(\lim_{x\rightarrow 0} x\cos(\tfrac{1}{x})=0\) using the Squeeze Theorem. Therefore, \(\lim_{x\rightarrow 0} f'(x)\) exists and is equal to \(0\), which is equal to \(f'(0)\). Hence, \(f'\) is continuous at \(x=0\), and thus continuous everywhere.
Consider the function \[ f(x) = \begin{cases} x^2\sin(\tfrac{1}{x}), & x\in\mathbb{Q}\backslash\hspace{-0.3em}\{0\}\\[2ex] x^2\cos(\tfrac{1}{x}), & x\notin\mathbb{Q}\\[2ex] 0, & x=0.\end{cases} \] Show that \(f'(0) = 0\).

Exercises

Use the definition of the derivative of a function to find \(f'(x)\) if \(\displaystyle f(x)=\frac{3x+4}{2x-1}\). Clearly state the domain of \(f'(x)\).
Use the definition of the derivative of a function to find \(f'(x)\) if \(\displaystyle f(x)=x|x|\). Clearly state the domain of \(f'(x)\).
Let \(f:\real\rightarrow\real\) be defined by \[ f(x) = \begin{cases} x^2, & x\in\mathbb{Q}\\ 0, & x\in\mathbb{R}\backslash\hspace{-0.3em}\mathbb{Q}\end{cases} \]
  1. Show that \(f\) is differentiable at \(c=0\) and find \(f'(0)\).
  2. Prove that if \(c\neq 0\) then \(f\) is not differentiable at \(c\).
Let \(g(x) = |x^3|\) for \(x\in\real\). Determine whether \(g'(0)\), \(g^{(2)}(0), g^{(3)}(0)\) exist and if yes find them. Hint: Consider writing \(g\) as a piecewise function and use the definition of the derivative.
If \(f:\real\rightarrow\real\) is differentiable at \(c\in\real\), explain why \[ f'(c) = \lim_{n\rightarrow\infty} \left[n(f(c+1/n)-f(c))\right] \] Give an example of a function \(f\) and a number \(c\) such that \[ \lim_{n\rightarrow\infty} \left[n(f(c+1/n)-f(c))\right] \] exists but \(f'(c)\) does not exist.

The Mean Value Theorem

Let \(f:I\rightarrow\real\) be a function and let \(c\in I\).
  1. We say that \(f\) has a relative maximum at \(c\) if there exists \(\delta \gt 0\) such that \(f(x)\leq f(c)\) for all \(x \in (c-\delta, c+\delta)\).
  2. We say that \(f\) has a relative minimum at \(c\) if there exists \(\delta\) such that \(f(c)\leq f(x)\) for all \(x\in (c-\delta, c+\delta)\).
A point \(c\in I\) is called a critical point of \(f:I\rightarrow\real\) if \(f'(c) = 0\). The next theorem says that a relative maximum/minimum of a differentiable function can only occur at a critical point.
Let \(f:I\rightarrow\real\) be a function and let \(c\) be an interior point of \(I\). Suppose that \(f\) has a relative maximum (or minimum) at \(c\). If \(f\) is differentiable at \(c\) then \(c\) is a critical point of \(f\), that is, \(f'(c)=0\).
Suppose that \(f\) has a relative maximum at \(c\); the relative minimum case is similar. Then for \(x\neq c\), it holds that \(f(x)-f(c)\leq 0\) for \(x\in (c-\delta, c+\delta)\) and some \(\delta \gt 0\). Consider the function \(h:(c-\delta,c+\delta)\rightarrow \real\) defined by \(h(x)=\frac{f(x)-f(c)}{x-c}\) for \(x\neq c\) and \(h(c)= f'(c)\). Then the function \(h\) is continuous at \(c=0\) because \(\lim_{x\rightarrow c} h(x) = h(c)\). Now for \(x\in A=(c,c+\delta)\) it holds that \(h(x)\leq 0\) and therefore \(f'(c)=\lim_{x\rightarrow c} h(x) \leq 0\). Similarly, for \(x\in B=(c-\delta,c)\) it holds that \(h(x)\geq 0\) and therefore \(0\leq f'(c)\). Thus \(f'(c)=0\).
If \(f:I\rightarrow\real\) has a relative maximum (or minimum) at \(c\) then either \(f'(c)=0\) or \(f'(c)\) does not exist.
The function \(f(x)=|x|\) has a relative minimum at \(x=0\), however, \(f\) is not differentiable at \(x=0\).
Let \(f:[a,b]\rightarrow\real\) be continuous on \([a,b]\) and differentiable on \((a,b)\). If \(f(a)=f(b)\) then there exists \(c\in (a,b)\) such that \(f'(c)=0\).
Since \(f\) is continuous on \([a,b]\) it achieves its maximum and minimum at some point \(x^*\) and \(x_*\), respectively, that is \(f(x_*)\leq f(x) \leq f(x^*)\) for all \(x\in [a,b]\). If \(f\) is constant then \(f'(x)=0\) for all \(x\in (a,b)\). If \(f\) is not constant then \(f(x_*) \lt f(x^*)\). Since \(f(a)=f(b)\) it follows that at least one of \(x_*\) and \(x^*\) is not contained in \(\{a,b\}\), and hence by Theorem 6.2.2 there exists \(c\in \{x_*,x^*\}\) such that \(f'(c)=0\).
We now state and prove the main result of this section.
Let \(f:[a,b]\rightarrow\real\) be continuous on \([a,b]\) and differentiable on \((a,b)\). Then there exists \(c\in (a,b)\) such that \(f'(c) = \frac{f(b)-f(a)}{b-a}\).
If \(f(a)=f(b)\) then the result follows from Rolle's Theorem (\(f'(c)=0\) for some \(c\in (a,b)\)). Let \(h:[a,b]\rightarrow\real\) be the line from \((a,f(a))\) to \((b,f(b)\), that is, \[ h(x) = f(a) + \frac{f(b)-f(a)}{(b-a)} (x-a) \] and define the function \[ g(x) = f(x) - h(x) \] for \(x\in [a,b]\). Then \(g(a) = f(a) - f(a) =0\) and \(g(b) = f(b) - f(b)=0\), and thus \(g(a)=g(b)\). Clearly, \(g\) is continuous on \([a,b]\) and differentiable on \((a,b)\), and it is straightforward to verify that \(g'(x) = f'(x) - \frac{f(b)-f(a)}{b-a}\). By Rolle's Theorem, there exists \(c \in (a,b)\) such that \(g'(c)=0\), and therefore \(f'(c) = \frac{f(b)-f(a)}{b-a}\).
Let \(f:[a,b]\rightarrow\real\) be continuous on \([a,b]\) and differentiable on \((a,b)\). If \(f'(x) = 0\) for all \(x\in (a,b)\) then \(f\) is constant on \([a,b]\).
Let \(y\in (a,b]\). Now \(f\) restricted to \([a,y]\) satisfies all the assumptions needed in the Mean Value Theorem. Therefore, there exists \(c\in (a,y)\) such that \(f'(c) = \frac{f(y)-f(a)}{y-a}\). But \(f'(c)=0\) and thus \(f(y)=f(a)\). This holds for all \(y\in (a,b]\) and thus \(f\) is constant on \([a,b]\).
Show by example that Theorem 6.2.7 is not true for a function \(f:A\rightarrow\real\) if \(A\) is not a closed and bounded interval.
If \(f,g:[a,b]\rightarrow\real\) are continuous and differentiable on \((a,b)\) and \(f'(x)=g'(x)\) for all \(x\in (a,b)\) then \(f(x) = g(x) + C\) for some constant \(C\).
Use the Mean Value theorem to show that \(-x\leq \sin(x) \leq x\) for all \(x\in\real\).
Suppose that \(x \gt 0\) and let \(g(x)=\sin(x)\) so that \(g'(x)=\cos(x)\). By the MVT, there exists \(c\in (0,x)\) such that \(\cos(c) = \frac{\sin(x)}{x}\), that is \(\sin(x) = x\cos(c)\). Now \(|\cos(c)|\leq 1\) and therefore \(|\sin(x)|\leq |x|=x\). Therefore, \(-x\leq \sin(x)\leq x\). The case \(x \lt 0\) can be treated similarly.
The function \(f:I\rightarrow \real\) is increasing if \(f(x_1)\leq f(x_2)\) whenever \(x_1 \lt x_2\). Similarly, \(f\) is decreasing if \(f(x_2)\leq f(x_1)\) whenever \(x_1 \lt x_2\). In either case, we say that \(f\) is monotone.
The sign of the derivative \(f'\) determines where \(f\) is increasing/decreasing.
Suppose that \(f:I\rightarrow\real\) is differentiable.
  1. Then \(f\) is increasing if and only if \(f'(x)\geq 0\) for all \(x\in I\).
  2. Then \(f\) is decreasing if and only if \(f'(x)\leq 0\) for all \(x\in I\).
Suppose that \(f\) is increasing. Then for all \(x,c\in I\) with \(x\neq c\) it holds that \(\frac{f(x)-f(c)}{x-c}\geq 0\) and therefore \(f'(c)=\lim_{x\rightarrow c} \frac{f(x)-f(c)}{x-c} \geq 0\). Hence, this proves that \(f'(x)\geq 0\) for all \(x\in I\). Now suppose that \(f'(x)\geq 0\) for all \(x\in I\). Suppose that \(x \lt y\). Then by the Mean Value Theorem, there exists \(c\in (x,y)\) such that \(f'(c) = \frac{f(y)-f(x)}{y-x}\). Therefore, since \(f'(c)\geq 0\) it follows that \(f(y)-f(x)\geq 0\). Part (ii) is proved similarly.

Exercises

Use the Mean Value Theorem to show that \[ |\cos(x)-\cos(y)|\leq |x-y|. \] In general, suppose that \(f:[a,b]\rightarrow\real\) is such that \(f'\) exists on \([a,b]\) and \(f'\) is continuous on \([a,b]\). Prove that \(f\) is Lipschitz on \([a,b]\).
Give an example of a uniformly continuous function on \([0,1]\) that is differentiable on \((0,1)\) but whose derivative is not bounded on \((0,1)\). Justify your answer.
Let \(I\) be an interval and let \(f:I\rightarrow\real\) be differentiable on \(I\). Prove that if \(f'(x) \gt 0\) for \(x\in I\) then \(f\) is strictly increasing on \(I\).
Let \(f:[a,b]\rightarrow \real\) be continuous on \([a,b]\) and differentiable on \((a,b)\). Show that if \(\lim_{x\rightarrow a} f'(x)=A\) then \(f'(a)\) exists and equals \(A\). Hint: Use the definition of \(f'(a)\), the Mean Value Theorem, and the Sequential Criterion for limits.
Let \(f:[a,b]\rightarrow\real\) be continuous and suppose that \(f'\) exists on \((a,b)\). Prove that if \(f'(x) \gt 0\) for \(x\in (a,b)\) then \(f\) is strictly increasing on \([a,b]\).
Suppose that \(f:[a,b]\rightarrow\real\) is continuous on \([a,b]\) and differentiable on \((a,b)\). We proved that if \(f'(x)=0\) for all \(x\in (a,b)\) then \(f\) is constant on \([a,b]\). Give an example of a function \(f:A\rightarrow\real\) such that \(f'(x)=0\) for all \(x\in A\) but \(f\) is not constant on \(A\).
Let \(f:[a,b]\rightarrow\real\) be differentiable. Prove that if \(f'(x)\neq 0\) on \([a,b]\) then \(f\) is injective.

Taylor's Theorem

Taylor's theorem is a higher-order version of the Mean Value Theorem and it has abundant applications in numerical analysis. Taylor's theorem involves Taylor polynomials which you are familiar with from calculus.
Let \(x_0\in[a,b]\) and suppose that \(f:[a,b]\rightarrow\real\) is such that the derivatives \(f'(x_0)\), \(f^{(2)}(x_0)\), \(f^{(3)}(x_0)\),\(\ldots\),\(f^{(n)}(x_0)\) exist for some positive integer \(n\). Then the polynomial \begin{align*} P_n(x) &= f(x_0) + f'(x_0)(x-x_0) + \frac{1}{2!}f^{(2)}(x_0) (x-x_0)^2 +\\ &  \cdots + \frac{1}{n!}f^{(n)}(x_0)(x-x_0)^n \end{align*} is called the \(n\)th order Taylor polynomial of \(f\) based at \(x_0\). Using summation convention, \(P_n(x)\) can be written as \[ P_n(x) = \sum_{k=0}^n \frac{f^{(k)}(x_0)}{k!} (x-x_0)^k. \]
By construction, the derivatives of \(f\) and \(P_n\) up to order \(n\) are identical at \(x_0\) (verify this!): \begin{align*} P_n(x_0) &= f(x_0)\\ P^{(1)}_n(x_0) &= f^{(1)}(x_0)\\ \vdots\;\;\;\; &= \;\;\;\;\vdots\\ P^{(n)}(x_0) &= f^{(n)}(x_0). \end{align*} It is reasonable then to suspect that \(P_n(x)\) is a good approximation to \(f(x)\) for points \(x\) near \(x_0\). If \(x\in [a,b]\) then the difference between \(f(x)\) and \(P_n(x)\) is \[ R_n(x) = f(x) - P_n(x) \] and we call \(R_n(x)\) the \(n\)th order remainder based at \(x_0\). Hence, for each \(x^*\in[a,b]\), the remainder \(R_n(x^*)\) is the error in approximating \(f(x^*)\) with \(P_n(x^*)\). You may be asking yourself why we would need to approximate \(f(x)\) if the function \(f\) is known and given. For example, if say \(f(x)=\sin(x)\) then why would we need to approximate say \(f(1) = \sin(1)\) since any basic calculator could easily compute \(\sin(1)\)? Well, what your calculator is actually computing is an approximation to \(\sin(1)\) using a (rational) number such as \(P_n(1)\) and using a large value of \(n\) for accuracy (although modern numerical algorithms for computing trigonometric functions have superseded Taylor approximations but Taylor approximations are a good start). Taylor's theorem provides an expression for the remainder term \(R_n(x)\) using the derivative \(f^{(n+1)}\).
Let \(f:[a,b]\rightarrow\real\) be a function such that for some \(n\in\mathbb{N}\) the functions \(f, f^{(1)}, f^{(2)},\ldots, f^{(n)}\) are continuous on \([a,b]\) and \(f^{(n+1)}\) exists on \((a,b)\). Fix \(x_0 \in [a,b]\). Then for any \(x \in [a,b]\) there exists \(c\) between \(x_0\) and \(x\) such that \[ f(x) = P_n(x) + R_n(x) \] where \[ R_n(x) = \frac{f^{(n+1)}(c)}{(n+1)!} (x-x_0)^{n+1}. \]
If \(x=x_0\) then \(P_n(x_0) = f(x_0)\) and then \(c\) can be chosen arbitrarily. Thus, suppose that \(x\neq x_0\), let \(m = \frac{f(x)-P_n(x)}{(x-x_0)^{n+1}}\), and define the function \(g:[a,b]\rightarrow \real\) by \[ g(t) = f(t) - P_n(t) - m (t-x_0)^{n+1}. \] Since \(f^{(n+1)}\) exists on \((a,b)\) then \(g^{(n+1)}\) exists on \((a,b)\). Moreover, since \(P^{(k)}(x_0) = f^{(k)}(x_0)\) for \(k=0,1,\ldots, n\) then \(g^{(k)}(x_0) = 0\) for \(k=0,1,\ldots, n\). Now \(g(x)=0\) and therefore since \(g(x_0)=0\) by Rolle's theorem there exists \(c_1\) in between \(x\) and \(x_0\) such that \(g'(c_1) = 0\). Now we can apply Rolle's theorem to \(g'\) since \(g'(c_1) = 0\) and \(g'(x_0)=0\), and therefore there exists \(c_2\) in between \(c_1\) and \(x_0\) such that \(g''(c_2)=0\). By applying this same argument repeatedly, there exists \(c\) in between \(x_0\) and \(c_{n-1}\) such that \(g^{(n+1)}(c) = 0\). Now, \[ g^{(n+1)}(t) = f^{(n+1)}(t) - m (n+1)! \] and since \(g^{(n+1)}(c)=0\) then \[ 0 = f^{(n+1)}(c) - m (n+1)! \] from which we conclude that \[ f(x) - P(x) = \frac{f^{(n+1)}(c)}{(n+1)!} (x-x_0)^{n+1} \] and the proof is complete.
Consider the function \(f:[0,2]\rightarrow\real\) given by \(f(x) = \ln(1+x)\). Use \(P_4\) based at \(x_0=0\) to estimate \(\ln(2)\) and give a bound on the error with your estimation.
Note that \(f(1) = \ln(2)\) and so the estimate of \(\ln(2)\) using \(P_4\) is \(\ln(2)\approx P_4(1)\). To determine \(P_4\) we need \(f(0), f^{(1)}(0), \ldots,f^{(4)}(0)\). We compute \begin{align*} f^{(1)}(x) &= \frac{1}{1+x} & f^{(1)}(0) &= 1\\ f^{(2)}(x) &= \frac{-1}{(1+x)^2} & f^{(2)}(0) &= -1 \\ f^{(3)}(x) &= \frac{2}{(1+x)^3} & f^{(3)}(0) &= 2\\ f^{(4)}(x) &= \frac{-6}{(1+x)^4} & f^{(4)}(0) &= -6.\\ \end{align*} Therefore, \[ P_4(x) = x - \tfrac{1}{2} x^2 + \tfrac{1}{3}x^3 - \tfrac{1}{4}x^4. \] Now \(P_4(1) = 1-\frac{1}{2}+\frac{1}{3} - \frac{1}{4} = \frac{7}{12}\) and therefore \[ \ln(2) \approx P_4(1) = \frac{7}{12}. \] The error is \(R_4(1) = f(1) - P_4(1)\) which is unknown but we can approximate it using Taylor's theorem. To that end, by Taylor's theorem, for any \(x\in [0,2]\) there exists \(c\) in between \(x_0=0\) and \(x\) such that \begin{align*} R_4(x) &= \frac{f^{(5)}(c)}{5!} x^5\\ & = \frac{1}{5!} \frac{24}{(1+c)^5} x^4\\ & = \frac{1}{5(1+c)^5}. \end{align*} Therefore, for \(x=1\), there exists \(0 \lt c \lt 1\) such that \[ R_4(1) = \frac{1}{5(1+c)^5}. \] Therefore, a bound for the error is \[ |R_4(1)| = \left|\frac{1}{5(1+c)^5}\right| \leq \frac{1}{5} \] since \(1+c \gt 1\).
Let \(f:\real\rightarrow\real\) be the sine function, that is, \(f(x) = \sin(x)\).
  1. Approximate \(f(3) = \sin(3)\) using \(P_8\) centered at \(x_0=0\) and give a bound on the error.
  2. Restrict \(f\) to a closed and bounded interval of the form \([-R,R]\). Show that for any \(\veps \gt 0\) there exists \(K\in\N\) such that if \(n\geq K\) then \(|f(x)-P_n(x)| \lt \veps\) for all \(x\in [-a,a]\).
(a) It is straightforward to compute that \[ P_8(x) = x - \frac{1}{3!}x^3 + \frac{1}{5!}x^5 - \frac{1}{7!}x^7 \] and \(f^{(9)}(x) = \sin(x)\). Thus, by Taylor's theorem for any \(x\) there exists \(c\) in between \(x=0\) and \(x\) such that \[ f(x) - P_8(x) = R_8(x) = \frac{\sin(c)}{9!} x^9. \] The estimate for \(f(3)=\sin(3)\) is \begin{align*} \sin(3) &\approx P_8(3)\\ & = 3 - \frac{1}{3!}3^3 + \frac{1}{5!}3^5 - \frac{1}{7!}3^7\\ & = \frac{51}{560}\\ & = 0.0910714286 \end{align*} By Taylor's theorem, there exists \(c\) such that \(0 \lt c \lt 3\) and \[ \sin(3) - P_8(3) = R_8(3) = \frac{\sin(c)}{9!} 3^9 \] Now since \(|\sin(c)|\leq 1\) for all \(c\in\real\), we have \begin{align*} |R_8(3)| &= \frac{|\sin(c)|}{9!} 3^9\\ & = \frac{3^9}{9!}\\ & = 0.054241\ldots \end{align*} (b) Since \(f(x)=\sin(x)\) has derivatives of all orders, for any \(n\in\N\) we have by Taylor's theorem that \begin{align*} |f(x) - P_n(x)| &= |R_n(x)|\\ & = \left| \frac{f^{(n+1)}(c)}{(n+1)!} x^{n+1}\right|\\ & = \frac{|f^{(n+1)}(c)}{(n+1)!} |x|^{n+1} \end{align*} where \(c\) is in between \(x_0=0\) and \(x\). Now, the derivative of \(f(x)=\sin(x)\) of any order is one of \(\pm \cos(x)\) or \(\pm \sin(x)\), and therefore \(|f^{(n+1}(c)| \leq 1\). Since \(x \in [-a,a]\) then \(|x|\leq a\) and therefore \(|x|^{n+1} \leq a^{n+1}\). Therefore, for all \(x\in [-a,a]\) we have \begin{align*} |R_n(x)| &\leq \frac{1}{(n+1)!} a^{n+1}\\ & = \frac{a^{n+1}}{(n+1)!}. \end{align*} Consider the sequence \(x_n = \frac{a^n}{n!}\). Applying the Ratio test we obtain \begin{align*} \lim_{n\rightarrow\infty} \frac{x_{n+1}}{x_n} &= \lim_{n\rightarrow\infty} \frac{a^{n+1} n!}{(n+1)! a^n}\\ & = \limi \frac{a}{n+1} = 0. \end{align*} Therefore, by the Ratio test \(\limi x_n = 0\). Hence, for any \(\veps \gt 0\) there exists \(K\in \N\) such that \(|x_n-0| = x_n \lt \veps\) for all \(n\geq K\). Therefore, for all \(n\geq K\) we have that \[ |R_n(x)| \leq \frac{a^{n+1}}{(n+1)!} \lt \veps \] for all \(x\in [-a,a]\).
Taylor's theorem can be used to derive useful inequalities.
Prove that for all \(x\in\real\) it holds that \[ 1-\frac{1}{2}x^2 \leq \cos(x). \]
Let \(f(x) = \cos(x)\). Applying Taylor's theorem to \(f\) at \(x_0=0\) we obtain \[ \cos(x) = 1-\frac{1}{2}x^2 + R_2(x) \] where \[ R_2(x) = \frac{f^{(3)}(c)}{3!} x^3 = \frac{\sin(c)}{6} x^3 \] and \(c\) is in between \(x_0=0\) and \(x\). Now, if \(0\leq x\leq \pi\) then \(0 \lt c \lt \pi\) and then \(\sin(c) \gt 0\), from which it follows that \(R_2(x) \geq 0\). If on the other hand \(-\pi\leq x\leq 0\) then \(-\pi \lt c \lt 0\) and then \(\sin(c) \lt 0\), from which it follows that \(R_2(x)\geq 0\). Hence, the inequality holds for \(|x|\leq \pi\). Now if \(|x|\geq \pi \gt 3\) then \[ 1- \frac{1}{2}x^2 \lt -3 \lt \cos(x). \] Hence the inequality holds for all \(x\in\real\).

Exercises

Use Taylor's theorem to prove that if \(x \gt 0\) then \[ 1 + \frac{1}{2}x - \frac{1}{8}x^2 \leq \sqrt{1+x} \leq 1+\frac{1}{2}x \] Then use these inequalities to approximate \(\sqrt{1.2}\) and \(\sqrt{2}\), and for each case determine a bound on the error of your approximation.
Let \(f:\real\rightarrow\real\) be such that \(f^{(k)}(x)\) exists for all \(x\in\real\) and for all \(k\in\mathbb{N}\) (such a function is called infinitely differentiable on \(\real\)). Suppose further that there exists \(M \gt 0\) such that \(|f^{(k)}(x)|\leq M\) for all \(x\in\real\) and all \(k\in\mathbb{N}\). Let \(P_n(x)\) be the \(n\)th order Taylor polynomial of \(f\) centered at \(x_0=0\). Let \(I=[-R,R]\), where \(R \gt 0\). Prove that for any fixed \(\varepsilon \gt 0\) there exists \(K\in\mathbb{N}\) such that for \(n\geq K\) it holds that \[ |f(x) - P_n(x)| \lt \varepsilon \] for all \(x\in [-R,R]\). Hint: \(f^{(n)}\) is continuous on \([-R,R]\) for every \(n\in\mathbb{N}\).
Euler's number is approximately \(e\approx 2.718281828\ldots\). Use Taylor's theorem at \(x_0=0\) on \(f(x) = e^x\) and the estimate \(e \lt 3\) to show that, for all \(n\in\mathbb{N}\), \[ 0 \lt e - \left(1+1+\frac{1}{2!}+\frac{1}{3!}+\cdots+\frac{1}{n!}\right) \lt \frac{3}{(n+1)!} \]
Let \(f:\real\rightarrow\real\) be the cosine function \(f(x) = \cos(x)\). Approximate \(f(2) = \cos(2)\) using \(P_8\) centered at \(x_0=0\) and give a bound on the error of your estimation.