Let’s say we want to approximate a function $f$. In an area around $a$ the value $f(a)$ is probably a decent approximation for $f(x)$.
Using the Fundamental Theorem of Calculus we can calculate the error
$$ f(x) - f(a) = \int_a^x f'(y)dy $$
If $f'$ is bounded by $\|f'\|_\infty$ then our error is smaller than $\|f'\|_\infty (x-a)$. We can now improve this bound by approximating $f'$ itself in the same way:
Which results in a more precise approximation of $f$:
$$ f(x) \approx f(a) + f'(a)(x-a) $$
If $f''$ is bounded then our error is now only $\|f''\|_\infty(x-a)^2$ a significant improvement over the previous error close to $a$.
Iterating on this idea yields the taylor approximation. If all the derivatives are bounded by the same constant, the error converges towards zero, the function is a taylor series
Application: Exponential Function
If the change of something is equal to the current size (e.g. number of bacteria as it increases via cell division) then that implies $f=f'$. If we also assume $f(0)=1$ for normalization then we immediately obtain the series representation of the unique function with this property: the exponential function
Taylor Approximation in Multiple Dimensions
To obtain the Taylor’s Theorem for functions $f:\mathbb{R}^d\to\mathbb{R}$ from multiple dimensions one can simply reduces them to the one dimensional version of the taylor theorem with the following trick. For $a,x\in\mathbb{R}$ we can define
which means we can apply the one dimensional taylor theorem from above to $g$ to get