You're right, they're used to approximate functions. Here's how they work, with a simple example. Let's suppose we have the polynomial
f(x)=a3x3+a2x2+a1x+a0.
Now then,
f(0)=a0. Futhermore,
f′(x)=3a3x2+2a2x+a1⟹f′(0)=a1.
Similarly, you'll get
f′′(0)=2a2 and
f′′′(0)=6a3 (where the 2 and 6 appear as a consequence of the differentiation).
But, we can use these values, and the predictable fashion in which differentation causes the coefficients to change, to get back the polynomial. Differentiation makes the coefficient multiply by some factorial; the
a3 is infront the
x3, so it'll be multiplied by 3, then 2, and then 1,
then it will be the constant term. So we 'undo' the differentiation on the coefficient by dividing by the factorial it would've been multiplied by:
f(x)=3!6a3x3+2!2a2x2+1!a1x+0!a0 where 0! =1 by convention. This
is the polynomial we started with! If this seems a bit abstract, try an example with fixed coefficients, and you should see it working out easily.
Looking back at what the values on the top of the fractions are, we could write this as
f(x)k=0∑3k!f(k)(0)xk.
Now its beginning to look like a taylor series! This is the general idea, but n could be any finite number in practice. Be warned, for taylor series are.. not exactly great in some cases. Try to find the expansion of
f(x)=e−x2 if you want to see how bad they can be!
I hope I've answered sufficiently to give you the idea, but if anything is unclear please ask away