This is sort of true, but it's quite a lot more subtle than you seem to think. I say "sort of true" because you need the derivative to converge as well. You certainly can have "trouble" where it doesn't. For example consider
n=0∑∞2n+1(−1)nxn=log(1+x) for (-1 < x <= 1).
The sum converges when x = 1, but differentiating the power series term by term gives a divergent series when x = 1, even though you can differentiate log(1+x) at that point.
If the power series for f converges everywhere, this
is true. More generally:
If you have a power series
f(x)=∑0∞anxn and it converges for any particular value
x=x0, then the series converges whenever
∣x∣<∣x0∣ , and for such values of x differentiating term by term gives a convergent series that converges to f'(x). You can repeat the argument to show
all the derivatives exist inside the radius of convergence.
This is all standard bookwork - albeit bookwork that you're often not expected to be able to prove.
This is true, but you can't express f as a power series so it isn't a counterexample to what was claimed.
Power series are
special.