The Student Room Group

Derivative of a Power Series

Can any one help me out on these questions? Not too sure what to do...

Define a function f by,

f(x):=i=03nxn(n+1)! f(x) := \displaystyle\sum_{i=0}^{\infty} \frac{3^n x^n}{(n+1)!}

for thosexR x \in \mathbb{R} for which the series converge, find f'

Thanks for any help!
Reply 1
I’m not quite sure- but this is how I believe it will workout-
The derivative will be n*(3^n)*(x^[n-1])/(n+1)!
This further reduces to {(3^n)*(x^[n-1])/(n)!}- {(3^n)*(x^[n-1])/(n+1)!},which converges.
Reply 2
Original post by Blue7195
Can any one help me out on these questions? Not too sure what to do...

Define a function f by,

f(x):=i=03nxn(n+1)! f(x) := \displaystyle\sum_{i=0}^{\infty} \frac{3^n x^n}{(n+1)!}

for thosexR x \in \mathbb{R} for which the series converge, find f'

Thanks for any help!


Well, assuming you've identified the values of x for which the series converges, you should know a general result that tells you how to differentiate a power series within its radius of convergence :smile:
Reply 3
Original post by davros
Well, assuming you've identified the values of x for which the series converges, you should know a general result that tells you how to differentiate a power series within its radius of convergence :smile:


So is it simply just,

f(x)=i=0n3nxn1(n+1)!f'(x) = \displaystyle\sum_{i=0}^{\infty} \frac{n3^n x^{n-1}}{(n+1)!}

?
That is correct, however it's the justification of the derivative that is important.

You don't know (from the question) that the derivative f' converges, thus use Spandy's simple method to rewrite the summand and show convergence.

The question is only difficult because you have to prove the convergence of the derivative as well as calculating it. The derivative you have calculated above will be valid for those x in the reals where the original series f converges.

Hope that's clear!
Reply 5
Original post by nathanturnerspc
That is correct, however it's the justification of the derivative that is important.

You don't know (from the question) that the derivative f' converges, thus use Spandy's simple method to rewrite the summand and show convergence.

The question is only difficult because you have to prove the convergence of the derivative as well as calculating it. The derivative you have calculated above will be valid for those x in the reals where the original series f converges.

Hope that's clear!


The question before was working out the radius of convergence which was infinity, so x is differentiable on all R so isn't the derivative differentiable on all R too? Sorry I maybe should have put that in first
Reply 6
Original post by nathanturnerspc
That is correct, however it's the justification of the derivative that is important.

You don't know (from the question) that the derivative f' converges, thus use Spandy's simple method to rewrite the summand and show convergence.

The question is only difficult because you have to prove the convergence of the derivative as well as calculating it. The derivative you have calculated above will be valid for those x in the reals where the original series f converges.

Hope that's clear!


Not if you've already been given the standard result that a power series is differentiable within its radius of convergence!

I'm not sure, but I suspect the point of this question is that you can in fact express the derivative as a well-known function of x by doing a bit of rearrangement and taking out a couple of factors, but I haven't tried it myself!
Ok in that case you're safe with f being differentiable everywhere in R.

However, you need to be careful with derivatives. It's derivative will CONVERGE on all R, otherwise f would not be differentiable on R. So you're right there (incidentally, the power series is differentiable because it is just a sum of positive powers of x; convergence means you can 'switch the derivative and sum' signs without any trouble).

You claimed that f' would be differentiable everywhere. I don't think this is true, take, for example f' = |x| and f = sgn(x) x^2 / 2

Then f is differentiable on R but f' is not (not differentiable at zero).

If you have any questions on the subtlety let me know :smile:
Reply 8
Original post by davros
Not if you've already been given the standard result that a power series is differentiable within its radius of convergence!

I'm not sure, but I suspect the point of this question is that you can in fact express the derivative as a well-known function of x by doing a bit of rearrangement and taking out a couple of factors, but I haven't tried it myself!


Ah okay, it's only 2 marks so I didn't think it would be anything big but at the same time I was just really unsure. Thanks a lot for your help!
Reply 9
Original post by nathanturnerspc
Ok in that case you're safe with f being differentiable everywhere in R.

However, you need to be careful with derivatives. It's derivative will CONVERGE on all R, otherwise f would not be differentiable on R. So you're right there (incidentally, the power series is differentiable because it is just a sum of positive powers of x; convergence means you can 'switch the derivative and sum' signs without any trouble).

You claimed that f' would be differentiable everywhere. I don't think this is true, take, for example f' = |x| and f = sgn(x) x^2 / 2

Then f is differentiable on R but f' is not (not differentiable at zero).

If you have any questions on the subtlety let me know :smile:


Thankyou! Analysis really isn't my strong point haha
Original post by nathanturnerspc
Ok in that case you're safe with f being differentiable everywhere in R.

However, you need to be careful with derivatives. It's derivative will CONVERGE on all R, otherwise f would not be differentiable on R. So you're right there (incidentally, the power series is differentiable because it is just a sum of positive powers of x; convergence means you can 'switch the derivative and sum' signs without any trouble).This is sort of true, but it's quite a lot more subtle than you seem to think. I say "sort of true" because you need the derivative to converge as well. You certainly can have "trouble" where it doesn't. For example consider

n=0(1)nxn2n+1=log(1+x)\displaystyle \sum_{n=0}^\infty \dfrac{(-1)^n x^n}{2n+1} = \log(1+x) for (-1 < x <= 1).

The sum converges when x = 1, but differentiating the power series term by term gives a divergent series when x = 1, even though you can differentiate log(1+x) at that point.

You claimed that f' would be differentiable everywhere. I don't think this is true,
If the power series for f converges everywhere, this is true. More generally:

If you have a power series f(x)=0anxnf(x) = \sum_0^\infty a_n x^n and it converges for any particular value x=x0x=x_0, then the series converges whenever x<x0|x|<|x_0| , and for such values of x differentiating term by term gives a convergent series that converges to f'(x). You can repeat the argument to show all the derivatives exist inside the radius of convergence.

This is all standard bookwork - albeit bookwork that you're often not expected to be able to prove.

take, for example f' = |x| and f = sgn(x) x^2 / 2

Then f is differentiable on R but f' is not (not differentiable at zero).
This is true, but you can't express f as a power series so it isn't a counterexample to what was claimed.

Power series are special.
There is a bit of subtlety in this question because we're talking about power series. In summary, for a power series:

- It is differentiable (term-by-term) within its radius of convergence; but the derivative may not be convergent at all points
- The derivative of the series will be differentiable at all points by virtue of it being a series of powers of x

Just FYI the series you have for log(1+x) isn't quite right, although I agree the derivative series does not converge at x=1. But no matter for the question we're looking at here.
Original post by nathanturnerspc
There is a bit of subtlety in this question because we're talking about power series. In summary, for a power series:

- It is differentiable (term-by-term) within its radius of convergence; but the derivative may not be convergent at all points
. No. It always converges inside the radius of convergence. If you think otherwise, provide a counter example.

See also https://gowers.wordpress.com/2014/02/22/differentiating-power-series/


- The derivative of the series will be differentiable at all points by virtue of it being a series of powers of xI'm not even sure what you mean by this. It sure ain't going to converge at points outside the radius of convergence.

Just FYI the series you have for log(1+x) isn't quite right, although I agree the derivative series does not converge at x=1. But no matter for the question we're looking at here.It still shows the convergence isn't as straightforward as you claim.
Original post by DFranklin
This is sort of true, but it's quite a lot more subtle than you seem to think. I say "sort of true" because you need the derivative to converge as well. You certainly can have "trouble" where it doesn't. For example consider

n=0(1)nxn2n+1=log(1+x)\displaystyle \sum_{n=0}^\infty \dfrac{(-1)^n x^n}{2n+1} = \log(1+x) for (-1 < x <= 1).

The sum converges when x = 1, but differentiating the power series term by term gives a divergent series when x = 1, even though you can differentiate log(1+x) at that point.

If the power series for f converges everywhere, this is true. More generally:

If you have a power series f(x)=0anxnf(x) = \sum_0^\infty a_n x^n and it converges for any particular value x=x0x=x_0, then the series converges whenever x<x0|x|<|x_0| , and for such values of x differentiating term by term gives a convergent series that converges to f'(x). You can repeat the argument to show all the derivatives exist inside the radius of convergence.

This is all standard bookwork - albeit bookwork that you're often not expected to be able to prove.

This is true, but you can't express f as a power series so it isn't a counterexample to what was claimed.

Power series are special.

Im pretty sure thats not the power series expansion of log(1+x). Coefficient of x^n should be [(-1)^{n-1}]/n
Original post by newblood
Im pretty sure thats not the power series expansion of log(1+x). Coefficient of x^n should be [(-1)^{n-1}]/n
Yes, it was already pointed out (I conflated the series for arctan x with the one for log x. Should have just derived it from scratch).

Edit: I should perhaps have acknowledged the error in the previous post but was posting from my phone and having enough trouble cut/pasting a link! Having lost the post twice I just wanted to get it out there.

Point still holds though: the power series for log(1+x) converges for -1<x<=1 but the derivative diverges at 1, despite the fact that log(1+x) is differentiable there.

The arctan series is actually a more interesting example in some ways (hopefully got it right this time!):

arctan(x)=n=0(1)nx2n+12n+1\arctan(x) = \sum_{n=0}^\infty \dfrac{(-1)^n x^{2n+1}}{2n+1}

The RHS only converges for -1 < x <= 1 and the (term-by-term) derivative diverges at x=1.

But the LHS is infinitely differentiable for all x.

The reason being that arctan z (z complex) has singularities at z=+/- i. Of course, I've done the complex analysis work to understand why this all happens, but I still find it kind of amazing how a function's behaviour well away from the real line can still have very observable effects on its behaviour when restricted to the real line.
(edited 9 years ago)
Reply 15
Original post by DFranklin
Power series are special.


I REALLY agree. Life is better after complex numbers. :smile:

Quick Reply

Latest