# Derivative of a Power Series

Watch
Announcements
#1
Can any one help me out on these questions? Not too sure what to do...

Define a function f by, for those for which the series converge, find f'

Thanks for any help!
0
5 years ago
#2
I’m not quite sure- but this is how I believe it will workout-
The derivative will be n*(3^n)*(x^[n-1])/(n+1)!
This further reduces to {(3^n)*(x^[n-1])/(n)!}- {(3^n)*(x^[n-1])/(n+1)!},which converges.
0
5 years ago
#3
(Original post by Blue7195)
Can any one help me out on these questions? Not too sure what to do...

Define a function f by, for those for which the series converge, find f'

Thanks for any help!
Well, assuming you've identified the values of x for which the series converges, you should know a general result that tells you how to differentiate a power series within its radius of convergence 1
#4
(Original post by davros)
Well, assuming you've identified the values of x for which the series converges, you should know a general result that tells you how to differentiate a power series within its radius of convergence So is it simply just, ?
0
5 years ago
#5
That is correct, however it's the justification of the derivative that is important.

You don't know (from the question) that the derivative f' converges, thus use Spandy's simple method to rewrite the summand and show convergence.

The question is only difficult because you have to prove the convergence of the derivative as well as calculating it. The derivative you have calculated above will be valid for those x in the reals where the original series f converges.

Hope that's clear!
0
#6
(Original post by nathanturnerspc)
That is correct, however it's the justification of the derivative that is important.

You don't know (from the question) that the derivative f' converges, thus use Spandy's simple method to rewrite the summand and show convergence.

The question is only difficult because you have to prove the convergence of the derivative as well as calculating it. The derivative you have calculated above will be valid for those x in the reals where the original series f converges.

Hope that's clear!
The question before was working out the radius of convergence which was infinity, so x is differentiable on all R so isn't the derivative differentiable on all R too? Sorry I maybe should have put that in first
0
5 years ago
#7
(Original post by nathanturnerspc)
That is correct, however it's the justification of the derivative that is important.

You don't know (from the question) that the derivative f' converges, thus use Spandy's simple method to rewrite the summand and show convergence.

The question is only difficult because you have to prove the convergence of the derivative as well as calculating it. The derivative you have calculated above will be valid for those x in the reals where the original series f converges.

Hope that's clear!
Not if you've already been given the standard result that a power series is differentiable within its radius of convergence!

I'm not sure, but I suspect the point of this question is that you can in fact express the derivative as a well-known function of x by doing a bit of rearrangement and taking out a couple of factors, but I haven't tried it myself!
0
5 years ago
#8
Ok in that case you're safe with f being differentiable everywhere in R.

However, you need to be careful with derivatives. It's derivative will CONVERGE on all R, otherwise f would not be differentiable on R. So you're right there (incidentally, the power series is differentiable because it is just a sum of positive powers of x; convergence means you can 'switch the derivative and sum' signs without any trouble).

You claimed that f' would be differentiable everywhere. I don't think this is true, take, for example f' = |x| and f = sgn(x) x^2 / 2

Then f is differentiable on R but f' is not (not differentiable at zero).

If you have any questions on the subtlety let me know 0
#9
(Original post by davros)
Not if you've already been given the standard result that a power series is differentiable within its radius of convergence!

I'm not sure, but I suspect the point of this question is that you can in fact express the derivative as a well-known function of x by doing a bit of rearrangement and taking out a couple of factors, but I haven't tried it myself!
Ah okay, it's only 2 marks so I didn't think it would be anything big but at the same time I was just really unsure. Thanks a lot for your help!
0
#10
(Original post by nathanturnerspc)
Ok in that case you're safe with f being differentiable everywhere in R.

However, you need to be careful with derivatives. It's derivative will CONVERGE on all R, otherwise f would not be differentiable on R. So you're right there (incidentally, the power series is differentiable because it is just a sum of positive powers of x; convergence means you can 'switch the derivative and sum' signs without any trouble).

You claimed that f' would be differentiable everywhere. I don't think this is true, take, for example f' = |x| and f = sgn(x) x^2 / 2

Then f is differentiable on R but f' is not (not differentiable at zero).

If you have any questions on the subtlety let me know Thankyou! Analysis really isn't my strong point haha
0
5 years ago
#11
(Original post by nathanturnerspc)
Ok in that case you're safe with f being differentiable everywhere in R.

However, you need to be careful with derivatives. It's derivative will CONVERGE on all R, otherwise f would not be differentiable on R. So you're right there (incidentally, the power series is differentiable because it is just a sum of positive powers of x; convergence means you can 'switch the derivative and sum' signs without any trouble).
This is sort of true, but it's quite a lot more subtle than you seem to think. I say "sort of true" because you need the derivative to converge as well. You certainly can have "trouble" where it doesn't. For example consider for (-1 < x <= 1).

The sum converges when x = 1, but differentiating the power series term by term gives a divergent series when x = 1, even though you can differentiate log(1+x) at that point.

You claimed that f' would be differentiable everywhere. I don't think this is true,
If the power series for f converges everywhere, this is true. More generally:

If you have a power series and it converges for any particular value , then the series converges whenever , and for such values of x differentiating term by term gives a convergent series that converges to f'(x). You can repeat the argument to show all the derivatives exist inside the radius of convergence.

This is all standard bookwork - albeit bookwork that you're often not expected to be able to prove.

take, for example f' = |x| and f = sgn(x) x^2 / 2

Then f is differentiable on R but f' is not (not differentiable at zero).
This is true, but you can't express f as a power series so it isn't a counterexample to what was claimed.

Power series are special.
1
5 years ago
#12
There is a bit of subtlety in this question because we're talking about power series. In summary, for a power series:

- It is differentiable (term-by-term) within its radius of convergence; but the derivative may not be convergent at all points
- The derivative of the series will be differentiable at all points by virtue of it being a series of powers of x

Just FYI the series you have for log(1+x) isn't quite right, although I agree the derivative series does not converge at x=1. But no matter for the question we're looking at here.
0
5 years ago
#13
(Original post by nathanturnerspc)
There is a bit of subtlety in this question because we're talking about power series. In summary, for a power series:

- It is differentiable (term-by-term) within its radius of convergence; but the derivative may not be convergent at all points
. No. It always converges inside the radius of convergence. If you think otherwise, provide a counter example.

- The derivative of the series will be differentiable at all points by virtue of it being a series of powers of x
I'm not even sure what you mean by this. It sure ain't going to converge at points outside the radius of convergence.

Just FYI the series you have for log(1+x) isn't quite right, although I agree the derivative series does not converge at x=1. But no matter for the question we're looking at here.
It still shows the convergence isn't as straightforward as you claim.
0
5 years ago
#14
(Original post by DFranklin)
This is sort of true, but it's quite a lot more subtle than you seem to think. I say "sort of true" because you need the derivative to converge as well. You certainly can have "trouble" where it doesn't. For example consider for (-1 < x <= 1).

The sum converges when x = 1, but differentiating the power series term by term gives a divergent series when x = 1, even though you can differentiate log(1+x) at that point.

If the power series for f converges everywhere, this is true. More generally:

If you have a power series and it converges for any particular value , then the series converges whenever , and for such values of x differentiating term by term gives a convergent series that converges to f'(x). You can repeat the argument to show all the derivatives exist inside the radius of convergence.

This is all standard bookwork - albeit bookwork that you're often not expected to be able to prove.

This is true, but you can't express f as a power series so it isn't a counterexample to what was claimed.

Power series are special.
Im pretty sure thats not the power series expansion of log(1+x). Coefficient of x^n should be [(-1)^{n-1}]/n
0
5 years ago
#15
(Original post by newblood)
Im pretty sure thats not the power series expansion of log(1+x). Coefficient of x^n should be [(-1)^{n-1}]/n
Yes, it was already pointed out (I conflated the series for arctan x with the one for log x. Should have just derived it from scratch).

Edit: I should perhaps have acknowledged the error in the previous post but was posting from my phone and having enough trouble cut/pasting a link! Having lost the post twice I just wanted to get it out there.

Point still holds though: the power series for log(1+x) converges for -1<x<=1 but the derivative diverges at 1, despite the fact that log(1+x) is differentiable there.

The arctan series is actually a more interesting example in some ways (hopefully got it right this time!): The RHS only converges for -1 < x <= 1 and the (term-by-term) derivative diverges at x=1.

But the LHS is infinitely differentiable for all x.

The reason being that arctan z (z complex) has singularities at z=+/- i. Of course, I've done the complex analysis work to understand why this all happens, but I still find it kind of amazing how a function's behaviour well away from the real line can still have very observable effects on its behaviour when restricted to the real line.
0
5 years ago
#16
(Original post by DFranklin)
Power series are special.
I REALLY agree. Life is better after complex numbers. 0
X

new posts Back
to top
Latest
My Feed

### Oops, nobody has postedin the last few hours.

Why not re-start the conversation?

see more

### See more of what you like onThe Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

### Poll

Join the discussion

#### Current uni students - are you thinking of dropping out of university?

Yes, I'm seriously considering dropping out (95)
13.87%
I'm not sure (32)
4.67%
No, I'm going to stick it out for now (214)
31.24%
I have already dropped out (16)
2.34%
I'm not a current university student (328)
47.88%