The Student Room Group

Continuity question help

Suppose g(-x) = -g(x) for any x. Show that if g is continuous at 0 then g(0)=0
(edited 6 years ago)
Original post by SWISH99
Suppose g(-x) = -g(x) for any x. Show that if g is continuous at 0 then g(0)=0


Set x=0x=0.
Original post by SWISH99
Suppose g(-x) = -g(x) for any x. Show that if g is continuous at 0 then g(0)=0


In this case g(x) is an odd function, so g(-0) = -g(0). Then it's an easy implication from there.
Original post by NotNotBatman
In this case g(x) is an odd function, so g(-0) = -g(0). Then it's an easy implication from there.


You haven't mentioned continuity though.
Original post by atsruser
You haven't mentioned continuity though.
Since the result holds without needing continuity, this is perhaps unsurprising.

(Possibly the original question only specified f(x)=-f(-x) for x not equal to 0; the conclusion then requires continuity).
Original post by DFranklin
Since the result holds without needing continuity, this is perhaps unsurprising.

(Possibly the original question only specified f(x)=-f(-x) for x not equal to 0; the conclusion then requires continuity).


y=1/x is a counterexample, no? Seems to me that continuity guarantees definedness at x=0.
Original post by atsruser
y=1/x is a counterexample, no? Seems to me that continuity guarantees definedness at x=0.


I agree, The question is probably looking for you to justify letting x=0x=0 with the fact that gg is continuous at 0.
Original post by atsruser
You haven't mentioned continuity though.


But the question assumes continuity, so g(x) is defined at x=0.
Original post by atsruser
y=1/x is a counterexample, no?
I wouldn't say it's a counterexample to the question as quoted. Unless told otherwise, there's no reason to assume that 0 is a "special" point as far as being able to apply f(x) = -f(-x), or in terms of f being defined (in fact, there are strong reasons to say it makes no sense at all to suggest f might not be defined at 0).

As I said in my reply, I suspect there's some missing preamble to the question.

Seems to me that continuity guarantees definedness at x=0.
Well yes, but note that continuity guarantees from the definition that both

f(0) is defined

and

limx0f(x)=f(0)\lim_{x \to 0} f(x) = f(0)

In other words, there's no "analysis" required to deduce f(0) is defined - it's directly one of the conditions of continuity.

The only way I see any "meat" here is if as I said before, the question did not mean to state that f(x) = -f(-x) holds at x = 0.

Then if you assume f(0)=ϵ0f(0) = \epsilon \neq 0, by using the definition of continuity and considering points slightly above and slightly below 0, you can deduce a contradiction.
Original post by DFranklin
I wouldn't say it's a counterexample to the question as quoted. Unless told otherwise, there's no reason to assume that 0 is a "special" point as far as being able to apply f(x) = -f(-x), or in terms of f being defined (in fact, there are strong reasons to say it makes no sense at all to suggest f might not be defined at 0).

Sorry - short of time. But it seems that we have to interpret the question wording rather widely, else the mention of continuity seems to be irrelevant and the question pretty vacuous - if we assume g(x) is defined at 0, then it's trivial.
Original post by NotNotBatman
But the question assumes continuity, so g(x) is defined at x=0.


I think the question tries to refer awkwardly to two different g's - the first lot are odd, and the second lot are continuous. It's only the intersection of both where the result holds. Anyway, gotta go - I'll leave you to further learned discussions.
Original post by atsruser
Sorry - short of time. But it seems that we have to interpret the question wording rather widely, else the mention of continuity seems to be irrelevant and the question pretty vacuous - if we assume g(x) is defined at 0, then it's trivial.
But deducing g(x) is defined at 0 from g cts at 0 is also trivial. It's literally part of the definition of continuity.

The OP posted the full question elsewhere; there's not actually any more detail, but from the extra context, I'm fairly sure the lecturer was *aiming* at establishing something along the lines that "if f is defined for x not 0, and f (where defined) is odd , then the only way to define f(0) such that f is cts is f(0)=0".

Unfortunately the actual question asked doesn't say anything about "x not 0", which makes the whole thing meaningless.

Quick Reply

Latest