The Student Room Group

Proof question

Let X be a proper subset of R and let f: X to R. Prove (from the definition) that if f(x) tends to L as x tends to infinity then -f(x) tends to -L as x tends to infinity.

My attempt:

(For all epsilon > 0) (There exists k > 0) (For all x E X) x>k implies |f(x) - L| < epsilon

So -epsilon< |f(x)-L| < epsilon

Similarly, if -f(x) tends to -L as x tends to infinity then

(For all epsilon > 0) (There exists k > 0) (For all x E X) x>k implies |-f(x) + L| < epsilon

-epsilon < |-f(x) + L| < epsilon which is the same as -epsilon< |f(x)-L| < epsilon

So if f(x) tends to L, -f(x) tends to -L

Is this correct?
(edited 13 years ago)
Reply 1
Poor troll 2/10
:troll::troll::troll::troll::troll:
Reply 2
Original post by PaigeyPoo
Poor troll 2/10
:troll::troll::troll::troll::troll:


???
Reply 3
Original post by Cggh90
???


Quit wasting bandwidth trolling
Reply 4
Original post by PaigeyPoo
Quit wasting bandwidth trolling


Oops. I didn't post the full question!

EDITTED!

Could you reply now. Sorry
Reply 5
Original post by Cggh90
Oops. I didn't post the full question!

EDITTED!

Could you reply now. Sorry


No I'm too angry
-e < f(x) - L < e
e > -f(x) - (-L) > -e
f(x) -> L
Reply 7
Original post by ziedj
-e < f(x) - L < e
e > -f(x) - (-L) > -e
f(x) -> L


Cheers. But are you saying mine isn't correct then?
Original post by Cggh90
Cheers. But are you saying mine isn't correct then?

It isn't really, but you started on the right lines. You seem to be trying to prove 'if' and 'only if' at the same time, and the two incomplete proofs have sort of-but-not-properly met in the middle.

Here's the proof:

Let epsilon > 0 be given.

Since f(x) tends to L as x tends to infinity, there exists k > 0 such that, if x > k, then | f(x) - L | < epsilon.

Since the modulus of the negative of a number is the same as the modulus of the number itself (this is just a shorter way of doing a line of thinking you seemed to have), this means that

if x > k then | - ( f(x) - L ) | < epsilon.

Or, to write it another way: if x > k then | (-f)(x) - (-L) | < epsilon.

So, we have shown that, given epsilon > 0, there exists k > 0 such that if x > k then | (-f)(x) - (-L) | < epsilon. This means that (-f)(x) -> L as x -> infinity.

(Sorry for the lousy formatting - does anyone know how to use LaTeX on the forum?)

Quick Reply