The Student Room Group

Why does dividing a number by zero not result in a value of zero?

Scroll to see replies

Original post by 1 8 13 20 42
You just can't divide a number by zero.

In fields, that is systems of mathematical objects, defined by certain rules, we can perform addition and multiplication. We have things called "inverses". Every field has an identity element, 1, and a zero element 0. Basically, multiplying by 1 does nothing, and adding 0 does nothing. Every non-zero element in the field has an inverse, i.e. for every "a" in the field there is a "b" such that ab = ba = 1. But 0 does not have an inverse.

An example of a field is the real numbers. When we "divide" real numbers together, we are really finding an inverse, then applying multiplication. Division is not its own thing, it is a shorthand for this process. x/y means find the inverse of y, then multiply x by that inverse. With 0, we fail at the first hurdle, as 0 does not have an inverse. There is no number, x for instance, such that 0 * x = 1. Division by 0 cannot be done. It is undefined.

Of course, we don't have to look at things through the field lens. But it's how this stuff is rigorously supported, so I think it's pertinent. Others have already stated more number-orientated intuitive ways of looking at this.
That's something I could put down, right? :smile:
Original post by hamza772000
Yeah, I know, it's just the fact that as we keep getting closer to zero the numbers start shooting up and gets too high before it actually reaches zero, right?

:smile:


You're kinda right. "Too high" would imply a limit of some sort, there is no limit. We just observe that as the denominator gets closer to 0, the numbers just go to infinity. The closer you are to 0, the bigger the number is.
Original post by RDKGames
You're kinda right. "Too high" would imply a limit of some sort, there is no limit. We just observe that as the denominator gets closer to 0, the numbers just go to infinity. The closer you are to 0, the bigger the number is.
Right, so it just keeps going and never reaches zero as the decimal points just keep increasing?
Reply 43
Original post by hamza772000
Is it because zero has no value? or because it isn't divisible? or because anything multiplied by it equals zero?(I know that doesn't sound like the right answer) or because it is not consistent with division by other numbers?

Thanks in advance :h:


Yh it's cause Zero has no value and basically represents nothing. If you think about it in terms of the real world you can't divide something by nothing, since nothing technically doesn't exist.

To add to that if you take say 5 chocolates and decided it by 0, you can't get Zero because what you're technically doing is deciding 5 sweets by a value that represents nothing. So the answer can't be nothing (0) since you can't even carry out the calculation.
Original post by Dinah98
Yh it's cause Zero has no value and basically represents nothing. If you think about it in terms of the real world you can't divide something by nothing, since nothing technically doesn't exist.

To add to that if you take say 5 chocolates and decided it by 0, you can't get Zero because what you're technically doing is deciding 5 sweets by a value that represents nothing. So the answer can't be nothing (0) since you can't even carry out the calculation.
I get it, but what do you mean by "decide"?
The reason is this.

ab\dfrac{a}{b} is defined to be the unique number cc such that cb=acb=a.
The key thing to note here in this definition is that there is an assumption. It assumes that this 'unique number cc' exists.

Let's try to find a0\dfrac{a}{0} where aa is nonzero. It's the unique number cc (if it exists) such that c×0=ac \times 0 = a. But it is axiomatic that c×0=0c \times 0=0, so we need the unique number cc such that 0=a0=a. But aa is nonzero, so there does not exist such a number cc and so a0\dfrac{a}{0} doesn't actually have a definition. It doesn't exist. If we are trying to assign a value to a0\dfrac{a}{0} by our original definition of division then we are making a false assumption (namely that cc exists).

What's interesting to note is the case a=0a=0. Then we require the unique number cc (if it exists) such that c×0=0c \times 0=0. But this is true for any c, so c is not unique and so 00\dfrac{0}{0} is undefined as well (this time for contradicting the latter part of 'exists a unique number cc).'
In fact we call the expression 00\dfrac{0}{0} indeterminate, since any cc satisfies the required equation c×0=0c \times 0=0.
Original post by hamza772000
Right, so it just keeps going and never reaches zero as the decimal points just keep increasing?


That depends what you are trying to do. Sure it can reach 0. We simply use the decreasing decimals as an observation of what happens as we approach 0, it's a mathematical technique used without simply putting 0 in as the denominator. Rightfully so, we can see that as the denominator goes to 0, the result goes up. So yeah, you're kinda right again but it can reach zero at which point our answer is undefined.
Reply 47
Original post by WhoDaresWins
An infinite number of zeros go into that number.


Makes good sense
Original post by RDKGames
That depends what you are trying to do. Sure it can reach 0. We simply use the decreasing decimals as an observation of what happens as we approach 0, it's a mathematical technique used without simply putting 0 in as the denominator. Rightfully so, we can see that as the denominator goes to 0, the result goes up. So yeah, you're kinda right again but it can reach zero at which point our answer is undefined.
Yeah, but you could just keep going on with increasing the decimal places, no? There isn't a set amount of decimal places to go before you reach 0, right? But yh again that's only if you're going down that route, you could just put in a zero as a denominator...
Reply 49
Original post by IrrationalRoot
The reason is this.

ab\dfrac{a}{b} is defined to be the unique number cc such that cb=acb=a.
The key thing to note here in this definition is that there is an assumption. It assumes that this 'unique number cc' exists.

Let's try to find a0\dfrac{a}{0} where aa is nonzero. It's the unique number cc (if it exists) such that c×0=ac \times 0 = a. But it is axiomatic that c×0=0c \times 0=0, so we need the unique number cc such that 0=a0=a. But aa is nonzero, so there does not exist such a number cc and so a0\dfrac{a}{0} doesn't actually have a definition. It doesn't exist. If we are trying to assign a value to a0\dfrac{a}{0} by our original definition of division then we are making a false assumption (namely that cc exists).

What's interesting to note is the case a=0a=0. Then we require the unique number cc (if it exists) such that c×0=0c \times 0=0. But this is true for any c, so c is not unique and so 00\dfrac{0}{0} is undefined as well (this time for contradicting the latter part of 'exists a unique number cc).'
In fact we call the expression 00\dfrac{0}{0} indeterminate, since any cc satisfies the required equation c×0=0c \times 0=0.


I am sure the OP is enlightened. I wish I was.
Ha ha ha ha ha :colone:
Original post by hamza772000
Yeah, but you could just keep going on with increasing the decimal places, no? There isn't a set amount of decimal places to go before you reach 0, right? But yh again that's only if you're going down that route, you could just put in a zero as a denominator...


Yeah you just keep going forever with the decimal places and you'll never reach 0, but you'll get extremely close to it.
Original post by IYGB
I am sure the OP is enlightened. I wish I was.
Ha ha ha ha ha :colone:


Well tbh it is the explanation and although possibly a bit intimidating at first glance it should be quite understandable if one tries.
And yeah there really is no other rigorous explanation that isn't equivalent to this.
Reply 52
Original post by hamza772000
I get it, but what do you mean by "decide"?


Sorry I meant divide but autocorrect changed it to 'decide' for some reason -_-


Posted from TSR Mobile
Original post by RDKGames
Yeah you just keep going forever with the decimal places and you'll never reach 0, but you'll get extremely close to it.
I get it! :smile:

Original post by IrrationalRoot
The reason is this.

ab\dfrac{a}{b} is defined to be the unique number cc such that cb=acb=a.
The key thing to note here in this definition is that there is an assumption. It assumes that this 'unique number cc' exists.

Let's try to find a0\dfrac{a}{0} where aa is nonzero. It's the unique number cc (if it exists) such that c×0=ac \times 0 = a. But it is axiomatic that c×0=0c \times 0=0, so we need the unique number cc such that 0=a0=a. But aa is nonzero, so there does not exist such a number cc and so a0\dfrac{a}{0} doesn't actually have a definition. It doesn't exist. If we are trying to assign a value to a0\dfrac{a}{0} by our original definition of division then we are making a false assumption (namely that cc exists).

What's interesting to note is the case a=0a=0. Then we require the unique number cc (if it exists) such that c×0=0c \times 0=0. But this is true for any c, so c is not unique and so 00\dfrac{0}{0} is undefined as well (this time for contradicting the latter part of 'exists a unique number cc).'
In fact we call the expression 00\dfrac{0}{0} indeterminate, since any cc satisfies the required equation c×0=0c \times 0=0.
I'm sorry, I don't get that :/ I don't think I need to know it to that depth anyways, I feel bad now, Sorry! :s-smilie:
Original post by Dinah98
Sorry I meant divide but autocorrect changed it to 'decide' for some reason -_-


Posted from TSR Mobile
Oh, no worries, makes sense now :smile:
Original post by hamza772000
I get it! :smile:

I'm sorry, I don't get that :/ I don't think I need to know it to that depth anyways, I feel bad now, Sorry! :s-smilie:


It's ok, but without being too harsh if you don't understand it (or something equivalent to this argument) then you don't quite understand why division by 0 isn't possible. But having a general intuition why is reasonable and I can see why it might not be important to fully understand.

Don't feel bad btw, I didn't just post it for you but for anyone looking at the thread who wanted to know the exact reason :smile:.
Original post by IrrationalRoot
It's ok, but without being too harsh if you don't understand it (or something equivalent to this argument) then you don't quite understand why division by 0 isn't possible. But having a general intuition why is reasonable and I can see why it might not be important to fully understand.

Don't feel bad btw, I didn't just post it for you but for anyone looking at the thread who wanted to know the exact reason :smile:.
Yeah :smile:

Thanks :h:
00=1\displaystyle \frac{0}{0}=1
Original post by EricPiphany
00=1\displaystyle \frac{0}{0}=1
:tongue:
If you had one piece of cake, and someone said "divide that by two" then we'd have two pieces - however, if you have one piece, and someone says "divide that by zero" you don't then have zero pieces of cake, you still have one piece. That's the practical answer, anyway, but as we all know, mathematical equations don't always have to have a practical application :lol:

Quick Reply

Latest