The AskPhilosophers logo.

Mathematics

All throughout our educational careers, we are taught not to divide by zero. Death upon he who divides by zero. If you punch it into a calculator you get an error or undefined. But, what I want to ask is if we can display this error. In reality we can divide by certain amounts. If I have four apples, and I want two 'divide by two,' I must split the apples into even groups. I can do this for any real number. But is there a realistic model that we can divide by zero? If I get the error on a calculator, can I get that error in real life? So that this apple will simple vanish, or, God forbid, time and space unravel? I think there has to be some realistic model to divide my four apples into zero baskets.
Accepted:
August 14, 2007

Comments

Allen Stairs
August 16, 2007 (changed August 16, 2007) Permalink

A couple of thoughts. The first is that even though arithmetic may have been inspired by things that we do when we arrange objects like apples and baskets, arithmetic isn't "about" those concrete operations. On the contrary: suppose we "add" one rabbit to another and get 10 rabbits. Then we simply don't count what we did with the rabbits (or what the rabbits did) as the arithmetic addition operation. Likewise, if I "add" one drop of water to another, I'll get one drop. But that doesn't give us an exception to "1+1=2". Rather, we say that the sort of "adding" we do when we plop one drop on top of another isn't arithmetic adding. We could give some more or less arbitrary "operational definition" of some kind of real-world "dividing" of four apples into zero baskets, but it wouldn't tell us anything about arithmetic.

There's another point. If dividing 4 by zero is going to make any sense, then the result can't be a real number (i.e., member of the set of reals). Why not? Because no matter how big a real number we pick, it won't be big enough. As we divide 4 by smaller and smaller positive numbers, we get a bigger and bigger result, without limit. So if there's any sense to be made of division by zero, 4/0 will have to be an infinite number. That's not a problem by itself. There's lots of math about infinities. And in fact, we can distinguish different infinite numbers. But which infinite number should 4/0 be? And should it be the same one as 5/0?

Well, if a/z = b, then a = z*b. So presumably, we would have to say that if 4/0 = b, then 4 =0*b = 0. By the same reasoning, 5 = 0. And so 4 and 5 are equal, and both equal to 0. Not a very appetizing conclusion! If division by zero is to be more than just talk, then we need some sort of consistent account of how it could work. And it's hard to see how there could be one.

There are lots of exotic possibilities for thinking about numbers that go beyond our usual notions. Let's use "#" as a symbol for the infinite number Aleph-null -- the number of integers. The mathematician John Conway devised a number system that allows us to perform operations such as 6*#, 47/#, 17+# and so on. But even in Conway's system of "surreal numbers," there's no sense to be made of division by zero, for the very good reason that there just doesn't seem to be any sense to be made of it.

So if you can come up with a sensible system of arithmetic that allows division by zero, you may have a bright future before you as a mathematician. (Maybe you'll even win the Fields Medal - the "Nobel Prize"of mathematics.) But the prospects for pulling it off seem rather dim.

  • Log in to post comments
Source URL: https://askphilosophers.org/question/1759
© 2005-2025 AskPhilosophers.org