The AskPhilosophers logo.

Mathematics

I'm sure the mathematical anomaly that .999 repeating equals 1 has been brought up, but I was wondering what you think of it. Why is this possible? x=.999 (repeating) therefore 10x=9.999 (repeating) Subtract one x from the 10x 10x=9.999 - x=0.999 and you get 9x=9 divide both sides by 9 x=1 I was wondering if you could explain why this happens. Does it show a flaw in our math system? Or is it just a strange occurrence that should be overlooked? Or is it true?
Accepted:
October 13, 2005

Comments

Daniel J. Velleman
October 14, 2005 (changed October 14, 2005) Permalink

Yes, it is true that .9999... = 1, and there's nothing paradoxical about it. But to see why that is, you need to think about the meaning of decimal notation.

Consider a decimal number of the form:

0.d1 d2 d3 d4 ...

where each of d1, d2, d3, ... is one of the digits from 0 to 9. Of course, d1 is in the tenths place, d2 is in the hundredths place, and so on. What this means is that the number represented by this decimal notation is:

d1/10 + d2/100 + d3/1000 + ...

But if the list of digits goes on forever, then this summation goes on forever, so now we have to ask what an infinite summation means. You can never finish adding up infinitely many numbers, so we can't just say that this is what you get when you finish adding up all of the infinitely many numbers. Here's how mathematicians define this infinite sum: Start with d1/10, then add on d2/100, then add on d3/1000, and so on. The process never ends, so you will never actually get to the answer. However, as you add more and more terms, you should get closer and closer to the answer. So mathematicians define the infinite sum to be the number that you get closer and closer to as you add on more and more terms of the infinite sum. (Those who have studied infinite series in calculus should recognize this.) In the case of the number 0.9999..., this means that we start with 9/10 = 0.9, then we add on 9/100, which gives us 0.99, then we add on 9/1000, which gives us 0.999, and so on. The definition of the notation "0.999..." is that it is the number that the numbers 0.9, 0.99, 0.999, ... are getting closer and closer to, and that number is 1.

I think the reason many people are bothered by this is that when you keep adding on more and more 9s, you never get to 1. That's true, but if you're bothered by that then you're misunderstanding the meaning of decimal notation. You're not supposed to get to 1. To say that 0.999... is equal to 1 just means that as you add on more and more 9s, you get closer and closer to 1, and that's true--and not very surprising.

By the way, there's nothing special about the number 0.999... here. To say that pi = 3.1415926... means that if you start with 3, and then add 1/10, and then add 4/100, and so on, then you will get closer and closer to pi. You will never get to pi, because the digits go on forever. But we still say that 3.1415926... is equal to pi, because what "3.1415926..." means is "the number you get closer and closer to as you add on more and more of the digits of this infinite decimal expansion", and that number is (exactly) pi.

  • Log in to post comments
Source URL: https://askphilosophers.org/question/181?page=0
© 2005-2025 AskPhilosophers.org