The AskPhilosophers logo.

Mathematics

When we write "one and a half" in decimal notation as 1.5 .. do we really mean 1.5000... (with an infinity of zeros?) If so, how do we know there's no "3" lurking at the 10 millionth decimal place? Is this a problem of converting an analogue world into digital mathematics?
Accepted:
November 3, 2005

Comments

Daniel J. Velleman
November 3, 2005 (changed November 3, 2005) Permalink

Yes, "one and a half" means 1.5000..., with infinitely many zeros. How do we know there's no "3" lurking at the 10 millionth decimal place? Well, it depends on how the number is being specified. If you are the one specifying the number, and you say that the number you have in mind is one and a half (exactly), or you say that it is 1.5000... and you add (as you did) that the decimal expansion you have in mind is the one in which the zeros go on forever, then you've already answered your own question--you have specified that there's no 3 lurking at the 10 millionth decimal place. If, on the other hand, the number was the result of a physical measurement, then there will surely be some amount of error in the measurement. So if the readout on the measuring instrument says "1.5000", then most likely the quantity being measured is not exactly one and a half, and there is some nonzero digit lurking somewhere--perhaps in the very next digit. If someone else tells you something about a number, and he tells you that the number is 1.5000, then there might be some ambiguity. Most likely he was telling you the number only to the accuracy of the number of digits he specified, so there could be a nonzero digit lurking somewhere in a later decimal place. This is certainly the way scientists communicate numbers to each other. But if there's any doubt, you could ask: "Did you mean 1.5 exactly, or did you mean 1.5000 only to the accuracy specified?"

  • Log in to post comments
Source URL: https://askphilosophers.org/question/396
© 2005-2025 AskPhilosophers.org