One of the obvious ways computers are limited is in their representation of

One of the obvious ways computers are limited is in their representation of

One of the obvious ways computers are limited is in their representation of numbers. Since computers represent numbers as bit strings of finite length, they can only represent finitely many, and to a finite degree of precision. Is it a mistake to think the humans, unlike computers, can represent infinitely many numbers with arbitrary precision? We obviously talk about things like the set of all real numbers; and we make use of symbols, like the letter pi, which purport to represent certain irrational numbers exactly. But then I'm not sure whether things like this really do show that we can represent numbers in a way that is fundamentally beyond computers.

Read another response by Andrew Pessin
Read another response about Mathematics
Print