8 said
Why is 0.999 the same thing as 1?
Okay, so this is actually a pretty interesting (and bloody complex) question, so thanks for kicking me off with a difficult one. I hope the wall of text doesn't scare everyone away - not all questions are as fundamental to the core of a subject as this one!
So, firstly, what you wrote there is definitely not equal to 1, but 0.999..., where the "..." implies that the "9" is
infinitely recurring, is.
There are different ways to prove this, the following, for reasons of elegant simplicity, being my favourite:
x = 0.999...
10x = 9.999...
10x - x = 9.999... - 0.999...
9x = 9
x = 1
My other favourite proof involves limits and infinite sequences, but that's slightly more complicated for me to explain (particularly considering the notation doesn't work in this sort of text format) and, I feel, doesn't actually address your question. These proofs don't answer your question of
why, which is the important part, they simply demonstrate that it
is. The answer to your question lies in the nature of infinity itself. We tend to view infinity as a number - a number further away than we can possibly ever reach, beyond any other number, but still at the theoretical "end" of the number line. Infinity, however, is a
concept. It is entirely separate from numbers and the number line. Phrases like "nearly infinite" often annoy mathematicians - something cannot be nearly infinite, either it is infinite or it is not. In this case, it is.
If the "9" digit in "0.999..." is
infinitely recurring, then considering 1 - 0.999... = x, there is naught x can be but 0 (implying that 1 = 0.999...). There is no real number that exists between 0.999... and 1, no space between them, and thus they must be one and the same. There is no "0.000...0001" left over when you make the subtraction, because those recurring nines continue for eternity. If the difference between two numbers is zero, they are the same. A more mathematical way of representing this is
here.
Of course, we can go far, far deeper. In the end, whether two numbers are equal or not is entirely dependent on how we define those numbers. The reason 0.999... = 1 seems so counter-intuitive is simply because of the standard way we are taught to define and interpret numbers. In reality, numbers do not simply exist; we created and defined every number in our number system. But in a standard mathematics education, we never really receive a rigorous definition of what a real number actually
is - what rules and axioms are in place to define them. In other words: the question is not really "why is 0.999... equal to 1", but rather to query what our definition of the numbers involved, or real numbers as a whole, is. The argument above, about there being no space between the two numbers 0.999... and 1, depends on the fact that we do not allow infinitesimally small numbers - "if a number is smaller than every other number, it is zero", also known as the Archimedean property. We say that if the distance between two numbers is infinitesimally small, the distance between them is zero, and thus they are the same. But, well, who decided that infinitesimally small numbers don't exist? Who made the rules, and what are they?
How do we define numbers? More specifically: "Assuming 0.999... and 1 are the same by our definitions of those numbers, what are those definitions; how are those numbers defined such that they are the same?"
The question to start with us: who says that two numbers having zero-difference makes them equal? That's something we pretty much just decided arbitrarily, because in almost all cases any two numbers with zero-difference are functionally identical. Two-thirds and ten-fifteenths are different numbers, but they are functionally identical in terms of their value and they have a difference of zero, so we declare that they are equal. We are doing the same in the case of 0.999.... and 1. They are different numbers but have zero-difference between them, so we say they are equal. All these "we say"s are basically one way of defining the real numbers - the "we say"s are the rules in place that decide what is or is not a real number and how they all interact. So, in essence, 0.999... and 1 are equal just because we say they are, and the
reason we just say that they are is because they function identically, have a difference of zero, and we can prove it with methods that fit our rules (such as the ones mentioned near the start of this post.)
However.... while this is distinctively different from "because we say so" for multiple reasons, it still feels a little unsatisfactory. There are obviously other ways of constructing what we identifty as the real numbers, such as Dedekind Cuts and Cauchy Sequences, but those are way above my paygrade, and as it is currently almost half past midnight I will do my research into them another time and return here to explain in depth how 0.999... = 1 by these definitions of what real numbers are on another day.
My current limited understanding (beware, I've not simplified this much, I'll do that when I return to these, so you may want to skip this paragraph) is that using Dedekind Cuts, a real number is defined as a set (a group/list) of all rational numbers (numbers expressable as a fraction a/b) lower than itself, and that when you use this method to define 0.999... and to define 1, you find the sets are identical and so the numbers themselves are identical. Defining real numbers by using Cauchy sequences, however... a Cauchy sequence is one in which the numbers in the sequence become arbitrarily closer together as the sequence goes on, like (0.1, 0.11, 0.111, 0.1111...) - the difference between each successive term gets smaller and smaller as the sequence goes on. You can define every real number as the limit of a Cauchy sequence of rational numbers). Therefore, if the limit of the difference between two sequences is 0, then the two numbers defined by those sequences are the same. You can express 1 as a Cauchy sequence (1, 1, 1, 1.......), and 0.999... as the Cauchy sequence (0, 0.9, 0.99, 0.999.....). Subtracting these sequences, you end up with (1, 0.1, 0.01, 0.001.....), and the limit of this sequence is obviously 0. As the limit of the difference between the two sequences is 0, the two numbers represented by those sequences must be the same - namely, 0.999... and 1.
And, as a final note.... here is where it gets sort of trippy: depending on how you define real numbers (or, rather, what number systems you use...), 0.999... might not equal 1. They are equal in the number system we use in our daily lives and that most mathematicians use most of the time, but there are ways of defining numbers that mean they are not equal. There are entire number systems, like hyperreal numbers, that assume infinitesimal numbers exist and because of that, 0.999... does not equal 1. As ever, it all depends on your definition. But that is a topic for another day!