One of the nice proofs reproduced in the book was the one to show that the square root of 2 is not a rational number; that is, that that it can't be represented as a fraction (a/b). The proof is short and easy to follow (find it on Wikipedia) and the result is satisfying because of its sheer scope. Despite there being literally an infinite quantity of numbers, you will never find a number that is the exact square root of 2. And you can do that with absolute certainty without even breaking into a mental jog, the proof is that simple.
Anyway, when I was reacquainted with this proof I remembered that many years ago I was really not that impressed by it. There seemed to me to be an obvious intuitive argument that it had to be so. It's all a question of there not being enough space on the number line to fit in all those square-roots.
Think about it: from 1 through 9, say, there are an infinite quantity of intermediate numbers; and if we map each of that infinite multitude to its square-root (or higher root, for that matter), the number line of the roots is squashed up relative to the number line from which is originates (it ranges only from 1 through 3 in the above example). It seemed obvious to me that there simply couldn't be enough room on the number line for every number between 1 & 9 to map to its own unique root in the range 1 to 3. After all, there may be an infinite quantity of numbers between 1 and 3, but the infinite quantity of numbers between 1 and 9 must be greater, mustn't it?
The standard proof that root-2 isn't rational is certainly more definitive than my (admittedly informal) argument; but mine seems to imply that there won't be cube roots, fifth roots and all the way up. Or at least, it seems to be on first glance.
Another way of coming at this question is to consider what the decimal expansion would be of a number that is the square-root of any integer that is not a square number (1,4,9 etc.). Clearly, the root has to end with a non-zero digit after the decimal point. And when squared, the final digit would need to equal zero for the square to be an integer. But when you square all the digits from 1 through 9 you don't find any that maps to zero; the square of any decimal number will in fact have more digits after the decimal point than the putative root itself has. Hence, there are no integers other than square numbers that have a square-root that can be expressed as an exact decimal; and in fact, no number with a single digit after the decimal has its own root either.
Anyway, glad to have got this off my chest after all these years. In writing it down, I suddenly thought about how we express numbers that aren't whole. The whole numbers each have their own symbol(s) that are unique to themselves, whereas the non-whole numbers are expressed only as the output of a computation using whole numbers (a/b); they don't have their own unique symbolic representation and I'm not sure that it would ever be a useful thing to do. Which makes me wonder about how real they really are, or as Leopold Kronecker, a 19th C. mathematician put it: God made the integers, all the rest is the work of man.
AH
No comments:
Post a Comment