|
|
Post by drystyx on May 8, 2020 21:49:20 GMT
It was way back in high school Math where it was explained why .9999.... =1.
A. 2/3+1/3 =1. Call this A
B. 2/3 = .666... Call this B
C. 1/3=.333....Call this C
D: .333... + .666...= .999. Call this D
All of these are defined.
Because of B and C, 2/3 and 1/3 are interchangeable with .666... and .333....
Therefore, A can be interchanged with D.
Therefore A can be replaced with .666...+ .333....=1.
Therefore, .9999....=1
Now, this is how I was taught. It does seem a bit oxymoronic, and I'm not a fanatic about it, but it's mathematical.
Now, lets get to the fallacy of this principle. The fallacy is in B and C, because we can't fraction out the digits in the base of 10 that we use, so this isn't really definable.
In order to get a defined answer, we need to use a base of "9" instead of 10, in which case we may see the true answer. However, we'd be best to use a base of something like "18", since .9999... would not be usable.
Then, we may see that there is a fallacy in .999...=1. But there isn't, IMO, because we'd see fractions that are different in the base. In a base of 18, it would be different digits. If one wants to work this out, let me know how it turns out. I'm too lazy.
|
|