1 is NOT equal to 0.
Thats like saying apples are bananas. It's false by definition.
1 divided by 0 is infinity, therefore 0 times infinity is 1, a far more interesting relationship. More interesting still is that there are an infinite number of different infinities! This relates to the fact that there are an infinite number of different paths to the origin from any point other than the origin.
I hope I confused you enough.
2006-09-06 23:20:15
·
answer #1
·
answered by Absent Glare 3
·
1⤊
0⤋
Absolutely wrong! It was decided long time ago that 0 is a landmark among numbers to establish a basis for somethign that does not exist. It is the only number that if you add, subtract, multiply, divide will not give more than itself or the initial number.
Example:
1 x 0 = 0
0/1 = 0
1+0 = 1
1-0 = 1
How is such a number to be equal to 1? No way pal.
2006-09-06 23:25:47
·
answer #2
·
answered by KCD 4
·
0⤊
0⤋
Perhaps you are getting confused. In binary mathematics
1+1= 0
Maybe this is what you are referring to. A binary number system is one which only contains the numbers 1 and 0. Your computer uses a binary system to determine every computation that you ask of it.
Further addition equations in binary mathematics:
0+0=0
1+0=1
0+1=1
If you reduced the number scale to just the number 0, then the number 1 does not exist and so 1=0.
In other words as the number 1 does not exist on the nothing number scale, it equals nothing.
Most mathematics which you are aware of is conducted base 10. This is the decimal system.
2006-09-06 23:24:11
·
answer #3
·
answered by James 6
·
0⤊
0⤋
Let a = b + 1
Then:
a = b + 1
(a-b)a = (a-b)(b+1)
a^2 - ab = ab + a - b^2 - b
a^2 - ab -a = ab + a -a - b^2 - b
a(a - b - 1) = b(a - b - 1)
a = b
b + 1 = b
Therefore, 1 = 0
The problem with this (and many other similar proofs) is that there is a 'divide by zero' step somewhere in the logic, which makes the proof invalid. In this case it is in the 6th step. (a-b-1)=0, so a is not neccesarily equal to b.
2006-09-06 23:15:09
·
answer #4
·
answered by robcraine 4
·
3⤊
1⤋
If 1 = 0 and 0 =1 then 1 must equal 0.
0 divided by 1 = 0
2006-09-06 23:14:13
·
answer #5
·
answered by Dfirefox 6
·
3⤊
2⤋
Working within the standard axiomatic framework of mathematics as most people know it, this is impossible without introducing a logical inconsistency. One that is often used is:
0 = 0 + 0 + 0 + 0 + ...
0 = (1-1) + (1-1) + ...
0 = 1 + (-1 + 1) + (-1 + 1) + ...
0 = 1 + 0 + 0 + 0 + ...
0 = 1
2006-09-06 23:14:26
·
answer #6
·
answered by Anonymous
·
1⤊
2⤋
1-1=0?
2006-09-06 23:11:44
·
answer #7
·
answered by Anonymous
·
1⤊
1⤋
The Sq Root of 0=1, but you can't really add, multiply or subtract anything by 0, so 0=1
2006-09-06 23:26:25
·
answer #8
·
answered by genghis41f 6
·
0⤊
0⤋
The trick the following is you'd be desiring to understand the theory of infinity. the reason why that's a wonder for most persons is they assume .999... ends someplace with a 9. the reality is that the determination would not end in any respect! As we bypass further and extra decimal places out, the determination receives nearer and in route of one million. So if we bypass out an unlimited type of places (which we truly can't do besides), .999... = a million
2016-11-25 02:07:42
·
answer #9
·
answered by ? 3
·
0⤊
0⤋
NO ..... 1/1=1 0/1= 0 1/0= infinity
2006-09-07 02:51:56
·
answer #10
·
answered by xenon 6
·
0⤊
0⤋