Given a = b + 1
By the properties of multiplication: (a-b)a = (a-b)(b+1)
Therefore: a^2 - ab = ab + a - b^2 - b
So: a^2 - ab -a = ab + a -a - b^2 - b
By distribution: a(a - b - 1) = b(a - b - 1)
And so: a = b
Giving: b + 1 = b
Therefore: 1 = 0
The problem with this "proof" is that there is a division by zero error, and so the proof is flawed (so I've been told. I can't see it though).
Additional: -
For the critics below, I did state that the proof is flawed, but it was a professor at university who instructed us in the proof, stating that it was a commonly used argument by philosophers as an argument against the "self referential" logic of mathematics. I just couldn't remember where the flaw was. Thanks for the explanation appledaydreamer.
2006-07-16 23:56:36
·
answer #1
·
answered by Grimread 4
·
0⤊
1⤋
gimread u are wrong.
if Given already a = b + 1
And so: a = b will be wrong since given already a = b + 1
note: if u are trying to prove a = b which is 1 = 0 whatever,
then a cannot be equal <> to b + 1
(ie. a <> b - 1) since
if the end result is a = b, then substituting
a = b + 1 into a will give :
a = b
(b +1 ) = b
b - b = -1
0 = -1 which is wrong and also :
a = b is true only there are 2 similar equations involved not just 1 set of equation.
and also :
if a = b in the end result, then from the beginning, a <> b +1, correct?
the 1 way to prove that 1 = 0 is to use a scenario :
1 = 0 is true when 1 = 0 + a when a = 1
admirer
2006-07-17 07:10:29
·
answer #2
·
answered by mercury of love 4
·
0⤊
1⤋
>Given a = b + 1
>By the properties of multiplication: (a-b)a = (a-b)(b+1)
>Therefore: a^2 - ab = ab + a - b^2 - b
>So: a^2 - ab -a = ab + a -a - b^2 - b
>By distribution: a(a - b - 1) = b(a - b - 1)
>And so: a = b
>Giving: b + 1 = b
>Therefore: 1 = 0
By rules of algebra this is true, but by rule of math this cannot be.
Here is why.
>By distribution: a(a - b - 1) = b(a - b - 1)
>And so: a = b
Reason #1.
Both sides are divided by (a - b - 1) to get a = b.
BUT given that a = b + 1
if we substitute b + 1 to a we get (b + 1 - b - 1)
which is equal to 0.
RULE OF MATH. WE CANNOT DIVIDE BY 0.
Therefore, the algebraic equation is mathematically wrong.
2006-07-17 07:09:42
·
answer #3
·
answered by appledaydreamer 1
·
1⤊
0⤋
if the number or calculation in question is of great value and length a number as small as one can be so insignificant that it would in fact be virtually equivalent to 0
2006-07-17 06:54:40
·
answer #4
·
answered by JayClutch 2
·
0⤊
0⤋
um guy two above mine:
i'll admit its been a while for such math, but im not sure i agreed with his breakdown or how can b + 1 = b regardless?
2006-07-17 07:07:13
·
answer #5
·
answered by madison018 6
·
0⤊
0⤋
i've seen that answer somewhere beofre. kinda lame
2006-07-17 06:53:26
·
answer #6
·
answered by supeyrio 2
·
0⤊
1⤋
one vacuum = nothing
2006-07-17 06:59:13
·
answer #7
·
answered by Richard C 1
·
0⤊
1⤋