English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I can see this
Let a = b
multiply both sides by a
=> a*a = a*b
subtract both sides by b*b
=> a*a-b*b = a*b-b*b
=> (a+b)(a-b) = b(a-b)
=> a+b = b
but we asummed a=b
=> 2b = b
=> 2 = 1
Hence Proved.
Is it right or wrong? Guess :)

2006-08-27 05:52:21 · 11 answers · asked by saurabh b 2 in Science & Mathematics Mathematics

11 answers

In proving theorems, it is dangerous when one of the consequences is implied on a wrong fact. In your example,
the wrong step is (a+b)(a-b)=b(a-b)==> a+b=b
This is because a=b hence a-b=0. You cannot divide both side by zero.

2006-08-27 05:57:11 · answer #1 · answered by lim_jz 2 · 2 0

To prove 2= 1

Raise both of them to power of 0
ie 1^0 = 1
2^0 =1 (any number raised to power of 0 is 1)
therefore 2 = 1

2006-08-27 19:46:19 · answer #2 · answered by insignia 2 · 0 0

If a=b then you can't divide by a-b in step 3 because that would be dividing by 0 which is a no-no.

Is it just me? Or do these same kinds of boring, dumbshit questions about dividing by 0 and what is 0/0 start to get a bit tedious after seeing them 2 or 3 times per day?


Doug

2006-08-27 05:58:26 · answer #3 · answered by doug_donaghue 7 · 1 0

not only 2=1; say :

i. 9 x 0 = 0
ii. 71 x 0 = 0

but equation i = ii, and we get

9 x 0 = 71 x 0

9 = 71 x 0/0

according to your question, if 0/0 is equal to 1, then we get

9 = 71

2006-08-30 16:38:54 · answer #4 · answered by Douglaskoo 1 · 0 0

Since a=b you are not allowed in the mathematical conventions of normal people to cross out a-b in step 4 of your 'proof' as it is 0. But if you are living in some hyperboloidal world in some other galaxy in our universe, then probably it might be true. Ask a bounty hunter or P. K. ****, they might be of immense help.

2006-08-27 13:40:04 · answer #5 · answered by A 4 · 0 0

Your problem is after this step

(a+b)(a-b) = b(a-b)

If a=b, then a-b=0.

You cannot divide by 0 and the reason why (a+b)(a-b) = b(a-b) is true is because 0(a+b)=0(b) => 0=0, not because a+b = b, which is clearly nonsensical.

2006-08-27 06:01:17 · answer #6 · answered by just♪wondering 7 · 2 0

you have divided by a-b on both sides in going from the 7th to the 8th step. since a=b, this is same as dividing by zero, which is not allowed. Proof is incorrect therefore

2006-08-27 23:56:32 · answer #7 · answered by curiosity_unbounded 2 · 0 0

A very good approach :-) , however (a-b) = 0 if a=b, so you have
(a+b)x0 = b x0
hence
0 = 0
not 2 = 1

Thanks,

2006-08-27 23:35:53 · answer #8 · answered by PKG 1 · 0 0

ok at (a+b)(a-b) = b(a-b)
so since a=b then a-b=0
here is what happens you come out with
0(a+b)=0(b)
so it ends up 0=0....your equation is now solved since you cannot go any further...so if a=0,1,2,3,4,5,etc then b is same so you always end up with solution of 0=0 which is true..hence your hypothesis is wrong..totally.

if a=b then no matter what you do you will have infinite valid answers because no matter what you do you end up with 0=0

2006-08-27 06:31:47 · answer #9 · answered by Anonymous · 2 0

2b cannot be equal to b this is absurd so the assumption a=b is wrong

2006-08-27 06:30:16 · answer #10 · answered by ursikap u 1 · 0 1

fedest.com, questions and answers