English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

question : (a/b)+(b/a) >= 2

how ?

2006-07-16 19:57:13 · 3 answers · asked by amit 1 in Science & Mathematics Mathematics

3 answers

How can you prove something that's blatantly false? Consider a=1 and b=-1. Then a/b+b/a=-2, which is most certainly NOT greater than or equal to 2. Further, it is not even the case that |a/b+b/a|≥2, if one permits the use of complex numbers. For instance, a=1, b=i gives a/b+b/a=0 precisely. The other proofs make an invalid use of the properties of square roots (specifically, they assume that √(a/b)√(b/a)=√((a/b)(b/a))=√1, which is NOT always true when negative or complex numbers are involved, and assuming they are can be used to "prove" that 1=-1).

2006-07-16 21:24:40 · answer #1 · answered by Pascal 7 · 0 0

Since ((a/b)^0.5 - (b/a)^0.5)^2 >= 0 (any square must be greater or equal zero)

Then, (a/b) - 2 + (b/a) >= 0
or a/b + b/a >=2

2006-07-17 03:11:31 · answer #2 · answered by JN_Answer 1 · 0 0

{sqrt(a/b) -sqrt(b/a) }^2 >= 0
expanding
(a/b)+(b/a) -2 >= 0

therefore
(a/b)+(b/a) >= 2

2006-07-17 03:09:27 · answer #3 · answered by qwert 5 · 0 0

fedest.com, questions and answers