English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Let a,b be nonnegative integers not both zero.
I am having trouble getting this proof started and I feel like I am missing something simple about primes that will help me get started. Thank you

2007-09-11 06:27:35 · 3 answers · asked by sh 1 in Science & Mathematics Mathematics

3 answers

If k is odd, a+b is always a factor of a^k + b^k.
In fact a^k + b^k = (a+b)(a^(k-1)b - a^(k-2)b² + ...b^(k-1) ).
Just multiply the factors to get the result.
For a numerical example,
3^k + 2^k is always divisible by 3+2 or 5 when k is odd.

2007-09-11 06:37:22 · answer #1 · answered by steiner1745 7 · 1 0

No, you aren't really missing anything about primes except perhaps an implication that primes cannot be factored. It is a matter of proving that a + b is a factor of a^k + b^k where k >= 3 and k is odd.

However, you also need to add the condition that a and b cannot both be 1 because then a^k + b^k = 2, which is indeed prime. You also need to handle the special case in which one of the variables is 1 and the other is zero. In this case, a^k + b^k is one, which is not prime but can't be factored in the usual sense.

The proof consists in multiplying the factor a + b by the factor sum (i= 0, k-1, (-1)^i * a^(k-1-i) * b^i), pulling out the end terms - which will be a^k and b^k - and showing that the remaining terms can be paired so that their coefficients cancel. In being able to do this, you will have shown that a + b and sum (i= 0, k-1, (-1)^i * a^(k-1-i) * b^i) are factors of a^k + b^k. It is then a matter of showing that 1 < a + b < a^k + b^k.

2007-09-11 07:40:05 · answer #2 · answered by devilsadvocate1728 6 · 1 0

Hello

You can prove this by trying to factor a^k+b^k, where k = 2n+1 for some n.


Consider a geometric series: a^(2n) - a^(2n-1)b + ... - ab^(2n-1) + b^2n
which has 2n+1 terms, with first term a^(2n) and common ratio -b/a.

The sum of such series is
[a^(2n)-a^(2n)(-b/a)^(2n+1)] / (1+b/a)
Simplifying gives:

[a^(2n+1) + b^(2n+1)] / (a-b)

Therefore,

a^(2n+1) + b^(2n+1) = [a^(2n) - a^(2n-1)b + ... - ab^(2n-1) + b^2n] * [a-b]

which definitely shows it is not prime. (a not equal to b)

2007-09-11 06:45:03 · answer #3 · answered by Derek C 3 · 1 0

fedest.com, questions and answers