English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

1. (a) consider a function f such that f'(x) exists for all real x. Suppose that
(i) f'(r) = 0 anf f'(s) = 0
(ii) f'(x) does not equal to zero for all x in the interval r Prove that there is at most one root of f(x)=0 in r
(b) Given a function f such that f'(x)=(4-x^2)(4+x^2)e^-2x. what is the maximum number of real roots of f(x)=0 in -2

2007-12-07 09:46:02 · 4 answers · asked by leonardo 1 in Science & Mathematics Mathematics

4 answers

1a: Suppose ∃x, y∈(r, s) s.t. f(x)=0 and f(y)=0. WLOG assume x
1b: Suppose that x∈(-2, 2), then (4-x²)>0. Since regardless of the value of x, (4+x²)>0 and e^(-2x) > 0, it follows that for all x∈(-2, 2), f'(x) = (4 - x²)(4 + x²)e^(-2x) > 0, and so in particular f'(x)≠0 on this interval. So by part 1a, it follows that there is at most one root of f(x) in this interval. Q.E.D.

2007-12-07 10:04:02 · answer #1 · answered by Pascal 7 · 0 0

(a) can be proved by contradiction using Rolle's theorem.

Suppose there were two roots of f(x) = 0 in r < x < s. Call those two roots u and v. Then by Rolle's theorem, there must be a point t such that u < t < v and f'(t) = 0. But note that
r < u < v < s, which means r < t < s, and this contradicts condition (ii) you described above.

for part (b), examine how many roots f'(x) = 0 has in the interval -2 < x < 2, and work from there..

2007-12-07 18:02:21 · answer #2 · answered by Chris W 4 · 0 0

You know, for once I would love to see a question posted that suggests it has actually been attempted. Rather than copying and pasting so much that it still says "justify your answer".

2007-12-07 17:57:18 · answer #3 · answered by Scott Evil 6 · 4 0

ophs!!!!!!!!!!!!!!
-; )

2007-12-07 18:18:45 · answer #4 · answered by $$$$$$$$$$$$$ 2 · 0 0

fedest.com, questions and answers