English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Suppose that a twice differentiable function f satisfies f''(x)+f'(x)g(x)−f(x) = 0 for some function g. Prove that if f is zero at two points, then f is zero on the interval between them.

Thank you!

2006-11-19 13:31:52 · 3 answers · asked by Eric 1 in Science & Mathematics Mathematics

Thank you, kaksi_guy for your answer but I found out there is little flaw in your proof. f'(x)=0 does not always mean it has min or max. (for example, consider function f(x)=x^3, - f(0)=0 however, no min or max.) and I don't quite understand why f(x3)>0 when f(x3) is max (or f(x3)<0 when f(x3) is min.

2006-11-21 07:40:18 · update #1

3 answers

Actually, kaksi_guy seems to be onto something. Unfortunately, not only does his proof have the flaw that you noticed, but even if that were rectified, he would still have only proved that f is zero at infinitely many points on the interval, not that f is zero on every point in the interval. Still, his post is the inspiration for what follows:

First, we need to assume that g(x) is defined everywhere on the interval between the two points. We don't need to assume that g(x) is continuous, but if we allow a function that is undefined at even one point, such as g(x)=(x-3/x)/2 (which is undefined at zero), we may easily generate exceptions to the theorem (in this case, f(x)=x²-1).

With that out of the way, we can begin our proof. Suppose f(x) = 0 at two points a and b. By the extreme value theorem, f(x) has both a maximum and a minimum on the closed interval [a, b]. Let us denote the minimum value f(x) attains on [a, b] as min(f(x)) (and analogously for max(f(x))). Now, since f(a)=f(b)=0, min(f(x))≤0 and max(f(x))≥0. Suppose that min(f(x))<0. Then the minimum does not occur on the endpoints, so it must occur on a point c in the open interval (a, b). It is readily shown that this implies f'(c)=0. Therefore, from the original differential equation, f''(c) - f(c) = 0 and thus f''(c) = f(c). However, f(c)<0, so this implies the function is concave down at c, contradicting the fact that c is a minimum. Thus min(f(x))=0. Similarly, if max(f(x))>0, then there would be a point c on (a, b) which is a maximum, but is concave up, which is also a contradiction. Thus, min(f(x))=max(f(x))=0, and a function whose minimum and maximum are both the same is a constant function, in this case f(x)=0 on the interval [a, b]. Q.E.D.

2006-11-21 09:28:02 · answer #1 · answered by Pascal 7 · 0 0

In addition to what ‘guy2323’ told you about mean value theorem, notice that there a certain x3 btw x1 & x2, where f’(x3)=0, thus f’’(x3)+0-f(x3)=0 or f’’(x3)=f(x3) for this particular point! Mind! f’(x3)=0 means max or min.
(a) If f(x3) is max then f(x3)>0, while f’’(x3) must be < than 0 (convex).
(b) If f(x3) is min then f(x3)<0, while f’’(x3) must be > than 0 (concave).
To justify both (a) & (b) cases at f’’(x3)=f(x3) we must admit that f(x3)=0 and only 0!
The same above is also true for a certain x4 in between [x1,x3] or [x3,x2].
Thus for any x in btw [x1,x2] f(x)=0. Is it persuasive Eric?

2006-11-19 23:16:34 · answer #2 · answered by Anonymous · 0 0

Don't forget that g has to be continuous. I would suggest a theorem related to the mean value theorem. It's generally known (at least in Bartle) as Cauchy's mean value theorem. I'll state it in a post edit.

post edit: I'm sorry. Cauchu's mean value thm was a miss. It doesn't look helpfull. I also think this may be true for g not continuous like you stated as long as it holds for all x on whatever domain g is defined. I'll be thinking about it though.

2006-11-19 16:34:39 · answer #3 · answered by guy232323232 2 · 0 0

fedest.com, questions and answers