Let x and y be two points in D.
Let the curve: z(t) = x + t*(y-x) define the curve from x to y.
Since the partial derivatives of f exist in D, the directional derivative along z; df/dz, exists for every point along the curve. Then:
f(y) = f(x) + Integral[df/dz]dz
= f(x) + Int(t=0,1)[(y-x)*df(t)/dz]dt
f(y)-f(x) = Int(t=0,1)[(y-x)*df(t)/dz]dt
Since the partial derivatives are bounded within D, df/dz is bounced within D; say, magnitude less than K.
Then:
magnitude(f(y) - f(x)) < K*magnitude(y-x)
That's the Lipschitz condition.
2007-09-14 09:40:14
·
answer #1
·
answered by ? 6
·
1⤊
0⤋
There's a subtle detail in nealjking's proof. The existence of the partial derivatives of f in D does not imply the existence of al the directional derivatives. This would be true if f was differentiable on D, but this wasn't stated. I recall we say f:D -->R is differentiable on D if, for every x of D, there exists a linear function L such that, for every u in D we have
f(u) = f(x) + L(u - x) + v(u -x), where v is a function such that lim (u --> x) v(u-x)/||u - x|| = 0.
If the partial derivatives were continuous, f would be differentiable, but this is not stated. All we know is the partial derivatives exist and are bounded in D. But the result is true anyway, if we assume D is open and convex (a region in R^n).
To make things simpler, it's enough to consider the case where D is a subset of R^2. This will be clear from the proof.
Let f1 be the partial derivative with respect to x1 and f2 the partial derivative with respect to x2. Since they are bounded, there exist
S1 = supremum {||f1(x)|| | x is in D} (1)
S2 = supremum {||f2(x)|| | x is in D} (2) .
Let u =(u1, u2) and v = (v1,v2) be 2 points of D. Then, if we "zig -zag" in D in steps parallel to the axis, we get
f(v) - f(u) = f(v1, v2) - f(v1, u2) + f(v1, u2) - f(u1, u2) (here, we used the fact that D is convex), which, by the triangle inequality, implies that
|f(v) - f(u)| <= |f(v1, v2) - f(v1, u2)| + |f(v1, u2) - f(u1, u2)| (3)
Since the partial derivatives exist in D, the Mean Value Theorem, one-dimensional case, shows the existence of some real number r between u2 and v2 such that
f(v1, v2) - f(v1, u2) = (v2 - u2) f2(v1, r). In virtue of the definition of S2 in (2), it follows that
|f(v1, v2) - f(v1, u2)| <= S2 |v2 - u2| . By the same reasoning, we get
|f(v1, u2) - f(u1, u2)| <= S1 |v1 - u1|
Combining with (3), it follows that
| f(v) - f(u)| <= S1 |v1 - u1| + S2 |v2 - u2|. (4)
According to Schwarz inequality,
S1 |v1 - u1| + S2 |v2 - u2| <= sqrt(S1^2 + S2^2) * sqrt((v1-u1)^2 + (v2- u2)^2) = sqrt(S1^2 + S2^2) * || v - u||. Combining with (4), we finally get
| f(v) - f(u)| <= sqrt(S1^2 + S2^2) * || v - u||, which shows f is Lipschitz with constant sqrt(S1^2 + S2^2).
This same reasoning cam be applied if D is a subset of R^n. We follow the same strategy of n steps parallel to the axis. We'll get the Lipschitz constant sqrt(S1^2... + Sn^2).
2007-09-17 11:21:34
·
answer #2
·
answered by Steiner 7
·
1⤊
0⤋