I have updated my answer on part c). The overall conclusion: Everything works!
a) Convergence for all x > 0:
As lobosito already stated, this sequence is a sequence of Riemann sums for the integral of function g(y) = y^x, on the interval [0,1]. So of course it converges.
It even converges for x = 0. Heck, it even converges for any x > -1.
b) Uniform convergence:
Since g(y) = y^x is non-decreasing as a function of y, the proof of existence of the limit of the Riemann sum shows that the error at the n-th stage cannot be more than (1/n)*1^x = 1/n. (You can go back and review the proof, but in brief: The limit must be somewhere between the upper and the lower Riemann sum, and the sum of the maximum possible differences between them will be the maximum value of the function over the interval [0,1], multiplied by (1/n). This is true for any value of x, and the maximum for g(y) occurs at y = 1.) So the convergence of f_n to 1/(1+x) is uniform, for x > 0. (Not for the range -1 < x < 0, however: values close to -1 will take longer and longer to converge.)
c) Do the f'_n converge to f' ?
This is a much harder question to answer. As I recall it, if the f'_n converge, then we can be sure that the limit to which they converge is indeed the derivative of f. When we look at f'_n(x), we get:
f'_n(x) = (1/n)*Sum(k=1,n) [ln(k/n)*(k/n)^x] .
But this is a sequence of Riemann sums for the integral:
h(x) = Integral (0,1)[ln(y)*y^x]. By the same argument as for part a), this indeed will converge, as long as x ≥ 0. (For x = 0, the integrand will be singular, but the integral will still converge, because the integral of ln(y) over the interval (0,1) can still be done.)
So: Yes, the f'_n(x) converge to a function h(x); and therefore this h(x) is in fact = df/dx.
2007-09-11 08:33:18
·
answer #1
·
answered by ? 6
·
1⤊
0⤋
Hi Sonia
Lobosito and, especially, Nealjking, whose answer is thorough, gave you great answers. I don't have much to add, I'd just like to say a few words more.
This problem is interesting and can be seen in a general context. Suppose f:R^2 --> R is continuous on D = {(x,y) | x >= 0, 0 <= y <= 1}. For each n, let f_n(x) = (1/n) [f(x,1/n) + f(x, 2/n)....+ f(x, n/n)]. Then, according to what was already said, f_n is a sequence of Riemann sums. Since f is continuous, f_n converges to g(x) = Integral (0^1) f(x, y) dy, an integral in y depending on the parameter x. This integral exists for every x >=0.
Now, lets add the assumption that the partial derivative of f with respect to x exists and is continuous in the domain D. This implies each f_n is a differentiable function of x and that f'_n(x) = (1/n) [d/dx f(x, 1/n) + d/dx f(x, 2/n) ...+ d/dx f(x, n/n)], again a sequence of Riemann sums on [0,1], now corresponding to the function y -> d/dx f(x,y), for each fixed x. Since the partial derivative of f f with respect to x exists and is continuous in D, this sum converges to the function h(x) = Integral (0^1) d/dx f(x, y) dy. But the continuity of the partial derivative of f with respect to x implies we can interchange the order of integrals and derivatives. So, h(x) = d/dx Integral f(x, y) dy = d/dx g(x) = g'(x). This is a generalization of what Nealjking showed for your particular case.
Observe that, under the assumptions we made, the results are true even if we don't have uniform convergences. In the general case of our function f, I don't think we can ensure uniform convergence.
Finally I'd like to recall some points about sequences of differentiable functions. Let f_n be a sequence of real valued differentiable functions defined on an interval I. Even if f_n converges uniformly to a function f, this doesn't mean f'_n will converge to f'. f'_n may not converge or may converge to a function other than f'. There are several examples in books on Analysis. But there's an interesting theorem that ensures the desired condition:
Let f_n be a sequence of real valued differentiable functions defined on a finite interval I. Suppose that for some x_0 in I the sequence f_n(x_0) converges and that f'_n converges uniformly on I to some function g. Then, f_n converges uniformly on I to a function f such that f'(x) = g(x) for every x in I.
This theorem gives a sufficient, though not necessary, condition for the limit of the derivatives to be the derivative of the limit. The important thing is that the sequence of derivatives converge uniformly. Uniform convergence of f_n implies nothing.
2007-09-12 10:06:26
·
answer #2
·
answered by Steiner 7
·
0⤊
0⤋
f_n(x) = sum 1/n (i/n)^x ,
you notice that this is exactly the partial sums that compose a Riemann integral. So
the limit is
integral from 0 to 1 of a^x da =
= a^(x+1)/(x+1) from 0 to 1 =
= 1/(x+1),
so lim f_n(x) = 1/(x+1), for any positive x
2007-09-11 14:25:13
·
answer #3
·
answered by Theta40 7
·
1⤊
0⤋