Actually, this relationship is FALSE. Here is a counterexample: Let f(x) = {1 if x=0, 0 otherwise} and g(x) = 0. Then:
[x→c]lim f(g(x)) = [x→c]lim f(0) = [x→c]lim 1 = 1, whereas
[u→ [x→c]lim g(x)]lim f(u) = [u→ 0]lim f(u) = 0
As you can see, these limits are not the same. However, if one of the following two conditions holds, then it is true:
1: f is continuous at [x→c]lim g(x), or
2: g(x) ≠ [x→c]lim g(x) in a punctured neighborhood of c.
Proof (case 1):
Let L = [x→c]lim g(x). Suppose f(x) is continuous at L. Let ε>0. Then since f is continuous, ∃r>0 such that |x-c| < r ⇒ |f(x) - f(c)| < ε. Then since [x→c]lim g(x) = L, ∃δ>0 such that 0<|x-c|<δ ⇒ |g(x) - L| < r ⇒ |f(g(x)) - f(L)| < ε. Since we can find such δ for any ε>0, it follows that [x→c]lim f(g(x)) = f(L). Since f is continuous at L, we also have [u→L]f(u) = f(L). Thus by transitivity, it follows that [u→L]f(u) = [x→c]lim f(g(x)).
Case 2:
Let L = [x→c]lim g(x) and M = [u→L]lim f(u). Suppose g(x) ≠ L in a punctured neighborhood of c. Let ε>0 then ∃r>0 such that 0 < |u-L| < r ⇒ |f(u) - M| < ε. Now, since [x→c]lim g(x) = L, ∃δ₁>0 s.t. 0 < |x-c| < δ₁ ⇒ |g(x) - L| < r. And since we have stipulated that g(x) ≠ L in a punctured neighborhood of c, it follows that ∃δ₂>0 s.t. 0 < |x-c| < δ₂ ⇒ g(x) ≠ L. So let δ = min (δ₁, δ₂). Then 0 < |x-c| < δ ⇒ 0 < |x-c| < δ₁ ∧ 0 < |x-c| < δ₂ ⇒ |g(x) - L| < r ∧ g(x) ≠ L ⇒ 0 < |g(x) - L| < r ⇒ |f(g(x)) - M| < ε. And since we can find such δ for any ε>0, it follows that [x→c]lim f(g(x)) = M = [u→L]lim f(u), as required.
2007-12-25 17:46:43
·
answer #1
·
answered by Pascal 7
·
0⤊
0⤋
This relationship is logical and intuitive, but you must take some care, because there are some conditions for it to be true. There are 2 well known theorems related to it, whose proofs you can find in every book on Analysis:
1) Let g be defined on a subset Dg of R^p and have values in R^q. Let a be a limit point of Dg such that lim (x --> a) g(x) = Lg. Let f be defined on a subset Df of R^q and have values in R^r. Suppose that a is a limit point of the domain of f o g, that Lg is a limit point of Df and that lim (y --> Lg) f(y) = Lf. If there exists a neghborhood V of a such that f(x) <> Lg for every x <>a in Dg Inter V, then, lim (x --> a) f(g(x) = Lf.
So, the existence of the neighborhood V satisfying the given conditions is essential. For example, lim (x --> 0) g(x) = x sin(1/x) = 0 and lim (y --> 0) f(y) = ln(1+y)/y = 1 But g vanishes for x <>0 (infinitely many times) in every neighborhood of 0, and f is not defined at 0. So, in this case, it's not true that lim (x -->0) f(g(x) = 1. The limit doesn't exist. However, if you modify the definition of f, extending it domain and setting f(0) = 1, then the desired condition becomes true.
Now, suppose g(x) = sin(x) and, again, f(y) = ln(1 +y)/y, for y >0, f not defined at 0. In this case, if we choose d sufficiently small, then sin(x) doesn't vanish in (-d, d) for x <>0. So, although f is not defined at 0, it's true that lim (x -->0) f(g(x) =1.
When f is continuous at Lg, then we have a simpler situation:
2) Let g be defined on a subset Dg of R^p and have values in R^q. Let a be a limit point of Dg such that lim (x --> a) g(x) = Lg. Let f be defined on a subset Df of R^q and have values in R^r. If a is a limit point of the domain of fo g and f is continuous at Lg (which automatically implies Lg is in Df), then lim (x -->a) f(g(x)) = f(Lg).
The requiremente that a be a limit point of the domain of fo g is just part of the definition of the limit of a function at a point. Note that, when we modified the definition and domain of f in the 1st example give for (1) , we just made it continuous at y = 0, so that Theorem (2) could apply.
When the domain of the functions is in R, thise theorem can be adapted to cobver the cases when x, y or both go to oo or -oo.
If you want the proofs of those theorems and don't have a book, email me.
2007-12-26 07:22:24
·
answer #2
·
answered by Steiner 7
·
0⤊
0⤋
It does, asuming all the limits on the right hand side exist.
Let lim_{n-->c} g(n) = d, and let lim_{u-->d} f(u) = L.
Let ε > 0. Then there exists a δ_1 > 0 such that if |u - d| < δ_1, then |f(u) - L| < ε. Then there exists a δ > 0 such that if |n - c| < δ, then |g(n) - d| < δ_1, and then taking u = g(n), |f(g(n)) - L| < ε. Thus lim_{n-->c} f(g(n)) = L.
2007-12-26 01:22:44
·
answer #3
·
answered by a²+b²=c² 4
·
0⤊
0⤋
Interesting. It does look like its valid... logically. I like it and hadnt ever considered such a relationship before.
Im going to test it.
But... let me add... that just by visual inspection... you have to assume that the limits exist.
f(g(n)) must exist.
g(n) must exist... and
f(u) must exist
===
I tested it with c equal to 0, and again with c equal to infinity... so that n approaches 0 and infinity, respectively. And I tested it with g(n) = 1/n. That way u approaches infinity and zero, respectively. I tested it with f(n) = (1 + 1/n)^n and with f(n) = sin n / n and with f(n) = n sin n. It works for all of those as long as the limits do exist. But I dont have a formal proof for the relationship on whole
2007-12-26 00:43:06
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
say lim(n->c)g(n) = k, lim(u->k)f(u) = m
let e > 0.
By definition of limit, there exists d such that |n-c| < d implies
|f(n) - m| < e
By definiton of limit, there exists b such that |u - k|
|g(u) - k| < d implies |fg(u) - m| < e
therefore fg(u) has a limit at c and that limit is m
2007-12-26 01:20:30
·
answer #5
·
answered by holdm 7
·
0⤊
0⤋