It is unnecessary to suppose that the sequence converges, or even that it is a sequence in R. ANY sequence in a totally ordered set has a monotone subsequence.
Proof: Let (x₁, x₂, x₃... ) be a sequence in a totally ordered set. Call x_i a peak of this sequence if for every j>i, x_j≤x_i. Then there are two possibilities:
Case 1: There are infinitely many peaks. In this case, let (x_k₁, x_k₂, x_k₃... ) be the subsequence of all the peaks occurring in (x₁, x₂, x₃... ). Since it is a subsequence, i
Case 2: There are only finitely many peaks -- then the set of the indices of all the peaks is finite and thus has a maximum. So let k₁ be the first natural number larger than the indices of all the peaks, so that x_k₁ is the first element that occurs after all the peaks. Then define a subsequence recursively by letting x_(k_(n+1)) be the first element occurring after x_(k_n) such that x_(k_(n+1)) > x_(k_n). Note that for any n in this sequence, x_(k_n) will occur after x₁, and thus after every peak in the sequence, so it will not itself be a peak, meaning that there will be at least one j>k_n such that x_j > x_(k_n), so it is always possible to satisfy this recurrence relation. Further, the subsequence (x_k₁, x_k₂, x_k₃...) so defined is obviously monotone increasing (and even strictly so).
Since in either case we may define a monotone subsequence, it follows that every sequence has a monotone subsequence, whether it converges or not (in fact, proving that every sequence has a monotone subsequence is a useful lemma in proving the Bolzano-Weierstrass theorem).
2007-11-22 13:51:43
·
answer #1
·
answered by Pascal 7
·
1⤊
0⤋
as far as i remember you need to make the derivate in that point (L in your case).. and to study its sign. If the sign is >0 you should have a converging sequence.... I hope I'm right... the idea is that all the elements between u and p are going to the L ... and that's why the sequence should be somehow monotone.
2007-11-22 17:32:04
·
answer #2
·
answered by nobody100 4
·
0⤊
1⤋