The previous post is right about the difference between sequences and series. Convergence of a sequence means that the successive terms get closer and closer to a given value as you go up in the sequence. For example, the sequence "1/2^n" converges to zero as n increases without bound. Convergence in a series means that the sum of the first n terms gets closer and closer to a given value as the number of terms (n) in the sum increases without bound. For example, the series 1+1/2+1/4+1/8+...+1/2^n converges to the value 2.
A sequence or series can fail to converge in a couple of ways. One is that the high-n values can bounce around. The classic examples of this are the sequence (-1)^n and the series 1-1+1-1+...+(-1)^n. Another way they can fail to converge is if they actually "diverge" to an infinite absolute value. The series 1+2+4+8+...+2^n does this, as does the sequence 2^n.
For a series to converge, the partial sums that you can imagine making for n=1,2,3,... have to form a convergent sequence, and the sequence formed by the successive terms of the series also has to converge--at least I'm not aware of any counterexamples to this rule. That's why the examples above come in sequence/series pairs.
ABSOLUTE convergence of a series occurs when the series formed by the absolute values of the successive terms also converges. For example, the series 1-1/2+1/4-1/8+1/16... is convergent, and it is also absolutely convergent (because the series 1+1/2+1/4+1/8+... also converges). If a series converges absolutely, then it certainly converges, but the opposite is not always true.
2006-08-08 13:22:33
·
answer #1
·
answered by Benjamin N 4
·
7⤊
3⤋
The difference between a series and a sequence is the "+" sign. You add things up in sequence and a series is an ordered list.
Convergence - it approaches a certain number the further along you go (much like a horizontal assymptote)
Divergence - it approaches infinity at a certain point as you get closer to a certain point along the way (much like an assymptote only vertical). also it may go to infinity as your series/sequence continues to infinity.
Absolute convergence - I forget this one. Could be that it converges to the same value in either direction (as on a graph).
1,2,3,4, .... (divergent)
1 + 1/2 + 1/3 + 1/4 + .... (converges)
1 + 1/2 + 2/3 + 3/4 + .... (divergent)
2006-08-08 13:33:26
·
answer #2
·
answered by Poncho Rio 4
·
0⤊
0⤋
a series is a series of numbers in a particular order, some examples of a series are a_k={0, a million/2, a million, 3/2, ...} b_k={-a million, a million, -a million, a million, ...} c_k={a million, a million/2, a million/4, a million/8, ...} In a_k, anytime period of the sequence is found with the help of including "a million/2" to the previous time period. For b_k, we multiply the previous time period with the help of "-a million" For c_k, we multiply the previous time period with the help of "a million/2" Any sequence, a_k is declared to converge if acceptable the following decrease exists and is finite. in the different case, we are saying that the sequence diverges. lim{ok-->infinity} (a_k) =L sequence are merely merely the sums of sequences. sequence are very functional in actual existence because (as you'll learn) we are able to apply them (extremely, skill sequence) to approximate something like sin(a million) to any accuracy. What the calculate does once you enter sin(a million), it approximates it utilizing the sequence programmed in for "sin(x)" and returns back to you a decimal approximation. So, without sequence many calculations would not be a probability.
2016-11-23 16:42:38
·
answer #3
·
answered by tepper 4
·
0⤊
0⤋
A sequence is a pattern of terms like x^2, x^3, x^ 4, x^5,...
A series is when you add them together x^2 + x^3 + x^4 + x^ 5,...
2006-08-08 13:05:47
·
answer #4
·
answered by MollyMAM 6
·
0⤊
0⤋
i don't take Calc yet, sorry
2006-08-08 13:04:58
·
answer #5
·
answered by Anonymous
·
0⤊
1⤋