u+v, v+w, and u+w will be linearly independent vectors. Consider that for them to be linearly dependent, p(u+v) + q(v+w) + r(u+w) =0 for some scalars p, q, and r, with at least one of p,q, and r being not equal to zero. However, since u,v, and w are linearly independent, this is only possible if the scalars which are associated with u, v, and w are all 0 after simplifying. This implies that:
p+r=0
p+q=0
q+r=0
This system of equations has a uniqe solution, with p, q, and r all equal to zero. In particular, it has no solutions with any of p, q and r not equal to zero. Thus, there are no nontrivial relations among (u+v), (u+w), and (v+w), and these vectors are linearly independent. Q.E.D.
For u-v, v-w, and u-w, these vectors will never be linearly indpendent, because a nontrivial relationship between them exists. Specifically: -(u-v) - (v-w) + (u-w) = 0
2006-09-16 09:51:21
·
answer #1
·
answered by Pascal 7
·
2⤊
0⤋
I assume the questions read:
1: u+v, v+w, u+w
2: u-v, v-w, u-w
The first question boils down to: is the determinant of
1 1 0
0 1 1
1 0 1
non-zero? Since this matrix transforms u,v,w into u+v etc. Actually, it's determinant is 2 so the answer is yes.
For the second question, the matrix is
1 -1 0
0 1 -1
1 0 -1
Again, the determinant is 2.
2006-09-16 16:39:06
·
answer #2
·
answered by helene_thygesen 4
·
0⤊
1⤋
linear independence means that we cannot derive one vector by a linear sum of other two vectors...
so the question is:
can u+v be expressed as a linear sum of v+w & u+w ?
if so, let u+v = a(v+w) + b (u+w) for some scalars, a & b
u + v = av + (a+b) w + b u
or, (1-b)u = av + (a+b) w which means u can be expressed as a linear combination of v and w... which is against our assumption...
so u+v is not a linear sum of the other two...
apply similar logic to the other set as well....
2006-09-16 16:39:20
·
answer #3
·
answered by m s 3
·
0⤊
0⤋