English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

U1 and U2 are subspaces of V. Assume B is a basis of U1 intersect U2. Extend B to B1, a basis of U1 and extend B to B2, a basis of U2.

Prove or disprove:

B1 union B2 is a basis of U1 + U2 ?

2006-12-20 10:37:47 · 1 answers · asked by modulo_function 7 in Science & Mathematics Mathematics

1 answers

Let x be in U1 + U2. Then x = x1 + x2 where x1 is in U1 and x2 is in U2. So x1 is a linear combination of the elements of B1 and x2 is a linear combination of the elements of B2. Hence x is a linear combination of the elements of B1 union B2. It is also clear that any linear combination of B1 union B2 can be written as a linear combination of elements of B1, plus a linear combination of elements of B2. So the span of B1 union B2 is U1 + U2.

Now, suppose that B1 union B2 is linearly dependent. Write B = {b1, b2, ..., bk}, B1\B = {b'1, ..., b'l}, B2\B = {b"1, ..., b"m}. Then there is some linear combination of these which is zero, with not all coefficients zero. Let this combination be
r1b1 + ... + rkbk + s1b'1 + ... + slb'l + t1b"1 + ... + t"mbm.
Now since B1 (including B) is linearly independent, at least one of the ti must be non-zero. Similarly, since B2 is linearly independent, at least one of the sj must be non-zero. So we can write
r1b1 + ... + rkbk + s1b'1 + ... + slb'l = x
t1b"1 + ... + t"mbm = x
where x is a non-zero vector (otherwise all coefficients must be zero, since the LHS of each of these equations is a linear combination of a known basis, and we know not all coefficients are zero). Hence x is in U1 from the first equation and in U2 from the second equation, so x is in U1 intersect U2.

But if x is in U1 intersect U2, x can be written as a linear combination of elements of B: x = x1b1 + ... + xkbk.
Then, since B1 is a basis and x = r1b1 + ... + rkbk + s1b'1 + ... + slb'l, we must have ri = xi for all i and sj = 0 for all j, which is a contradiction since we earlier showed that at least one sj was non-zero. (Similarly the second equation shows that xi = 0 for all i and tj = 0 for all j, each of which is a contradiction to previously proved statements.)

Hence the vectors in B1 union B2 cannot be linearly dependent. Since they are linearly independent and have span U1 + U2, they are a basis for U1 + U2.

2006-12-20 11:18:25 · answer #1 · answered by Scarlet Manuka 7 · 1 0

fedest.com, questions and answers