English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

How do I prove that if the rows of matrix A are linearlydependent, then so are the rows of matrix AB?

2006-11-14 00:47:31 · 3 answers · asked by chica1012 2 in Science & Mathematics Mathematics

Assuming A and B are square matrices

2006-11-14 04:09:07 · update #1

3 answers

A row of AB is of the form RB where R is a row of A. Now, if
R1, R2, ...Rk are linearly dependent, there are constants c1,..ck
where
c1*R1+c2*R2+..ck*Rk=0.
Now apply B on the right to get your result.

2006-11-14 04:19:03 · answer #1 · answered by mathematician 7 · 1 0

The problem, as stated, does not assume a square matrix. So the determinant is not applicable for the general case.

In any case, row reduce your matrix to get a zero row. You can do this exactly because the rows are dependent (write the rows as row vectors, get a linear dependency relation, and then use that relation to get the row reduction). Row reduction is really just left multiplication by certain matrices of full rank, so it does not change the rank of the original matrix. Say your row reduction matricies have product R, i.e. RA'=A for some matrix A' with a zero row and R has full rank. Multiplying any matrix B on the left by a matrix that has a row of zeros will give a matrix that will also have a zero row and hence linearly dependent rows. Hence, AB=RA'B will have less than full rank, i.e. the rows of the matrix AB are dependent.

2006-11-14 03:30:33 · answer #2 · answered by just another math guy 2 · 1 1

The rows of A are linearly dependent, so its determinant is 0. The determinant of AB is det(A)*det(B), which is also 0, so the rows of AB are linearly dependent.

2006-11-14 00:55:10 · answer #3 · answered by Anonymous · 1 1

fedest.com, questions and answers