English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

3 answers

Spose A is nxm. Let A(i), 1<=i<=n, be the rows of A. The rows are linearly dependent, so there is some k such that
A(k) = c(1)*A(1) + c(2)*A(2) +...+c(k-1)*A(k-1) + c(k+1)*A(k+1) +...+ c(n)*A(n),
where c(1), c(2), ... are constants. So A(k) can be written as a linear combination of the other rows of A.
The rows of C=AB are C(i)=A(i)B (where again A(i) are the row vectors of A).
By linearity, we get
C(k) = A(k)B = (c(1)*A(1) +...+ c(k-1)A(k-1) + c(k+1)A(k+1) +...+ c(n)A(n))B
=c(1)A(1)B +...+ c(k-1)A(k-1)B +c(k+1)A(k+1)B +...+ c(n)A(n)B
=c(1)C(1) +...+ c(k-1)C(k-1) + c(k+1)C(k+1) +...c(n)C(n),
so C(k) is a linear combination of the other rows of C.

2006-09-24 16:09:04 · answer #1 · answered by vinzklorthos 2 · 0 0

the difficulty, as pronounced, does not anticipate a sq. matrix. So the determinant isn't acceptable for the standard case. after all, row shrink your matrix to get a 0 row. you're able to try this precisely through fact the rows are based (write the rows as row vectors, get a linear dependency relation, and then use that relation to get the row alleviation). Row alleviation is actual purely left multiplication by utilising specific matrices of finished rank, so it does not substitute the rank of the unique matrix. Say your row alleviation matricies have product R, i.e. RA'=A for some matrix A' with a 0 row and R has finished rank. Multiplying any matrix B on the left by utilising a matrix that has a row of zeros will supply a matrix which will actual have a 0 row and for this reason linearly based rows. for this reason, AB=RA'B might have below finished rank, i.e. the rows of the matrix AB are based.

2016-12-15 13:42:34 · answer #2 · answered by starich 4 · 0 0

Let v1,...,vK be the rows of the matrix A. Then the rows of the matrix AB are (v1. B),....,(vK . B). Now because v1,...,vK are known to be linearly dependent, there are scalars a1,...,aK such that not all of them are zero and
a1.v1+...+aK.vK=0
But then multiplying the both sides of the equation above by B from the right we get
a1(v1 . B)+...+aK(vK . B)=0 . B= 0
which means that (v1 . B),..., (vK . B) are linearly dependent.

2006-09-24 16:11:44 · answer #3 · answered by firat c 4 · 0 0

fedest.com, questions and answers