English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Prove that two vectors are linearly dependent if and only if one is a scalar multiple of the other. (Hint: Separately consider the case where one of the vectors is 0).

2007-04-01 15:15:21 · 1 answers · asked by Anonymous in Science & Mathematics Mathematics

1 answers

Suppose u is a scalar multiple of v (e.g. u=kv for some scalar k). Then u-kv=0 so they are linearly dependent. Conversely, suppose u and v are linearly dependent (e.g. k₁u+k₂v = 0 for some scalars k₁ and k₂ not both zero). If k₁=0 then k₂v=0 which means v=0 (since k₂ cannot also be zero), and thus v=0*u and we are done. Also, if k₁≠0, then this equation may be rearranged to show u=(-k₂/k₁) * v, and we are still done. Thus any two vectors are linearly dependent iff one is a scalar multiple of the other. Q.E.D.

2007-04-01 15:39:18 · answer #1 · answered by Pascal 7 · 0 0

fedest.com, questions and answers