Think of x[subscript 1] as the variable x, and x[subscript 2] as the variable y. That means we really have the two equations
x + 2y = 3
-2x + y = 4
I'm going to use x1 and x2 from now on, though.
The corresponding matrix would be
[1 2] [x1] = [3]
[-2 1][x2] = [4]
If we let
A = [1 2]
. . . [-2 1]
x = [x1]
. . . [x2]
b = [3]
. . . [4]
Then, we have
Ax = b. Multiplying both sides by A^(-1), we have
x = A^(-1)b
So our problem is solving for A^(-1).
To solve for the inverse of a matrix, we have to use the following identity:
A^(-1) = (1 / detA) * adj(A)
Where det(A) is the determinant of A,
adj(A) = the adjoint of A.
First, let's solve for the determinant of A.
det[1 2]
. . . [-2 1]
This equals (1)(1) - (-2)(2) = 1 + 4 = 5.
adj[1 2]
. . [-2 1]
What you're going to get is a square matrix. Let's do this one at a time, shall we?
To get the entry (1,1), first note that its sign is dependent on it's row; (-1)^(1 + 1) = (-1)^2 = 1.
Next, delete the 1st row and 1st column of the matrix, and take the determinant of it. Since we're left with the bottom right number upon deletion of the 1st row and 1st column, our answer is 1. Therefore, our adjoint so far is
[1 ?]^T
[? ?]
(We have to transpose the matrix in the end.
We do the same for the first row, second entry. The sign of this is going to be negative, since (-1)^(1 + 2) = (-1)^3 = -1.
Upon deletion of first row second column, we're left with the bottom left value, which is -2. We negate -2 (due to the sign) to get 2. Our matrix so far is now:
[1 2]^T
[? ?]
Now, let's get the 2nd row first column. The sign is going to be negative. Also, we're left with 2 after row/column deletion, so we have -2 (due to the sign):
[1 2]^T
[-2 ?]
Lastly, we have a 1.
[1 2]^T
[-2 1]
We take the transpose of this matrix, and that's what the adjoint of A is going to be.
[1 -2]
[2 1]
Recall that
A^(-1) = (1/detA) adjA
We know what detA is; it's 5, so we have
A^(-1) = (1/5) [1 -2]
. . . . . . . . . . . [2 1]
Applying scalar multiplication, we get
A^(-1) = [1/5 -2/5]
. . . . . . . [2/5 1/5]
Now that we have the inverse, we can now solve for
x = A^(-1)(b). Therefore
[x1] = [1/5 -2/5] [3]
[x2] = [2/5 1/5] [4]
Doing matrix multiplications,
[x1] = [3/5 - 6/5]
[x2] = [6/5 + 4/5]
[x1] = [-3/5]
[x2] = [10/5]
[x1] = [-3/5]
[x2] = [2]
Therefore
x1 = -3/5
x2 = 2
2007-01-27 18:28:40
·
answer #1
·
answered by Puggy 7
·
0⤊
0⤋
To save space, I will write matrices as arrays of row vectors. So for instance:
[a, b]
[c, d]
Will be written as [[a, b], [c, d]]
You have:
x_1 + 2x_2 = 3
-2x_1 + x_2 = 4
Which is, in matrix notation:
Ax=b, where A=[[1, 2], [-2, 1]], b=[[3], [4]], and x = [[x_1], [x_2]]. You want to find x, and you can do it by left-multiplying both sides of the equation by A^(-1). So you need to find it. You know that:
A^(-1) = 1/det(A) * adj(A)
where adj(A) denotes the adjugate matrix (also known as the adjoint matrix, although sometimes adjoint is used to denote the conjugate transpose instead). The matrix of cofactors of A is [[1, 2], [-2, 1]], which in this case happens to be A itself. This will not happen in general. Adj(A) is the transpose of this matrix, which is [[1, -2], [2, 1]]. The determinant of A is 5, so dividing this matrix by 5 yields A^(-1), which is [[1/5, -2/5], [2/5, 1/5]].
Now left-multiply both sides by A^(-1). This gives you x=a^(-1)b. You then calculate A^(-1)b with normal matrix multiplication. You get [[-1], [2]], so x_1 = -1 and x_2 = 2. Substituting these values back in reveals that this is the correct solution to the system of equations.
2007-01-27 18:15:50
·
answer #2
·
answered by Pascal 7
·
0⤊
0⤋
ah that was awhile back... I do remember that i learned alot of algebra stuff on youtube recently, you could try that out, worked wonders for me
2016-05-24 07:54:39
·
answer #3
·
answered by Beth 4
·
0⤊
0⤋