discover how simple notation can embody clever thoughts that eases our efforts, in particular
matrix notation enables solving systems of equations like simple, invertible equations.
By the end of this section you will have seen some generalization of concepts and methods learned in previous algebra classes. You should also
know the form of the matrix identity,
be able to calculate the matrix inverse for some matrices, and
be able write the transpose of a matrix.
Subsection1.3.1From Old to New
This problem provides one motivation for a matrix inverse. We will develop the method for calculating later.
Checkpoint1.3.1.
(a)
What arithmetic would you perform to solve \(5x=7\text{?}\)
(b)
After this arithmetic what number is technically still in front of \(x\text{?}\)
(c)
For all numbers \(x,\)\(1x=\text{?}\)
We can determine if there is a matrix version of 1 (that is has the properties above) by setting up a product and solving what results. This provides a connection to Section 1.1.
If this matrix is the matrix form of 1, for what values of \(x_i\)\(i=1,2,3\) should the previous equation be true?
(c)
Look at the three equations and set each \(x_i\) to either 0 or 1, so that the solution for \(a_1\) is easy to find.
For example select \([0,0,0]\) or \([1,1,0]\) or something similar.
(d)
Use this vector to solve for \(a_1\text{.}\)
(e)
Repeat (make similar choices) until you have solved for each \(a_i\text{.}\)
(f)
Repeat the previous two steps to solve for the \(b_i\) and \(c_i\text{.}\)
(g)
Use these nine values to write the matrix which is the multiplicative identity.
For \(5x=7\) we use \(1/5\text{,}\) because \(1/5(5)=1\text{.}\) For \(A\vec{x}=\vec{b}\) we want \(B\) such that \(BA\) is the identity matrix we just found. Find the matrix equivalent of \(1/A\) by doing the following.
Note that this an equation showing that the two matrices are equal. This means each of the nine entries in the left side are equal to the corresponding entry on the right side. Write down all the equations from the elements of the matrices with variable \(a_i\) for \(i=1,2,3\text{.}\)
(c)
Solve this system of equations.
(d)
Repeat these two steps to solve for the \(b_i\) and \(c_i\text{.}\)
(e)
How would this change for another matrix? Pick a random matrix and try.
Definition1.3.4.Matrix Identity.
A matrix \(I\) such that \(IA=AI=A\) for all compatible matrices \(A\) is called the identity matrix.
Definition1.3.5.Matrix Inverse.
A matrix \(B\) such that \(AB=BA=I\) for some matrix \(A\) is called the inverse of \(A\) denoted \(A^{-1}\text{.}\)
Subsection1.3.2Properties of Matrix Inverses
Certain properties just seem right. The next one can be phrased as “The inverse of an inverse is the original matrix.”
Checkpoint1.3.6.
The identity matrix is denoted \(I,\) and the inverse of a matrix \(A\) is denoted \(A^{-1}\text{.}\) Suppose \(A^{-1}\) exists. Thus \(A^{-1}A = I, \) and \(AA^{-1} = I. \) Now use these two statements to show (using algebra) that \((A^{-1})^{-1}=A\text{.}\)
The next property is only true for square matrices. It can be read as “the left inverse is the same matrix as the right inverse.”
Theorem1.3.7.
If \(AM=I=MB\) for square matrices then \(A=B\text{.}\)
Proof.
\begin{align*}
AM = & MB. & \text{We start here, because it is all we have.}\\
AMB = & MBB. & \text{This gives us } MB \text{ on both sides.}\\
A(MB) = & (MB)B.\\
AI = & IB. & \text{This we were given, and it leaves us with}\\
A = & B.
\end{align*}
Checkpoint1.3.8.
Where was the requirement the matrices be square used in the proof of Theorem 1.3.7?
Note this proof is why we call this linear algebra. Notice how it looks like a version of what we did in high school that just happens to be applied to matrices.
The next property is calculation of the inverse of a product of matrices.
Checkpoint1.3.9.
Remembering that \(B\) is the inverse of \(A\) iff \(BA=AB=I,\) prove \((CD)^{-1}=D^{-1}C^{-1}\text{.}\)