Skip to main content
Logo image

Section 1.3 Matrix Inverse

Goals.

We will
  • discover how simple notation can embody clever thoughts that eases our efforts, in particular
  • matrix notation enables solving systems of equations like simple, invertible equations.
By the end of this section you will have seen some generalization of concepts and methods learned in previous algebra classes. You should also
  • know the form of the matrix identity,
  • be able to calculate the matrix inverse for some matrices, and
  • be able write the transpose of a matrix.

Subsection 1.3.1 From Old to New

This problem provides one motivation for a matrix inverse. We will develop the method for calculating later.

Checkpoint 1.3.1.

(a)
What arithmetic would you perform to solve \(5x=7\text{?}\)
(b)
After this arithmetic what number is technically still in front of \(x\text{?}\)
(c)
For all numbers \(x,\) \(1x=\text{?}\)
We can determine if there is a matrix version of 1 (that is has the properties above) by setting up a product and solving what results. This provides a connection to Section 1.1.

Checkpoint 1.3.2.

(a)
Setup the equations for \(\left[ \begin{array}{rrr} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{array} \right]\left[\begin{array}{r} x_1 \\ x_2 \\ x_3 \end{array}\right] = \left[\begin{array}{r} x_1 \\ x_2 \\ x_3 \end{array}\right]. \)
(b)
If this matrix is the matrix form of 1, for what values of \(x_i\) \(i=1,2,3\) should the previous equation be true?
(c)
Look at the three equations and set each \(x_i\) to either 0 or 1, so that the solution for \(a_1\) is easy to find.
For example select \([0,0,0]\) or \([1,1,0]\) or something similar.
(d)
Use this vector to solve for \(a_1\text{.}\)
(e)
Repeat (make similar choices) until you have solved for each \(a_i\text{.}\)
(f)
Repeat the previous two steps to solve for the \(b_i\) and \(c_i\text{.}\)
(g)
Use these nine values to write the matrix which is the multiplicative identity.
For \(5x=7\) we use \(1/5\text{,}\) because \(1/5(5)=1\text{.}\) For \(A\vec{x}=\vec{b}\) we want \(B\) such that \(BA\) is the identity matrix we just found. Find the matrix equivalent of \(1/A\) by doing the following.

Checkpoint 1.3.3.

(a)
Multiply (left side) \(\left[ \begin{array}{rrr} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{array} \right] \left[ \begin{array}{rrr} 2 & 1 & 4 \\ 1 & 2 & 2 \\ 3 & -3 & -2 \end{array} \right]=\left[ \begin{array}{rrr} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{array} \right]\text{.}\)
(b)
Note that this an equation showing that the two matrices are equal. This means each of the nine entries in the left side are equal to the corresponding entry on the right side. Write down all the equations from the elements of the matrices with variable \(a_i\) for \(i=1,2,3\text{.}\)
(c)
Solve this system of equations.
(d)
Repeat these two steps to solve for the \(b_i\) and \(c_i\text{.}\)
(e)
How would this change for another matrix? Pick a random matrix and try.

Definition 1.3.4. Matrix Identity.

A matrix \(I\) such that \(IA=AI=A\) for all compatible matrices \(A\) is called the identity matrix.

Definition 1.3.5. Matrix Inverse.

A matrix \(B\) such that \(AB=BA=I\) for some matrix \(A\) is called the inverse of \(A\) denoted \(A^{-1}\text{.}\)

Subsection 1.3.2 Properties of Matrix Inverses

Certain properties just seem right. The next one can be phrased as “The inverse of an inverse is the original matrix.”

Checkpoint 1.3.6.

The identity matrix is denoted \(I,\) and the inverse of a matrix \(A\) is denoted \(A^{-1}\text{.}\) Suppose \(A^{-1}\) exists. Thus \(A^{-1}A = I, \) and \(AA^{-1} = I. \) Now use these two statements to show (using algebra) that \((A^{-1})^{-1}=A\text{.}\)
The next property is only true for square matrices. It can be read as “the left inverse is the same matrix as the right inverse.”

Proof.

\begin{align*} AM = & MB. & \text{We start here, because it is all we have.}\\ AMB = & MBB. & \text{This gives us } MB \text{ on both sides.}\\ A(MB) = & (MB)B.\\ AI = & IB. & \text{This we were given, and it leaves us with}\\ A = & B. \end{align*}

Checkpoint 1.3.8.

Where was the requirement the matrices be square used in the proof of Theorem 1.3.7?
Note this proof is why we call this linear algebra. Notice how it looks like a version of what we did in high school that just happens to be applied to matrices.
The next property is calculation of the inverse of a product of matrices.

Checkpoint 1.3.9.

Remembering that \(B\) is the inverse of \(A\) iff \(BA=AB=I,\) prove \((CD)^{-1}=D^{-1}C^{-1}\text{.}\)

Subsection 1.3.3 Transposition

Definition 1.3.10. Matrix Transposition.

The following operation is called transposition.
\begin{equation*} \left[ \begin{array}{cccc} a_{11} & a_{12} & \ldots & a_{1n} \\ a_{21} & a_{22} & \ldots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \ldots & a_{mn} \end{array} \right]^T = \left[ \begin{array}{cccc} a_{11} & a_{12} & \ldots & a_{1m} \\ a_{21} & a_{22} & \ldots & a_{2m} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \ldots & a_{nm} \end{array} \right]. \end{equation*}

Example 1.3.11.

\begin{equation*} \left[ \begin{array}{rr} 1 & -7 \\ 2 & -4 \end{array} \right]^T = \left[ \begin{array}{rr} 1 & 2 \\ -7 & -4 \end{array} \right]. \end{equation*}
Each of the following can be proved by simply doing the operations on both sides and comparing the results. Do so.

Checkpoint 1.3.12.

(a)
\((A^T)^T=A\text{.}\)
(b)
\((A+B)^T=A^T+B^T\text{.}\)
(c)
\((AB)^T=B^TA^T\text{.}\)
(d)
\((A^T)^{-1}=(A^{-1})^T\text{.}\)