Skip to main content
Logo image

Section 1.4 Dependence

Goals.

We will
  • use solutions to linear systems (\(A\vec{x}=\vec{b}\)) to show a relationship between vectors from \(A\) and \(\vec{b}\text{.}\)
  • From this experiment we will discover the definition of one of the fundamentals of linear algebra.
By the end of this section
  • you will know the definition for linear independence, and
  • you know how linear independence affects the solution set of a linear system
  • and you can test if a set of vectors, which may be rows or columns of a matrix, are linearly independent.

Subsection 1.4.1 Discovering a Relationship

Through comparing two cases we will notice a property that is fundamental to linear algebra. This is part one of the experiment.

Activity 1.4.1.

\begin{equation*} A=\left[ \begin{array}{rrr} 1 & 0 & 1 \\ 0 & 1 & 7 \\ 1 & 7 & 1 \end{array} \right], \;\; \vec{x}=\left[\begin{array}{r} x_1 \\ x_2 \\ x_3 \end{array}\right]. \end{equation*}
\begin{equation*} \vec{b}_1=\left[\begin{array}{r} 2 \\ 9 \\ 16 \end{array}\right], \vec{b}_2=\left[\begin{array}{r} 4 \\ 9 \\ 18 \end{array}\right], \vec{b}_3=\left[\begin{array}{r} -1 \\ -16 \\ -15 \end{array}\right] \end{equation*}
(a)
Solve \(A\vec{x}=\vec{b}_i\) for each vector \(\vec{b}_i\text{.}\)
(b)
Find coefficients \(a,b,c\) such that \(a\vec{c}_1+b\vec{c}_2+c\vec{c}_3=\vec{b}_i\) where \(\vec{c}_i\) are the columns of \(A\text{.}\)
Hint.
When we see a question for which we may not have a memorized method, we can start by simply writing it down. Insert the specific vectors, leave the constants as variables, and see what you have. You will discover you do know how to solve this.
(c)
Compare the results of these two steps.
This is part two of the experiment.

Activity 1.4.2.

\begin{equation*} A=\left[ \begin{array}{rrr} 2 & 1 & 4 \\ 1 & 2 & 2 \\ 4 & 5 & 8 \end{array} \right], \;\; \vec{x}=\left[\begin{array}{r} x_1 \\ x_2 \\ x_3 \end{array}\right]. \end{equation*}
\begin{equation*} \vec{b}_1=\left[\begin{array}{r} 7 \\ 5 \\ 17 \end{array}\right], \vec{b}_2=\left[\begin{array}{r} 4 \\ 2 \\ 8 \end{array}\right], \vec{b}_3=\left[\begin{array}{r} 9 \\ 9 \\ 18 \end{array}\right] \end{equation*}
(a)
Solve \(A\vec{x}=\vec{b}_i\) for each vector \(\vec{b}_i\text{.}\)
(b)
Find coefficients \(a,b,c\) such that \(a\vec{c}_1+b\vec{c}_2+c\vec{c}_3=\vec{b}_i\) where \(\vec{c}_i\) are the columns of \(A\text{.}\)
(c)
Compare the results of these two steps.
(d)
Explain why there is no solution for some vectors.
The answer is the fundamental concept.
Having compared two cases we next want to review matrix multiplication which will connect this concept to a previous one.

Activity 1.4.3.

We will consider the equation \(A\vec{x}=\vec{b}\text{.}\)
(b)
To what are the rest of the \(x_i\) (other elements of \(\vec{x}\)) multiplied?
(c)
In light of these responses, what is \(\vec{b}\) in terms of \(A\text{?}\)
(d)
What is true of the rows/columns of \(A\) when solutions to this equation are not unique?

Subsection 1.4.2 Properties

In this section we will connect this concept to matrices, specifically to non-square matrices.

Activity 1.4.4.

(a)
If you row reduce a matrix that is \(5 \times 3\) what will happen (always)?
(b)
What does that mean about the rows of such a matrix (in context of this lesson)?
(c)
If you column reduce a matrix that is \(3 \times 5\) what will happen (always)?
(d)
What does that mean about the columns of such a matrix (in context of this lesson)?

Definition 1.4.1. Dependence.

If \(\vec{y}=a_1\vec{x}_1+a_2\vec{x}_2+ \ldots + a_k\vec{x}_k\) for some set of scalars \(a_i\) not all zero, then \(\vec{y}\) is dependent on the set of \(\vec{x}_i\text{.}\)
Under this condition the set \(\{y,x_1,x_2,\ldots,x_k\}\) is called a dependent set of vectors.

Definition 1.4.2. Independent.

A set of vectors \(x_i\) is called independent if and only if \(a_1\vec{x}_1+a_2\vec{x}_2+\ldots+a_k\vec{x}_k=\vec{0}\) has no non-zero solution.