Skip to main content
Logo image

Section 6.2 Orthogonal Sets

Goals.

We will
  • consider special orthogonal sets,
  • test the independence/dependence of orthogonal sets of vectors,
  • define projection, and
  • note the benefit of orthogonal bases for projection.

Subsection 6.2.1 Orthogonal Sets

We have defined orthogonality with respect to a pair of vectors. In this section we expand that to a vector being orthogonal to a set of vectors.

Activity 6.2.1.

The goal of this activity is to illustrate a set of vectors to which a vector is orthogonal.
We use the dot product as the inner product. For \(\vec{v}_1=[x_1,y_1,z_1]^T\) and \(\vec{v}_2=[x_2,y_2,z_2]^T\) \(\langle \vec{v}_1,\vec{v}_2 \rangle = x_1 x_2+y_1 y_2+z_1 z_2 \text{.}\)
Use the vectors \(\vec{v}_1=[6,6,0]^T\text{,}\) \(\vec{v}_2=[3,-3,6]^T\text{,}\) and \(\vec{v}_3=[-4,4,4]^T\text{.}\)
(a)
Show that \(\vec{v}_1 \perp \vec{v}_2\) and \(\vec{v}_1 \perp \vec{v}_3\text{.}\)
(b)
Calculate the linear combination: \(\vec{u}=5\vec{v}_2-7\vec{v}_3\)
(c)
Test if \(\vec{v}_1 \perp \vec{u}\text{.}\)
(d)
Would this always be true?

Definition 6.2.1.

\begin{equation*} W^{\perp} = \{ \vec{x} | \vec{x} \perp \vec{v} \mbox{ for all } \vec{v} \in \mbox{span}(W) \}. \end{equation*}
Next we consider some of the subspaces we know and discover a relationship between them.

Activity 6.2.2.

Using a specific matrix, we will check the relationship between vectors in the row space and vectors in the null space.
Use the matrix
\begin{equation*} A = \left[ \begin{array}{rrr} 1 & 3 & -5 \\ 2 & 7 & -12 \\ 7 & 23 & -39 \end{array} \right]. \end{equation*}
(a)
Find the row space of \(A\text{.}\)
(b)
Find the null space of \(A\text{.}\)
(c)
Select a vector from the row space of \(A\) and a vector from the null space of \(A\) and determine if those vectors are orthogonal.

Subsection 6.2.2 Orthogonality and Independence

Next we check if a set of orthogonal vectors is necessarily independent, dependent, or variable.

Activity 6.2.3.

Check if this set of orthogonal vectors is independent.
Use the vectors \(\vec{v}_1=[6,6,0]^T\text{,}\) \(\vec{v}_2=[3,-3,6]^T\text{,}\) and \(\vec{v}_3=[-4,4,4]^T\text{.}\)
(a)
Test if these three vectors are linearly dependent or independent.
(b)
Write \(c_1\vec{v}_1+c_2\vec{v}_2+c_3\vec{v}_3=\vec{0}\text{.}\)
(c)
Using properties of the inner product expand \(\langle \vec{v}_1, c_1\vec{v}_1+c_2\vec{v}_2+c_3\vec{v}_3 \rangle\text{.}\)
(d)
Reduce the expression above using the knowledge that these vectors are orthogonal. The result should be an expression with only some \(c_i\text{.}\)
(e)
Calculate \(\langle \vec{v}_1, \vec{0} \rangle \text{.}\)
(f)
Note we calculated the inner product of bothsides of the indepdendence definition (linear combination equals zero vector). Based on the last two results what is \(c_1\text{?}\)
(g)
Is a set of orthogonal vectors independent?

Subsection 6.2.3 Orthogonal Bases

Activity 6.2.4.

Here we look at orthogonal sets (bases) and look at calculating coordinates with respect to this basis. We will calculate the coordinates, then do some calculations using inner products and compare the result.
Use the vectors \(\vec{u}=[16,20,-16]^T\text{,}\) \(\vec{v}_1=[6,6,0]^T\text{,}\) \(\vec{v}_2=[3,-3,6]^T\text{,}\) and \(\vec{v}_3=[-4,4,4]^T\text{.}\) Note the \(\vec{v}_i\) are pairwise orthogonal (every pair is orthogonal).
(a)
Find \([\vec{u}]_V\text{.}\)
(b)
Consider \(c_1\vec{v}_1+c_2\vec{v}_2+c_3\vec{v}_3=\vec{0}\) where \([\vec{u}]_V=[c_1,c_2,c_3]\text{.}\) Based on the calculations in Activity 6.2.3 what is the following?
\begin{equation*} \langle c_1\vec{v}_1+c_2\vec{v}_2+c_3\vec{v}_3, \vec{v}_1 \rangle \end{equation*}
(c)
What would you get for \(\langle \vec{u}, \vec{v}_i \rangle\) for the other \(\vec{v}_i\text{?}\)
(d)
What does this imply about coordinates and orthogonal bases?

Subsection 6.2.4 Orthonormal

We saw that calculating coordinates with respect to an orthogonal basis is easier than for a general basis. Here we extend the niceness by one more feature.

Activity 6.2.5.

Recall that an inner product on \(\R^n\) is the dot product. This is equivalent to the appropriate matrix multiplication. We use matrix multiplication, therefore, to illustrate orthogonality and beyond.
For this activity use \(\{ \vec{u}_1=[1,1,1], \vec{u}_2=[-2,1,1], \vec{u}_3=[0,-3,3] \} \) \(\{ \vec{v}_1=\frac{1}{\sqrt{3}}[1,1,1], \vec{u}_2=\frac{1}{\sqrt{6}}[-2,1,1], \vec{u}_3=\frac{1}{\sqrt{18}}[0,-3,3] \} \text{.}\)
(a)
Show that set of \(\vec{u}_i\) is orthogonal.
(b)
Let \(U=[\vec{u}_1^T,\vec{u}_2^T,\ldots,\vec{u}_n^T]\text{.}\) Calculate \(U^TU\text{.}\)
(c)
How does this product connect with the orthogonality of the vectors?
(d)
Let \(V=[\vec{v}_1^T,\vec{v}_2^T,\ldots,\vec{v}_n^T]\text{.}\) Calculate \(V^TV\text{.}\)
(e)
Are the \(\vec{v}_i\) orthgonal? What else do you notice about them?

Definition 6.2.2. Orthonormal Vectors.

A set of vectors is orthonormal if and only if the vectors are pairwise orthogonal and have norm one.