2.5 Linear Independence
In a vector space, vectors can be added to each other and scaled by scalars. The closure property ensures the result is still within the same vector space. Some sets of vectors, when added and scaled, can be used to represent every vector in the space (through linear combinations). These sets form a basis for the space. Before defining a basis, we first define linear combinations and linear independence.
Definition 2.24 A vector
\[
\mathbf{v} = \lambda_1 x_1 + \lambda_2 x_2 + \cdots + \lambda_k x_k
\]
is called a linear combination of the vectors \(x_1, \ldots, x_k\) where \(\lambda_1, \ldots, \lambda_k \in \mathbb{R}\).
The zero vector can always be written as a linear combination by taking all \(\lambda_i = 0\). A non-trivial linear combination occurs when at least one \(\lambda_i \ne 0\).
Definition 2.25 A set of vectors \(x_1, \ldots, x_k \in V\) is linearly dependent if there exists a non-trivial combination
\[
\lambda_1 x_1 + \cdots + \lambda_k x_k = 0,
\]
with at least one \(\lambda_i \ne 0\). A set of vectors is linearly independent if the only solution is the trivial one
\[
\lambda_1 = \cdots = \lambda_k = 0.
\]
Intuitively, linearly independent vectors contain no redundancy. Removing one changes the span of the set.
Example 2.45 Directions such as northwest and southwest form linearly independent directions in a 2D plane. A west direction can be expressed as a combination of those two, making the set linearly dependent.
- Vectors are either linearly dependent or independent — no third case exists.
- If any vector is zero or if two vectors are identical, the set is dependent.
- A set is dependent iff one vector is a linear combination of the others.
- Scalar multiples of a vector cause dependence: if \(x_i = \lambda x_j\), the set is dependent.
Example 2.46 Consider the vectors in \(\mathbb{R}^3\): \[ \mathbf{v}_1 = \begin{pmatrix}1 \\ 2 \\ 3\end{pmatrix}, \quad \mathbf{v}_2 = \begin{pmatrix}2 \\ 4 \\ 6\end{pmatrix}, \quad \mathbf{v}_3 = \begin{pmatrix}0 \\ 1 \\ -1\end{pmatrix} \]
We check if there exist scalars \(c_1, c_2, c_3\), not all zero, such that \[ c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + c_3 \mathbf{v}_3 = \mathbf{0}. \] Notice that \(\mathbf{v}_2 = 2 \mathbf{v}_1\), so we can take \[ c_1 = 2, \quad c_2 = -1, \quad c_3 = 0 \]
Then: \[ 2\mathbf{v}_1 - 1\mathbf{v}_2 + 0\mathbf{v}_3 = 2\mathbf{v}_1 - 2\mathbf{v}_1 + \mathbf{0} = \mathbf{0}. \] Hence, \(\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}\) is linearly dependent.
2.5.1 Gaussian Elimination Method
Theorem 2.2 A set of vectors \(V = \{ \mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_k \}\) in \(\mathbb{R}^n\) is linearly independent if, when the vectors are placed as columns of a matrix \[ \mathbf{A} = [\mathbf{v}_1 \ \mathbf{v}_2 \ \cdots \ \mathbf{v}_k], \] and elementary row operations are performed to reduce \(\mathbf{A}\) to row echelon form, each column of \(\mathbf{A}\) contains a pivot (a leading 1 in some row).
Equivalently, the set \(V\) is linearly independent if and only if the reduced matrix has a pivot in every column.
To test independence:
- Write the vectors as columns of a matrix \(\mathbf{A}\).
- Perform row reduction to row echelon form.
- Pivot columns correspond to independent vectors.
- Non-pivot columns can be expressed as combinations of earlier vectors.
- Pivot columns correspond to independent vectors.
- The vectors are independent iff every column is a pivot column.
Example 2.47 Given
\[
x_1 =
\begin{bmatrix}
1 \\ 2 \\ -3 \\ 4
\end{bmatrix},
\quad
x_2 =
\begin{bmatrix}
1 \\ 1 \\ 0 \\ 2
\end{bmatrix},
\quad
x_3 =
\begin{bmatrix}
-1 \\ -2 \\ 1 \\ 1
\end{bmatrix},
\]
we can put them into a matrix
\[\begin{bmatrix}
1 & 1 & -1\\ 2 & 1 & -2 \\ -3 & 0 & 1 \\ 4 & 2 & 1
\end{bmatrix}.
\]
Row reduction shows all are pivot columns
\[\begin{bmatrix}
1 & 1 & -1\\ 2 & 1 & -2 \\ -3 & 0 & 1 \\ 4 & 2 & 1
\end{bmatrix}
\longrightarrow
\begin{bmatrix}
1 & 1 & -1\\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0
\end{bmatrix}.
\]
Since all the columns are pivot columns, the set is linearly independent.
Let \(\mathbf{b}_1, \ldots, \mathbf{b}_k\) be linearly independent, and define \(x_j = \mathbf{B} \mathbf{\lambda}_j\) with \(\mathbf{B} = [\mathbf{b}_1 \cdots \mathbf{b}_k]\). Then, vectors \(\{\mathbf{x}_1, \ldots, \mathbf{x}_m\}\) are linearly independent iff the coefficient vectors \(\{\mathbf{\lambda}_1, \ldots, \mathbf{\lambda}_m\}\) are linearly independent.
Example 2.48 Let \[ \mathbf{b}_1 = \begin{pmatrix}1 \\ 0 \\ 0\end{pmatrix}, \quad \mathbf{b}_2 = \begin{pmatrix}0 \\ 1 \\ 0\end{pmatrix} \] be linearly independent vectors in \(\mathbb{R}^3\), and define \(\mathbf{B} = [\mathbf{b}_1 \ \mathbf{b}_2]\).
Let the coefficient vectors be: \[ \mathbf{\lambda}_1 = \begin{pmatrix}1 \\ 2\end{pmatrix}, \quad \mathbf{\lambda}_2 = \begin{pmatrix}3 \\ 4\end{pmatrix}. \]
Then define \[ \mathbf{x}_j = \mathbf{B} \mathbf{\lambda}_j. \]
Next, we compute \(\mathbf{x}_1\) and \(\mathbf{x}_2\) \[ \mathbf{x}_1 = \mathbf{B}\mathbf{\lambda}_1 = \begin{pmatrix}1 & 0 \\ 0 & 1 \\ 0 & 0\end{pmatrix} \begin{pmatrix}1 \\ 2\end{pmatrix} = \begin{pmatrix}1 \\ 2 \\ 0\end{pmatrix}, \quad \mathbf{x}_2 = \mathbf{B}\mathbf{\lambda}_2 = \begin{pmatrix}3 \\ 4 \\ 0\end{pmatrix}. \]
The claim is that \(\mathbf{x}_1\) and \(\mathbf{x}_2\) are linearly independent. To verify this, check if \(c_1 \mathbf{x}_1 + c_2 \mathbf{x}_2 = \mathbf{0}\) has only the trivial solution. \[ c_1 \begin{pmatrix}1 \\ 2 \\ 0\end{pmatrix} + c_2 \begin{pmatrix}3 \\ 4 \\ 0\end{pmatrix} = \begin{pmatrix}c_1 + 3c_2 \\ 2c_1 + 4c_2 \\ 0\end{pmatrix} = \mathbf{0}. \] Solve: \[ c_1 + 3c_2 = 0 \quad \text{and} \quad 2c_1 + 4c_2 = 0. \] Both equations are consistent and lead to the trivial solution \(c_1 = 0, c_2 = 0\). Hence, \(\{\mathbf{x}_1, \mathbf{x}_2\}\) is linearly independent.
In any vector space \(V\), if there are more vectors than dimensions (\(m > k\)), the vectors are linearly dependent.
Exercises
Exercise 2.57 Determine if the set of vectors is linearly independent: \[ \begin{aligned} x_1 &= b_1 - 2b_2 + b_3 - b_4, \\ x_2 &= -4b_1 - 2b_2 + 4b_4, \\ x_3 &= 2b_1 + 3b_2 - b_3 - 3b_4, \\ x_4 &= 17b_1 - 10b_2 + 11b_3 + b_4, \end{aligned} \]
Exercise 2.58 Prove that if any vector is zero or if two vectors are identical, the set is dependent.
Exercise 2.59 Prove that a set of vectors is dependent iff one vector is a linear combination of the others.
Exercise 2.60 Prove that scalar multiples of a vector cause dependence: if \(x_i = \lambda x_j\), the set is dependent.
Exercise 2.61 Are the vectors \(\left\{\begin{bmatrix}1\\2\\3\end{bmatrix}, \begin{bmatrix}3\\5\\7\end{bmatrix}, \begin{bmatrix}0\\1\\2\end{bmatrix}\right\}\) linearly independent?
Exercise 2.62 Are the vectors \(\left\{\begin{bmatrix}1\\2\\3\end{bmatrix}, \begin{bmatrix}3\\2\\9\end{bmatrix}, \begin{bmatrix}5\\2\\-1\end{bmatrix}\right\}\) linearly independent?
Exercise 2.63 Are \(p(x) = 1 + 3x + 2x^2\), \(q(x) = 3 + x + 2x^2\) and \(r(x) = 2x + x^2\) linearly independent?