4.1 Determinant and Trace

The determinant of a square matrix \(\mathbf{A} \in \mathbb{R}^{n \times n}\), denoted as \(\det(\mathbf{A})\) or \(|\mathbf{A}|\), is a scalar that characterizes several key properties of \(\mathbf{A}\).

For small matrices:

\[ \det \begin{pmatrix} a_{11} \end{pmatrix} = a_{11}, \quad \det \begin{pmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \end{pmatrix} = a_{11}a_{22} - a_{12}a_{21}. \]

Example 4.1 Let
\[ \mathbf{A} = \begin{bmatrix} 3 & -1 \\ 5 & 2 \end{bmatrix}. \]

The determinant of \(\mathbf{A}\) is \[ \det(\mathbf{A}) = (3)(2) - (-1)(5) = 6 + 5 = 11. \]

For larger matrices, we can compute determinants recursively using the Laplace expansion:

\[ \det(\mathbf{A}) = \sum_{k=1}^n (-1)^{k+j} a_{kj} \det(\mathbf{A}_{k,j}), \] where \(\mathbf{A}_{k,j}\) is the submatrix obtained by removing row \(k\) and column \(j\).

Example 4.2 Let
\[ \mathbf{A} = \begin{bmatrix} 2 & -1 & 3 \\ 1 & 4 & 0 \\ -2 & 5 & 1 \end{bmatrix}. \]

We compute \(\det( \mathbf{A})\) using Laplace expansion along the first row. \[ \det( \mathbf{A}) = 2 \begin{vmatrix} 4 & 0 \\ 5 & 1 \end{vmatrix} - (-1) \begin{vmatrix} 1 & 0 \\ -2 & 1 \end{vmatrix} + 3 \begin{vmatrix} 1 & 4 \\ -2 & 5 \end{vmatrix}. \] Next, we compute each minor \[\begin{align*} \begin{vmatrix} 4 & 0 \\ 5 & 1 \end{vmatrix} &= (4)(1) - (0)(5) = 4\\ \begin{vmatrix} 1 & 0 \\ -2 & 1 \end{vmatrix} &= (1)(1) - (0)(-2) = 1\\ \begin{vmatrix} 1 & 4 \\ -2 & 5 \end{vmatrix} &= (1)(5) - (4)(-2) = 5 + 8 = 13 \end{align*}\]

Using these determinants in the Laplace formula gives us \[ \det( \mathbf{A}) = 2(4) - (-1)(1) + 3(13) = 8 + 1 + 39 = 48. \]

A matrix \(\mathbf{A}\) is invertible if and only if \(\det(\mathbf{A}) \neq 0\).

Example 4.3 Verify that a matrix is invertible if and only if its determinant is non-zero.

For triangular matrices, the determinant equals the product of the diagonal elements.

Example 4.4 Let
\[ \mathbf{A} = \begin{bmatrix} 3 & 2 & -1 \\ 0 & 5 & 4 \\ 0 & 0 & 7 \end{bmatrix}. \]

Since \(\mathbf{A}\) is upper triangular, we already know that
\[ \det( \mathbf{A}) = 3 \cdot 5 \cdot 7 = 105. \] But we will verify this using Laplace expansion.

Expand along column 1: \[\begin{align*} \det(\mathbf{A}) &= 3 \begin{vmatrix} 5 & 4 \\ 0 & 7 \end{vmatrix} - 0 \begin{vmatrix} 2 & -1 \\ 0 & 7 \end{vmatrix} + 0 \begin{vmatrix} 2 & -1 \\ 5 & 4 \end{vmatrix}\\ &= 3 \begin{vmatrix} 5 & 4 \\ 0 & 7 \end{vmatrix}\\ &= 3\left[(5)(7) - (4)(0)\right]\\ &= 3(35)\\ &= 105. \end{align*}\] This matches the product of the diagonal entries, as expected for triangular matrices.

The determinant changes sign when two rows (or columns) are swapped, and scales when a row is multiplied by a scalar.

Example 4.5 Let \[ \mathbf{A}=\begin{bmatrix}1 & 2\\[4pt] 3 & 4\end{bmatrix}. \] Compute \(\det( \mathbf{A})\): \[ \det( \mathbf{A})=1\cdot 4 - 2\cdot 3 = 4 - 6 = -2. \]

Swap row 1 and row 2 to get \[ \mathbf{B}=\begin{bmatrix}3 & 4\\[4pt] 1 & 2\end{bmatrix}. \] Compute \(\det( \mathbf{B})\): \[ \det( \mathbf{B})=3\cdot 2 - 4\cdot 1 = 6 - 4 = 2. \]

Observation: \(\det( \mathbf{B})=2 = -(-2)= -\det( \mathbf{A})\).
Swapping two rows changed the sign of the determinant.

Example 4.6 Let \[ \mathbf{C}=\begin{bmatrix}2 & 1\\[4pt] 0 & 3\end{bmatrix}. \] Compute \(\det( \mathbf{C})\): \[ \det( \mathbf{C})=2\cdot 3 - 1\cdot 0 = 6 - 0 = 6. \]

Multiply the first row of \(\mathbf{C}\) by \(2\) to get \[ \mathbf{D}=\begin{bmatrix}4 & 2\\[4pt] 0 & 3\end{bmatrix}. \] Compute \(\det( \mathbf{D})\): \[ \det( \mathbf{D})=4\cdot 3 - 2\cdot 0 = 12 - 0 = 12. \]

Observation: \(\det( \mathbf{D})=12 = 2\cdot 6 = 2\det( \mathbf{C})\).
Scaling a single row by a factor \(2\) scales the determinant by \(2\).

4.1.1 Geometric Interpretation

The determinant measures the signed volume of the parallelepiped spanned by the columns of \(\mathbf{A}\):

  • In \(\mathbb{R}^2\): \(|\det(\mathbf{A})|\) gives the area of a parallelogram.
  • In \(\mathbb{R}^3\): \(|\det(\mathbf{A})|\) gives the volume of a parallelepiped.

If the determinant is zero, the columns are linearly dependent and the volume collapses to zero.

4.1.2 Properties of the Determinant

Theorem 4.1 For square matrices \(\mathbf{A}\) and \(\mathbf{B}\) and \(\lambda \in \mathbb{R}\), the following properties hold: \[ \begin{aligned} \det(\mathbf{A}\mathbf{B}) &= \det(\mathbf{A})\det(\mathbf{B}), \\ \det(\mathbf{A}^\top) &= \det(\mathbf{A}), \\ \det(\mathbf{A}^{-1}) &= \frac{1}{\det(\mathbf{A})}, \\ \det(\lambda \mathbf{A}) &= \lambda^n \det(\mathbf{A}). \end{aligned} \]

Example 4.7 For \(2 \times 2\) matrices, \[ \mathbf{A} = \begin{bmatrix} a & b \\ c & d \end{bmatrix}, \quad \mathbf{B} = \begin{bmatrix} e & f \\ g & h \end{bmatrix}, \] we have \[ \mathbf{A}\mathbf{B} = \begin{bmatrix} ae + bg & af + bh \\ ce + dg & cf + dh \end{bmatrix}. \] This has determinant \[\begin{align*} \det( \mathbf{A}\mathbf{B}) &= \begin{vmatrix} ae + bg & af + bh \\ ce + dg & cf + dh \end{vmatrix} \\ & = (ae + bg)(cf + dh) - (af + bh)(ce + dg)\\ & = aecf + aedh + bgcf + bgdh - \left( afce + afdg + bhce + bhdg \right)\\ & = ac(ef - fe) + ad(eh - fg) + bc(gf - he) + bd(gh - hg)\\ & = ad(eh - fg) - bc(eh - fg)\\ & = (ad - bc)(eh - fg) \\ &= \det( \mathbf{A})\det(\mathbf{B}). \end{align*}\]

Theorem 4.2 A matrix is invertible if and only if it is full rank, i.e. \(\text{rank}(\mathbf{A}) = n\).


4.1.3 Trace

Definition 4.1 The trace of a square matrix \(\mathbf{A} \in \mathbb{R}^{n \times n}\) is the sum of its diagonal elements: \[ \text{tr}(\mathbf{A}) = \sum_{i=1}^n a_{ii}. \]

Theorem 4.3 For square matrices \(\mathbf{A}\) and \(\mathbf{B}\) and \(\alpha \in \mathbb{R}\), the following properties hold: \[ \begin{aligned} \text{tr}(\mathbf{A} + \mathbf{B}) &= \text{tr}(\mathbf{A}) + \text{tr}(\mathbf{B}), \\ \text{tr}(\alpha \mathbf{A}) &= \alpha \, \text{tr}(\mathbf{A}), \\ \text{tr}(\mathbf{A}\mathbf{B}) &= \text{tr}(\mathbf{B}\mathbf{A}), \\ \text{tr}(\mathbf{I}_n) &= n. \end{aligned} \]

The trace is invariant under cyclic permutations, meaning \(\text{tr}(\mathbf{A}\mathbf{K}\mathbf{L}) = \text{tr}(\mathbf{K}\mathbf{L}\mathbf{A})\). It is also independent of basis, so the trace of a linear map \(\Phi\) is the same in all matrix representations.

Example 4.8 Let
\[ \mathbf{A} = \begin{bmatrix} 3 & -1 & 4 \\ 0 & 2 & 5 \\ 7 & 1 & -6 \end{bmatrix}. \] The trace is \[ \text{tr}(\mathbf{A}) = 3 + 2 + (-6) = -1. \]


4.1.4 Characteristic Polynomial

Definition 4.2 The characteristic polynomial of a square matrix \(\mathbf{A}\) is defined as: \[ p_ \mathbf{A}(\lambda) = \det(\mathbf{A} - \lambda \mathbf{I}) = c_0 + c_1 \lambda + \cdots + c_{n-1} \lambda^{n-1} + (-1)^n \lambda^n. \]

The characteristic polynomial for \(\mathbf{A}\) encodes key properties of \(\mathbf{A}\): \[ c_0 = \det(\mathbf{A}), \quad c_{n-1} = (-1)^{n-1} \text{tr}(\mathbf{A}). \] The roots of this polynomial are the eigenvalues of \(\mathbf{A}\), which will be explored in the next section.


Example 4.9 Let
\[ \mathbf{A} = \begin{bmatrix} 3 & -1 & 4 \\ 0 & 2 & 5 \\ 7 & 1 & -6 \end{bmatrix}. \] The characteristic polynomial is \[\begin{align*} p(\lambda) &= \det(A - \lambda I) \\ &= \det\left( \begin{bmatrix} 3 - \lambda & -1 & 4 \\ 0 & 2 - \lambda & 5 \\ 7 & 1 & -6 - \lambda \end{bmatrix} \right) \\ &= (3 - \lambda) \begin{vmatrix} 2 - \lambda & 5 \\ 1 & -6 - \lambda \end{vmatrix} + 7 \begin{vmatrix} -1 & 4 \\ 2 - \lambda & 5 \end{vmatrix}\\ &= (3 - \lambda)(\lambda^2 + 4\lambda - 17) + 7(4\lambda - 13) \\ &= -\lambda^3 - \lambda^2 + 57\lambda - 142. \end{align*}\]

Exercises

Exercise 4.1 Find \(\det\left(\begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix} \right)\) using the formula.

Exercise 4.2

Find \(\det\left(\begin{bmatrix} 2 & 3 &4\\ 5 & 6 &7\\ 8 & 9 & 1 \end{bmatrix} \right)\) using the formula.

Exercise 4.3

Find \(\det\left(\begin{bmatrix} 2 & 3 &4\\ 5 & 6 &7\\ 8 & 9 & 1 \end{bmatrix} \right)\) using the Laplace method.

Exercise 4.4

Prove that if \(A\) is a square matrix with a row or column of 0’s, then \(\det(A) = 0\).

Exercise 4.5

Find \(\det\left(\begin{bmatrix} 0& 2 & 3 &4\\ 5 & 6 &7&0 \\1& 8 & 9 & 1\\0&2&3&0 \end{bmatrix} \right)\) using the Laplace method.

Exercise 4.6

Prove that if \(A\) is a square matrix with 2 identical rows or columns, then \(\det(A) = 0\).

Exercise 4.7

Find \(\det\left(\begin{bmatrix} 2 & 0 &0\\ 5 & 6 &0\\ 8 & 9 & 1 \end{bmatrix} \right)\).

Exercise 4.8

Find \(\text{tr}\left(\begin{bmatrix} 2 & 0 &0\\ 5 & 6 &0\\ 8 & 9 & 1 \end{bmatrix} \right)\).

Exercise 4.9

Find the characteristic polynomial for \(\begin{bmatrix} 2 & 0 &0\\ 5 & 6 &0\\ 8 & 9 & 1 \end{bmatrix}\).

Exercise 4.10 Prove the following properties of the trace:

  • \(\text{tr}(A+B) = \text{tr}(A) + \text{tr}(B)\)
  • \(\text{tr}(\alpha A) = \alpha \text{tr}(A)\)
  • \(\text{tr}(I_n) = n\)

Exercise 4.11

Exercise 4.12