4.5 Singular Value Decomposition (SVD)

The Singular Value Decomposition (SVD) is a fundamental matrix decomposition method in linear algebra, applicable to all matrices (square or rectangular). It expresses a matrix \(\mathbf{A} \in \mathbb{R}^{m \times n}\) as:

\[ \mathbf{A} = \mathbf{U} \mathbf{\Sigma} \mathbf{V}^\top \]

where:
- \(\mathbf{U} \in \mathbb{R}^{m \times m}\) is an orthogonal matrix of left-singular vectors \(u_i\),
- \(\mathbf{V} \in \mathbb{R}^{n \times n}\) is an orthogonal matrix of right-singular vectors \(v_j\),
- \(\mathbf{\Sigma} \in \mathbb{R}^{m \times n}\) is diagonal with non-negative singular values \(\sigma_i\) (ordered \(\sigma_1 \ge \sigma_2 \ge \dots \ge 0\)).

4.5.1 Geometric Intuition

The SVD can be interpreted as three sequential linear transformations:

  1. Basis change in the domain via \(\mathbf{V}^\top\)
  2. Scaling by singular values via \(\mathbf{\Sigma}\) and possibly dimension change
  3. Basis change in the codomain via \(\mathbf{U}\)

Unlike eigendecomposition, the domain and codomain in SVD can have different dimensions, and \(\mathbf{U}\) and \(\mathbf{V}\) are orthonormal but generally not inverses of each other.

Example 4.19 Consider the matrix \[ \mathbf{A} = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix}. \]

It has a Singular Value Decomposition given by \[ \mathbf{A} = \underbrace{\mathbf{U}}_{\text{basis change}} \;\; \underbrace{\boldsymbol{\Sigma}}_{\text{scaling}} \;\; \underbrace{\mathbf{V}^\top}_{\text{basis change}} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 3 & 0 \\ 0 & 2 \end{bmatrix} \frac{1}{\sqrt{2}} \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix}. \]

Geometric Interpretation:

  1. \(\mathbf{V}^\top\) Rotates the input vector into a new orthonormal basis.

  2. \(\boldsymbol{\Sigma}\) Scales the coordinates by factors of 3 and 2 along perpendicular axes.

  3. \(\mathbf{U}\) Rotates the result into the output space.

4.5.2 Construction of the SVD

The following are the steps required to complete a SVD:

  1. Compute \(\mathbf{A}^\top \mathbf{A}\) (symmetric, positive semidefinite)
  2. Diagonalize \(\mathbf{A}^\top \mathbf{A} = \mathbf{P} D \mathbf{P}^\top\) to obtain right-singular vectors \(\mathbf{V} = \mathbf{P}\)
  3. Singular values \(\sigma_i = \sqrt{\lambda_i}\), where \(\lambda_i\) are eigenvalues of \(\mathbf{A}^\top \mathbf{A}\)
  4. Compute left-singular vectors \(u_i = \frac{1}{\sigma_i} \mathbf{A} v_i\)
  5. Assemble \(\mathbf{U}, \mathbf{\Sigma}, \mathbf{V}\) to form \(\mathbf{A} = \mathbf{U} \mathbf{\Sigma} \mathbf{V}^\top\)

Example 4.20 Consider the matrix \[ \mathbf{A} = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix}. \] We will compute its Singular Value Decomposition \[ \mathbf{A} = \mathbf{U}\boldsymbol{\Sigma}\mathbf{V}^\top. \]

Step 1: Compute \(\mathbf{A}^\top \mathbf{A}\)

\[ \mathbf{A}^\top \mathbf{A} = \begin{bmatrix} 3 & 0 \\ 1 & 2 \end{bmatrix} \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix} = \begin{bmatrix} 9 & 3 \\ 3 & 5 \end{bmatrix}. \]

Step 2: Singular Values

The eigenvalues of \(\mathbf{A}^\top \mathbf{A}\) satisfy \[ \det(\mathbf{A}^\top \mathbf{A} - \lambda \mathbf{I}) = \begin{vmatrix} 9 - \lambda & 3 \\ 3 & 5 - \lambda \end{vmatrix} = \lambda^2 - 14\lambda + 36 = 0. \] So, \[ \lambda_1 = 9, \quad \lambda_2 = 4. \] The singular values are \[ \sigma_1 = 3, \quad \sigma_2 = 2. \] Thus, \[ \boldsymbol{\Sigma} = \begin{bmatrix} 3 & 0 \\ 0 & 2 \end{bmatrix}. \]

Step 3: Right Singular Vectors (\(\mathbf{V}\))

Eigenvectors of \(\mathbf{A}^\top \mathbf{A}\):

  • For \(\lambda = 9\): eigenvector \(\begin{bmatrix} 1 \\ 1 \end{bmatrix}\)
  • For \(\lambda = 4\): eigenvector \(\begin{bmatrix} 1 \\ -1 \end{bmatrix}\)

After normalization, \[ \mathbf{V} = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix}. \] This represents the first basis change in the domain.

Step 4: Left Singular Vectors (\(\mathbf{U}\))

Using \(\mathbf{U} = \mathbf{A}\mathbf{V}\boldsymbol{\Sigma}^{-1}\), we obtain \[ \mathbf{U} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}. \] (This matrix happens to be the identity here, meaning no rotation in the codomain.)

Step 5: Final SVD

\[ \mathbf{A} = \underbrace{\mathbf{U}}_{\text{basis change}} \;\; \underbrace{\boldsymbol{\Sigma}}_{\text{scaling}} \;\; \underbrace{\mathbf{V}^\top}_{\text{basis change}} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 3 & 0 \\ 0 & 2 \end{bmatrix} \frac{1}{\sqrt{2}} \begin{bmatrix} 1 & 1 \\ 1 & -1 \end{bmatrix}. \]

The SVD always exists for any matrix. For symmetric positive definite matrices, it coincides with the eigendecomposition.

Example 4.21 Consider the \(3 \times 2\) matrix \[ \mathbf{A} = \begin{bmatrix} 1 & 0 \\ 0 & 2 \\ 0 & 0 \end{bmatrix}. \] We compute the Singular Value Decomposition of \(\mathbf{A}\).

Step 1: Compute \(\mathbf{A}^\top \mathbf{A}\)

\[ \mathbf{A}^\top \mathbf{A} = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2 & 0 \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & 2 \\ 0 & 0 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 4 \end{bmatrix}. \]

Step 2: Singular Values

The eigenvalues of \(\mathbf{A}^\top \mathbf{A}\) are \[ \lambda_1 = 4, \quad \lambda_2 = 1. \] Thus the singular values are \[ \sigma_1 = 2, \quad \sigma_2 = 1 \;\;\; \Longrightarrow \;\;\; \boldsymbol{\Sigma} = \begin{bmatrix} 2 & 0 \\ 0 & 1 \\ 0 & 0 \end{bmatrix}. \]

Step 3: Right Singular Vectors (\(\mathbf{V}\))

Since \(\mathbf{A}^\top \mathbf{A}\) is diagonal, \[ \mathbf{V} = \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \quad \text{(any orthonormal ordering is valid)}. \] This matrix represents a basis change in the domain \(\mathbb{R}^2\).

Step 4: Left Singular Vectors (\(\mathbf{U}\))

Compute \[ \mathbf{U} = \mathbf{A}\mathbf{V}\boldsymbol{\Sigma}^{-1}. \] The resulting orthonormal basis is \[ \mathbf{U} = \begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix}. \] This is a basis for \(\mathbb{R}^3\), extending the image of \(\mathbf{A}\) to a full orthonormal basis.

Step 5: Final SVD

\[ \mathbf{A} = \underbrace{\mathbf{U}}_{\text{basis change in } \mathbb{R}^3} \underbrace{\boldsymbol{\Sigma}}_{\text{scaling}} \underbrace{\mathbf{V}^\top}_{\text{basis change in } \mathbb{R}^2}. \] Explicitly, \[ \mathbf{A} = \begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix} \begin{bmatrix} 2 & 0 \\ 0 & 1 \\ 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}. \]

4.5.3 Comparison: Eigenvalue Decomposition vs SVD

Feature Eigendecomposition SVD
Matrix type Square only Any \(m \times n\)
Basis vectors Not necessarily orthonormal Orthonormal (U, V)
Diagonal entries Eigenvalues (can be negative/complex) Non-negative singular values
Basis change Same vector space Domain and codomain can differ
Relation to eigenvectors Only eigenvectors of square matrices Left-singular vectors = eigenvectors of \(\mathbf{A}\mathbf{A}^\top\), Right-singular = eigenvectors of \(\mathbf{A}^\top \mathbf{A}\)

Exercises

Exercise 4.32 Find a SVD of \[\mathbf{A} = \begin{bmatrix}3&0\\4&5 \end{bmatrix}.\]

Exercise 4.33 Find a SVD of \[\mathbf{A} = \begin{bmatrix}3&2&2\\2&3&-2 \end{bmatrix}.\]

Exercise 4.34 Find a SVD of \[\mathbf{A} = \begin{bmatrix}4&0\\3&-5 \end{bmatrix}.\]

Exercise 4.35 Find a SVD of \[\mathbf{A} = \begin{bmatrix}1&0&1\\-1&1&0 \end{bmatrix}.\]

Exercise 4.36 Find a SVD of \[\mathbf{A} = \begin{bmatrix}1&-1\\0&1\\1&0 \end{bmatrix}.\]

Exercise 4.37 Find a SVD of \[\mathbf{A} = \begin{bmatrix}1&1&1\\ -1&0&-2\\ 1&2&0 \end{bmatrix}.\]