3.3 Lengths and Distances

An inner product naturally induces a norm that measures the length of a vector: \[ \|\mathbf{x}\| = \sqrt{\langle \mathbf{x}, \mathbf{x} \rangle}. \] This means we can compute vector lengths directly from the inner product. However, not all norms come from inner products (for example, the Manhattan norm does not).

Theorem 3.2 For any vectors \(\mathbf{x}, \mathbf{y}\) in an inner product space: \[ |\langle \mathbf{x}, \mathbf{y} \rangle| \leq \|\mathbf{x}\| \|\mathbf{y}\|. \]

This fundamental inequality relates the inner product to the lengths of vectors.

Example 3.9 Let
\[ \mathbf{x} = \begin{bmatrix}2 \\ -1\end{bmatrix}, \qquad \mathbf{y} = \begin{bmatrix}1 \\ 3\end{bmatrix}. \]

Compute the inner product: \[ \langle \mathbf{x}, \mathbf{y} \rangle = \mathbf{x} \cdot \mathbf{y} = 2(1) + (-1)(3) = -1, \] so the left-hand side is \[ |\langle \mathbf{x}, \mathbf{y} \rangle| = |-1| = 1. \]

Compute the norms: \[ \|\mathbf{x}\| = \sqrt{2^2 + (-1)^2} = \sqrt{5}, \qquad \|\mathbf{y}\| = \sqrt{1^2 + 3^2} = \sqrt{10}. \]

Thus, \[ \|\mathbf{x}\|\,\|\mathbf{y}\| = \sqrt{50} \approx 7.07. \]

We see that \[ 1 \le 7.07, \] so the Cauchy–Schwarz inequality holds.


3.3.1 Distance and Metrics

Definition 3.4 In an inner product space \((V, \langle \cdot, \cdot \rangle)\), the distance between two vectors is defined as: \[ d(\mathbf{x}, \mathbf{y}) = \|\mathbf{x} - \mathbf{y}\| = \sqrt{\langle \mathbf{x} - \mathbf{y}, \mathbf{x} - \mathbf{y} \rangle}. \]

Of course, this means that thd distance between two vectors depends on how you measure distance (your choice of inner product).

Example 3.10 Let
\[ \mathbf{x} = \begin{bmatrix}1 \\ 2\end{bmatrix}, \qquad \mathbf{y} = \begin{bmatrix}4 \\ -1\end{bmatrix}. \]

We compute the distance \(d(\mathbf{x}, \mathbf{y})\) using two different inner products. First, we find the distance between the vectors. The difference vector is
\[ \mathbf{x} - \mathbf{y} = \begin{bmatrix}-3 \\ 3\end{bmatrix}. \]

1. Standard Euclidean Inner Product

The standard inner product is
\[ \langle \mathbf{u}, \mathbf{v} \rangle = u_1 v_1 + u_2 v_2. \]

Distance: \[ d_1(\mathbf{x}, \mathbf{y}) = \|\mathbf{x} - \mathbf{y}\| = \sqrt{(-3)^2 + 3^2} = \sqrt{18} = 3\sqrt{2}. \]


2. Weighted Inner Product with Matrix
\[ \langle \mathbf{u}, \mathbf{v} \rangle_A = \mathbf{u}^\top A \mathbf{v}, \qquad A = \begin{bmatrix} 2 & 0 \\ 0 & 5 \end{bmatrix}. \] Distance: \[ d_2(\mathbf{x}, \mathbf{y}) = \sqrt{(\mathbf{x}-\mathbf{y})^\top A (\mathbf{x}-\mathbf{y})} = \sqrt{63}. \]

Thus, \[ d_2(\mathbf{x}, \mathbf{y}) = \sqrt{63} = 3\sqrt{7}. \]

If the inner product is the standard dot product, \(d(\mathbf{x}, \mathbf{y})\) is the Euclidean distance (like the first part of the previous example).

Definition 3.5 A metric \(d: V \times V \to \mathbb{R}\) satisfies:

  1. Positive definiteness: \(d(\mathbf{x}, \mathbf{y}) \ge 0\) and \(d(\mathbf{x}, \mathbf{y}) = 0 \iff \mathbf{x} = \mathbf{y}\)
  2. Symmetry: \(d(\mathbf{x}, \mathbf{y}) = d(\mathbf{y}, \mathbf{x})\)
  3. Triangle inequality: \(d(\mathbf{x}, \mathbf{z}) \le d(\mathbf{x}, \mathbf{y}) + d(\mathbf{y}, \mathbf{z})\)

Example 3.11 Let \(X\) be any set. Define the discrete metric as \(d:X\times X\to\{0,1\}\) by \[ d(\mathbf{x},\mathbf{y})=\begin{cases} 0 & \text{if } \mathbf{x}=\mathbf{y},\\[4pt] 1 & \text{if } \mathbf{x}\ne \mathbf{y}. \end{cases} \] We show \(d\) satisfies the three metric axioms.

1. Positive Definiteness For all \(\mathbf{x},\mathbf{y}\in X\), \(d(\mathbf{x},\mathbf{y})\in\{0,1\}\), so \(d(\mathbf{x},\mathbf{y})\ge 0\). Moreover \(d(\mathbf{x},\mathbf{y})=0\) exactly when \(\mathbf{x}=\mathbf{y}\) by definition. Thus this function is positive definite.

2. Symmetry For any \(\mathbf{x},\mathbf{y}\in X\), \[ d(\mathbf{x},\mathbf{y})= \begin{cases} 0 & \mathbf{x}=\mathbf{y}\\ 1 & \mathbf{x}\ne \mathbf{y} \end{cases} = \begin{cases} 0 & \mathbf{y}=\mathbf{x}\\ 1 & \mathbf{y}\ne \mathbf{x} \end{cases} = d(\mathbf{y},\mathbf{x}). \] So \(d(\mathbf{x},\mathbf{y})=d(\mathbf{y},\mathbf{x})\) for all \(\mathbf{x},\mathbf{y}\).

3. Triangle inequality We must show for all \(\mathbf{x},\mathbf{y},\mathbf{z}\in X\), \[ d(\mathbf{x},\mathbf{z})\le d(\mathbf{x},\mathbf{y})+d(\mathbf{y},\mathbf{z}). \] There are only a few cases to check (each value is 0 or 1).

  • If \(\mathbf{x}=\mathbf{z}\) then \(d(\mathbf{x},\mathbf{z})=0\). The right-hand side \(d(\mathbf{x},\mathbf{y})+d(\mathbf{y},\mathbf{z})\) is \(\ge 0\), so the inequality holds.
  • If \(\mathbf{x}\ne \mathbf{z}\) then \(d(\mathbf{x},\mathbf{z})=1\). The only way the triangle inequality could fail is if \(d(\mathbf{x},\mathbf{y})+d(\mathbf{y},\mathbf{z})=0\). But \(d(\mathbf{x},\mathbf{y})+d(\mathbf{y},\mathbf{z})=0\) implies both \(d(\mathbf{x},\mathbf{y})=0\) and \(d(\mathbf{y},\mathbf{z})=0\), hence \(\mathbf{x}=\mathbf{y}\) and \(\mathbf{y}=\mathbf{z}\), so \(\mathbf{x}=\mathbf{z}\), contradicting \(\mathbf{x}\ne \mathbf{z}\). Therefore \(d(\mathbf{x},\mathbf{y})+d(\mathbf{y},\mathbf{z})\ge 1 = d(\mathbf{x},z)\), and the inequality holds.

Thus the triangle inequality is satisfied in all cases.

Since all metric axioms hold, \(d\) is a metric on \(X\).

Note that while inner products measure similarity, distances measure difference — similar vectors have a large inner product but a small distance.


Exercises

Exercise 3.17 Let \(\mathbf{x} = \begin{bmatrix}2\\4 \end{bmatrix}\). Compute \(\|\mathbf{x}\|\) using the dot product as the inner product that induces the norm.

Exercise 3.18 Let \(\mathbf{x} = \begin{bmatrix}2\\4 \end{bmatrix}\). Compute \(\|\mathbf{x}\|\) using \(\langle \mathbf{x} , \mathbf{y} \rangle = \mathbf{x}^T \begin{bmatrix} 2 & -1\\-1 &2 \end{bmatrix} \mathbf{y}\) as the inner product that induces the norm.

Exercise 3.19 Compute the distance between \(\begin{bmatrix} 1 \\ 2 \end{bmatrix}\) and \(\begin{bmatrix}3\\-4 \end{bmatrix}\) using the dot product as the inner product that induces the norm.

Exercise 3.20

Compute the distance between \(\begin{bmatrix} 1 \\ 2 \end{bmatrix}\) and \(\begin{bmatrix}3\\-4 \end{bmatrix}\) using \(\langle \mathbf{x} , \mathbf{y} \rangle = \mathbf{x}^T \begin{bmatrix} 2 & -1\\-1 &2 \end{bmatrix} \mathbf{y}\) as the inner product that induces the norm.

Exercise 3.21 Consider two numbers \(x,y \in \mathbb{R}\). Show that \(d(x,y) = |x-y|\) is a distance metric. Use this to extend to \(\mathbf{x},\mathbf{y} \in \mathbb{R}^n\) with the absolute value of a vector taken as componentwise absolute values.

Exercise 3.22 Show that \(\langle u+v, u-v \rangle = \|u\|^2 - \|v\|^2\) for all \(u,v \in V\).

Exercise 3.23

The metric in this example (the discrete metric) is fairly strange. Although it is not very useful in applications, it is handy to know about as it is totally different from the metrics we’ve seen so far. Let \(X \not = \phi\). Define: \[d(x,y) = \begin{cases}0 & x = y \\ 1 &x \not = y \end{cases}.\] Prove that this is a metric.

Exercise 3.24