Norms

In order to define how close two vectors are, and in order to define the convergence of sequences of vectors, mathematicians use a special device, called metric (which is actually a distance). However, since we need to incorporate vector additions, a metric in vector spaces is generated by a norm.

Let V be a vector space over a field 𝔽, where 𝔽 is either the field ℝ of reals, or the field ℚ of rational numbers, or the field ℂ of complex numbers. A norm on V is a function ∥ ∥ : V → ℝ+ = { r∈ℝ : r ≥ 0 }, assigning a nonnegative real number ∥u∥ to any vector uV, and satisfying the following conditions for all x, y, zV:

  1. Positivity:    ∥x∥ ≥ 0 and ∥x∥ = 0 if and only if x = 0.
  2. Homogeneity:     ∥kx∥ = |k|·∥x∥.
  3. Triangle inequality:    ∥x+y∥ ≤ ∥x∥ + ∥y∥.
A vector space V together with a norm ∥·∥ is called a normed vector space.

Out of many possible norms, we mention four the most important norms:

  • For every x = [x1, x2, … , xn] ∈ V, we have the 1-norm:
    \[ \| {\bf x}\|_1 = \sum_{k=1}^n | x_k | = |x_1 | + |x_2 | + \cdots + |x_n |. \]
    It is also called the Taxicab norm or Manhattan norm.
  • The Euclidean norm or ℓ²-norm is
    \[ \| {\bf x}\|_2 = \left( \sum_{k=1}^n x_k^2 \right)^{1/2} = \left( x_1^2 + x_2^2 + \cdots + x_n^2 \right)^{1/2} . \]
  • The Chebyshev norm or sup-norm ‖v, is defined such that
    \[ \| {\bf x}\|_{\infty} = \max_{1 \le k \le n} \left\{ | x_k | \right\} . \]
  • The ℓp-norm (for p≥1)
    \[ \| {\bf x}\|_p = \left( \sum_{k=1}^n x_k^p \right)^{1/p} = \left( x_1^p + x_2^p + \cdots + x_n^p \right)^{1/p} . \]
Theorem 3: The following inequalities hold for all x ∈ ℂn or x ∈ ℝn:
  • \( \displaystyle \| {\bf x} \|_{\infty} \le \| {\bf x} \|_{1} \le n\,\| {\bf x} \|_{\infty} , \)
  • \( \displaystyle \| {\bf x} \|_{\infty} \le \| {\bf x} \|_{2} \le \sqrt{n}\,\| {\bf x} \|_{\infty} , \)
  • \( \displaystyle \| {\bf x} \|_{2} \le \| {\bf x} \|_{1} \le \sqrt{n}\,\| {\bf x} \|_{2} .\)
In given any (real or complex) vector space V, two norms ‖ ‖a and ‖ ‖b are equivalent if and only if (iff) there exists some positive constants c1 and c2 such that
\[ \| {\bf x}\|_a \le c_1 \| {\bf x}\|_b \qquad \mbox{and} \qquad \| {\bf x}\|_b \le c_2 \| {\bf x}\|_a \qquad \mbox{for all } \quad {\bf x} \in V. \]

Theorem 4: If V is any real or complex vector space of finite dimension, then any two norms on V are equivalent.

With dot product, we can assign a length of a vector, which is also called the Euclidean norm or 2-norm:

\[ \| {\bf x} \|_2 = \sqrt{ {\bf x}\cdot {\bf x}} = \sqrt{x_1^2 + x_2^2 + \cdots + x_n^2} . \]
Since this definition involves a square root, we always use a positive branch of the analytic function \( \sqrt{x} . \) Namely, we choose the branch that assigns \( \sqrt{x} > 0 \) for z > 0.

Dot product is a particular case of more general bilinear form, known as inner product. An inner product space is a vector space with an additional structure called an inner product. So every inner product space inherits the Euclidean norm and becomes a metric space. In linear algebra, functional analysis, and related areas of mathematics, a norm is a function that assigns a strictly positive length or size to each vector in a vector space—save for the zero vector, which is assigned a length of zero.

The definition of norm in ℂn needs an accormodation of complex conjugate numbers that are denoted either by overline (mathematics) or asterisk (physics and engineering):

\[ \left( a + b\,{\bf j} \right)^{\ast} = \overline{a + b\,{\bf j}} = a - b\,{\bf j} , \]
where j is the imaginary unit on the complex plane ℂ, so j² = −1. Then the Euclidean norm on an n-dimensional complex space is defined by
\[ \| {\bf z} \| = \sqrt{ {\bf z}\cdot {\bf z}} = \sqrt{\overline{z_1} \,z_1 + \overline{z_2}\,z_2 + \cdots + \overline{z_n}\,z_n} = \sqrt{|z_1|^2 + |z_2 |^2 + \cdots + |z_n |^2} . \]
A unit vector u is a vector whose length equals one: \( {\bf u} \cdot {\bf u} =1 . \) We say that two vectors x and y are perpendicular if their inner product is zero.

For any norm, the Cauchy--Bunyakovsky--Schwarz (or simply CBS) inequality holds:

\begin{equation} \label{EqVector.1} | {\bf x} \cdot {\bf y} | \le \| {\bf x} \| \, \| {\bf y} \| . \end{equation}
For p ≥ 1, we define q as
\[ \frac{1}{p} + \frac{1}{q} = 1 . \]
Then the CBS inequality can be generalized as
\begin{equation} \label{EqVector.2} | {\bf x} \cdot {\bf y} | \le \| {\bf x} \|_p \, \| {\bf y} \|_q . \end{equation}
Eq.\eqref{EqVector.2} is known as "Hölder's inequality.
         
 Augustin-Louis Cauchy    Viktor Yakovlevich Bunyakovsky    Hermann Amandus Schwarz

The inequality \eqref{EqVector.1} for sums was published by Augustin-Louis Cauchy (1789--1857) in 1821, while the corresponding inequality for integrals was first proved by Viktor Yakovlevich Bunyakovsky (1804--1889) in 1859. The modern proof (which is actually a repetition of the Bunyakovsky's one) of the integral inequality was given by Hermann Amandus Schwarz (1843--1921) in 1888.   ■