Inner Products

We can analyze minimum-distance classifiers easily by using a little linear algebra. We have already used the symbol x to stand for a column vector of d features, x1, x2, ... , xd. By using the transpose operator ' we can convert the column vector x to the row vector x':

The inner product of two column vectors x and y is defined by

Thus the norm of x (using the Euclidean metric) is given by

|| x || = sqrt( x' x ) .

Here are some important additional properties of inner products:
x' y = y' x = || x || || y || cos( angle between x and y )

x' ( y + z ) = x' y + x' z .
Thus, the inner product of x and y is maximum when the angle between them is zero, i.e., when one is just a positive multiple of the other. Sometimes we say (a little loosely) that x' y is the correlation between x and y, and that the correlation is maximum when x and y point in the same direction. If x' y = 0, the vectors x and y are said to be orthogonal or uncorrelated.

Left arrow Back to Metrics Right arrow On to Linear Up arrow Up to Simple Classifiers