# Outer product

In linear algebra, the **outer product** of two coordinate vectors is a matrix. If the two vectors have dimensions *n* and *m*, then their outer product is an *n* × *m* matrix. If the first vector is taken as a column vector, then the outer product is the matrix of columns proportional to this vector, where the proportionality of each column is a component of the second vector.

The outer product introduces tensor algebra since the outer product of two vectors and is their tensor product which is the matrix given by . More generally, the outer product is an instance of Kronecker products.

The outer product contrasts with the dot product, which takes as input a pair of coordinate vectors and produces a scalar.

The outer product is also a related function in some computer programming languages.

## Contents

## Definition (matrix multiplication)[edit]

The outer product **u** ⊗ **v** is equivalent to a matrix multiplication **uv**^{T}, provided that **u** is represented as a *m* × 1 column vector and **v** as a *n* × 1 column vector (which makes **v**^{T} a row vector).^{[1]} For instance, if *m* = 4 and *n* = 3, then

^{[2]}

Or in index notation:

For complex vectors, it is customary to use the conjugate transpose of **v**, denoted or :

- .

### Contrast with Euclidean inner product[edit]

If *m* = *n*, then one can take the matrix product the other way, yielding a scalar (or 1 × 1 matrix):

which is the standard inner product for Euclidean vector spaces, better known as the dot product. The inner product is the trace of the outer product.^{[3]} Unlike the inner product, the outer product is not commutative.

### Rank of an outer product[edit]

If **u** and **v** are both nonzero then the outer product matrix **uv**^{T} always has matrix rank 1. Indeed, the columns of the outer product are all proportional to the first column, thus they are all linearly dependent on that one column, hence the matrix is of rank one.

("Matrix rank" should not be confused with "tensor order", or "tensor degree", which is sometimes referred to as "rank".)

## Definition (vectors and tensors)[edit]

### Vector multiplication[edit]

Given the vectors

their outer product **u** ⊗ **v** is defined as the *m* × *n* matrix **A** obtained by multiplying each element of **u** by each element of **v**:^{[4]}^{[5]}

For complex vectors, outer product can be defined as above, or with the conjugate transpose of (denoted o ). Namely, matrix **A** is obtained by multiplying each element of **u** by the conjugate transpose of each element of .

### Tensor multiplication[edit]

The outer product on tensors is typically referred to as the tensor product. Given a tensor **a** of order *q* with dimensions (*i*_{1}, ..., *i*_{q}), and a tensor **b** of order *r* with dimensions (*j*_{1}, ..., *j*_{r}), their outer product **c** is of order *q* + *r* with dimensions (*k*_{1}, ..., *k*_{q+r}) which are the *i* dimensions followed by the *j* dimensions. It is denoted in coordinate-free notation using ⊗ and components are defined in index notation by:^{[6]}

similarly for higher order tensors:

For example, if **A** is of order 3 with dimensions (3, 5, 7) and **B** is of order 2 with dimensions (10, 100), their outer product **c** is of order 5 with dimensions (3, 5, 7, 10, 100). If **A** has a component *A*_{[2, 2, 4]} = 11 and **B** has a component *B*_{[8, 88]} = 13, then the component of **C** formed by the outer product is *C*_{[2, 2, 4, 8, 88]} = 143.

To understand the matrix definition of outer product in terms of the definition of tensor product:

- The vector
**v**can be interpreted as an order-1 tensor with dimension*M*, and the vector**u**as an order-1 tensor with dimension*N*. The result is an order-2 tensor with dimension (*M*,*N*). - The order of the result of an inner product between two tensors of order
*q*and*r*is the greater of*q*+*r*− 2 and 0. Thus, the inner product of two matrices has the same order as the outer product (or tensor product) of two vectors. - It is possible to add arbitrarily many leading or trailing
*1*dimensions to a tensor without fundamentally altering its structure. These*1*dimensions would alter the character of operations on these tensors, so any resulting equivalences should be expressed explicitly. - The inner product of two matrices
**V**with dimensions (*d*,*e*) and**U**with dimensions (*e*,*f*) is , where*i*= 1, 2, ...,*d*and*k*= 1, 2, ...,*f*. For the case where*e*= 1, the summation is trivial (involving only a single term). - The outer product of two matrices
**V**with dimensions (*m*,*n*) and**U**with dimensions (*p*,*q*) is , where*s*= 1, 2, ...,*mp*− 1,*mp*and*t*= 1, 2, ...,*nq*− 1,*nq*.

## Definition (abstract)[edit]

Let *V* and *W* be two vector spaces, and let *W*^{∗} be the dual space of *W*.
Given vectors *x* ∈ *V* and *y* ∈ *W*^{∗}, then the tensor product *y* ⊗ *x* corresponds to the map *A* : W → *V* given by

Here *y*(*w*) denotes the value of the linear functional *y* (which is an element of the dual space of *W*) when evaluated at the element *w* ∈ *W*. This scalar in turn is multiplied by *x* to give as the final result an element of the space *V*.

If *V* and *W* are finite-dimensional, then the space of all linear transformations from *W* to *V*, denoted Hom(*W*, *V*), is generated by such outer products; in fact, the rank of a matrix is the minimal number of such outer products needed to express it as a sum (this is the **tensor rank** of a matrix). In this case Hom(*W*, *V*) is isomorphic to *W*^{∗} ⊗ *V*.

### Contrast with duality pairing[edit]

If *W* = *V*, then one can pair the covector *w* ∈ *V*^{∗} with the vector *v* ∈ *V* via the map (*w*, *v*) ↦ *w*(*v*), which is the duality pairing between *V* and its dual.

## In programming languages[edit]

In some programming languages, given a two-argument function *f* (or a binary operator), the outer product of *f* and two one-dimensional arrays A and B is a two-dimensional array C such that C[i,j] = f(A[i],B[j]); this is syntactically represented in various ways: in APL, as the infix binary operator °.*f*; in R, as the function outer(A, B, *f*);^{[7]} in Mathematica, as Outer[*f*,A,B]. In MATLAB, the function kron(A,B) is used for this product; these often generalize to multi-dimensional arguments, and more than two arguments.

### Python with NumPy[edit]

In the Python library NumPy, the outer product can be computed with function `np.outer()`

^{[8]}

```
>>> import numpy as np
>>> a = np.array([1, 2, 3])
>>> b = np.array([2, 4, 8])
>>> np.outer(a, b)
Out[*]: array([[ 2, 4, 8],
[ 4, 8, 16],
[ 6, 12, 24]])
# in contrast np.kron, generalization of outer product, results in a flat array
>>> np.kron(a, b)
Out[*]: array([ 2, 4, 8, 4, 8, 16, 6, 12, 24])
```

## Properties[edit]

The outer product satisfies the following properties:

## Applications[edit]

As the outer product is a special case of the Kronecker product, some of the applications of the Kronecker product use outer products; some of these applications to quantum theory, signal processing, and image compression are found in chapter 3, "Applications", in a book by Willi-Hans Steeb and Yorick Hardy.^{[9]}

### Spinors[edit]

Suppose *s,t,w,z* ∈ ℂ so that (*s,t*) and (*w,z*) are in ℂ^{2}. Then the outer product of these complex 2-vectors is an element of M(2,ℂ), the 2 × 2 complex matrices:

- The determinant of this matrix is
*swtz*−*sztw*= 0 because of the commutative property of ℂ.

In the theory of spinors in three dimensions, these matrices are associated with isotropic vectors due to this null property. Élie Cartan described this construction in 1937^{[10]} but it was introduced by Wolfgang Pauli in 1927^{[11]} so that M(2,ℂ) has come to be called Pauli algebra.

### Concepts[edit]

The block form of outer products is useful in classification. Concept analysis is a study that depends on certain outer products:

When a vector has only zeros and ones as entries it is called a *logical vector*, a special case of a logical matrix; the logical operation and takes the place of multiplication. The outer product of two logical vectors (*u*_{i}) and (*v*_{j}) is given by the logical matrix . This type of matrix is used in the study of binary relations and is called a rectangular relation or a **cross-vector**.^{[12]}

## See also[edit]

### Products[edit]

### Duality[edit]

## References[edit]

**^**Lipschutz, S.; Lipson, M. (2009).*Linear Algebra*. Schaum’s Outlines (4th ed.). McGraw-Hill. ISBN 978-0-07-154352-1.**^**James M. Ortega (1987)*Matrix Theory: A Second Course*, page 7, Plenum Press ISBN 0-306-42433-9**^**Stengel, Robert F. (1994).*Optimal Control and Estimation*. New York: Dover Publications. p. 26. ISBN 0-486-68200-5.**^**"Kronecker Product".*Wolfram MathWorld*.**^**Lerner, R. G.; Trigg, G. L. (1991).*Encyclopaedia of Physics*(2nd ed.). VHC. ISBN 0-89573-752-3.**^**Riley, K. F.; Hobson, M. P.; Bence, S. J. (2010).*Mathematical Methods for Physics and Engineering*. Cambridge University Press. ISBN 978-0-521-86153-3.**^**"outer function".*RDocumentation*.**^**https://docs.scipy.org/doc/numpy/reference/generated/numpy.outer.html**^**Willi-Hans Steeb and Yorick Hardy (2011) Matrix Calculus and Kronecker Product: A Practical Approach to Linear and Multilinear Algebra, second edition, World Scientific ISBN 981-4335-31-2**^**Élie Cartan (1937)*Lecons sur la theorie des spineurs*, translated 1966:*The Theory of Spinors*, Hermann, Paris**^**Pertti Lounesto (1997)*Clifford Algebras and Spinors*, page 51, Cambridge University Press ISBN 0-521-59916-4**^**Ki Hang Kim (1982)*Boolean Matrix Theory and Applications*, page 37, Marcel Dekker ISBN 0-8247-1788-0

## Further reading[edit]

- Carlen, Eric; Canceicao Carvalho, Maria (2006). "Outer Products and Orthogonal Projections".
*Linear Algebra: From the Beginning*. Macmillan. pp. 217–218.