Katana VentraIP

Projection (linear algebra)

In linear algebra and functional analysis, a projection is a linear transformation from a vector space to itself (an endomorphism) such that . That is, whenever is applied twice to any vector, it gives the same result as if it were applied once (i.e. is idempotent). It leaves its image unchanged.[1] This definition of "projection" formalizes and generalizes the idea of graphical projection. One can also consider the effect of a projection on a geometrical object by examining the effect of the projection on points in the object.

"Orthogonal projection" redirects here. For the technical drawing concept, see Orthographic projection. For a concrete discussion of orthogonal projections in finite-dimensional linear spaces, see Vector projection.

A is called a projection matrix if it is equal to its square, i.e. if .[2]: p. 38 

square matrix

A square matrix is called an orthogonal projection matrix if for a matrix, and respectively for a complex matrix, where denotes the transpose of and denotes the adjoint or Hermitian transpose of .[2]: p. 223 

real

A projection matrix that is not an orthogonal projection matrix is called an oblique projection matrix.

Examples[edit]

Orthogonal projection[edit]

For example, the function which maps the point in three-dimensional space to the point is an orthogonal projection onto the xy-plane. This function is represented by the matrix

Projections on normed vector spaces[edit]

When the underlying vector space is a (not necessarily finite-dimensional) normed vector space, analytic questions, irrelevant in the finite-dimensional case, need to be considered. Assume now is a Banach space.


Many of the algebraic results discussed above survive the passage to this context. A given direct sum decomposition of into complementary subspaces still specifies a projection, and vice versa. If is the direct sum , then the operator defined by is still a projection with range and kernel . It is also clear that . Conversely, if is projection on , i.e. , then it is easily verified that . In other words, is also a projection. The relation implies and is the direct sum .


However, in contrast to the finite-dimensional case, projections need not be continuous in general. If a subspace of is not closed in the norm topology, then the projection onto is not continuous. In other words, the range of a continuous projection must be a closed subspace. Furthermore, the kernel of a continuous projection (in fact, a continuous linear operator in general) is closed. Thus a continuous projection gives a decomposition of into two complementary closed subspaces: .


The converse holds also, with an additional assumption. Suppose is a closed subspace of . If there exists a closed subspace such that X = UV, then the projection with range and kernel is continuous. This follows from the closed graph theorem. Suppose xnx and Pxny. One needs to show that . Since is closed and {Pxn} ⊂ U, y lies in , i.e. Py = y. Also, xnPxn = (IP)xnxy. Because is closed and {(IP)xn} ⊂ V, we have , i.e. , which proves the claim.


The above argument makes use of the assumption that both and are closed. In general, given a closed subspace , there need not exist a complementary closed subspace , although for Hilbert spaces this can always be done by taking the orthogonal complement. For Banach spaces, a one-dimensional subspace always has a closed complementary subspace. This is an immediate consequence of Hahn–Banach theorem. Let be the linear span of . By Hahn–Banach, there exists a bounded linear functional such that φ(u) = 1. The operator satisfies , i.e. it is a projection. Boundedness of implies continuity of and therefore is a closed complementary subspace of .

Singular value decomposition

Reduction to form (the first step in many eigenvalue algorithms)

Hessenberg

Linear regression

Projective elements of matrix algebras are used in the construction of certain K-groups in

Operator K-theory

Projections (orthogonal and otherwise) play a major role in algorithms for certain linear algebra problems:


As stated above, projections are a special case of idempotents. Analytically, orthogonal projections are non-commutative generalizations of characteristic functions. Idempotents are used in classifying, for instance, semisimple algebras, while measure theory begins with considering characteristic functions of measurable sets. Therefore, as one can imagine, projections are very often encountered in the context of operator algebras. In particular, a von Neumann algebra is generated by its complete lattice of projections.

Generalizations[edit]

More generally, given a map between normed vector spaces one can analogously ask for this map to be an isometry on the orthogonal complement of the kernel: that be an isometry (compare Partial isometry); in particular it must be onto. The case of an orthogonal projection is when W is a subspace of V. In Riemannian geometry, this is used in the definition of a Riemannian submersion.

which is an example of a projection matrix.

Centering matrix

to compute the projection onto an intersection of sets

Dykstra's projection algorithm

Invariant subspace

Least-squares spectral analysis

Orthogonalization

Properties of trace

Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC,  978-1420095388

ISBN

Dunford, N.; Schwartz, J. T. (1958). Linear Operators, Part I: General Theory. Interscience.

Meyer, Carl D. (2000). . Society for Industrial and Applied Mathematics. ISBN 978-0-89871-454-8.

Matrix Analysis and Applied Linear Algebra

on YouTube, from MIT OpenCourseWare

MIT Linear Algebra Lecture on Projection Matrices

on YouTube, by Pavel Grinfeld.

Linear Algebra 15d: The Projection Transformation

– a simple-to-follow tutorial explaining the different types of planar geometric projections.

Planar Geometric Projections Tutorial