Katana VentraIP

Invertible matrix

In linear algebra, an n-by-n square matrix A is called invertible (also nonsingular, nondegenerate or rarely regular) if there exists an n-by-n square matrix B such that

Over a field, a square matrix that is not invertible is called singular or degenerate. A square matrix with entries in a field is singular if and only if its determinant is zero. Singular matrices are rare in the sense that if a square matrix's entries are randomly selected from any bounded region on the number line or complex plane, the probability that the matrix is singular is 0, that is, it will "almost never" be singular. Non-square matrices, i.e. m-by-n matrices for which mn, do not have an inverse. However, in some cases such a matrix may have a left inverse or right inverse. If A is m-by-n and the rank of A is equal to n, (nm), then A has a left inverse, an n-by-m matrix B such that BA = In. If A has rank m (mn), then it has a right inverse, an n-by-m matrix B such that AB = Im.


While the most common case is that of matrices over the real or complex numbers, all these definitions can be given for matrices over any algebraic structure equipped with addition and multiplication (i.e. rings). However, in the case of a ring being commutative, the condition for a square matrix to be invertible is that its determinant is invertible in the ring, which in general is a stricter requirement than it being nonzero. For a noncommutative ring, the usual determinant is not defined. The conditions for existence of left-inverse or right-inverse are more complicated, since a notion of rank does not exist over rings.


The set of n × n invertible matrices together with the operation of matrix multiplication and entries from ring R form a group, the general linear group of degree n, denoted GLn(R).

Properties[edit]

The invertible matrix theorem[edit]

Let A be a square n-by-n matrix over a field K (e.g., the field of real numbers). The following statements are equivalent, i.e., they are either all true or all false for any given matrix:[2]

Methods of matrix inversion[edit]

Gaussian elimination[edit]

Gaussian elimination is a useful and easy way to compute the inverse of a matrix. To compute a matrix inverse using this method, an augmented matrix is first created with the left side being the matrix to invert and the right side being the identity matrix. Then, Gaussian elimination is used to convert the left side into the identity matrix, which causes the right side to become the inverse of the input matrix.


For example, take the following matrix:


The first step to compute its inverse is to create the augmented matrix


Call the first row of this matrix and the second row . Then, add row 1 to row 2 This yields


Next, subtract row 2, multiplied by 3, from row 1 which yields


Finally, multiply row 1 by −1 and row 2 by 2 This yields the identity matrix on the left side and the inverse matrix on the right:


Thus,


The reason it works is that the process of Gaussian elimination can be viewed as a sequence of applying left matrix multiplication using elementary row operations using elementary matrices (), such as


Applying right-multiplication using we get And the right side which is the inverse we want.


To obtain we create the augumented matrix by combining A with I and applying Gaussian elimination. The two portions will be transformed using the same sequence of elementary row operations. When the left portion becomes I, the right portion applied the same elementary row operation sequence will become A−1.

Newton's method[edit]

A generalization of Newton's method as used for a multiplicative inverse algorithm may be convenient, if it is convenient to find a suitable starting seed:

Generalized inverse[edit]

Some of the properties of inverse matrices are shared by generalized inverses (for example, the Moore–Penrose inverse), which can be defined for any m-by-n matrix.[17]

, Encyclopedia of Mathematics, EMS Press, 2001 [1994]

"Inversion of a matrix"

; Leiserson, Charles E.; Rivest, Ronald L.; Stein, Clifford (2001) [1990]. "28.4: Inverting matrices". Introduction to Algorithms (2nd ed.). MIT Press and McGraw-Hill. pp. 755–760. ISBN 0-262-03293-7.

Cormen, Thomas H.

Bernstein, Dennis S. (2009). (2nd ed.). Princeton University Press. ISBN 978-0691140391 – via Google Books.

Matrix Mathematics: Theory, Facts, and Formulas

Petersen, Kaare Brandt; Pedersen, Michael Syskind (November 15, 2012). (PDF). pp. 17–23.

"The Matrix Cookbook"

Sanderson, Grant (August 15, 2016). . Essence of Linear Algebra. Archived from the original on 2021-11-03 – via YouTube.

"Inverse Matrices, Column Space and Null Space"

Strang, Gilbert. . MIT OpenCourseWare.

"Linear Algebra Lecture on Inverse Matrices"

Moore-Penrose Inverse Matrix