Rank factorization

From Wikipedia the free encyclopedia

In mathematics, given a field , nonnegative integers , and a matrix , a rank decomposition or rank factorization of A is a factorization of A of the form A = CF, where and , where is the rank of .

Existence[edit]

Every finite-dimensional matrix has a rank decomposition: Let be an matrix whose column rank is . Therefore, there are linearly independent columns in ; equivalently, the dimension of the column space of is . Let be any basis for the column space of and place them as column vectors to form the matrix . Therefore, every column vector of is a linear combination of the columns of . To be precise, if is an matrix with as the -th column, then

where 's are the scalar coefficients of in terms of the basis . This implies that , where is the -th element of .

Non-uniqueness[edit]

If is a rank factorization, taking and gives another rank factorization for any invertible matrix of compatible dimensions.

Conversely, if are two rank factorizations of , then there exists an invertible matrix such that and .[1]

Construction[edit]

Rank factorization from reduced row echelon forms[edit]

In practice, we can construct one specific rank factorization as follows: we can compute , the reduced row echelon form of . Then is obtained by removing from all non-pivot columns (which can be determined by looking for columns in which do not contain a pivot), and is obtained by eliminating any all-zero rows of .

Note: For a full-rank square matrix (i.e. when ), this procedure will yield the trivial result and (the identity matrix).

Example[edit]

Consider the matrix

is in reduced echelon form.

Then is obtained by removing the third column of , the only one which is not a pivot column, and by getting rid of the last row of zeroes from , so

It is straightforward to check that

Proof[edit]

Let be an permutation matrix such that in block partitioned form, where the columns of are the pivot columns of . Every column of is a linear combination of the columns of , so there is a matrix such that , where the columns of contain the coefficients of each of those linear combinations. So , being the identity matrix. We will show now that .

Transforming into its reduced row echelon form amounts to left-multiplying by a matrix which is a product of elementary matrices, so , where . We then can write , which allows us to identify , i.e. the nonzero rows of the reduced echelon form, with the same permutation on the columns as we did for . We thus have , and since is invertible this implies , and the proof is complete.

Singular value decomposition[edit]

If then one can also construct a full-rank factorization of via a singular value decomposition

Since is a full-column-rank matrix and is a full-row-rank matrix, we can take and .

Consequences[edit]

rank(A) = rank(AT)[edit]

An immediate consequence of rank factorization is that the rank of is equal to the rank of its transpose . Since the columns of are the rows of , the column rank of equals its row rank.[2]

Proof: To see why this is true, let us first define rank to mean column rank. Since , it follows that . From the definition of matrix multiplication, this means that each column of is a linear combination of the columns of . Therefore, the column space of is contained within the column space of and, hence, .

Now, is , so there are columns in and, hence, . This proves that .

Now apply the result to to obtain the reverse inequality: since , we can write . This proves .

We have, therefore, proved and , so .

Notes[edit]

  1. ^ Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.
  2. ^ Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388

References[edit]

  • Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388
  • Lay, David C. (2005), Linear Algebra and its Applications (3rd ed.), Addison Wesley, ISBN 978-0-201-70970-4
  • Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations, Johns Hopkins Studies in Mathematical Sciences (3rd ed.), The Johns Hopkins University Press, ISBN 978-0-8018-5414-9
  • Stewart, Gilbert W. (1998), Matrix Algorithms. I. Basic Decompositions, SIAM, ISBN 978-0-89871-414-2
  • Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.