I have been trying to develop a program to solve a system Ax=b for a square matrix A using LU-Decomposition. However, I realized that this decomposition does not always exist (one way to tell is if a row exchange operation is not required, then exists). However, I see from many sources that this is an excellent method in computing the solutions to Ax=b.

My question is: how often is it that one comes across a matrix that does not have an LU-decomposition? If one does encounter such a matrix, how should he handle it? Should he create a separate method such as Gaussian Elimination just in case?

Please provide me with some insight on this. Thanks in advance.

Note: I am trying to use this information to solve A^TAx=A^Tb, i.e. finding a mathematical model using least squares.


Taken from wikipedia in its most concise form

Any square matrix $A$ admits an LUP factorization. If $A$ is invertible, then it admits an LU (or LDU) factorization if and only if all its leading principal minors are non-zero. If $A$ is a singular matrix of rank $k$, then it admits an LU factorization if the first $k$ leading principal minors are non-zero, although the converse is not true.

I don't have the implementation fully written, but this looks involved. I would think depending on your matrix, there exists simpler numerical schemes that reduces your solution down.

As for often how does one come across such? Well no one has any idea what you do, so that is impossible to answer. If you encounter such, switch to another scheme.

One that I have used often in practice is Gauss-Seidel. Actually wikipedia has a completely written scheme.

  • Thank you for the insight on the Gauss-Seidel method! I will try to incorporate it in my program. – Matthew Yang Mar 12 at 6:39

The LU decompositions exists if and only if all leading principal minors of the matrix are non-zero.

From your actual question, you are solving:


A^TA is a square symmetric matrix. We can diagonalize the matrix as: A = R^-1 D R and you can always rearrange it to find x. You need non-zero eigen values for this to work.

A (square) matrix is invertible if and only if it does not have a zero eigenvalue.

I think inverting it via Gaussian elimination might be the best solution.

Your Answer

By clicking "Post Your Answer", you agree to our terms of service, privacy policy and cookie policy

Not the answer you're looking for? Browse other questions tagged or ask your own question.