Hello.
I think my question is straightforward but I'm not finding the answer I need in the manual.
I have a set of overdetermined linear equations A*x = b + epsilon.
A is an MxN matrix where M>N. A is known.
b and epsilon are Mx1 column vectors. b is known. epsilon is unknown.
x is an Nx1 column vector. x is unknown.
I want to find the value of x that minimizes the L1 norm of epsilon.
What is the recommended way to do this in Octave?