Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"LU matrix is singular" Error for well-behaving data (LuDecomposition.solve in ml-matrix) #10

Open
saunus opened this issue Oct 14, 2023 · 7 comments

Comments

@saunus
Copy link

saunus commented Oct 14, 2023

Hi,

for the data as shown in the example below, I get the following error if I try to fit well-behaved data with fit-order 2:

Error: LU matrix is singular
    at LuDecomposition.solve (/home/damner/code/node_modules/ml-matrix/matrix.js:3201:13)
    at solve (/home/damner/code/node_modules/ml-matrix/matrix.js:3977:43)
    at regress (/home/damner/code/node_modules/ml-regression-polynomial/lib/index.js:174:45)
    at new PolynomialRegression (/home/damner/code/node_modules/ml-regression-polynomial/lib/index.js:54:28)

The x-data is monotonous, the y-data looks like a noisy parabola. Matlab has no issue with fitting.

Any ideas, why this is failing?

  • Removing first, second or last point solves issue.
  • Changing the first x-value from 404.26220703125 to 404.26220703124 solves the issue. This happens for a couple elements, but not for all
  • I guess it has something to do with the %.11f precision (which is not required, but just imported from .csv). But I do not just want to round the values, before I understood the problem.
  • Currently, I do not know how to set breakpoints inside a module, so I did not look at the input for LuDecomposition.solve

Example:

const mlr = require('ml-regression-polynomial');
x = [404.26220703125, 404.27349853515625, 404.28448486328125, 404.29541015625, 404.3040771484375, 404.3128356933594, 404.3204040527344, 404.33154296875, 404.3409423828125, 404.3497314453125, 404.3594665527344, 404.36865234375, 404.37451171875, 404.3879699707031, 404.3951416015625, 404.40545654296875, 404.4149475097656, 404.4233703613281, 404.431884765625, 404.4431457519531, 404.4515686035156, 404.4609069824219, 404.4708251953125, 404.479248046875, 404.48883056640625, 404.4971923828125, 404.50799560546875, 404.5179443359375, 404.5263977050781, 404.5354919433594, 404.5458068847656, 404.5533447265625];
y = [0.3631210884030956, 0.38180362489729697, 0.391453923949123, 0.4037110236952629, 0.4163452490432181, 0.4288353165530643, 0.4430856652809317, 0.45358586770200177, 0.4562565407929964, 0.4650388402896439, 0.47573011091885675, 0.4837785283897874, 0.4905844679839888, 0.49022240374688425, 0.49039250930602896, 0.49193229796429716, 0.490495250991277, 0.504985674313608, 0.49096439100589284, 0.4781324538477614, 0.48063366656261153, 0.46976621057011786, 0.4587271348758588, 0.44576288402868225, 0.43918945480010974, 0.42895113530071644, 0.41687014727334876, 0.3938250656311999, 0.38222703735756586, 0.3651908364214456, 0.34985174702375216, 0.34842848544450433];

const regression = new mlr.PolynomialRegression(x, y, 2);

Thanks for your help.

@ghost
Copy link

ghost commented Oct 16, 2023

Hi @saunus and sorry about that.

I don't have Matlab installed, but does this result look correct ? (At least the last term is negative)

PolynomialRegression {
  degree: 2,
  powers: [ 0, 1, 2 ],
  coefficients: [
    0.0004257708983743669,
    0.08609266575876973,
    -0.00021019886491302486
  ]
}

Also r.predict(404.5) gives 0.43211824353537054

I have tested old and newer versions of the software, and I think we just need to pass true to the solve function.

(The mathematical reason could be that by rounding errors those rows end up being multiple of each other, that would make the system of equations unsolvable without SVD, probably.)

So I may send a PR with this fix.

@saunus
Copy link
Author

saunus commented Oct 16, 2023

Thanks for digging into that!

Unfortunately, solving the singular matrix did not yield the correct result, but suppresses the quadratic term:
image

Correct result would be: [-1126496.33992227, -6.88810931148790, 5571.1518772892]

  • I already tried rounding the values using several precision values, without success for all fit curves (~200 for my typical data sets). At least one would always result in a singularity.
  • Next I tried normalizing the data by subtracting the mean in x and y. I did not need to adjust the scale. This worked in all cases.

I think the current implementation of polynomial regression is not suited for fitting data with offsets >> variation.
The question is, whether data normalization should be part of a fitting function, or it should be the user's responsibility.

Thanks for your help

@ghost
Copy link

ghost commented Oct 16, 2023

I'd assume this is left in user's hands at least for now, since it would require quite a few changes.

Maybe @targos has a different idea, otherwise we may update the docs and close the issue.

@ghost ghost added the enhancement label Oct 16, 2023
@ghost
Copy link

ghost commented Oct 18, 2023

I've been doing some research on this issue, and it seems to be that calculating XtX which is what the current algorithm does, increases the error in the result.

I have replaced it by a QR decomposition, and got this results:

PolynomialRegression {
  degree: 2,
  powers: [ 0, 1, 2 ],
  coefficients: [ -1126496.3326794403, 5571.151841472426, -6.888109267208169 ]
}

Which although they are in a different order, are very close to what is reported:

Correct result would be: [-1126496.33992227, -6.88810931148790, 5571.1518772892]

I could try to find some backing for the claim, if there is will to change (only a few lines) of the algorithm.

Basically, now it does not use the transpose but does this:

  const qrF = new QrDecomposition(F);
  console.log(qrF, F, y);//temporary
  return {
    coefficients: qrF.isFullRank()
      ? qrF.solve(Y).to1DArray()
      : solve(F, Y).to1DArray(),
    degree: Math.max(...powers),
    powers,
  };

So at least seems to extend the range of usability. Currently all tests pass but one.

@saunus @targos

@saunus
Copy link
Author

saunus commented Oct 18, 2023

Hi,

I did not dare to ask for QRDecomposition in first place:

  • I do not understand the linear algebra behind that
  • I read somewhere about lower performance but more stable results.
  • I was not able to figure out, whether it could be applied to non square matrices

But maybe something to give you confidence: QR-decomposition is used by matlab internally for least-squares:

function [p,S,mu] = polyfit(x,y,n)

% Construct the Vandermonde matrix V = [x.^n ... x.^2 x ones(size(x))]
V(:,n+1) = ones(length(x),1,class(x));
for j = n:-1:1
    V(:,j) = x.*V(:,j+1);
end

% Solve least squares problem p = V\y to get polynomial coefficients p.
[p, rankV, QRfactor, perm] = matlab.internal.math.leastSquaresFit(V,y1);

where leastSquaresFit is

function [p, rankV, QRfactor, perm] = leastSquaresFit(V,y)

% Solve least squares problem p = V\y to get polynomial coefficients p.
[QRfactor, tau, perm, rankV, ~] = matlab.internal.decomposition.builtin.qrFactor(V, -2);

% use nonzero diagonal entries to determine rank for qrSolve.
rV = sum(abs(getDiag(QRfactor)) ~= 0);

% QR solve with rank = rV.
p = matlab.internal.decomposition.builtin.qrSolve(QRfactor, tau, perm, y, rV);

I can not contribute, on what downside the change to QRDecomposition might have. But I would definitely appreciate to have the option to use it in this package.

Thanks for your help

@ghost
Copy link

ghost commented Oct 19, 2023

It makes sense. I think one approach could be to either do it when it fails, and do QR as a fallback, and another one would be use QR first.

I will try the second approach for now, and will add a benchmark to see how bad it gets.

For the moment, the results seem better (but I think it does not get so close as the results you shared.)

  • This is before (curve labels are wrong):

before1

  • This is after the changes:

after1

It will take a little while though, cause I don't want to introduce a more complex approach that we can't fix later on. Any extra data (just the arrays of numbers.) where it fails would be great.

@ghost ghost self-assigned this Oct 19, 2023
This was referenced Oct 19, 2023
@saunus
Copy link
Author

saunus commented Oct 23, 2023

Hi,

I extracted a full data set of 450 curves to fit:
fitdata.txt

The file contains alternating x/y arrays.

Some of them throw a singularity error, some of them work, some of them give bad fit results albeit not throwing an error!. Unfortunately I can not spend the time on preparing each case.

The following wrapper solves the issue for me in all cases and does not slow down the regression noticeably. I only tested it on the 450 data sets with fit order 2 and on the same amount of data with fit order 1, but not for higher orders.

import PolynomialRegression from "ml-regression-polynomial";

export class NormalizedPolynomialRegression {
  constructor(x, y, order) {

    // Calculate normalization
    const offsX = -Math.min(...x);
    const factX = 1 / (Math.max(...x) - Math.min(...x));

    const offsY = -Math.min(...y);
    const factY = 1 / (Math.max(...y) - Math.min(...y));

    // Apply normalization
    const x1 = x.map((x) => (x + offsX) * factX);
    const y1 = y.map((y) => (y + offsY) * factY);

    // Regression on normalized data
    const regression = new PolynomialRegression(x1, y1, order);
    const normCoefficients = regression.coefficients;

    // Retrive coefficients from normalized coefficients
    let coeffs = Array(order + 1).fill(0);

    coeffs[0] = normCoefficients[0];

    for (let n = 1; n < order + 1; n++) {
      const powers = calculatePowersOfElements(n);

      const currentPreFact = normCoefficients[n] * factX ** n;

      for (let j = 0; j < powers.length; j++) {
        const currentExp = powers[j].powerA;
        const currentFact = powers[j].factor * offsX ** powers[j].powerB;
        coeffs[currentExp] = coeffs[currentExp] + currentPreFact * currentFact;
      }
    }

    coeffs = coeffs.map((coeff) => coeff / factY);
    coeffs[0] = coeffs[0] - offsY;

    this.coefficients = coeffs;
  }

  predict(x0) {
    let val = 0;
    for (let n = 0; n < this.coefficients.length; n++)
      val = val + this.coefficients[n] * x0 ** n;
    return val;
  }
}

function calculatePowersOfElements(n) {
  const powers = [];

  for (let i = 0; i <= n; i++) {
    const aPower = i;
    const bPower = n - i;

    powers.push({
      factor: binomialCoefficient(n, i),
      powerA: aPower,
      powerB: bPower,
    });
  }

  return powers;
}

function binomialCoefficient(n, k) {
  if (k < 0 || k > n) {
    return 0;
  }

  if (k === 0 || k === n) {
    return 1;
  }

  let result = 1;
  for (let i = 1; i <= k; i++) {
    result *= (n - i + 1) / i;
  }

  return result;
}

Thanks for helping out.

@ghost ghost moved this to 🟠 In Progress in NMR and Cheminfo projects organisation Oct 24, 2023
@ghost ghost removed their assignment May 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant