Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

warnings in Logisticregression with prox newton #215

Open
mathurinm opened this issue Dec 10, 2021 · 7 comments
Open

warnings in Logisticregression with prox newton #215

mathurinm opened this issue Dec 10, 2021 · 7 comments

Comments

@mathurinm
Copy link
Owner

Running pytest:

celer/tests/test_logreg.py::test_LogisticRegression[True]
  /home/mathurin/workspace/celer/celer/homotopy.py:311: ConvergenceWarning: Objective did not converge: duality gap: 0.841892524980679, tolerance: 0.005545177444479563. Increasing `tol` may make the solver faster without affecting the results much. 
  Fitting data with very small alpha causes precision issues.
    sol = newton_celer(

celer/tests/test_logreg.py::test_LogisticRegression[True]
  /home/mathurin/workspace/celer/celer/homotopy.py:311: ConvergenceWarning: Objective did not converge: duality gap: 23.09072008747797, tolerance: 0.006931471805599453. Increasing `tol` may make the solver faster without affecting the results much. 
  Fitting data with very small alpha causes precision issues.
    sol = newton_celer(

celer/tests/test_logreg.py::test_LogisticRegression[True]
  /home/mathurin/workspace/celer/celer/homotopy.py:311: ConvergenceWarning: Objective did not converge: duality gap: 2.1031969926275593, tolerance: 0.006931471805599453. Increasing `tol` may make the solver faster without affecting the results much. 
  Fitting data with very small alpha causes precision issues.
    sol = newton_celer(
@mathurinm
Copy link
Owner Author

@Badr-MOUFAD can you have a look ? You can start a PR with a script isolating and reproducing the issue (hopefully on a single alpha) on the data from the test

@Badr-MOUFAD
Copy link
Collaborator

Badr-MOUFAD commented Apr 1, 2022

@mathurinm, we get the warning only if gap <= tol.
By playing with alpha and tol, I was able to reproduce the warning

celer/tests/test_logreg.py::test_reproduce_error[False]
  c:\users\hp\desktop\celer-repo\celer\celer\homotopy.py:313: ConvergenceWarning:

  Objective did not converge: duality gap: 1.1778627197155299e-08, tolerance: 2.0794415416798358e-15. Increasing `tol` may make the solver faster without affecting the results much.
  Fitting data with very small alpha causes precision issues.

However, I could not find a case where the duality gap is greater than 1, as in your example.

NB: note that I reproduced the warning by using unrealistic values of alpha and tol (alpha = 1e-10 and tol = 1e-16)

@mathurinm
Copy link
Owner Author

Did you take the same data and the same alpha as in the test ?

@Badr-MOUFAD
Copy link
Collaborator

@mathurinm, I literally reused the code in celer/tests/test_LogisticRegression.

@mathurinm
Copy link
Owner Author

mathurinm commented Apr 1, 2022 via email

@Badr-MOUFAD
Copy link
Collaborator

I do get the warning.
I was just testing things locally. so I didn't push any line of code.

@Badr-MOUFAD
Copy link
Collaborator

@mathurinm,
We get the warning when we run check_estimator (the utils from sklearn).
I do know that it checks whether the estimator, in our case LogisticRegression, abides by the rules of sklearn.
Yet, I totally ignore what kinds of checks it does. So, I don't know which data causes the warning.

I pushed the code in this branch https://github.com/Badr-MOUFAD/celer/tree/conv-warning-issue
I have only commented the check_estimator lines in tests/test_logreg.
To reproduce just run pytest .\celer\tests\test_logreg.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants