Skip to content

Commit

Permalink
Fix AttributeError use_fallback_lbfgs_solve for newton-cholesky when …
Browse files Browse the repository at this point in the history
…fitting with max_iter=0 (#26653)

Co-authored-by: Christian Lorentzen <lorentzen.ch@gmail.com>
Co-authored-by: Jérémie du Boisberranger <34657725+jeremiedbb@users.noreply.github.com>
  • Loading branch information
3 people committed Jun 29, 2023
1 parent ee89eb8 commit eb011fd
Show file tree
Hide file tree
Showing 3 changed files with 32 additions and 0 deletions.
5 changes: 5 additions & 0 deletions doc/whats_new/v1.3.rst
Original file line number Diff line number Diff line change
Expand Up @@ -455,6 +455,11 @@ Changelog
on linearly separable problems.
:pr:`25214` by `Tom Dupre la Tour`_.

- |Fix| Fix a crash when calling `fit` on
:class:`linear_model.LogisticRegression(solver="newton-cholesky", max_iter=0)`
which failed to inspect the state of the model prior to the first parameter update.
:pr:`26653` by :user:`Olivier Grisel <ogrisel>`.

- |API| Deprecates `n_iter` in favor of `max_iter` in
:class:`linear_model.BayesianRidge` and :class:`linear_model.ARDRegression`.
`n_iter` will be removed in scikit-learn 1.5. This change makes those
Expand Down
1 change: 1 addition & 0 deletions sklearn/linear_model/_glm/_newton_solver.py
Original file line number Diff line number Diff line change
Expand Up @@ -375,6 +375,7 @@ def solve(self, X, y, sample_weight):

self.iteration = 1
self.converged = False
self.use_fallback_lbfgs_solve = False

while self.iteration <= self.max_iter and not self.converged:
if self.verbose:
Expand Down
26 changes: 26 additions & 0 deletions sklearn/linear_model/tests/test_logistic.py
Original file line number Diff line number Diff line change
Expand Up @@ -2063,3 +2063,29 @@ def test_liblinear_not_stuck():
with warnings.catch_warnings():
warnings.simplefilter("error", ConvergenceWarning)
clf.fit(X_prep, y)


@pytest.mark.parametrize("solver", SOLVERS)
def test_zero_max_iter(solver):
# Make sure we can inspect the state of LogisticRegression right after
# initialization (before the first weight update).
X, y = load_iris(return_X_y=True)
y = y == 2
with ignore_warnings(category=ConvergenceWarning):
clf = LogisticRegression(solver=solver, max_iter=0).fit(X, y)
if solver not in ["saga", "sag"]:
# XXX: sag and saga have n_iter_ = [1]...
assert clf.n_iter_ == 0

if solver != "lbfgs":
# XXX: lbfgs has already started to update the coefficients...
assert_allclose(clf.coef_, np.zeros_like(clf.coef_))
assert_allclose(
clf.decision_function(X),
np.full(shape=X.shape[0], fill_value=clf.intercept_),
)
assert_allclose(
clf.predict_proba(X),
np.full(shape=(X.shape[0], 2), fill_value=0.5),
)
assert clf.score(X, y) < 0.7

0 comments on commit eb011fd

Please sign in to comment.