-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Infer: Don't minimise to Nothing if there's an upper bound #16786
Conversation
// Skip = minimisedSelected "hold off instantiating" | ||
// False = return false | ||
|
||
// there are 9 combinations: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I love this documentation!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice tests and docs! And the actual change is correct, of course.
EDIT: I think I commented too fast, and now see problems with the second change.
@@ -183,7 +183,7 @@ object Inferencing { | |||
// else hold off instantiating unbounded unconstrained variable | |||
else if direction != 0 then | |||
instantiate(tvar, fromBelow = direction < 0) | |||
else if variance >= 0 && (force.ifBottom == IfBottom.ok || tvar.hasLowerBound) then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's no motivation why the !tvar.hasUpperBound
is added. It does not make sense to me. IfBottom
being ok
means we are allowed to minimize to Nothing
. Why stop doing this if there is an upper bound?
I agree that the issue this fixes is a real one. But I am missing the reasoning why this PR is the correct fix, in particular since the PR caused regressions elsewhere.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From the logic it seems we're happy to minimise if there's a lower bound and maximise if there's a lower bound. IfBottom, to me, looks like it's mostly for it's IfBottom.fail and IfBottom.flip alternatives - as in, IfBottom.ok is the "default"/"standard" behaviour. So under that condition, we don't want to minimise when we have an upper bound. Doing so causes the issue we're trying to fix: we have a S1 <: Pet
parameter, and there are no further constraints. Consistently with everything else, it should maximise to Pet
, not minimise to Nothing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The way I read it is: If the lower bound is (missing or) Nothing
then we are allowed to instantiate to the lower bound only if isBottom is OK. The upper bound has nothing to do with it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand what justifies ignoring an existing upper bound, which is exactly the i14218 case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had a closer look at it now. Here's a slight adaptation of the original issue:
class A
class B extends A
class Z[S](v: S => Unit)
val x = new Z((s: B) => ())
x
has type Z[B]
, as expected.
Now add a bound to S
:
class Z[S <: A](v: S => Unit)
x
still has type Z[B]
!
But if we sharpen the bound to B
:
class Z[S <: B](v: S => Unit)
then x
has type Z[Nothing]
.
The reason this happens is because of code in Inferencing before the line in question:
val direction = instDirection(tvar.origin)
...
else if direction != 0 then
instantiate(tvar, fromBelow = direction < 0)
The direction
is set to -1 if the variable is constrained only from below and to 1 if the variable is constrained only from above. "Is constrained" means: There is a constraint stronger than the variable's bound. That's what goes wrong here: The added constrant is exactly the variable's bound, so it does not count. It's hard to change this, since the information that we also added a constraint (not just recorded the bound) is lost if the constraint is not stronger than the bound.
No description provided.