-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SIP-56-extended: Refined provablyDisjoint test. #18416
Conversation
For now, we only have `SubTypeTest` and `LegacyPatMat`. * `SubTypeTest` is used when there is no type capture. * `LegacyPatMat` is used when there are captures. In the match type reduction algorithm, we already have a simpler path for `SubTypeTest`. The `LegacyPatMat` path is basically the same as before, but with static knowledge that we have an `HKTypeLambda`.
This is the first step in using the new specified algorithm for match type reduction. When the pattern of a case satisfies elibility conditions, we use the new algorithm. Otherwise, we fall back on the legacy algorithm. To be eligible, a pattern with at least one capture must be: an applied *class* type constructor whose arguments are all: - either a type capture, - or a fully defined type that contains no inner capture, - or the argument must be in covariant position and recursively qualify to the elibility conditions. With those criteria, all the type captures can be *computed* using `baseType`, instead of inferred through the full `TypeComparer` machinery. The new algorithm directly handles preventing widening abstract types, when doing so leads to captures being under-defined. With the legacy algorithm, this prevention is scattered elsewhere in the type comparer. Making it centralized improves the error messages in those situations; it seems they were previously entirely misleading (see changed check files).
So that they do not fall into the legacy fallback.
An abstract tycon can be matched if it is exactly equal to the scrutinee's tycon.
… type reduction. We introduce a new flag `-Yno-legacy-match-types`, which forbids the reduction of "legacy" match types. Like `-Yno-deep-subtypes`, it is meant to be used in our CI. With it, we check that we do not unexpectedly fall back on legacy match types in tests for which the specced match types are enough. Later, we should consider integrating that behavior with the source level so that it reaches users.
Instead of through the flag -Yno-legacy-match-types. Codebases that require legacy match types will need to use `-source:3.3` to compile from now on.
And improve the error message on selecting a class member of an unstable prefix.
1a8575e
to
12e74ac
Compare
val f1: Int = f | ||
//val f2: Int = f | ||
|
||
type R = FieldType["A", Int] *: FieldType["B", Double] *: FieldType["C", String] *: FieldType["D", Boolean] *: EmptyTuple |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've observed that a lot of people want to use this kind of pattern for type level records, what is the solution?
use class not opaque type?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, you have to use classes. In fact this test was very obviously wrong. FieldType["A", Int]
is clearly not disjoint from FieldType["B", Int]
since they both map to Int
! The fact that it passed before was definitely a bug, not a feature.
Instead of having an ad hoc criteria on type parameters' bounds. This makes sense, because the spec wants a finite derivation tree for the provablyDisjoint relation. If we hit the same pair of types in the tree, that branch of the exploration will not lead to a finite subtree.
It does not seem to bring anything in practice, and it is easier to specify things without it.
71cc84b
to
4086ada
Compare
Previously the disjointnessBoundary of HKTypeLambda's was implicitly their `resultType`, through the use of `superTypeNormalized`. This was fine as long as both sides of `provablyDisjoint` ended up being HKTypeLambda's at the same time, but this may not always be the case (notably with any-kinded types). It is safer to consider type lambdas as boundaries themselves, and explicitly recurse on the result types when arities match. This change surfaced a weird case in `TypeTestsCasts`, which called `provablyDisjoint` with ill-kinded types. We now explicitly apply what I suspect are partially-erased types to wildcards to recover appropriate kinds.
… instead. Fundamentally, the `provablyEmpty(scrut)` test was meant to prevent situations where both `scrut <: pattern` and `provablyDisjoint(scrut, pattern)` are true. That is a problem because it allows a match type to reduce in two different ways depending on the context. Instead, we basically use that combination of `scrut <: pattern` and `provablydisjoint(scrut, pattern)` as the *definition* for `provablyEmpty`. When both those conditions arise together, we refuse to reduce the match type. This allows one example to pass that did not pass before, but that particular example does not seem to cause unsoundness. In a sense, `provablyEmpty` was too strong here.
Closes scala#17121. Closes scala#17944. Closes scala#18488.
d54fd9e
to
980c9e7
Compare
This is now integrated into #18262. |
No description provided.