- "Dark mode is hard to see, can you switch to light mode?" switches to light mode "Aaaargh!"
Currently, Roslyn generates less-than-optimal code for list patterns, doing more expensive slicing before attempting to see if easily-accessible
elements of a collection match. This is because, today, we emit the collection expression tests LTR (at least when considering a single pattern).
The language has always reserved the space to reorder pattern tests as it sees fit, but this is the first time that we're actually looking at a
specific case within a single pattern where we think we should reorder for performance reasons. There are certain cases where the guarantees around
reordering can be painful: for example, immutableArray is { IsDefault: false, Length: > 1 }
is not guaranteed to check IsDefault
first, and the
Length
check could null ref if it was checked on a default immutable array. However, we don't think that this is one of those cases. There could
be some collection types where slicing is actually cheaper than indexing (a linked list, for example, needs to walk the whole thing anyway), but
we don't think that this is the majority case. At the same time, we do think that we should take another look at our ordering rules sometime during
the C# 13 timeframe and see if we can make stronger guarantees. We're not sure about this; in particular, the way that we deduplicate checks across
patterns in a switch statement or expression makes it extremely hard to guarantee anything if properties are checked in different orders across those
patterns. But we think it's worth a bit of effort to see if we can make it better.
We are ok with reordering collection pattern testing. We will see if we can be more specific about the types of reordering in general.
There's an interesting issue with array variance that was introduced in .NET 7 that breaks some expectations of pattern matching; when getting the subarray of a variantly-converted array, the runtime will now make the subarray the variantly-converted type, not the original type of the array. This breaks code that was type testing the subarray against the original type. Morever, we think the code that was broken was perfectly reasonable. We'll talk with the runtime team to understand what the costs for reverting this change would be, and how we might be able to work around it in the compiler if we cannot revert it.
We will follow up after talking with the runtime team around the behavior here.
Finally today, LDM looked at a community-proposed specification for making is
expressions constant when possible. On initial look, we didn't find much
objectionable in the specification, just a few comments on how it could be cleaned up to be more general. We also were interested in expanding the
proposal to switch expressions. However, when we started drilling down on the existing subsumption behavior, we started to be a bit more concerned about
behavioral complexity. Specifically, we looked at some examples like this:
public const int A = 4;
public bool B1 = A is 4; // warning CS8520: The given expression always matches the provided constant.
public bool B2 = A switch { 4 => true }; // warning CS8509: The switch expression does not handle all possible values of its input type (it is not exhaustive). For example, the pattern '0' is not covered.
public bool B3 = true is true and false; // error CS8518: An expression of type 'bool' can never match the provided pattern
Today, these warnings and errors are all sensible (except perhaps B2
), as they would affect the code that the compiler would generate. However, if we
started making patterns constant expressions everywhere, these warnings and errors might go away in more places than we expect, as the language would be
unable to tell when we should warn about subsumption and when we shouldn't. There could be a version of this where we suppress these warnings when the
expression occurs within a location that must be a constant expression; ie, when in a const
variable initializer, parameter default value, or other such
location, but we're a little concerned about the potential complexity of such a change compared to the potential benefit of using patterns in these locations.
After all, there's no new behavior with these patterns, it would just be a more concise way of expressing existing constant expressions. Given that, we think
we want to see an updated version of the specification with these rules to understand how big of a change it would be, and decide whether we're comfortable with
it at that point.
We need to see an updated specification with rules for dealing with subsumption before making a final decision on this feature.