Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dart infers "num" for result of generic function when only ints are involved #55315

Closed
dgreensp opened this issue Mar 27, 2024 · 2 comments
Closed

Comments

@dgreensp
Copy link

I ran into this error in some complex code, but I've reduced the reproduction down to this:

T first<T>(List<T> list) {
  return list[0];
}

int getInt() {
  final ints = [1, 2, 3]; // List<int>
  final n = 1 + first(ints); // num???
  return n; // error at analysis time, num is not int
}

The same problem doesn't occur if you change 1 + first(ints) to first(ints) + 1. Or if you assign the result of first(ints) to a local variable.

Reproduced in Dart 3.3.2 (stable) and main channel.

FWIW, this is easy to work around. You can write first<int>(ints). It just shouldn't be necessary.

My guess is the context of being after + somehow causes inference as num (even though only ints are involved).

@eernstg
Copy link
Member

eernstg commented Mar 27, 2024

This is all working as specified. Check out the signature of operator + on a receiver of type int:

num operator +(num other);

This operator should work with numbers, both of type int and of type double, and this is achieved by accepting an argument of type num. The result is then guaranteed to be of type num, but we can't promise that it will be an int or that it will be a double, because we have to declare a return type which is consistent with what we know when the argument type is num.

We do have special rules about the typing of expressions of the form a + b where a and b both have the static type int (and other special combinations), but if the argument has type num then the result will also have the type num.

In this case the argument does indeed have the type num.

My guess is the context of being after + somehow causes inference as num

Exactly! The reason why this is so is that Dart type inference gives a high priority to the context type, and this causes first(ints) to be inferred as first<num>(ints). This is type correct (because ints has static type List<int> and List<int> is a subtype of List<num> because int is a subtype of num), so the program compiles, so far.

Of course, n will then have the inferred type num, so we can't return it when the return type is int.

You can force the choice of type argument by specifying it explicitly, as you mention:

T first<T>(List<T> list) {
  return list[0];
}

int getInt() {
  final ints = [1, 2, 3]; // List<int>.
  final n = 1 + first<int>(ints);
  return n; // OK.
}

The main reason why Dart type inference gives a high priority to the context type is that mutable state plus dynamically checked covariance works best when the chosen type argument is what the context specifies (because this implies that we avoid introducing covariance in many situations). Here is an illustration why this matters:

void main() {
  List<num> xs = [1]; // OK, inferred as `<num>[1]`.
  xs.add(1.5); // OK statically and at run time.

  List<num> ys = <int>[1]; // OK, by covariance.
  ys.add(1.5); // OK statically, throws at run time.
}

It's convenient to allow a variable of type List<num> to refer to an object of type List<int>. However, this is only type safe as long as you use members where the type parameter of List occurs in positions that are covariant (e.g., it can be the return type of a method or getter). In contrast, the method add has the type parameter as a parameter type, and that's unsafe when the type is actually covariant (e.g., a List<int> has static type List<num>).

So, in summary: Dynamically checked covariance causes Dart type inference to give a high priority to the context type. Operator + must work with all numbers, not just ints, so the argument type is num. Hence, first(ints) gets inferred as first<num>(ints). This means that we "forget" some of the precision of the type of the result of this addition. However, if you encounter this kind of loss of precision then you can restore the precise type by giving the desired type argument explicitly.

I'll close the issue because it doesn't report on anything new, or anything that we are likely to change.

However, if you wish to support the introduction of statically checked variance then you can vote for dart-lang/language#524.

@eernstg
Copy link
Member

eernstg commented May 13, 2024

Just in passing, note that the specialized analysis of numeric operations will allow us to avoid the compile-time error in the original version of the example simply by adding the desired type to the variable n.

T first<T>(List<T> list) {
  return list[0];
}

int getInt() {
  final ints = [1, 2, 3]; // List<int>.
  final int n = 1 + first(ints); // <-- Just give it a hint, and it's `first<int>(...)`.
  return n; // OK
}

The point is that first(ints) gets the context type int when the context type for the entire expression is int and the receiver has type int as well.

Just thought this might be useful to know. 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants