Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make by-name types first-class value types #14225

Closed
wants to merge 17 commits into from

Conversation

odersky
Copy link
Contributor

@odersky odersky commented Jan 6, 2022

A big revision so that types indicating call-by-name passing are regular value types. In fact, the
by-name type => T now behaves like a nullary context function type () ?=> T. Nullary context function types still cannot be written explicitly but the type mechanics for cbn types and nullary context function types are the same.

Benefits:

  • Overall, a significant simplification. The previous transformation pipeline had to do a complicated danse to expand call-by-name types. This is no longer necessary.
  • Call-by-name types are now regular value types. This removes some ad-hoc restrictions, such as that cbn methods cannot be eta expanded to function values.
  • Delayed call-by name arguments are marked explicitly from Typer. This means follow-on analyses such as initialization checking can be simplified.
  • Potential to drop more restrictions, in particular
    • By-name types could be allowed to appear everywhere, not just in parameter position
    • Classes could have by-name value parameters
      These generalizations should be considered for future PRs.
  • Opens the way to refine by-name types to be pure or impure, or to let them have specific capture capabilities.

Things to do:

  • Update scaladoc to handle the new by-name types (scaladoc build fails currently, as do two scaladoc tests in the scala3-bootstrapped/test task.
  • Adapt macro expansion and reflect.quotes to the new scheme. Currently 6 tests in BootstrappedOnlyCompilationTests fail.
  • Investigate why sbt-test/analyzer-plugin fails.

@odersky
Copy link
Contributor Author

odersky commented Jan 11, 2022

@smarter To make by-name types context functions, we need to overcome a tricky interaction between type variable avoidance and implicit search divergence checking. The problem is that
we introduce empty parameter closures for by-name args in Typer, which means that any nested variables will get higher levels than than the environment. I can see that in some changed error messages where type variables now get (avoid) or (lower) prefixes. That's not nice in error messages, but there is a much bigger problem:

We can now generate new type variables for every level of by-name implicit search. This violates the basic assumption underlying implicit divergence checking that the set of type variables is fixed. Consequently, we get infinite recursion on neg/byname-implicits-11 and neg/by-name-implicits-16. Previously, this problem did not manifest itself, since we did not generate synthesized terms with nested scopes. But now we do. (I guess even before this could have arisen if we
generated terms for context function types, but the interaction with divergence checking was never studied).

I see two possible routes for a solution

  • Don't bump the level for nullary closures - since they do not introduce anything of interest that needs to be avoided. That will fix the problem for the new cbn encoding, and might be a good idea for performance anyway, but we would still be vulnerable for implicit search of context function types.

  • For the purpose implicit divergence checking treat the original variables and their lifted versions as being the same. Is there a way to map a lifted variable back to its original?

Right now I am blocked by this, so if you have some advice that would be great!

@smarter
Copy link
Member

smarter commented Jan 11, 2022

Is there a way to map a lifted variable back to its original?

Yes we use the same trick than depTypeVar and can map back using representedParamRef

@odersky odersky force-pushed the cft-byname branch 3 times, most recently from d2dabb8 to a9dfc91 Compare January 17, 2022 08:55
I tried to extend specialization to all context functions, not just ones of 0 arity.
But that runs into problems for dependent context functions, since the necessary casts
get complicated. Since context functions over primitive types are an anti-pattern anyway
I don't think we need to optimize this case, after all.
The regular changeOwner can cause cycles in transformations. Transformations like ByNameLambda should use
the less demanding changeOwnerAfter.
There was an omission before, but as long as  byname parameters were ExprTypes it did
not lead to problems.
@odersky odersky changed the title Allow nullary context functions Make by-name types first-class value types Jan 17, 2022
@odersky
Copy link
Contributor Author

odersky commented Jan 17, 2022

test performance please

@odersky odersky marked this pull request as ready for review January 17, 2022 14:02
@dottybot
Copy link
Member

performance test scheduled: 1 job(s) in queue, 1 running.

@@ -726,6 +726,14 @@ trait TypedTreeInfo extends TreeInfo[Type] { self: Trees.Instance[Type] =>
case _ => tree
}

/** An anonyous function and a closure node referring to it in a block, without any wrappigs */

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo:
anonyous -> anonymous
wrappigs -> wrappings

* that they are cbv, and we have to correct this assumption if the actual resolved
* method is cbn. If the call is non-overloaded, we do the right thing from the start.
* Inserting a byName call just makes clear that the argumnent is cbn. There is no
* special treatemnt in the compiler associated with that method; it is just the

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: treatemnt -> treatment

@odersky
Copy link
Contributor Author

odersky commented Jan 17, 2022

@smarter I solved the problem by delaying the closure generation. In typer a call-by-name argument is represented as <by-name>(...) where <by-name> is a synthetic function in <special-ops>, analogous to throw. These applications are then replaced with closures in phase ByNameLambda.

@odersky odersky added the needs-minor-release This PR cannot be merged until the next minor release label Jan 17, 2022
@dottybot
Copy link
Member

Performance test finished successfully:

Visit https://dotty-bench.epfl.ch/14225/ to see the changes.

Benchmarks is based on merging with master (7939ddb)

Instead, drop <by-name> applications when pickling and reconstitute them
based on formal parameter types when unpickling.
@nicolasstucki
Copy link
Contributor

@odersky I added some patches in dotty-staging@36f4598. You can cherry-pick that commit. It fixes most of the macro related issues.

ByNameLambda needs to run in a group after ElimRepeated since ElimRepeated
works on ByName arguments but not converted closures, and it sees the arguments
after transformations by subsequent miniphases in the same group.
ByName nodes in arguments are not pickled, which means that we
can use the same Tasty version as before.
@odersky odersky removed the needs-minor-release This PR cannot be merged until the next minor release label Jan 19, 2022
@odersky
Copy link
Contributor Author

odersky commented Jan 19, 2022

I made the following changes

  • by-name arguments are now wrapped in a new kind of tree node of class ByName instead of being represented as applications of a magic function.
  • ByName nodes are not pickled. Instead they are reconstituted on unpickling by looking at the formal parameter types of applications.

This means that no new Tasty version is needed. The pickled Tasty stays the same.

@bishabosha
Copy link
Member

bishabosha commented Jan 19, 2022

  • ByName nodes are not pickled. Instead they are reconstituted on unpickling by looking at the formal parameter types of applications.

This means that no new Tasty version is needed. The pickled Tasty stays the same.

@sjrd Should it be considered a Tasty incompatibility that def foo(arg: => Int) = { arg; arg } in tasty now looks like

def foo(arg: => Int) = { arg.apply(); arg.apply() }

i.e. we now have Apply(SELECTin(arg.type, apply [@apply (): java.lang.Object]), Nil) rather than just arg.type

@odersky
Copy link
Contributor Author

odersky commented Jan 19, 2022

Overall, here are the major representation changes, from phase Typer on:

  • The type of a by-name parameter with underlying type T is represented as AppliedType(defn.ContextFunction0, T) instead of as ExprType(T). That is, => T is represented as () ?=> T.
  • An argument to a call-by-name parameter is wrapped in a ByName tree node. A new phase ByNameLambda rewrites these nodes to closures.
  • A reference to a call-by-name parameter p is automatically applied. That is, it's now p.apply() instead of p.

@sjrd
Copy link
Member

sjrd commented Jan 19, 2022

  • ByName nodes are not pickled. Instead they are reconstituted on unpickling by looking at the formal parameter types of applications.

This means that no new Tasty version is needed. The pickled Tasty stays the same.

@sjrd Should it be considered a Tasty incompatibility that def foo(arg: => Int) = { arg; arg } in tasty now looks like

def foo(arg: => Int) = { arg.apply(); arg.apply() }

i.e. we now have Apply(SELECTin(arg.type, apply [@apply (): java.lang.Object]), Nil) rather than just arg.type

Writing things differently is not issue per se, as long as reading the old stuff still works (backward compat) and that an old compiler can still read the new stuff (forward compat).

So, if the new compiler sees only arg.type, will it work? If yes, it is backward compatible. If not, it is not.

If the old compiler sees the Apply(SELECTin(...), Nil), will it work? If yes, it is forward compatible. Otherwise, it is not.

 - Handle combinations of NamedArg and ByName
 - Accept and evaluate any remainign ByName applications
@pikinier20
Copy link
Contributor

Well, errors from Scaladoc don't look scary. There are just slight differences in signatures with call-by-name params. Scaladoc now presents => A type as () => A (probably it fell into other case in our signature logic). According to the PR message we would like to represent them as () ?=> A, right? Can I commit to this PR with solution?

@smarter
Copy link
Member

smarter commented Jan 19, 2022

According to the PR message we would like to represent them as () ?=> A, right?

That's the desugared form, but the form users should write and we should use for pretty-printing is still => A.

@odersky
Copy link
Contributor Author

odersky commented Jan 19, 2022

@pikinier20 No, by-name parameter types should still be rendered as => T. The analogy to () ?=> T is just for the compiler's internal purposes.

@pikinier20
Copy link
Contributor

Okey, good that we explained this. I'm going to fix it tomorrow if it's ok.

@odersky
Copy link
Contributor Author

odersky commented Jan 19, 2022

@nicolasstucki I integrated your changes. Now there's 5 tests that are still failing.

Test Report
================================================================================

9 suites passed, 2 failed, 11 total
    tests/run-macros/i5119 failed
    tests/run-macros/tasty-interpolation-1 failed
    tests/run-macros/xml-interpolation-2 failed
    tests/run-macros/tasty-unsafe-let failed
    tests/pos-macros/null-by-name failed

@odersky
Copy link
Contributor Author

odersky commented Jan 20, 2022

The Tasty interop problems look daunting, unfortunately.

  • The new compiler can probably be made to work with old Expr representations, if we generalize a bit. It will be hard to test this extensively, though. But nevertheless, we can hope for backwards compatibility.
  • Old compilers cannot work with new context function representations. That would require a major retrofit of old compiler codebases.

So this means we need a mode where we still generate the old Expr based representations under a release flag. Which would mean that any gain in simplifications would be lost and the new implementation is strictly more complex than the old.

Maybe we should put this work on hold a bit and decide afterwards whether it's worth continuing. /cc @nicolasstucki @pikinier20.

@smarter
Copy link
Member

smarter commented Jan 20, 2022

Old compilers cannot work with new context function representations.

Is this because of the extra .apply() when using a by-name? Could we strip the .apply at pickling time and re-introduce them when unpickling?

It will be hard to test this extensively, though.

This is made much easier by #14156 which added support for multi-compiler tests to our test infrastructure, from http://dotty.epfl.ch/docs/contributing/testing.html:

There are also other suffixes indicating how some particular files are compiled:

  • _c${compilerVersion} - compile a file with a specific version of the compiler instead of the one developed on the current branch (e.g. Foo_c3.0.2.scala)
  • _r${release} - compile a file with a given value of -Yscala-release flag (e.g. Foo_r3.0.scala)

Different suffixes can be mixed together (their order is not important although consistency is advised), e.g. Foo_1_r3.0, Bar_2_c3.0.2.

@odersky
Copy link
Contributor Author

odersky commented Jan 20, 2022

Is this because of the extra .apply() when using a by-name? Could we strip the .apply at pickling time and re-introduce them when unpickling?

It's not only that. Also, that when the expected type is a () ?=> T the new compiler would generate a ByName tree, but the old compiler would generate a closure. Generating closures
means we subtly change avoidance patterns and some tests start to fail (notably recursive implicit searches).

@smarter
Copy link
Member

smarter commented Jan 20, 2022

Generating closures means we subtly change avoidance patterns and some tests start to fail

We should be able to prevent avoidance from kicking in by not increasing the nesting level when typing a def with no parameters, the relevant code is here:
https://github.com/lampepfl/dotty/blob/2654b5649dbe84ee4743c455ab36a410068917fd/compiler/src/dotty/tools/dotc/typer/Namer.scala#L779-L780
If original has no type parameters or term parameters then it seems like we can safely reusing the same nestingLevel.

@bishabosha
Copy link
Member

bishabosha commented Jan 20, 2022

It's not only that. Also, that when the expected type is a () ?=> T the new compiler would generate a ByName tree, but the old compiler would generate a closure. Generating closures means we subtly change avoidance patterns and some tests start to fail (notably recursive implicit searches).

In the TreePickler can we replace () ?=> T by BYNAMEtype ? e.g. it appears in a refinement type in the upper bound of F:

class Foo {
  def foo(arg: => Any): String = arg.toString
}

class Bar[F <: Foo { def foo(arg: => String): String }] {
  def bar(foo: F, arg: String): String = foo.foo(arg)
}

otherwise when by-name parameter is in tree position we still have BYNAMEtpt

@odersky
Copy link
Contributor Author

odersky commented Jan 21, 2022

Withdrawn in favor of #14295. The latter is more conservative, and less principled, but much simpler to implement, and it causes not inter-version headaches.

@odersky odersky closed this Jan 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants