diff --git a/docs/docs/reference/changed-features/compiler-plugins.md b/docs/docs/reference/changed-features/compiler-plugins.md index 7cd178a2927d..0d76ba186a96 100644 --- a/docs/docs/reference/changed-features/compiler-plugins.md +++ b/docs/docs/reference/changed-features/compiler-plugins.md @@ -66,7 +66,8 @@ class DivideZero extends StandardPlugin { val name: String = "divideZero" override val description: String = "divide zero check" - def init(options: List[String]): List[PluginPhase] = (new DivideZeroPhase) :: Nil + def init(options: List[String]): List[PluginPhase] = + (new DivideZeroPhase) :: Nil } class DivideZeroPhase extends PluginPhase { diff --git a/docs/docs/reference/changed-features/implicit-resolution.md b/docs/docs/reference/changed-features/implicit-resolution.md index 41523c30507c..6f848c040d0d 100644 --- a/docs/docs/reference/changed-features/implicit-resolution.md +++ b/docs/docs/reference/changed-features/implicit-resolution.md @@ -24,7 +24,7 @@ where the type may still be inferred: ... } ``` -**2.** Nesting is now taken into account for selecting an implicit.Consider for instance the following scenario: +**2.** Nesting is now taken into account for selecting an implicit. Consider for instance the following scenario: ```scala def f(implicit i: C) = { def g(implicit j: C) = { @@ -55,7 +55,7 @@ have only `b` in its implicit search scope but not `a`. In more detail, here are the rules for what constitutes the implicit scope of a type: -**Definition:** A reference is an _anchor_ if it refers to an object, a class, a trait, an abstract type, an opaque type alias, or a match type alias. References to packages and package objects are anchors only under -source:3.0-migration. +**Definition:** A reference is an _anchor_ if it refers to an object, a class, a trait, an abstract type, an opaque type alias, or a match type alias. References to packages and package objects are anchors only under `-source:3.0-migration`. **Definition:** The _anchors_ of a type _T_ is a set of references defined as follows: @@ -122,7 +122,7 @@ most (but not all) divergence errors in Scala 2 would terminate the implicit sea def buzz(y: A) = ??? buzz(1) // error: ambiguous ``` -**7.** The rule for picking a _most specific_ alternative among a set of overloaded or implicit alternatives is refined to take context parameters into account. All else being equal, an alternative that takes some context parameters is taken to be less specific than an alternative that takes none. If both alternatives take context parameters, we try to choose between them as if they were methods with regular parameters. The following paragraph in the SLS is affected by this change: +**7.** The rule for picking a _most specific_ alternative among a set of overloaded or implicit alternatives is refined to take context parameters into account. All else being equal, an alternative that takes some context parameters is taken to be less specific than an alternative that takes none. If both alternatives take context parameters, we try to choose between them as if they were methods with regular parameters. The following paragraph in the [SLS §6.26.3](https://scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) is affected by this change: _Original version:_ diff --git a/docs/docs/reference/changed-features/interpolation-escapes.md b/docs/docs/reference/changed-features/interpolation-escapes.md index 1b313bebe8d6..aafeaacf15c2 100644 --- a/docs/docs/reference/changed-features/interpolation-escapes.md +++ b/docs/docs/reference/changed-features/interpolation-escapes.md @@ -3,7 +3,7 @@ layout: doc-page title: Escapes in interpolations --- -In Scala 2 there was no straightforward way to represent a single quote character `"` in a single quoted interpolation. A \ character can't be used for that because interpolators themselves decide how to handle escaping, so the parser doesn't know whether the " should be escaped or used as a terminator. +In Scala 2 there was no straightforward way to represent a single quote character `"` in a single quoted interpolation. A `\` character can't be used for that because interpolators themselves decide how to handle escaping, so the parser doesn't know whether the `"` should be escaped or used as a terminator. In Dotty, you can use the `$` meta character of interpolations to escape a `"` character. diff --git a/docs/docs/reference/changed-features/match-syntax.md b/docs/docs/reference/changed-features/match-syntax.md index e83f5f9ead86..f5b5b3800188 100644 --- a/docs/docs/reference/changed-features/match-syntax.md +++ b/docs/docs/reference/changed-features/match-syntax.md @@ -16,6 +16,7 @@ The syntactical precedence of match expressions has been changed. case "empty" => 0 case "nonempty" => 1 } + ``` 2. `match` may follow a period: @@ -26,6 +27,7 @@ The syntactical precedence of match expressions has been changed. } then "nonempty" else "empty" + ``` 3. The scrutinee of a match expression must be an `InfixExpr`. Previously the scrutinee could be followed by a type ascription `: T`, but this is no longer supported. So `x : T match { ... }` now has to be written `(x: T) match { ... }`. diff --git a/docs/docs/reference/changed-features/numeric-literals.md b/docs/docs/reference/changed-features/numeric-literals.md index 05477a15533d..cb8c74547a18 100644 --- a/docs/docs/reference/changed-features/numeric-literals.md +++ b/docs/docs/reference/changed-features/numeric-literals.md @@ -79,8 +79,8 @@ numbers that can have both a decimal point and an exponent: ```scala object FromDigits { - /** A subclass of `FromDigits` that also allows to convert whole number literals - * with a radix other than 10 + /** A subclass of `FromDigits` that also allows to convert whole + * number literals with a radix other than 10 */ trait WithRadix[T] extends FromDigits[T] { def fromDigits(digits: String): T = fromDigits(digits, 10) diff --git a/docs/docs/reference/changed-features/operators.md b/docs/docs/reference/changed-features/operators.md index 55c416753007..aebf4bbd7995 100644 --- a/docs/docs/reference/changed-features/operators.md +++ b/docs/docs/reference/changed-features/operators.md @@ -47,7 +47,7 @@ one of the following conditions holds: - the operator is followed by an opening brace. An alphanumeric operator is an operator consisting entirely of letters, digits, the `$` and `_` characters, or -any unicode character `c` for which `java.lang.Character.isIdentifierPart(c)` returns `true`. +any Unicode character `c` for which `java.lang.Character.isIdentifierPart(c)` returns `true`. Infix operations involving symbolic operators are always allowed, so `infix` is redundant for methods with symbolic names. @@ -89,9 +89,9 @@ The purpose of the `infix` modifier is to achieve consistency across a code base 5. To smooth migration to Scala 3.0, alphanumeric operators will only be deprecated from Scala 3.1 onwards, or if the `-source 3.1` option is given in Dotty/Scala 3. -## The @targetName Annotation +## The `@targetName` Annotation -It is recommended that definitions of symbolic operators carry a [@targetName annotation](../other-new-features/targetName.html) that provides an encoding of the operator with an alphanumeric name. This has several benefits: +It is recommended that definitions of symbolic operators carry a [`@targetName` annotation](../other-new-features/targetName.md) that provides an encoding of the operator with an alphanumeric name. This has several benefits: - It helps interoperability between Scala and other languages. One can call a Scala-defined symbolic operator from another language using its target name, @@ -115,7 +115,7 @@ def condition = || xs.exists(_ > 0) || xs.isEmpty ``` -Previously, these expressions would have been rejected, since the compiler's semicolon inference +Previously, those expressions would have been rejected, since the compiler's semicolon inference would have treated the continuations `++ " world"` or `|| xs.isEmpty` as separate statements. To make this syntax work, the rules are modified to not infer semicolons in front of leading infix operators. diff --git a/docs/docs/reference/changed-features/overload-resolution.md b/docs/docs/reference/changed-features/overload-resolution.md index 3eb348dcae7c..aa7e4fdea0c7 100644 --- a/docs/docs/reference/changed-features/overload-resolution.md +++ b/docs/docs/reference/changed-features/overload-resolution.md @@ -13,8 +13,8 @@ are in the first argument list. Overloading resolution now can take argument lists into account when choosing among a set of overloaded alternatives. -For example, the following code compiles in Dotty, while it results in an -ambiguous overload error in Scala2: +For example, the following code compiles in Scala 3, while it results in an +ambiguous overload error in Scala 2: ```scala def f(x: Int)(y: String): Int = 0 @@ -33,7 +33,7 @@ g(2)(3)(4) // ok g(2)(3)("") // ok ``` -To make this work, the rules for overloading resolution in section 6.23.3 of the SLS are augmented +To make this work, the rules for overloading resolution in [SLS §6.26.3](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) are augmented as follows: > In a situation where a function is applied to more than one argument list, if overloading @@ -57,7 +57,7 @@ def f(x: String, f2: String => String) = f2(x) f("a", _.toUpperCase) f(2, _ * 2) ``` -To make this work, the rules for overloading resolution in section 6.23.3 of the SLS are modified +To make this work, the rules for overloading resolution in [SLS §6.26.3](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) are modified as follows: Replace the sentence diff --git a/docs/docs/reference/changed-features/pattern-bindings.md b/docs/docs/reference/changed-features/pattern-bindings.md index de5c32ddc010..65c170717b47 100644 --- a/docs/docs/reference/changed-features/pattern-bindings.md +++ b/docs/docs/reference/changed-features/pattern-bindings.md @@ -25,13 +25,13 @@ want to decompose it like this: ```scala val first :: rest = elems // error ``` -This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.1 it will give a type error. One can avoid the error by marking the pattern with an @unchecked annotation: +This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.1 it will give a type error. One can avoid the error by marking the pattern with an `@unchecked` annotation: ```scala val first :: rest : @unchecked = elems // OK ``` This will make the compiler accept the pattern binding. It might give an error at runtime instead, if the underlying assumption that `elems` can never be empty is wrong. -## Pattern Bindings in For Expressions +## Pattern Bindings in `for` Expressions Analogous changes apply to patterns in `for` expressions. For instance: diff --git a/docs/docs/reference/changed-features/pattern-matching.md b/docs/docs/reference/changed-features/pattern-matching.md index 14628b0e36f1..3b4a88b90c3c 100644 --- a/docs/docs/reference/changed-features/pattern-matching.md +++ b/docs/docs/reference/changed-features/pattern-matching.md @@ -3,9 +3,9 @@ layout: doc-page title: "Option-less pattern matching" --- -Dotty implementation of pattern matching was greatly simplified compared to scalac. From a user perspective, this means that Dotty generated patterns are a *lot* easier to debug, as variables all show up in debug modes and positions are correctly preserved. +Dotty implementation of pattern matching was greatly simplified compared to Scala 2. From a user perspective, this means that Scala 3 generated patterns are a *lot* easier to debug, as variables all show up in debug modes and positions are correctly preserved. -Dotty supports a superset of scalac's [extractors](https://www.scala-lang.org/files/archive/spec/2.13/08-pattern-matching.html#extractor-patterns). +Dotty supports a superset of Scala 2 [extractors](https://www.scala-lang.org/files/archive/spec/2.13/08-pattern-matching.html#extractor-patterns). ## Extractors @@ -54,7 +54,7 @@ A usage of a fixed-arity extractor is irrefutable if one of the following condit - `U = true` - the extractor is used as a product match -- `U = Some[T]` (for Scala2 compatibility) +- `U = Some[T]` (for Scala 2 compatibility) - `U <: R` and `U <: { def isEmpty: false }` ### Variadic Extractors @@ -84,7 +84,7 @@ and `S` conforms to one of the two matches above. The former form of `unapplySeq` has higher priority, and _sequence match_ has higher precedence over _product-sequence match_. -A usage of a variadic extractor is irrefutable if one of the following condition holds: +A usage of a variadic extractor is irrefutable if one of the following conditions holds: - the extractor is used directly as a sequence match or product-sequence match - `U = Some[T]` (for Scala2 compatibility) @@ -230,7 +230,8 @@ object CharList { ```Scala class Foo(val name: String, val children: Int *) object Foo { - def unapplySeq(f: Foo): Option[(String, Seq[Int])] = Some((f.name, f.children)) + def unapplySeq(f: Foo): Option[(String, Seq[Int])] = + Some((f.name, f.children)) } def foo(f: Foo) = f match { diff --git a/docs/docs/reference/changed-features/structural-types.md b/docs/docs/reference/changed-features/structural-types.md index e840b6df4ea2..def43b047b95 100644 --- a/docs/docs/reference/changed-features/structural-types.md +++ b/docs/docs/reference/changed-features/structural-types.md @@ -42,7 +42,7 @@ Here's an example of a structural type `Person`: val age: Int } ``` -The person type adds a _refinement_ to its parent type `Record` that defines `name` and `age` fields. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of class `Person`. For instance, the following +The type `Person` adds a _refinement_ to its parent type `Record` that defines the two fields `name` and `age`. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of class `Person`. For instance, the following program would print "Emma is 42 years old.": ```scala val person = Record("name" -> "Emma", "age" -> 42).asInstanceOf[Person] diff --git a/docs/docs/reference/contextual/extension-methods.md b/docs/docs/reference/contextual/extension-methods.md index efdb2e9dd506..587e833a015d 100644 --- a/docs/docs/reference/contextual/extension-methods.md +++ b/docs/docs/reference/contextual/extension-methods.md @@ -267,7 +267,7 @@ def position(s: String)(ch: Char, n: Int): Int = ### Syntax Here are the syntax changes for extension methods and collective extensions relative -to the [current syntax](../../internals/syntax.md). +to the [current syntax](../syntax.md). ``` BlockStat ::= ... | Extension diff --git a/docs/docs/reference/contextual/given-imports.md b/docs/docs/reference/contextual/given-imports.md index 911641ce9988..8e2f37fc99df 100644 --- a/docs/docs/reference/contextual/given-imports.md +++ b/docs/docs/reference/contextual/given-imports.md @@ -18,7 +18,7 @@ object B { } ``` -In the code above, the `import A._` clause of object `B` will import all members +In the code above, the `import A._` clause in object `B` imports all members of `A` _except_ the given instance `tc`. Conversely, the second import `import A.given` will import _only_ that given instance. The two import clauses can also be merged into one: @@ -67,7 +67,7 @@ object Instances { } ``` -the import +the import clause ```scala import Instances.{given Ordering[?], given ExecutionContext} diff --git a/docs/docs/reference/contextual/givens.md b/docs/docs/reference/contextual/givens.md index b63607dfb68c..e1f9925d9f0d 100644 --- a/docs/docs/reference/contextual/givens.md +++ b/docs/docs/reference/contextual/givens.md @@ -4,7 +4,7 @@ title: "Given Instances" --- Given instances (or, simply, "givens") define "canonical" values of certain types -that serve for synthesizing arguments to [context parameters](./using-clauses.html). Example: +that serve for synthesizing arguments to [context parameters](./using-clauses.md). Example: ```scala trait Ord[T] { @@ -34,7 +34,7 @@ for `Ord[List[T]]` for all types `T` that come with a given instance for `Ord[T] themselves. The `using` clause in `listOrd` defines a condition: There must be a given of type `Ord[T]` for a given of type `Ord[List[T]]` to exist. Such conditions are expanded by the compiler to [context -parameters](./using-clauses.html). +parameters](./using-clauses.md). ## Anonymous Givens @@ -108,7 +108,7 @@ In each case, a pattern-bound given instance consists of `given` and a type `T`. ## Negated Givens -Scala 2's somewhat puzzling behavior with respect to ambiguity has been exploited to implement the analogue of a "negated" search in implicit resolution, where a query Q1 fails if some other query Q2 succeeds and Q1 succeeds if Q2 fails. With the new cleaned up behavior these techniques no longer work. But there is now a new special type `scala.util.NotGiven` which implements negation directly. +Scala 2's somewhat puzzling behavior with respect to ambiguity has been exploited to implement the analogue of a "negated" search in implicit resolution, where a query Q1 fails if some other query Q2 succeeds and Q1 succeeds if Q2 fails. With the new cleaned up behavior these techniques no longer work. But the new special type `scala.util.NotGiven` now implements negation directly. For any query type `Q`, `NotGiven[Q]` succeeds if and only if the implicit search for `Q` fails, for example: diff --git a/docs/docs/reference/contextual/motivation.md b/docs/docs/reference/contextual/motivation.md index 22da4305ac42..a0bd53e03919 100644 --- a/docs/docs/reference/contextual/motivation.md +++ b/docs/docs/reference/contextual/motivation.md @@ -8,7 +8,7 @@ title: "Overview" Scala's implicits are its most distinguished feature. They are _the_ fundamental way to abstract over context. They represent a unified paradigm with a great variety of use cases, among them: implementing type classes, establishing context, dependency injection, expressing capabilities, computing new types and proving relationships between them. Following Haskell, Scala was the second popular language to have some form of implicits. Other languages have followed suit. E.g Rust's traits or Swift's protocol extensions. Design proposals are also on the table for Kotlin as [compile time dependency resolution](https://github.com/Kotlin/KEEP/blob/e863b25f8b3f2e9b9aaac361c6ee52be31453ee0/proposals/compile-time-dependency-resolution.md), for C# as [Shapes and Extensions](https://github.com/dotnet/csharplang/issues/164) -or for F# as [Traits](https://github.com/MattWindsor91/visualfsharp/blob/hackathon-vs/examples/fsconcepts.md). Implicits are also a common feature of theorem provers such as Coq or Agda. +or for F# as [Traits](https://github.com/MattWindsor91/visualfsharp/blob/hackathon-vs/examples/fsconcepts.md). Implicits are also a common feature of theorem provers such as Coq or [Agda](https://agda.readthedocs.io/en/latest/language/implicit-arguments.html). Even though these designs use widely different terminology, they are all variants of the core idea of _term inference_. Given a type, the compiler synthesizes a "canonical" term that has that type. Scala embodies the idea in a purer form than most other languages: An implicit parameter directly leads to an inferred argument term that could also be written down explicitly. By contrast, type class based designs are less direct since they hide term inference behind some form of type classification and do not offer the option of writing the inferred quantities (typically, dictionaries) explicitly. diff --git a/docs/docs/reference/contextual/type-classes.md b/docs/docs/reference/contextual/type-classes.md index 05bc5deff3c5..f12531ef2274 100644 --- a/docs/docs/reference/contextual/type-classes.md +++ b/docs/docs/reference/contextual/type-classes.md @@ -97,7 +97,7 @@ assertTransformation(List("a1", "b1"), List("a", "b"), elt => s"${elt}1") ``` That's a first step, but in practice we probably would like the `map` function to be a method directly accessible on the type `F`. So that we can call `map` directly on instances of `F`, and get rid of the `summon[Functor[F]]` part. -As in the previous example of Monoids, [`extension` methods](extension-methods.html) help achieving that. Let's re-define the `Functor` type class with extension methods. +As in the previous example of Monoids, [`extension` methods](extension-methods.md) help achieving that. Let's re-define the `Functor` type class with extension methods. ```scala trait Functor[F[_]]: @@ -234,7 +234,7 @@ given configDependentMonad: Monad[ConfigDependent] with end configDependentMonad ``` -The type `ConfigDependent` can be written using [type lambdas](../new-types/type-lambdas.html): +The type `ConfigDependent` can be written using [type lambdas](../new-types/type-lambdas.md): ```scala type ConfigDependent = [Result] =>> Config => Result diff --git a/docs/docs/reference/contextual/using-clauses.md b/docs/docs/reference/contextual/using-clauses.md index 63423e283bd8..33bfcc145728 100644 --- a/docs/docs/reference/contextual/using-clauses.md +++ b/docs/docs/reference/contextual/using-clauses.md @@ -9,13 +9,13 @@ functions. Context parameters can help here since they enable the compiler to sy repetitive arguments instead of the programmer having to write them explicitly. For example, with the [given instances](./givens.md) defined previously, -a maximum function that works for any arguments for which an ordering exists can be defined as follows: +a `max` function that works for any arguments for which an ordering exists can be defined as follows: ```scala def max[T](x: T, y: T)(using ord: Ord[T]): T = if ord.compare(x, y) < 0 then y else x ``` Here, `ord` is a _context parameter_ introduced with a `using` clause. -The `max` method can be applied as follows: +The `max` function can be applied as follows: ```scala max(2, 3)(using intOrd) ``` @@ -38,7 +38,7 @@ def maximum[T](xs: List[T])(using Ord[T]): T = `maximum` takes a context parameter of type `Ord` only to pass it on as an inferred argument to `max`. The name of the parameter is left out. -Generally, context parameters may be defined either as a full parameter list `(p_1: T_1, ..., p_n: T_n)` or just as a sequence of types `T_1, ..., T_n`. Vararg parameters are not supported in using clauses. +Generally, context parameters may be defined either as a full parameter list `(p_1: T_1, ..., p_n: T_n)` or just as a sequence of types `T_1, ..., T_n`. Vararg parameters are not supported in `using` clauses. ## Inferring Complex Arguments @@ -60,18 +60,18 @@ maximum(xs)(using descending(using listOrd)) maximum(xs)(using descending(using listOrd(using intOrd))) ``` -## Multiple Using Clauses +## Multiple `using` Clauses -There can be several using clauses in a definition and using clauses can be freely mixed with normal parameter clauses. Example: +There can be several `using` clauses in a definition and `using` clauses can be freely mixed with normal parameter clauses. Example: ```scala def f(u: Universe)(using ctx: u.Context)(using s: ctx.Symbol, k: ctx.Kind) = ... ``` -Multiple using clauses are matched left-to-right in applications. Example: +Multiple `using` clauses are matched left-to-right in applications. Example: ```scala object global extends Universe { type Context = ... } -given ctx : global.Context with { type Symbol = ...; type Kind = ... } -given sym : ctx.Symbol -given kind : ctx.Kind +given ctx : global.Context with { type Symbol = ...; type Kind = ... } +given sym : ctx.Symbol +given kind: ctx.Kind ``` Then the following calls are all valid (and normalize to the last one) ```scala @@ -96,7 +96,7 @@ def summon[T](using x: T): x.type = x ## Syntax -Here is the new syntax of parameters and arguments seen as a delta from the [standard context free syntax of Scala 3](../../internals/syntax.md). `using` is a soft keyword, recognized only at the start of a parameter or argument list. It can be used as a normal identifier everywhere else. +Here is the new syntax of parameters and arguments seen as a delta from the [standard context free syntax of Scala 3](../syntax.md). `using` is a soft keyword, recognized only at the start of a parameter or argument list. It can be used as a normal identifier everywhere else. ``` ClsParamClause ::= ... | UsingClsParamClause DefParamClauses ::= ... | UsingParamClause diff --git a/docs/docs/reference/dropped-features/class-shadowing.md b/docs/docs/reference/dropped-features/class-shadowing.md index 721a4d18139d..d9b4c6fd74d8 100644 --- a/docs/docs/reference/dropped-features/class-shadowing.md +++ b/docs/docs/reference/dropped-features/class-shadowing.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: Class Shadowing +title: "Dropped: Class Shadowing" --- Scala so far allowed patterns like this: diff --git a/docs/docs/reference/dropped-features/delayed-init.md b/docs/docs/reference/dropped-features/delayed-init.md index d38c84e6222f..f391cb07143c 100644 --- a/docs/docs/reference/dropped-features/delayed-init.md +++ b/docs/docs/reference/dropped-features/delayed-init.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: Delayedinit +title: "Dropped: Delayedinit" --- The special handling of the `DelayedInit` trait is no longer diff --git a/docs/docs/reference/dropped-features/do-while.md b/docs/docs/reference/dropped-features/do-while.md index 6f76207208c6..6032b4e0a774 100644 --- a/docs/docs/reference/dropped-features/do-while.md +++ b/docs/docs/reference/dropped-features/do-while.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: Do-While +title: "Dropped: Do-While" --- The syntax construct @@ -43,7 +43,7 @@ while { ### Why Drop The Construct? - - `do-while` is used relatively rarely and it can expressed faithfully using just while. So there seems to be little point in having it as a separate syntax construct. - - Under the [new syntax rules](../other-new-features/control-syntax) `do` is used + - `do-while` is used relatively rarely and it can expressed faithfully using just `while`. So there seems to be little point in having it as a separate syntax construct. + - Under the [new syntax rules](../other-new-features/control-syntax.md) `do` is used as a statement continuation, which would clash with its meaning as a statement introduction. diff --git a/docs/docs/reference/dropped-features/early-initializers.md b/docs/docs/reference/dropped-features/early-initializers.md index 91dc3f1946f5..a0b24fa36162 100644 --- a/docs/docs/reference/dropped-features/early-initializers.md +++ b/docs/docs/reference/dropped-features/early-initializers.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: Early Initializers +title: "Dropped: Early Initializers" --- Early initializers of the form diff --git a/docs/docs/reference/dropped-features/existential-types.md b/docs/docs/reference/dropped-features/existential-types.md index 3a9d638adeff..0fcc7a256d95 100644 --- a/docs/docs/reference/dropped-features/existential-types.md +++ b/docs/docs/reference/dropped-features/existential-types.md @@ -1,9 +1,9 @@ --- layout: doc-page -title: Dropped: Existential Types +title: "Dropped: Existential Types" --- -Existential types using `forSome` have been dropped. The reasons for dropping them were: +Existential types using `forSome` have been dropped. The reasons for dropping them are: - Existential types violate a type soundness principle on which DOT and Dotty are constructed. That principle says that every diff --git a/docs/docs/reference/dropped-features/limit22.md b/docs/docs/reference/dropped-features/limit22.md index ed1fe7e44435..05142d56661a 100644 --- a/docs/docs/reference/dropped-features/limit22.md +++ b/docs/docs/reference/dropped-features/limit22.md @@ -1,14 +1,14 @@ --- layout: doc-page -title: Dropped: Limit 22 +title: "Dropped: Limit 22" --- The limits of 22 for the maximal number of parameters of function types and the maximal number of fields in tuple types have been dropped. Functions can now have an arbitrary number of -parameters. Functions beyond Function22 are erased to a new trait -`scala.FunctionXXL` and tuples beyond Tuple22 are erased to a new trait `scala.TupleXXL`. +parameters. Functions beyond `Function22` are erased to a new trait +`scala.FunctionXXL` and tuples beyond `Tuple22` are erased to a new trait `scala.TupleXXL`. Both of these are implemented using arrays. Tuples can also have an arbitrary number of fields. Furthermore, they support generic operation such as concatenation and indexing. diff --git a/docs/docs/reference/dropped-features/macros.md b/docs/docs/reference/dropped-features/macros.md index 8cd8861ebad2..2742416e2c0a 100644 --- a/docs/docs/reference/dropped-features/macros.md +++ b/docs/docs/reference/dropped-features/macros.md @@ -1,9 +1,11 @@ --- layout: doc-page -title: Dropped: Scala 2 Macros +title: "Dropped: Scala 2 Macros" --- -The previous, experimental macro system has been dropped. Instead, there is a cleaner, more restricted system based on two complementary concepts: `inline` and `'{ ... }`/`${ ... }` code generation. +The previous, experimental macro system has been dropped. + +Instead, there is a cleaner, more restricted system based on two complementary concepts: `inline` and `'{ ... }`/`${ ... }` code generation. `'{ ... }` delays the compilation of the code and produces an object containing the code, dually `${ ... }` evaluates an expression which produces code and inserts it in the surrounding `${ ... }`. In this setting, a definition marked as inlined containing a `${ ... }` is a macro, the code inside the `${ ... }` is executed at compile-time and produces code in the form of `'{ ... }`. Additionally, the contents of code can be inspected and created with a more complex reflection API (TASTy Reflect) as an extension of `'{ ... }`/`${ ... }` framework. diff --git a/docs/docs/reference/dropped-features/nonlocal-returns.md b/docs/docs/reference/dropped-features/nonlocal-returns.md index 075eef088f75..e3af0d1de2d1 100644 --- a/docs/docs/reference/dropped-features/nonlocal-returns.md +++ b/docs/docs/reference/dropped-features/nonlocal-returns.md @@ -1,9 +1,11 @@ --- layout: doc-page -title: Deprecated: Nonlocal Returns +title: "Deprecated: Nonlocal Returns" --- -Returning from nested anonymous functions has been deprecated. Nonlocal returns are implemented by throwing and catching `scala.runtime.NonLocalReturnException`-s. This is rarely what is intended by the programmer. It can be problematic because of the hidden performance cost of throwing and catching exceptions. Furthermore, it is a leaky implementation: a catch-all exception handler can intercept a `NonLocalReturnException`. +Returning from nested anonymous functions has been deprecated. + +Nonlocal returns are implemented by throwing and catching `scala.runtime.NonLocalReturnException`-s. This is rarely what is intended by the programmer. It can be problematic because of the hidden performance cost of throwing and catching exceptions. Furthermore, it is a leaky implementation: a catch-all exception handler can intercept a `NonLocalReturnException`. A drop-in library replacement is provided in `scala.util.control.NonLocalReturns`: diff --git a/docs/docs/reference/dropped-features/procedure-syntax.md b/docs/docs/reference/dropped-features/procedure-syntax.md index 9a5416e37db4..a345cd9e0eb9 100644 --- a/docs/docs/reference/dropped-features/procedure-syntax.md +++ b/docs/docs/reference/dropped-features/procedure-syntax.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: Procedure Syntax +title: "Dropped: Procedure Syntax" --- Procedure syntax diff --git a/docs/docs/reference/dropped-features/symlits.md b/docs/docs/reference/dropped-features/symlits.md index 211ceab1aef0..ecb03816b4cd 100644 --- a/docs/docs/reference/dropped-features/symlits.md +++ b/docs/docs/reference/dropped-features/symlits.md @@ -1,7 +1,9 @@ --- layout: doc-page -title: Dropped: Symbol Literals +title: "Dropped: Symbol Literals" --- -Symbol literals are no longer supported. The `scala.Symbol` class still exists, so a +Symbol literals are no longer supported. + +The `scala.Symbol` class still exists, so a literal translation of the symbol literal `'xyz` is `Symbol("xyz")`. However, it is recommended to use a plain string literal `"xyz"` instead. (The `Symbol` class will be deprecated and removed in the future). diff --git a/docs/docs/reference/dropped-features/this-qualifier.md b/docs/docs/reference/dropped-features/this-qualifier.md index 4e50048b8d76..9d7ba7e5ad41 100644 --- a/docs/docs/reference/dropped-features/this-qualifier.md +++ b/docs/docs/reference/dropped-features/this-qualifier.md @@ -1,14 +1,14 @@ --- layout: doc-page -title: Dropped: private[this] and protected[this] +title: "Dropped: private[this] and protected[this]" --- The `private[this]` and `protected[this]` access modifiers are deprecated and will be phased out. -Previously, these modifiers were needed +Previously, these modifiers were needed for - - for avoiding the generation of getters and setters - - for excluding code under a `private[this]` from variance checks. (Scala 2 also excludes `protected[this]` but this was found to be unsound and was therefore removed). + - avoiding the generation of getters and setters + - excluding code under a `private[this]` from variance checks. (Scala 2 also excludes `protected[this]` but this was found to be unsound and was therefore removed). The compiler now infers for `private` members the fact that they are only accessed via `this`. Such members are treated as if they had been declared `private[this]`. `protected[this]` is dropped without a replacement. diff --git a/docs/docs/reference/dropped-features/type-projection.md b/docs/docs/reference/dropped-features/type-projection.md index ccc4259e2230..3e926151f672 100644 --- a/docs/docs/reference/dropped-features/type-projection.md +++ b/docs/docs/reference/dropped-features/type-projection.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: General Type Projection +title: "Dropped: General Type Projection" --- Scala so far allowed general type projection `T#A` where `T` is an arbitrary type diff --git a/docs/docs/reference/dropped-features/weak-conformance.md b/docs/docs/reference/dropped-features/weak-conformance.md index 2ed20cd07af1..2e64a2dcdf19 100644 --- a/docs/docs/reference/dropped-features/weak-conformance.md +++ b/docs/docs/reference/dropped-features/weak-conformance.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: Weak Conformance +title: "Dropped: Weak Conformance" --- In some situations, Scala used a _weak conformance_ relation when diff --git a/docs/docs/reference/dropped-features/xml.md b/docs/docs/reference/dropped-features/xml.md index 8e8769336e53..aa90d02f5ba5 100644 --- a/docs/docs/reference/dropped-features/xml.md +++ b/docs/docs/reference/dropped-features/xml.md @@ -1,6 +1,6 @@ --- layout: doc-page -title: Dropped: XML Literals +title: "Dropped: XML Literals" --- XML Literals are still supported, but will be dropped in the near future, to diff --git a/docs/docs/reference/enums/adts.md b/docs/docs/reference/enums/adts.md index 61258a7c490e..53985f616190 100644 --- a/docs/docs/reference/enums/adts.md +++ b/docs/docs/reference/enums/adts.md @@ -3,7 +3,7 @@ layout: doc-page title: "Algebraic Data Types" --- -The [`enum` concept](./enums.html) is general enough to also support algebraic data +The [`enum` concept](./enums.md) is general enough to also support algebraic data types (ADTs) and their generalized version (GADTs). Here is an example how an `Option` type can be represented as an ADT: @@ -32,7 +32,7 @@ enum Option[+T] { Note that the parent type of the `None` value is inferred as `Option[Nothing]`. Generally, all covariant type parameters of the enum -class are minimized in a compiler-generated extends clause whereas all +class are minimized in a compiler-generated `extends` clause whereas all contravariant type parameters are maximized. If `Option` was non-variant, you would need to give the extends clause of `None` explicitly. @@ -145,9 +145,10 @@ enum View[-T, +U] extends (T => U): ### Syntax of Enums Changes to the syntax fall in two categories: enum definitions and cases inside enums. -The changes are specified below as deltas with respect to the Scala syntax given [here](../../internals/syntax.md) +The changes are specified below as deltas with respect to the Scala syntax given [here](../syntax.md) 1. Enum definitions are defined as follows: + ``` TmplDef ::= `enum' EnumDef EnumDef ::= id ClassConstr [`extends' [ConstrApps]] EnumBody @@ -155,10 +156,13 @@ The changes are specified below as deltas with respect to the Scala syntax given EnumStat ::= TemplateStat | {Annotation [nl]} {Modifier} EnumCase ``` + 2. Cases of enums are defined as follows: + ``` EnumCase ::= `case' (id ClassConstr [`extends' ConstrApps]] | ids) ``` + ### Reference For more info, see [Issue #1970](https://github.com/lampepfl/dotty/issues/1970). diff --git a/docs/docs/reference/enums/desugarEnums.md b/docs/docs/reference/enums/desugarEnums.md index beed276b92ff..4e613873feb4 100644 --- a/docs/docs/reference/enums/desugarEnums.md +++ b/docs/docs/reference/enums/desugarEnums.md @@ -174,6 +174,7 @@ If `E` contains at least one simple case, its companion object will define in ad - A private method `$new` which defines a new simple case value with given ordinal number and name. This method can be thought as being defined as follows. + ```scala private def $new(_$ordinal: Int, $name: String) = new E with runtime.EnumValue { def ordinal = _$ordinal @@ -209,5 +210,5 @@ Cases such as `case C` expand to a `@static val` as opposed to a `val`. This all `scala.reflect.Enum`. This ensures that the only cases of an enum are the ones that are explicitly declared in it. - - If an enum case has an extends clause, the enum class must be one of the + - If an enum case has an `extends` clause, the enum class must be one of the classes that's extended. diff --git a/docs/docs/reference/features-classification.md b/docs/docs/reference/features-classification.md index c508ee18f846..e2e99991b37b 100644 --- a/docs/docs/reference/features-classification.md +++ b/docs/docs/reference/features-classification.md @@ -14,7 +14,7 @@ The current document reflects the state of things as of April, 2019. It will be ## Essential Foundations -These new constructs directly model core features of DOT, higher-kinded types, and the [SI calculus for implicit resolution](https://infoscience.epfl.ch/record/229878/files/simplicitly_1.pdf). +These new constructs directly model core features of [DOT](https://www.scala-lang.org/blog/2016/02/03/essence-of-scala.html), higher-kinded types, and the [SI calculus for implicit resolution](https://infoscience.epfl.ch/record/229878/files/simplicitly_1.pdf). - [Intersection types](new-types/intersection-types.md), replacing compound types, - [Union types](new-types/union-types.md), diff --git a/docs/docs/reference/metaprogramming/erased-terms.md b/docs/docs/reference/metaprogramming/erased-terms.md index 2313538c570a..c5d583149eea 100644 --- a/docs/docs/reference/metaprogramming/erased-terms.md +++ b/docs/docs/reference/metaprogramming/erased-terms.md @@ -3,7 +3,7 @@ layout: doc-page title: "Erased Terms" --- -# Why erased terms? +## Why erased terms? Let's describe the motivation behind erased terms with an example. In the following we show a simple state machine which can be in a state `On` or `Off`. @@ -44,7 +44,7 @@ introduce _erased terms_ to overcome this limitation: we are able to enforce the right constrains on terms at compile time. These terms have no run time semantics and they are completely erased. -# How to define erased terms? +## How to define erased terms? Parameters of methods and functions can be declared as erased, placing `erased` in front of a parameter list (like `given`). @@ -76,7 +76,7 @@ erased val erasedEvidence: Ev = ... methodWithErasedEv(erasedEvidence) ``` -# What happens with erased values at runtime? +## What happens with erased values at runtime? As `erased` are guaranteed not to be used in computations, they can and will be erased. @@ -93,7 +93,7 @@ erased val erasedEvidence3: Ev = ... // does not exist at runtime methodWithErasedEv(evidence1) ``` -# State machine with erased evidence example +## State machine with erased evidence example The following example is an extended implementation of a simple state machine which can be in a state `On` or `Off`. The machine can change state from `Off` @@ -121,14 +121,16 @@ final class Off extends State @implicitNotFound("State must be Off") class IsOff[S <: State] object IsOff { - // will not be called at runtime for turnedOn, the compiler will only require that this evidence exists + // will not be called at runtime for turnedOn, the + // compiler will only require that this evidence exists given IsOff[Off] = new IsOff[Off] } @implicitNotFound("State must be On") class IsOn[S <: State] object IsOn { - // will not exist at runtime, the compiler will only require that this evidence exists at compile time + // will not exist at runtime, the compiler will only + // require that this evidence exists at compile time erased given IsOn[On] = new IsOn[On] } diff --git a/docs/docs/reference/metaprogramming/inline.md b/docs/docs/reference/metaprogramming/inline.md index b4df434e3750..9812744169ab 100644 --- a/docs/docs/reference/metaprogramming/inline.md +++ b/docs/docs/reference/metaprogramming/inline.md @@ -142,11 +142,13 @@ funkyAssertEquals(computeActual(), computeExpected(), computeDelta()) // if (actual - expected).abs > computeDelta() then // throw new AssertionError(s"difference between ${expected} and ${actual} was larger than ${computeDelta()}") ``` + ### Rules for Overriding Inline methods can override other non-inline methods. The rules are as follows: 1. If an inline method `f` implements or overrides another, non-inline method, the inline method can also be invoked at runtime. For instance, consider the scenario: + ```scala abstract class A { def f(): Int @@ -170,6 +172,7 @@ Inline methods can override other non-inline methods. The rules are as follows: 2. Inline methods are effectively final. 3. Inline methods can also be abstract. An abstract inline method can be implemented only by other inline methods. It cannot be invoked directly: + ```scala abstract class A { inline def f(): Int @@ -182,9 +185,9 @@ Inline methods can override other non-inline methods. The rules are as follows: a.f() // error: cannot inline f() in A. ``` -### Relationship to @inline +### Relationship to `@inline` -Scala also defines a `@inline` annotation which is used as a hint +Scala 2 also defines a `@inline` annotation which is used as a hint for the backend to inline. The `inline` modifier is a more powerful option: Expansion is guaranteed instead of best effort, it happens in the frontend instead of in the backend, and it also applies @@ -223,8 +226,7 @@ pure expressions of constant type. #### The definition of constant expression Right-hand sides of inline values and of arguments for inline parameters must be -constant expressions in the sense defined by the [SLS § -6.24](https://www.scala-lang.org/files/archive/spec/2.12/06-expressions.html#constant-expressions), +constant expressions in the sense defined by the [SLS §6.24](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#constant-expressions), including _platform-specific_ extensions such as constant folding of pure numeric computations. @@ -498,8 +500,7 @@ val conjunction: true && true = true val multiplication: 3 * 5 = 15 ``` -Many of these singleton operation types are meant to be used infix (as in [SLS § -3.2.8](https://www.scala-lang.org/files/archive/spec/2.12/03-types.html#infix-types)). +Many of these singleton operation types are meant to be used infix (as in [SLS §3.2.10](https://www.scala-lang.org/files/archive/spec/2.13/03-types.html#infix-types)). Since type aliases have the same precedence rules as their term-level equivalents, the operations compose with the expected precedence rules: diff --git a/docs/docs/reference/metaprogramming/macros-spec.md b/docs/docs/reference/metaprogramming/macros-spec.md index 9e10c15c95cf..ffa4bb874dc6 100644 --- a/docs/docs/reference/metaprogramming/macros-spec.md +++ b/docs/docs/reference/metaprogramming/macros-spec.md @@ -7,7 +7,7 @@ title: "Macros Spec" ### Syntax -Compared to the [Dotty reference grammar](../../internals/syntax.md) +Compared to the [Dotty reference grammar](../syntax.md) there are the following syntax changes: ``` SimpleExpr ::= ... diff --git a/docs/docs/reference/metaprogramming/macros.md b/docs/docs/reference/metaprogramming/macros.md index 163eff7b5c0e..ed282fe59741 100644 --- a/docs/docs/reference/metaprogramming/macros.md +++ b/docs/docs/reference/metaprogramming/macros.md @@ -3,7 +3,7 @@ layout: doc-page title: "Macros" --- -### Macros: Quotes and Splices +## Macros: Quotes and Splices Macros are built on two well-known fundamental operations: quotation and splicing. Quotation is expressed as `'{...}` for expressions and as `'[...]` @@ -73,7 +73,7 @@ ${'[T]} = T '[${T}] = T ``` -### Types for Quotations +## Types for Quotations The type signatures of quotes and splices can be described using two fundamental types: @@ -100,7 +100,7 @@ these types are provided by the system. One way to construct values of these types is by quoting, the other is by type-specific lifting operations that will be discussed later on. -### The Phase Consistency Principle +## The Phase Consistency Principle A fundamental *phase consistency principle* (PCP) regulates accesses to free variables in quoted and spliced code: @@ -137,7 +137,7 @@ environment, with some restrictions and caveats since such accesses involve serialization. However, this does not constitute a fundamental gain in expressiveness. -### From `Expr`s to Functions and Back +## From `Expr`s to Functions and Back It is possible to convert any `Expr[T => R]` into `Expr[T] => Expr[R]` and back. These conversions can be implemented as follows: @@ -184,7 +184,7 @@ result of beta-reducing `f(x)` if `f` is a known lambda expression. Expr.betaReduce(_): Expr[(T1, ..., Tn) => R] => ((Expr[T1], ..., Expr[Tn]) => Expr[R]) ``` -### Lifting Types +## Lifting Types Types are not directly affected by the phase consistency principle. It is possible to use types defined at any level in any other level. @@ -225,7 +225,7 @@ to the context bound `: Type`), and the reference to that value is phase-correct. If that was not the case, the phase inconsistency for `T` would be reported as an error. -### Lifting Expressions +## Lifting Expressions Consider the following implementation of a staged interpreter that implements a compiler through staging. @@ -357,7 +357,7 @@ def showExpr[T](expr: Expr[T])(using Quotes): Expr[String] = { That is, the `showExpr` method converts its `Expr` argument to a string (`code`), and lifts the result back to an `Expr[String]` using `Expr.apply`. -### Lifting Types +## Lifting Types The previous section has shown that the metaprogramming framework has to be able to take a type `T` and convert it to a type tree of type @@ -388,7 +388,7 @@ In fact Scala 2's type tag feature can be understood as a more ad-hoc version of `quoted.Type`. As was the case for type tags, the implicit search for a `quoted.Type` is handled by the compiler, using the algorithm sketched above. -### Relationship with Inline +## Relationship with Inline Seen by itself, principled metaprogramming looks more like a framework for runtime metaprogramming than one for compile-time metaprogramming with macros. @@ -488,7 +488,7 @@ private def powerCode(x: Expr[Double], n: Int)(using Quotes): Expr[Double] = else '{ $x * ${ powerCode(x, n - 1) } } ``` -### Scope Extrusion +## Scope Extrusion Quotes and splices are duals as far as the PCP is concerned. But there is an additional restriction that needs to be imposed on splices to guarantee @@ -531,7 +531,7 @@ appearing in splices. In a base language with side effects we would have to do t anyway: Since `run` runs arbitrary code it can always produce a side effect if the code it runs produces one. -### Example Expansion +## Example Expansion Assume we have two methods, one `map` that takes an `Expr[Array[T]]` and a function `f` and one `sum` that performs a sum by delegating to `map`. @@ -615,7 +615,7 @@ while (i < arr.length) { sum ``` -### Find implicits within a macro +## Find implicits within a macro Similarly to the `summonFrom` construct, it is possible to make implicit search available in a quote context. For this we simply provide `scala.quoted.Expr.summon`: @@ -631,7 +631,7 @@ def setForExpr[T: Type](using Quotes): Expr[Set[T]] = { } ``` -### Relationship with Whitebox Inline +## Relationship with Whitebox Inline [Inline](./inline.md) documents inlining. The code below introduces a whitebox inline method that can calculate either a value of type `Int` or a value of type @@ -651,7 +651,7 @@ val b: String = defaultOf("string") ``` -### Defining a macro and using it in a single project +## Defining a macro and using it in a single project It is possible to define macros and use them in the same project as long as the implementation of the macros does not have run-time dependencies on code in the file where it is used. @@ -663,7 +663,7 @@ If there are any suspended files when the compilation ends, the compiler will au compilation of the suspended files using the output of the previous (partial) compilation as macro classpath. In case all files are suspended due to cyclic dependencies the compilation will fail with an error. -### Pattern matching on quoted expressions +## Pattern matching on quoted expressions It is possible to deconstruct or extract values out of `Expr` using pattern matching. @@ -691,7 +691,7 @@ private def sumExpr(argsExpr: Expr[Seq[Int]])(using Quotes): Expr[Int] = argsExp } ``` -#### Quoted patterns +### Quoted patterns Quoted pattens allow deconstructing complex code that contains a precise structure, types or methods. Patterns `'{ ... }` can be placed in any location where Scala expects a pattern. @@ -728,7 +728,7 @@ private def sumExpr(args1: Seq[Expr[Int]])(using Quotes): Expr[Int] = { } ``` -#### Recovering precise types using patterns +### Recovering precise types using patterns Sometimes it is necessary to get a more precise type for an expression. This can be achived using the following pattern match. @@ -772,7 +772,7 @@ trait Show[-T] { } ``` -#### Open code patterns +### Open code patterns Quote pattern matching also provides higher-order patterns to match open terms. If a quoted term contains a definition, then the rest of the quote can refer to this definition. @@ -820,6 +820,6 @@ eval { // expands to the code: (16: Int) We can also close over several bindings using `$b(a1, a2, ..., an)`. To match an actual application we can use braces on the function part `${b}(a1, a2, ..., an)`. -### More details +## More details [More details](./macros-spec.md) diff --git a/docs/docs/reference/metaprogramming/toc.md b/docs/docs/reference/metaprogramming/toc.md index edf512f8caa4..773491e67583 100644 --- a/docs/docs/reference/metaprogramming/toc.md +++ b/docs/docs/reference/metaprogramming/toc.md @@ -6,7 +6,7 @@ title: "Overview" The following pages introduce the redesign of metaprogramming in Scala. They introduce the following fundamental facilities: -1. [Inline](./inline.md) `inline` is a new modifier that guarantees that +1. [`inline`](./inline.md) is a new modifier that guarantees that a definition will be inlined at the point of use. The primary motivation behind inline is to reduce the overhead behind function calls and access to values. The expansion will be performed by the Scala compiler during the @@ -17,7 +17,7 @@ introduce the following fundamental facilities: programming), macros (enabling compile-time, generative, metaprogramming) and runtime code generation (multi-stage programming). -2. [Macros](./macros.md) Macros are built on two well-known fundamental +2. [Macros](./macros.md) are built on two well-known fundamental operations: quotation and splicing. Quotation converts program code to data, specifically, a (tree-like) representation of this code. It is expressed as `'{...}` for expressions and as `'[...]` for types. Splicing, diff --git a/docs/docs/reference/other-new-features/explicit-nulls.md b/docs/docs/reference/other-new-features/explicit-nulls.md index 40e43219af1c..8abc2fd6c087 100644 --- a/docs/docs/reference/other-new-features/explicit-nulls.md +++ b/docs/docs/reference/other-new-features/explicit-nulls.md @@ -71,7 +71,7 @@ y == x // ok (x: Any) == null // ok ``` -## Working with Null +## Working with `Null` To make working with nullable values easier, we propose adding a few utilities to the standard library. So far, we have found the following useful: @@ -89,9 +89,9 @@ So far, we have found the following useful: Don't use `.nn` on mutable variables directly, because it may introduce an unknown type into the type of the variable. -## Java Interop +## Java Interoperability -The compiler can load Java classes in two ways: from source or from bytecode. In either case, +The Scala compiler can load Java classes in two ways: from source or from bytecode. In either case, when a Java class is loaded, we "patch" the type of its members to reflect that Java types remain implicitly nullable. @@ -223,15 +223,17 @@ Specifically, we patch } ``` - The annotation must be from the list below to be recognized as NotNull by the compiler. + The annotation must be from the list below to be recognized as `NotNull` by the compiler. Check `Definitions.scala` for an updated list. ```scala - // A list of annotations that are commonly used to indicate that a field/method argument or return - // type is not null. These annotations are used by the nullification logic in JavaNullInterop to - // improve the precision of type nullification. - // We don't require that any of these annotations be present in the class path, but we want to - // create Symbols for the ones that are present, so they can be checked during nullification. + // A list of annotations that are commonly used to indicate + // that a field/method argument or return type is not null. + // These annotations are used by the nullification logic in + // JavaNullInterop to improve the precision of type nullification. + // We don't require that any of these annotations be present + // in the class path, but we want to create Symbols for the + // ones that are present, so they can be checked during nullification. @tu lazy val NotNullAnnots: List[ClassSymbol] = ctx.getClassesIfDefined( "javax.annotation.Nonnull" :: "edu.umd.cs.findbugs.annotations.NonNull" :: @@ -391,42 +393,42 @@ while (xs != null) { When dealing with local mutable variables, there are two questions: 1. Whether to track a local mutable variable during flow typing. - We track a local mutable variable iff the variable is not assigned in a closure. - For example, in the following code `x` is assigned to by the closure `y`, so we do not - do flow typing on `x`. - - ```scala - var x: String|Null = ??? - def y = { - x = null - } - if (x != null) { - // y can be called here, which would break the fact - val a: String = x // error: x is captured and mutated by the closure, not trackable - } - ``` + We track a local mutable variable iff the variable is not assigned in a closure. + For example, in the following code `x` is assigned to by the closure `y`, so we do not + do flow typing on `x`. + + ```scala + var x: String|Null = ??? + def y = { + x = null + } + if (x != null) { + // y can be called here, which would break the fact + val a: String = x // error: x is captured and mutated by the closure, not trackable + } + ``` 2. Whether to generate and use flow typing on a specific _use_ of a local mutable variable. - We only want to do flow typing on a use that belongs to the same method as the definition - of the local variable. - For example, in the following code, even `x` is not assigned to by a closure, but we can only - use flow typing in one of the occurrences (because the other occurrence happens within a nested - closure). - - ```scala - var x: String|Null = ??? - def y = { + We only want to do flow typing on a use that belongs to the same method as the definition + of the local variable. + For example, in the following code, even `x` is not assigned to by a closure, but we can only + use flow typing in one of the occurrences (because the other occurrence happens within a nested + closure). + + ```scala + var x: String|Null = ??? + def y = { + if (x != null) { + // not safe to use the fact (x != null) here + // since y can be executed at the same time as the outer block + val _: String = x + } + } if (x != null) { - // not safe to use the fact (x != null) here - // since y can be executed at the same time as the outer block - val _: String = x + val a: String = x // ok to use the fact here + x = null } - } - if (x != null) { - val a: String = x // ok to use the fact here - x = null - } - ``` + ``` See more examples in `tests/explicit-nulls/neg/var-ref-in-closure.scala`. diff --git a/docs/docs/reference/other-new-features/indentation.md b/docs/docs/reference/other-new-features/indentation.md index 5ee17a411246..13b4a5532a77 100644 --- a/docs/docs/reference/other-new-features/indentation.md +++ b/docs/docs/reference/other-new-features/indentation.md @@ -63,9 +63,10 @@ There are two rules: - after a ": at end of line" token (see below) - after one of the following tokens: - ``` - = => <- if then else while do try catch finally for yield match return - ``` + ``` + = => <- catch do else finally for + if match return then try while yield + ``` If an `` is inserted, the indentation width of the token on the next line is pushed onto `IW`, which makes it the new current indentation width. @@ -80,7 +81,7 @@ There are two rules: then else do catch finally yield match ``` - the first token on the next line is not a - [leading infix operator](../changed-features/operators.html). + [leading infix operator](../changed-features/operators.md). If an `` is inserted, the top element is popped from `IW`. If the indentation width of the token on the next line is still less than the new current indentation width, step (2) repeats. Therefore, several `` tokens diff --git a/docs/docs/reference/overview.md b/docs/docs/reference/overview.md index 86b9fd9cf22d..4f10f7d81d83 100644 --- a/docs/docs/reference/overview.md +++ b/docs/docs/reference/overview.md @@ -48,7 +48,7 @@ These constructs replace existing constructs with the aim of making the language With the exception of early initializers and old-style vararg patterns, all superseded constructs continue to be available in Scala 3.0. The plan is to deprecate and phase them out later. -Value classes (superseded by opaque type aliases) are a special case. There are currently no deprecation plans for value classes, since we might want to bring them back in a more general form if they are supported natively by the JVM as is planned by project Valhalla. +Value classes (superseded by opaque type aliases) are a special case. There are currently no deprecation plans for value classes, since we might want to bring them back in a more general form if they are supported natively by the JVM as is planned by [project Valhalla](https://openjdk.java.net/projects/valhalla/). ## Restrictions @@ -109,11 +109,11 @@ These are additions to the language that make it more powerful or pleasant to us - [Dependent Function Types](new-types/dependent-function-types.md) generalize dependent methods to dependent function values and types. - [Polymorphic Function Types](new-types/polymorphic-function-types.md) generalize polymorphic methods to polymorphic function values and types. _Current status_: There is a proposal and a merged prototype implementation, but the implementation has not been finalized (it is notably missing type inference support). - [Kind Polymorphism](other-new-features/kind-polymorphism.md) allows the definition of operators working equally on types and type constructors. - - [@targetName Annotations](other-new-features/targetName.md) make it easier to interoperate with code written in other languages and give more flexibility for avoiding name clashes. + - [`@targetName` Annotations](other-new-features/targetName.md) make it easier to interoperate with code written in other languages and give more flexibility for avoiding name clashes. ## Metaprogramming -The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as Shapeless that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3. +The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as [Shapeless](https://github.com/milessabin/shapeless) that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3. It's worth noting that macros were never included in the Scala 2 language specification and were so far made available only under an `-experimental` flag. This has not prevented their widespread usage. diff --git a/docs/docs/reference/soft-modifier.md b/docs/docs/reference/soft-modifier.md index 14617cb0cbd3..24bcfc970890 100644 --- a/docs/docs/reference/soft-modifier.md +++ b/docs/docs/reference/soft-modifier.md @@ -12,4 +12,3 @@ Worth maintaining? or maybe better refer to internal/syntax.md ? It is treated as a potential modifier of a definition, if it is followed by a hard modifier or a keyword combination starting a definition (`def`, `val`, `var`, `type`, `class`, `case class`, `trait`, `object`, `case object`, `enum`). Between the two words there may be a sequence of newline tokens and soft modifiers. It is treated as a potential modifier of a parameter binding unless it is followed by `:`. - diff --git a/docs/docs/reference/syntax.md b/docs/docs/reference/syntax.md index 073a23649703..c46d150ba836 100644 --- a/docs/docs/reference/syntax.md +++ b/docs/docs/reference/syntax.md @@ -90,14 +90,14 @@ colonEol ::= ": at end of line that can start a template body" ### Regular keywords ``` -abstract case catch class def do else enum -export extends false final finally for given if -implicit import lazy match new null object package -private protected override return super sealed then throw -trait true try type val var while with -yield -: = <- => <: :> # @ -=>> ?=> +abstract case catch class def do else +enum export extends false final finally for +given if implicit import lazy match new +null object package private protected override return +super sealed then throw trait true try +type val var while with yield +: = <- => <: :> # +@ =>> ?=> ``` ### Soft keywords