You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/implicit-resolution.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@ where the type may still be inferred:
24
24
...
25
25
}
26
26
```
27
-
**2.**Nesting is now taken into account for selecting an implicit.Considerfor instance the following scenario:
27
+
**2.**Nesting is now taken into account for selecting an implicit.Considerfor instance the following scenario:
28
28
```scala
29
29
deff(impliciti: C) = {
30
30
defg(implicitj: C) = {
@@ -55,7 +55,7 @@ have only `b` in its implicit search scope but not `a`.
55
55
In more detail, here are the rules for what constitutes the implicit scope of
56
56
a type:
57
57
58
-
**Definition:**A reference is an _anchor_ if it refers to an object, a class, a trait, an abstracttype, an opaquetypealias, or a matchtypealias. References to packages and packageobjectsareanchorsonlyunder-source:3.0-migration.
58
+
**Definition:**A reference is an _anchor_ if it refers to an object, a class, a trait, an abstracttype, an opaquetypealias, or a matchtypealias. References to packages and packageobjectsareanchorsonlyunder`-source:3.0-migration`.
59
59
60
60
**Definition:**The _anchors_ of a type_T_ is a set of references defined asfollows:
61
61
@@ -122,7 +122,7 @@ most (but not all) divergence errors in Scala 2 would terminate the implicit sea
122
122
defbuzz(y: A) =???
123
123
buzz(1) // error: ambiguous
124
124
```
125
-
**7.**The rule for picking a _most specific_ alternative among a set of overloaded or implicit alternatives is refined to take context parameters into account. Allelse being equal, an alternative that takes some context parameters is taken to be less specific than an alternative that takes none. If both alternatives take context parameters, we try to choose between them asif they were methods with regular parameters. The following paragraph in the SLS is affected by thischange:
125
+
**7.**The rule for picking a _most specific_ alternative among a set of overloaded or implicit alternatives is refined to take context parameters into account. Allelse being equal, an alternative that takes some context parameters is taken to be less specific than an alternative that takes none. If both alternatives take context parameters, we try to choose between them asif they were methods with regular parameters. The following paragraph in the [SLS §6.26.3](https://scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) is affected by thischange:
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/interpolation-escapes.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ layout: doc-page
3
3
title: Escapes in interpolations
4
4
---
5
5
6
-
In Scala 2 there was no straightforward way to represent a single quote character `"` in a single quoted interpolation. A \ character can't be used for that because interpolators themselves decide how to handle escaping, so the parser doesn't know whether the " should be escaped or used as a terminator.
6
+
In Scala 2 there was no straightforward way to represent a single quote character `"` in a single quoted interpolation. A `\` character can't be used for that because interpolators themselves decide how to handle escaping, so the parser doesn't know whether the `"` should be escaped or used as a terminator.
7
7
8
8
In Dotty, you can use the `$` meta character of interpolations to escape a `"` character.
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/match-syntax.md
+2
Original file line number
Diff line number
Diff line change
@@ -16,6 +16,7 @@ The syntactical precedence of match expressions has been changed.
16
16
case"empty"=>0
17
17
case"nonempty"=>1
18
18
}
19
+
```
19
20
20
21
2. `match` may follow a period:
21
22
@@ -26,6 +27,7 @@ The syntactical precedence of match expressions has been changed.
26
27
}
27
28
then"nonempty"
28
29
else"empty"
30
+
```
29
31
30
32
3. The scrutinee of a match expression must be an `InfixExpr`. Previously the scrutinee could be followed by a typeascription `: T`, but this is no longer supported. So `x : T match { ... }` now has to be
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/operators.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -47,7 +47,7 @@ one of the following conditions holds:
47
47
- the operator is followed by an opening brace.
48
48
49
49
An alphanumeric operator is an operator consisting entirely of letters, digits, the `$` and `_` characters, or
50
-
any unicode character `c` for which `java.lang.Character.isIdentifierPart(c)` returns `true`.
50
+
any Unicode character `c` for which `java.lang.Character.isIdentifierPart(c)` returns `true`.
51
51
52
52
Infix operations involving symbolic operators are always allowed, so `infix` is redundant for methods with symbolic names.
53
53
@@ -89,9 +89,9 @@ The purpose of the `infix` modifier is to achieve consistency across a code base
89
89
5. To smooth migration to Scala3.0, alphanumeric operators will only be deprecated from Scala3.1 onwards,
90
90
or if the `-source 3.1` option is giveninDotty/Scala3.
91
91
92
-
##The@targetName Annotation
92
+
##The`@targetName`Annotation
93
93
94
-
It is recommended that definitions of symbolic operators carry a [@targetName annotation](../other-new-features/targetName.html) that provides an encoding of the operator with an alphanumeric name. This has several benefits:
94
+
It is recommended that definitions of symbolic operators carry a [`@targetName` annotation](../other-new-features/targetName.md) that provides an encoding of the operator with an alphanumeric name. This has several benefits:
95
95
96
96
-It helps interoperability between Scala and other languages. One can call
97
97
a Scala-defined symbolic operator from another language using its target name,
@@ -115,7 +115,7 @@ def condition =
115
115
|| xs.exists(_ >0)
116
116
|| xs.isEmpty
117
117
```
118
-
Previously, these expressions would have been rejected, since the compiler's semicolon inference
118
+
Previously, those expressions would have been rejected, since the compiler's semicolon inference
119
119
would have treated the continuations `++ " world"` or `|| xs.isEmpty` as separate statements.
120
120
121
121
To make this syntax work, the rules are modified to not infer semicolons in front of leading infix operators.
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/overload-resolution.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -13,8 +13,8 @@ are in the first argument list.
13
13
14
14
Overloading resolution now can take argument lists into account when
15
15
choosing among a set of overloaded alternatives.
16
-
For example, the following code compiles in Dotty, while it results in an
17
-
ambiguous overload error in Scala2:
16
+
For example, the following code compiles in Scala 3, while it results in an
17
+
ambiguous overload error in Scala 2:
18
18
19
19
```scala
20
20
deff(x: Int)(y: String):Int=0
@@ -33,7 +33,7 @@ g(2)(3)(4) // ok
33
33
g(2)(3)("") // ok
34
34
```
35
35
36
-
To make this work, the rules for overloading resolution in section 6.23.3 of the SLS are augmented
36
+
To make this work, the rules for overloading resolution in [SLS §6.26.3](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) are augmented
37
37
as follows:
38
38
39
39
> In a situation where a function is applied to more than one argument list, if overloading
To make this work, the rules for overloading resolution in section 6.23.3 of the SLS are modified
60
+
To make this work, the rules for overloading resolution in [SLS §6.26.3](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) are modified
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/pattern-bindings.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -25,13 +25,13 @@ want to decompose it like this:
25
25
```scala
26
26
valfirst:: rest = elems // error
27
27
```
28
-
This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.1 it will give a type error. One can avoid the error by marking the pattern with an @unchecked annotation:
28
+
This works in Scala 2. In fact it is a typical use case for Scala 2's rules. But in Scala 3.1 it will give a type error. One can avoid the error by marking the pattern with an `@unchecked` annotation:
29
29
```scala
30
30
valfirst::rest : @unchecked = elems // OK
31
31
```
32
32
This will make the compiler accept the pattern binding. It might give an error at runtime instead, if the underlying assumption that `elems` can never be empty is wrong.
33
33
34
-
## Pattern Bindings in For Expressions
34
+
## Pattern Bindings in `for` Expressions
35
35
36
36
Analogous changes apply to patterns in `for` expressions. For instance:
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/pattern-matching.md
+6-5
Original file line number
Diff line number
Diff line change
@@ -3,9 +3,9 @@ layout: doc-page
3
3
title: "Option-less pattern matching"
4
4
---
5
5
6
-
Dotty implementation of pattern matching was greatly simplified compared to scalac. From a user perspective, this means that Dotty generated patterns are a *lot* easier to debug, as variables all show up in debug modes and positions are correctly preserved.
6
+
Dotty implementation of pattern matching was greatly simplified compared to Scala 2. From a user perspective, this means that Scala 3 generated patterns are a *lot* easier to debug, as variables all show up in debug modes and positions are correctly preserved.
7
7
8
-
Dotty supports a superset of scalac's[extractors](https://www.scala-lang.org/files/archive/spec/2.13/08-pattern-matching.html#extractor-patterns).
8
+
Dotty supports a superset of Scala 2[extractors](https://www.scala-lang.org/files/archive/spec/2.13/08-pattern-matching.html#extractor-patterns).
9
9
10
10
## Extractors
11
11
@@ -54,7 +54,7 @@ A usage of a fixed-arity extractor is irrefutable if one of the following condit
54
54
55
55
-`U = true`
56
56
- the extractor is used as a product match
57
-
-`U = Some[T]` (for Scala2 compatibility)
57
+
-`U = Some[T]` (for Scala 2 compatibility)
58
58
-`U <: R` and `U <: { def isEmpty: false }`
59
59
60
60
### Variadic Extractors
@@ -84,7 +84,7 @@ and `S` conforms to one of the two matches above.
84
84
The former form of `unapplySeq` has higher priority, and _sequence match_ has higher
85
85
precedence over _product-sequence match_.
86
86
87
-
A usage of a variadic extractor is irrefutable if one of the following condition holds:
87
+
A usage of a variadic extractor is irrefutable if one of the following conditions holds:
88
88
89
89
- the extractor is used directly as a sequence match or product-sequence match
Copy file name to clipboardExpand all lines: docs/docs/reference/changed-features/structural-types.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -42,7 +42,7 @@ Here's an example of a structural type `Person`:
42
42
valage:Int
43
43
}
44
44
```
45
-
The person type adds a _refinement_ to its parent type `Record` that defines `name` and `age` fields. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of class `Person`. For instance, the following
45
+
The type `Person`adds a _refinement_ to its parent type `Record` that defines the two fields `name` and `age`. We say the refinement is _structural_ since `name` and `age` are not defined in the parent type. But they exist nevertheless as members of class `Person`. For instance, the following
Copy file name to clipboardExpand all lines: docs/docs/reference/contextual/givens.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ title: "Given Instances"
4
4
---
5
5
6
6
Given instances (or, simply, "givens") define "canonical" values of certain types
7
-
that serve for synthesizing arguments to [context parameters](./using-clauses.html). Example:
7
+
that serve for synthesizing arguments to [context parameters](./using-clauses.md). Example:
8
8
9
9
```scala
10
10
traitOrd[T] {
@@ -34,7 +34,7 @@ for `Ord[List[T]]` for all types `T` that come with a given instance for `Ord[T]
34
34
themselves. The `using` clause in `listOrd` defines a condition: There must be a
35
35
given of type `Ord[T]` for a given of type `Ord[List[T]]` to exist.
36
36
Such conditions are expanded by the compiler to [context
37
-
parameters](./using-clauses.html).
37
+
parameters](./using-clauses.md).
38
38
39
39
## Anonymous Givens
40
40
@@ -108,7 +108,7 @@ In each case, a pattern-bound given instance consists of `given` and a type `T`.
108
108
109
109
## Negated Givens
110
110
111
-
Scala 2's somewhat puzzling behavior with respect to ambiguity has been exploited to implement the analogue of a "negated" search in implicit resolution, where a query Q1 fails if some other query Q2 succeeds and Q1 succeeds if Q2 fails. With the new cleaned up behavior these techniques no longer work. But there is now a new special type `scala.util.NotGiven`which implements negation directly.
111
+
Scala 2's somewhat puzzling behavior with respect to ambiguity has been exploited to implement the analogue of a "negated" search in implicit resolution, where a query Q1 fails if some other query Q2 succeeds and Q1 succeeds if Q2 fails. With the new cleaned up behavior these techniques no longer work. But the new special type `scala.util.NotGiven`now implements negation directly.
112
112
113
113
For any query type `Q`, `NotGiven[Q]` succeeds if and only if the implicit
Copy file name to clipboardExpand all lines: docs/docs/reference/contextual/motivation.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ title: "Overview"
8
8
Scala's implicits are its most distinguished feature. They are _the_ fundamental way to abstract over context. They represent a unified paradigm with a great variety of use cases, among them: implementing type classes, establishing context, dependency injection, expressing capabilities, computing new types and proving relationships between them.
9
9
10
10
Following Haskell, Scala was the second popular language to have some form of implicits. Other languages have followed suit. E.g Rust's traits or Swift's protocol extensions. Design proposals are also on the table for Kotlin as [compile time dependency resolution](https://github.com/Kotlin/KEEP/blob/e863b25f8b3f2e9b9aaac361c6ee52be31453ee0/proposals/compile-time-dependency-resolution.md), for C# as [Shapes and Extensions](https://github.com/dotnet/csharplang/issues/164)
11
-
or for F# as [Traits](https://github.com/MattWindsor91/visualfsharp/blob/hackathon-vs/examples/fsconcepts.md). Implicits are also a common feature of theorem provers such as Coq or Agda.
11
+
or for F# as [Traits](https://github.com/MattWindsor91/visualfsharp/blob/hackathon-vs/examples/fsconcepts.md). Implicits are also a common feature of theorem provers such as Coq or [Agda](https://agda.readthedocs.io/en/latest/language/implicit-arguments.html).
12
12
13
13
Even though these designs use widely different terminology, they are all variants of the core idea of _term inference_. Given a type, the compiler synthesizes a "canonical" term that has that type. Scala embodies the idea in a purer form than most other languages: An implicit parameter directly leads to an inferred argument term that could also be written down explicitly. By contrast, type class based designs are less direct since they hide term inference behind some form of type classification and do not offer the option of writing the inferred quantities (typically, dictionaries) explicitly.
That's a first step, but in practice we probably would like the `map` function to be a method directly accessible on the type `F`. So that we can call `map` directly on instances of `F`, and get rid of the `summon[Functor[F]]` part.
100
-
As in the previous example of Monoids, [`extension` methods](extension-methods.html) help achieving that. Let's re-define the `Functor` type class with extension methods.
100
+
As in the previous example of Monoids, [`extension` methods](extension-methods.md) help achieving that. Let's re-define the `Functor` type class with extension methods.
101
101
102
102
```scala
103
103
traitFunctor[F[_]]:
@@ -234,7 +234,7 @@ given configDependentMonad: Monad[ConfigDependent] with
234
234
endconfigDependentMonad
235
235
```
236
236
237
-
The type `ConfigDependent` can be written using [type lambdas](../new-types/type-lambdas.html):
237
+
The type `ConfigDependent` can be written using [type lambdas](../new-types/type-lambdas.md):
0 commit comments