Skip to content

[docs/reference] more fixes in Markdown files #11062

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 11, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 6 additions & 12 deletions docs/docs/reference/changed-features/implicit-conversions-spec.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,9 +81,7 @@ implicit val myConverter: Int => String = _.toString
implicit val myConverter: Conversion[Int, String] = _.toString
```

Note that implicit conversions are also affected by the
[changes to implicit resolution](implicit-resolution.md) between Scala 2 and
Scala 3.
Note that implicit conversions are also affected by the [changes to implicit resolution](implicit-resolution.md) between Scala 2 and Scala 3.

## Motivation for the changes

Expand All @@ -99,24 +97,20 @@ val x: String = 1 // Scala 2: assigns "abc" to x
// Scala 3: type error
```

This snippet contains a type error. The right hand side of `val x`
This snippet contains a type error. The right-hand side of `val x`
does not conform to type `String`. In Scala 2, the compiler will use
`m` as an implicit conversion from `Int` to `String`, whereas Scala 3
will report a type error, because `Map` isn't an instance of
`Conversion`.

## Migration path

Implicit values that are used as views should see their type changed
to `Conversion`.
Implicit values that are used as views should see their type changed to `Conversion`.

For the migration of implicit conversions that are affected by the
changes to implicit resolution, refer to the [Changes in Implicit
Resolution](implicit-resolution.md) for more information.
changes to implicit resolution, refer to the [Changes in Implicit Resolution](implicit-resolution.md) for more information.

## Reference

For more information about implicit resolution, see [Changes in
Implicit Resolution](implicit-resolution.md).
Other details are available in
[PR #2065](https://github.com/lampepfl/dotty/pull/2065).
For more information about implicit resolution, see [Changes in Implicit Resolution](implicit-resolution.md).
Other details are available in [PR #2065](https://github.com/lampepfl/dotty/pull/2065).
2 changes: 1 addition & 1 deletion docs/docs/reference/changed-features/numeric-literals.md
Original file line number Diff line number Diff line change
Expand Up @@ -199,7 +199,7 @@ BigFloat.FromDigits.fromDigits("1e100000000000")
Evaluating this expression throws a `NumberTooLarge` exception at run time. We would like it to
produce a compile-time error instead. We can achieve this by tweaking the `BigFloat` class
with a small dose of metaprogramming. The idea is to turn the `fromDigits` method
into a macro, i.e. make it an inline method with a splice as right hand side.
into a macro, i.e. make it an inline method with a splice as right-hand side.
To do this, replace the `FromDigits` instance in the `BigFloat` object by the following two definitions:

```scala
Expand Down
10 changes: 8 additions & 2 deletions docs/docs/reference/changed-features/overload-resolution.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,8 @@ as follows:
resolution yields several competing alternatives when `n >= 1` parameter lists are taken
into account, then resolution re-tried using `n + 1` argument lists.

This change is motivated by the new language feature [extension
methods](../contextual/extension-methods.md), where emerges the need to do
This change is motivated by the new language feature
[extension methods](../contextual/extension-methods.md), where emerges the need to do
overload resolution based on additional argument blocks.

## Parameter Types of Function Values
Expand All @@ -51,12 +51,14 @@ pass such values in the first argument list of an overloaded application, provid
that the remaining parameters suffice for picking a variant of the overloaded function.
For example, the following code compiles in Scala 3, while it results in an
missing parameter type error in Scala2:

```scala
def f(x: Int, f2: Int => Int) = f2(x)
def f(x: String, f2: String => String) = f2(x)
f("a", _.toUpperCase)
f(2, _ * 2)
```

To make this work, the rules for overloading resolution in [SLS §6.26.3](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#overloading-resolution) are modified
as follows:

Expand All @@ -75,11 +77,15 @@ is determined as followed:
- Otherwise the known type of `E` is the result of typing `E` with an undefined expected type.

A pattern matching closure

```scala
{ case P1 => B1 ... case P_n => B_n }
````

is treated as if it was expanded to the function value

```scala
x => x match { case P1 => B1 ... case P_n => B_n }
```

and is therefore also approximated with a `? => ?` type.
4 changes: 2 additions & 2 deletions docs/docs/reference/changed-features/pattern-bindings.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ From Scala 3.1 on, type checking rules will be tightened so that warnings are re
```scala
val xs: List[Any] = List(1, 2, 3)
val (x: String) :: _ = xs // error: pattern's type String is more specialized
// than the right hand side expression's type Any
// than the right-hand side expression's type Any
```
This code gives a compile-time warning in Scala 3.1 (and also in Scala 3.0 under the `-source 3.1` setting) whereas it will fail at runtime with a `ClassCastException` in Scala 2. In Scala 3.1, a pattern binding is only allowed if the pattern is _irrefutable_, that is, if the right-hand side's type conforms to the pattern's type. For instance, the following is OK:
```scala
Expand All @@ -38,7 +38,7 @@ Analogous changes apply to patterns in `for` expressions. For instance:
```scala
val elems: List[Any] = List((1, 2), "hello", (3, 4))
for (x, y) <- elems yield (y, x) // error: pattern's type (Any, Any) is more specialized
// than the right hand side expression's type Any
// than the right-hand side expression's type Any
```
This code gives a compile-time warning in Scala 3.1 whereas in Scala 2 the list `elems`
is filtered to retain only the elements of tuple type that match the pattern `(x, y)`.
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/changed-features/pattern-matching.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ precedence over _product-sequence match_.
A usage of a variadic extractor is irrefutable if one of the following conditions holds:

- the extractor is used directly as a sequence match or product-sequence match
- `U = Some[T]` (for Scala2 compatibility)
- `U = Some[T]` (for Scala 2 compatibility)
- `U <: R` and `U <: { def isEmpty: false }`

## Boolean Match
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/contextual/context-functions-spec.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ new scala.ContextFunctionN[T1, ..., Tn, T]:

A context parameter may also be a wildcard represented by an underscore `_`. In that case, a fresh name for the parameter is chosen arbitrarily.

Note: The closing paragraph of the
**Note:** The closing paragraph of the
[Anonymous Functions section](https://www.scala-lang.org/files/archive/spec/2.13/06-expressions.html#anonymous-functions)
of Scala 2.13 is subsumed by context function types and should be removed.

Expand Down
5 changes: 2 additions & 3 deletions docs/docs/reference/contextual/derivation.md
Original file line number Diff line number Diff line change
Expand Up @@ -326,8 +326,7 @@ inline def derived[A](using gen: K0.Generic[A]) as Eq[A] =
The framework described here enables all three of these approaches without mandating any of them.

For a brief discussion on how to use macros to write a type class `derived`
method please read more at [How to write a type class `derived` method using
macros](./derivation-macro.md).
method please read more at [How to write a type class `derived` method using macros](./derivation-macro.md).

### Deriving instances elsewhere

Expand All @@ -353,7 +352,7 @@ ConstrApps ::= ConstrApp {‘with’ ConstrApp}
| ConstrApp {‘,’ ConstrApp}
```

Note: To align `extends` clauses and `derives` clauses, Scala 3 also allows multiple
**Note:** To align `extends` clauses and `derives` clauses, Scala 3 also allows multiple
extended types to be separated by commas. So the following is now legal:

```scala
Expand Down
7 changes: 3 additions & 4 deletions docs/docs/reference/contextual/givens.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,7 @@ a given for the type `Ord[Int]` whereas `listOrd[T]` defines givens
for `Ord[List[T]]` for all types `T` that come with a given instance for `Ord[T]`
themselves. The `using` clause in `listOrd` defines a condition: There must be a
given of type `Ord[T]` for a given of type `Ord[List[T]]` to exist.
Such conditions are expanded by the compiler to [context
parameters](./using-clauses.md).
Such conditions are expanded by the compiler to [context parameters](./using-clauses.md).

## Anonymous Givens

Expand Down Expand Up @@ -98,7 +97,7 @@ transparent inline given mkAnnotations[A, T]: Annotations[A, T] = ${
}
```

Since `mkAnnotations` is `transparent`, the type of an application is the type of its right hand side, which can be a proper subtype of the declared result type `Annotations[A, T]`.
Since `mkAnnotations` is `transparent`, the type of an application is the type of its right-hand side, which can be a proper subtype of the declared result type `Annotations[A, T]`.

## Pattern-Bound Given Instances

Expand Down Expand Up @@ -167,5 +166,5 @@ of given instances:

- A _structural instance_ contains one or more types or constructor applications,
followed by `with` and a template body that contains member definitions of the instance.
- An _alias instance_ contains a type, followed by `=` and a right hand side expression.
- An _alias instance_ contains a type, followed by `=` and a right-hand side expression.
- An _abstract instance_ contains just the type, which is not followed by anything.
2 changes: 1 addition & 1 deletion docs/docs/reference/contextual/multiversal-equality.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ This definition effectively says that values of type `T` can (only) be
compared to other values of type `T` when using `==` or `!=`. The definition
affects type checking but it has no significance for runtime
behavior, since `==` always maps to `equals` and `!=` always maps to
the negation of `equals`. The right hand side `CanEqual.derived` of the definition
the negation of `equals`. The right-hand side `CanEqual.derived` of the definition
is a value that has any `CanEqual` instance as its type. Here is the definition of class
`CanEqual` and its companion object:

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/contextual/relationship-implicits.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Given instances can be mapped to combinations of implicit objects, classes and i
```

3. Alias givens map to implicit methods or implicit lazy vals. If an alias has neither type nor context parameters,
it is treated as a lazy val, unless the right hand side is a simple reference, in which case we can use a forwarder to
it is treated as a lazy val, unless the right-hand side is a simple reference, in which case we can use a forwarder to
that reference without caching it.

Examples:
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/contextual/using-clauses.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ def minimum[T](xs: List[T])(using Ord[T]) =
maximum(xs)(using descending)
```

The `minimum` method's right hand side passes `descending` as an explicit argument to `maximum(xs)`.
The `minimum` method's right-hand side passes `descending` as an explicit argument to `maximum(xs)`.
With this setup, the following calls are all well-formed, and they all normalize to the last one:

```scala
Expand Down
14 changes: 6 additions & 8 deletions docs/docs/reference/features-classification.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
layout: doc-page
title: A Classification of Proposed Language Features
title: "A Classification of Proposed Language Features"
date: April 6, 2019
author: Martin Odersky
---
Expand Down Expand Up @@ -52,7 +52,7 @@ These constructs replace existing constructs with the aim of making the language

With the exception of early initializers and old-style vararg patterns, all superseded constructs continue to be available in Scala 3.0. The plan is to deprecate and phase them out later.

Value classes (superseded by opaque type aliases) are a special case. There are currently no deprecation plans for value classes, since we might want to bring them back in a more general form if they are supported natively by the JVM as is planned by project Valhalla.
Value classes (superseded by opaque type aliases) are a special case. There are currently no deprecation plans for value classes, since we might bring them back in a more general form if they are supported natively by the JVM as is planned by project Valhalla.

**Status: bimodal: now or never / can delay**

Expand Down Expand Up @@ -121,7 +121,7 @@ Currently implemented features could stay around indefinitely. Updated docs may
**Migration cost: moderate to high**

Dropped features require rewrites to avoid their use in programs. These rewrites can sometimes be automatic (e.g. for procedure syntax, symbol literals, auto application)
and sometimes need to be manual (e.g. class shadowing, auto tupling). Sometimes the rewrites would have to be non-local, affecting use sites as well as definition sites (e.g., in the case of DelayedInit, unless we find a solution).
and sometimes need to be manual (e.g. class shadowing, auto tupling). Sometimes the rewrites would have to be non-local, affecting use sites as well as definition sites (e.g., in the case of `DelayedInit`, unless we find a solution).

## Changes

Expand Down Expand Up @@ -164,7 +164,7 @@ Being new features, existing code migrates without changes. To be sure, sometime

The following constructs together aim to put metaprogramming in Scala on a new basis. So far, metaprogramming was achieved by a combination of macros and libraries such as [Shapeless](https://github.com/milessabin/shapeless) that were in turn based on some key macros. Current Scala 2 macro mechanisms are a thin veneer on top the current Scala 2 compiler, which makes them fragile and in many cases impossible to port to Scala 3.

It's worth noting that macros were never included in the Scala 2 language specification and were so far made available only under an `-experimental` flag. This has not prevented their widespread usage.
It's worth noting that macros were never included in the [Scala 2 language specification](https://scala-lang.org/files/archive/spec/2.13/) and were so far made available only under an `-experimental` flag. This has not prevented their widespread usage.

To enable porting most uses of macros, we are experimenting with the advanced language constructs listed below. These designs are more provisional than the rest of the proposed language constructs for Scala 3.0. There might still be some changes until the final release. Stabilizing the feature set needed for metaprogramming is our first priority.

Expand All @@ -185,9 +185,7 @@ Existing macro libraries will have to be rewritten from the ground up. In many c

## Changes to Type Checking and Inference

The Scala 3 compiler uses a new algorithm for type inference, which relies on
a general subtype constraint solver. The new algorithm often
[works better than the old](https://contributors.scala-lang.org/t/better-type-inference-for-scala-send-us-your-problematic-cases/2410), but there are inevitably situations where the results of both algorithms differ, leading to errors diagnosed by Scala 3 for programs that the Scala 2 compiler accepts.
The Scala 3 compiler uses a new algorithm for type inference, which relies on a general subtype constraint solver. The new algorithm often [works better than the old](https://contributors.scala-lang.org/t/better-type-inference-for-scala-send-us-your-problematic-cases/2410), but there are inevitably situations where the results of both algorithms differ, leading to errors diagnosed by Scala 3 for programs that the Scala 2 compiler accepts.

**Status: essential**

Expand All @@ -197,6 +195,6 @@ The new type-checking and inference algorithms are the essential core of the new

Some existing programs will break and, given the complex nature of type inference, it will not always be clear what change caused the breakage and how to fix it.

In our experience, macros and changes in type and implicit argument inference together cause the large majority of problems encountered when porting existing code to Scala 3. The latter source of problems could be addressed systematically by a tool that added all inferred types and implicit arguments to a Scala 2 source code file. Most likely such a tool would be implemented as a Scala 2 compiler plugin. The resulting code would have a greatly increased likelihood to compile under Scala 3, but would often be bulky to the point of being unreadable. A second part of the rewriting tool should then selectively and iteratively remove type and implicit annotations that were synthesized by the first part as long as they compile under Scala 3. This second part could be implemented as a program that invokes the Scala 3 compiler `scalac` programmatically.
In our experience, macros and changes in type and implicit argument inference together cause the large majority of problems encountered when porting existing code to Scala 3. The latter source of problems could be addressed systematically by a tool that added all inferred types and implicit arguments to a Scala 2 source code file. Most likely such a tool would be implemented as a [Scala 2 compiler plugin](https://docs.scala-lang.org/overviews/plugins/index.html). The resulting code would have a greatly increased likelihood to compile under Scala 3, but would often be bulky to the point of being unreadable. A second part of the rewriting tool should then selectively and iteratively remove type and implicit annotations that were synthesized by the first part as long as they compile under Scala 3. This second part could be implemented as a program that invokes the Scala 3 compiler `scalac` programmatically.

Several people have proposed such a tool for some time now. I believe it is time we find the will and the resources to actually implement it.
2 changes: 1 addition & 1 deletion docs/docs/reference/metaprogramming/erased-terms-spec.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ title: "Erased Terms Spec"
if `def f(erased x: T): U` then `f: (erased T) => U`.


5. Erasure Semantics
5. Erasure semantics
* All `erased` parameters are removed from the function
* All argument to `erased` parameters are not passed to the function
* All `erased` definitions are removed
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/reference/new-types/dependent-function-types.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,8 +35,8 @@ This type describes function values that take any argument `e` of type
`Entry` and return a result of type `e.Key`.

Recall that a normal function type `A => B` is represented as an
instance of the `Function1` trait (i.e. `Function1[A, B]`) and
analogously for functions with more parameters. Dependent functions
instance of the [`Function1` trait](https://dotty.epfl.ch/api/scala/Function1.html)
(i.e. `Function1[A, B]`) and analogously for functions with more parameters. Dependent functions
are also represented as instances of these traits, but they get an additional
refinement. In fact, the dependent function type above is just syntactic sugar for

Expand Down
5 changes: 3 additions & 2 deletions docs/docs/reference/new-types/match-types.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Elem[List[Float]] =:= Float
Elem[Nil.type] =:= Nothing
```

Here `=:=` is understood to mean that left and right hand sides are mutually
Here `=:=` is understood to mean that left and right-hand sides are mutually
subtypes of each other.

In general, a match type is of the form
Expand Down Expand Up @@ -214,7 +214,8 @@ be caught and turned into a compile-time error that indicates a trace of the
subtype tests that caused the overflow without showing a full stack trace.

## Variance Laws for Match Types
NOTE: This section does not reflect the current implementation.

**Note:** This section does not reflect the current implementation.

Within a match type `Match(S, Cs) <: B`, all occurrences of type variables count
as covariant. By the nature of the cases `Ci` this means that occurrences in
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/reference/new-types/type-lambdas-spec.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ with types that satisfy these constraints. Likewise
```scala
opaque type O[X] = List[X]
```
`O` is known to be invariant (and not covariant, as its right hand side would suggest). On the other hand, a transparent alias
`O` is known to be invariant (and not covariant, as its right-hand side would suggest). On the other hand, a transparent alias
```scala
type O2[X] = List[X]
```
Expand Down
Loading