-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-43836][BUILD] Make Scala 2.13 as default in Spark 3.5 #41344
Conversation
Thank you, @MaxGekk . Merged to master. |
Thank you @dongjoon-hyun |
Yes, I agree with you, @bjornjorgensen . I'm also planning to send out this as a part of the followings (Sorted by ID)
|
I'm very happy to see this change @dongjoon-hyun :) |
Thank you always in many ways, @LuciferYang ! :) |
For the record, I sent an email to dev@spark here.
|
### What changes were proposed in this pull request? This PR aims to make Scala 2.13 the default Scala version in Apache Spark 3.5. ### Why are the changes needed? The current releases of Scala community are `Scala 3.2.2` and `Scala 2.13.10`. - https://scala-lang.org/download/all.html Although the Apache Spark community has been using Scala 2.12 by default since Apache Spark 3.0 and Scala community will release Scala 2.12.18 for Java 21 support, we had better focus on `Scala 2.13+` more from Apache Spark 3.5 timeline to adopt Scala community's activity. Since SPARK-25075 added Scala 2.13 at Apache Spark 3.2.0, the Apache Spark community has been using it as a second Scala version. This PR aims to switch only the default Scala version from 2.12 to 2.13. Apache Spark will support both Scala 2.12 and 2.13 still. ### Does this PR introduce _any_ user-facing change? Yes, but we still have Scala 2.12. ### How was this patch tested? Pass the CIs. Closes apache#41344 from dongjoon-hyun/SPARK-43836. Authored-by: Dongjoon Hyun <dongjoon@apache.org> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
What changes were proposed in this pull request?
This PR aims to make Scala 2.13 the default Scala version in Apache Spark 3.5.
Why are the changes needed?
The current releases of Scala community are
Scala 3.2.2
andScala 2.13.10
.Although the Apache Spark community has been using Scala 2.12 by default since Apache Spark 3.0 and Scala community will release Scala 2.12.18 for Java 21 support, we had better focus on
Scala 2.13+
more from Apache Spark 3.5 timeline to adopt Scala community's activity.Since SPARK-25075 added Scala 2.13 at Apache Spark 3.2.0, the Apache Spark community has been using it as a second Scala version. This PR aims to switch only the default Scala version from 2.12 to 2.13. Apache Spark will support both Scala 2.12 and 2.13 still.
Does this PR introduce any user-facing change?
Yes, but we still have Scala 2.12.
How was this patch tested?
Pass the CIs.