-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-21708][BUILD] Migrate build to sbt 1.x #759
Conversation
11da97d
to
05fe273
Compare
@@ -28,9 +28,6 @@ all-branches-and-tags: &all-branches-and-tags | |||
# Step templates | |||
|
|||
step_templates: | |||
restore-build-binaries-cache: &restore-build-binaries-cache |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This had to be removed because it picks up older build/sbt
and doesn't get the updated versions
@@ -390,7 +390,8 @@ def build_spark_assembly_sbt(extra_profiles, checkstyle=False): | |||
if checkstyle: | |||
run_java_style_checks(build_profiles) | |||
|
|||
build_spark_unidoc_sbt(extra_profiles) | |||
# TODO(lmartini): removed because broken, checks generated classes | |||
# build_spark_unidoc_sbt(extra_profiles) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need unidoc and this broke attempting to generate docs for autogenerated files. Didn't want to invest too much into this but can try if needed
val organization = "org.apache.spark" | ||
val previousSparkVersion = "2.4.0" | ||
val previousSparkVersion = "3.0.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file is cleanly picked from upstream. It seems like they forgot to do this bump and added it with this PR out of convenience (context: apache#29286 (comment) and apache#22977 (comment))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we keep the previous
to 2.4.0 the number of binary breaks is 200+ so this is the best option anyways
// TODO(lmartini): Additional excludes not in upstream but unique to palantir fork | ||
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SparkContext.initializeForcefully"), | ||
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SparkContext.initializeForcefully"), | ||
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.broadcast.Broadcast.initializeForcefully"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These last 3 excludes had to be added in our fork only. It should be ok
dependencyOverrides ++= MavenHelper.fromPom { pom => | ||
for { | ||
dep <- pom.getDependencyManagement.getDependencies.asScala | ||
} yield MavenHelper.convertDep(dep) | ||
}.value.toSet | ||
}.value.toSeq |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sbt got stricter and toSet
was not accepted
6b73877
to
94b4feb
Compare
}.value | ||
test := (test andFinally Def.taskDyn { | ||
copyTestReportsToCircle | ||
}).value |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, sbt got stricter. and the fancy tuple plugin stuff removed. This is the equivalent of the above
f4de666
to
fd66f3c
Compare
|
||
// need to make changes to uptake sbt 1.0 support in "com.cavorite" % "sbt-avro-1-7" % "1.1.2" | ||
addSbtPlugin("com.cavorite" % "sbt-avro" % "0.3.2") | ||
addSbtPlugin("com.cavorite" % "sbt-avro" % "2.1.1") | ||
libraryDependencies += "org.apache.avro" % "avro-compiler" % "1.10.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The upstream pr changed the version of avro compiler to 1.8.2 but we're already ahead of it
Migrate sbt-launcher URL to download one for sbt 1.x. Update plugins versions where required by sbt update. Change sbt version to be used to latest released at the moment, 1.3.13 Adjust build settings according to plugins and sbt changes. Migration to sbt 1.x: 1. enhances dev experience in development 2. updates build plugins to bring there new features/to fix bugs in them 3. enhances build performance on sbt side 4. eases movement to Scala 3 / dotty No. All existing tests passed, both on Jenkins and via Github Actions, also manually for Scala 2.13 profile. Closes apache#29286 from gemelen/feature/sbt-1.x. Authored-by: Denis Pyshev <git@gemelen.net> Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
465e437
to
e45a2f8
Compare
lazy val sparkGenjavadocSettings: Seq[sbt.Def.Setting[_]] = Seq( | ||
libraryDependencies += compilerPlugin( | ||
"com.typesafe.genjavadoc" %% "genjavadoc-plugin" % unidocGenjavadocVersion.value cross CrossVersion.full), | ||
lazy val sparkGenjavadocSettings: Seq[sbt.Def.Setting[_]] = GenJavadocPlugin.projectSettings ++ Seq( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like this is why the unidoc check started failing.
scalacOptions ++= Seq( | ||
"-P:genjavadoc:out=" + (target.value / "java"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or this actually
analysis.infos.allInfos.foreach { case (k, i) => | ||
i.reportedProblems foreach { p => | ||
val deprecation = p.message.contains("is deprecated") | ||
analysis.asInstanceOf[sbt.internal.inc.Analysis].infos.allInfos.foreach { case (k, i) => | ||
i.getReportedProblems foreach { p => | ||
val deprecation = p.message.contains("deprecated") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(This previously was is deprecated
in our fork. But checking for deprecated
only should give you a superset.)
Original PR description
Migrate sbt-launcher URL to download one for sbt 1.x.
Update plugins versions where required by sbt update.
Change sbt version to be used to latest released at the moment, 1.3.13
Adjust build settings according to plugins and sbt changes.
Migration to sbt 1.x:
No.
All existing tests passed, both on Jenkins and via Github Actions, also manually for Scala 2.13 profile.
Closes apache#29286 from gemelen/feature/sbt-1.x.
Authored-by: Denis Pyshev git@gemelen.net
Signed-off-by: Dongjoon Hyun dhyun@apple.com
Upstream SPARK-XXXXX ticket and PR link (if not applicable, explain)
[SPARK-21708][BUILD] Migrate build to sbt 1.x
Commit: apache@6daa2ae
PR: apache#29286
What changes were proposed in this pull request?
Bump sbt version to 1.x.
There are a few non-trivial changes related to versions.
First of all the original PR was introduced on top of spark 3.1 so we had to adapt a few things for it to work on top of spark 3.0
The breaks are listed under
v30excludes
instead of the upstream'sv31excludes
because current version is 3.0.0Why are the changes needed?
The SBT bump is needed because:
Does this PR introduce any user-facing change?
No