-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Confusing compilation errors in a project matrix #75
Comments
You can see a build begin to fail here. Note the inclusion of scopt in the |
I'll also note that this setup has absolutely wrecked the IntelliJ project. IntelliJ appears to be importing the primary project that is erroring. This is also the first time I've used project matrix with IntelliJ so pointers there are welcome, too. |
Your build has 3 |
Playing with this in a very small window of time just now… I wanted to do something like this, but .jvmPlatform(
scalaVersions = Seq(scala211, scala212, scala213),
settings = scalaBinaryVersion.value match {
case "2.11" =>
Seq(
circeVersion := "0.11.2",
circeYamlVersion := "0.10.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.4" % Provided,
(Compile / runMain) := Defaults.runMainTask(Compile / fullClasspath, Compile / run / runner).evaluated,
generateTestData := { (Compile / runMain).toTask(" com.target.data_validator.GenTestData").value }
)
case "2.12" =>
Seq(
circeVersion := "0.14.2",
circeYamlVersion := "0.14.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.8" % Provided
)
case "2.13" =>
Seq(
circeVersion := "0.14.2",
circeYamlVersion := "0.14.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.1" % Provided
)
}
) I got lazy val root = (projectMatrix in file("."))
.enablePlugins(BuildInfoPlugin)
.settings(commonSettings)
.jvmPlatform(
scalaVersions = Seq(scala211, scala212, scala213),
settings = Seq(
circeVersion := (scalaBinaryVersion.value match {
case "2.11" => "0.11.2"
case "2.12" | "2.13" => "0.14.2"
}),
circeYamlVersion := (scalaBinaryVersion.value match {
case "2.11" => "0.10.1"
case "2.12" | "2.13" => "0.14.1"
}),
libraryDependencies ++= (scalaBinaryVersion.value match {
case "2.11" => Seq("org.apache.spark" %% "spark-sql" % "2.3.4" % Provided)
case "2.12" => Seq("org.apache.spark" %% "spark-sql" % "2.4.8" % Provided)
case "2.13" => Seq("org.apache.spark" %% "spark-sql" % "3.2.1" % Provided)
}),
//(Compile / runMain) := Defaults.runMainTask(Compile / fullClasspath, Compile / run / runner).evaluated,
//generateTestData := { (Compile / runMain).toTask(" com.target.data_validator.GenTestData").value }
)
) I feel like I'm getting warmer… |
I aim to build a Spark 2.x app for Scalas 2.11, 2.12, and 2.13 with a set of dependencies for each on top of the base dependencies.
My configuration seems to allow the projects for 2.11 and 2.12 to build and test (e.g.
sbt root2_11/test
) correctly while the 2.13 has some errors still (working on those). However, there seems to be another project that's also building, and it can't find any dependencies, so it errors when I runsbt test
as CI does.I think what's happening is that the root project still thinks it should compile when I think I only want the projects in the specified matrix
jvmPlatform
declarations to be active. I think I need to disable this root project somehow, but I can't find a way to do that.I'd welcome some pointers in the right direction. I'm so close to getting this cross-version build to work!
The text was updated successfully, but these errors were encountered: