-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-25235] [Build] [SHELL] [FOLLOWUP] Fix repl compile for 2.12 #22280
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Hm, why would this help? there is no scala-2.12 source dir anymore. Does this method exist in 2.11 and 2.12? then we can just add the override I guess. |
|
First, Actually, only Add a We may not care about whether the dirs are empty, we care about the consistent behavior with sbt. |
|
This PR intends to fix the build error by mvn. I have no idea about Maybe we should spare some time to eliminate the code under |
|
This PR together with #22264 should make the scala-2.12 jenkins job work. |
|
Yes, I set up these dirs and POM profile. The profile adds a source directory for 2.12, that previously had 2.12-specific implementations. It doesn't exist anymore, so adding these source dirs shouldn't do anything. See ff8dcc1 for the work that unified this, although we still have a separate dir for 2.11-only code now, that can't be removed. The problem is really that we (I) forgot in the review that removing the 2.12-specific profile means we fail to override the "extra.source.dirs" that should only exist for 2.11! So yes your fix is right for that reason. It's a little odd to refer to the old dirs that don't exist. I wonder if just setting the property values to blank strings in the 2.12 profile works as well? Really, what we should have is a scala-2.11 profile that adds the extra source dirs only when 2.11 is building. That's cleaner, are you familiar with how to write it? I can open a PR too. CC @dbtsai |
|
Jenkins add to whitelist |
|
Test build #4307 has finished for PR 22280 at commit
|
|
The test failure here is almost certainly spurious; let's see what the other one does. |
|
Test build #95478 has finished for PR 22280 at commit
|
|
I'm going to merge this as a partial revert of the other commit, as the dummy dir this reintroduces doesn't matter. We can tweak it more later, and I want to unblock finishing 2.12 integration. |
## What changes were proposed in this pull request? Error messages from https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/183/ ``` [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) spark-repl_2.12 --- [INFO] Using zinc server for incremental compilation [warn] Pruning sources from previous analysis, due to incompatible CompileSetup. [info] Compiling 6 Scala sources to /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/repl/target/scala-2.12/classes... [error] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala:80: overriding lazy value importableSymbolsWithRenames in class ImportHandler of type List[(this.intp.global.Symbol, this.intp.global.Name)]; [error] lazy value importableSymbolsWithRenames needs `override' modifier [error] lazy val importableSymbolsWithRenames: List[(Symbol, Name)] = { [error] ^ [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:53: variable addedClasspath in class ILoop is deprecated (since 2.11.0): use reset, replay or require to update class path [warn] if (addedClasspath != "") { [warn] ^ [warn] /home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala:54: variable addedClasspath in class ILoop is deprecated (since 2.11.0): use reset, replay or require to update class path [warn] settings.classpath append addedClasspath [warn] ^ [warn] two warnings found [error] one error found [error] Compile failed at Aug 29, 2018 5:28:22 PM [0.679s] ``` Readd the profile for `scala-2.12`. Using `-Pscala-2.12` will overrides `extra.source.dir` and `extra.testsource.dir` with two non-exist directories. ## How was this patch tested? First, make sure it compiles. ``` dev/change-scala-version.sh 2.12 mvn -Pscala-2.12 -DskipTests compile install ``` Then, make a distribution to try the repl: `./dev/make-distribution.sh --name custom-spark --tgz -Phadoop-2.7 -Phive -Pyarn -Pscala-2.12` ``` 18/08/30 16:04:50 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://172.16.131.140:4040 Spark context available as 'sc' (master = local[*], app id = local-1535616298812). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.0-SNAPSHOT /_/ Using Scala version 2.12.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112) Type in expressions to have them evaluated. Type :help for more information. scala> spark.sql("select percentile(key, 1) from values (1, 1),(2, 1) T(key, value)").show +-------------------------------------+ |percentile(key, CAST(1 AS DOUBLE), 1)| +-------------------------------------+ | 2.0| +-------------------------------------+ ``` Closes apache#22280 from sadhen/SPARK_24785_FOLLOWUP. Authored-by: 忍冬 <rendong@wacai.com> Signed-off-by: Sean Owen <sean.owen@databricks.com>
What changes were proposed in this pull request?
Error messages from https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test/job/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/183/
Readd the profile for
scala-2.12. Using-Pscala-2.12will overridesextra.source.dirandextra.testsource.dirwith two non-exist directories.How was this patch tested?
First, make sure it compiles.
Then, make a distribution to try the repl:
./dev/make-distribution.sh --name custom-spark --tgz -Phadoop-2.7 -Phive -Pyarn -Pscala-2.12