-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-33343][BUILD] Fix the build with sbt to copy hadoop-client-runtime.jar #30250
Conversation
I don't know whether this is the best way to fix this issue. |
Kubernetes integration test starting |
Kubernetes integration test status success |
Test build #130607 has finished for PR 30250 at commit
|
retest this please. |
cc @sunchao |
For me, $ build/sbt -Pyarn -Phive -Phive-thriftserver -Psparkr test:package
$ bin/spark-shell
20/11/04 08:56:42 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context available as 'sc' (master = local[*], app id = local-1604509006686).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.1.0-SNAPSHOT
/_/
Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_272)
Type in expressions to have them evaluated.
Type :help for more information.
scala> :quit |
I built with |
Thank you, @sarutak . I also confirmed the issue~ |
According to the
|
Kubernetes integration test starting |
Kubernetes integration test status failure |
retest this please. |
Thanks @sarutak for reporting the issue. Yeah I was able to reproduce it as well with SBT, but not Maven. In particular the |
Kubernetes integration test starting |
Kubernetes integration test status failure |
So this has been discussed sometime back: sbt/sbt-assembly#120, but I don't know whether there is already a fix or not. |
Test build #130609 has finished for PR 30250 at commit
|
Test build #130614 has finished for PR 30250 at commit
|
cc @srowen |
What changes were proposed in this pull request?
This PR fix the issue that spark-shell doesn't work if it's built with
sbt package
(without any profiles specified).It's due to hadoop-client-runtime.jar isn't copied to assembly/target/scala-2.12/jars.
Why are the changes needed?
This is a bug.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Ran spark-shell and confirmed it works.