-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-37529][K8S][TESTS] Support K8s integration tests for Java 17 #34790
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Test build #145885 has finished for PR 34790 at commit
|
|
Kubernetes integration test starting |
|
Kubernetes integration test status failure |
|
Test build #145917 has finished for PR 34790 at commit
|
dongjoon-hyun
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to change more? (since this is WIP)
|
@dongjoon-hyun Oh, I just forgot to modify the title. Thank you ! |
| SPARK_TGZ="$2" | ||
| shift | ||
| ;; | ||
| --docker-file) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to update resource-managers/kubernetes/integration-tests/README.md?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Thank you so much, @sarutak .
I also verified that the test runner works correctly on SBT, Zulu Java 17, Java 17 Docker Image, Mac OS (Intel), Minikube v1.24.0, K8s v1.22.3. Although I hit some test failures, those can be handled separately like updating README.md.
$ java -version
openjdk version "17.0.1" 2021-10-19 LTS
OpenJDK Runtime Environment Zulu17.30+15-CA (build 17.0.1+12-LTS)
OpenJDK 64-Bit Server VM Zulu17.30+15-CA (build 17.0.1+12-LTS, mixed mode, sharing)
$ minikube version --short
v1.24.0
$ k version --short
Client Version: v1.22.3
Server Version: v1.22.3
$ build/sbt -Dspark.kubernetes.test.dockerFile=resource-managers/kubernetes/docker/src/main/dockerfiles/spark/Dockerfile.java17 -Pkubernetes -Pkubernetes-integration-tests package "kubernetes-integration-tests/test"
Using /Users/dongjoon/.jenv/versions/zulu17 as default JAVA_HOME.
...
[info] KubernetesSuite:
[info] - Run SparkPi with no resources (14 seconds, 550 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (13 seconds, 849 milliseconds)
[info] - Run SparkPi with a very long application name. (14 seconds, 863 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (13 seconds, 817 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (13 seconds, 901 milliseconds)
[info] - Run SparkPi with an argument. (14 seconds, 814 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (13 seconds, 809 milliseconds)
[info] - All pods have the same service account by default (13 seconds, 818 milliseconds)
[info] - Run extraJVMOptions check on driver (7 seconds, 754 milliseconds)
[info] - Run SparkRemoteFileTest using a remote data file *** FAILED *** (3 minutes, 4 seconds)
...
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j.properties (13 seconds, 784 milliseconds)
[info] - Run SparkPi with env and mount secrets. (25 seconds, 40 milliseconds)
[info] - Run PySpark on simple pi.py example (15 seconds, 830 milliseconds)
[info] - Run PySpark to test a pyfiles example (17 seconds, 977 milliseconds)
[info] - Run PySpark with memory customization (15 seconds, 833 milliseconds)
[info] - Run in client mode. (11 seconds, 434 milliseconds)
[info] - Start pod creation from template (13 seconds, 893 milliseconds)
[info] - PVs with local hostpath storage on statefulsets *** FAILED *** (3 minutes, 4 seconds)
...
[info] - PVs with local hostpath and storageClass on statefulsets *** FAILED *** (3 minutes, 4 seconds)
...
[info] - PVs with local storage *** FAILED *** (3 minutes, 5 seconds)
[info] - Launcher client dependencies (41 seconds, 424 milliseconds)
[info] - SPARK-33615: Launcher client archives (26 seconds, 386 milliseconds)
[info] - SPARK-33748: Launcher python client respecting PYSPARK_PYTHON (31 seconds, 927 milliseconds)
[info] - SPARK-33748: Launcher python client respecting spark.pyspark.python and spark.pyspark.driver.python (31 seconds, 580 milliseconds)
[info] - Launcher python client dependencies using a zip file (33 seconds, 228 milliseconds)
[info] - Test basic decommissioning (48 seconds, 505 milliseconds)
[info] - Test basic decommissioning with shuffle cleanup (48 seconds, 677 milliseconds)
[info] - Test decommissioning with dynamic allocation & shuffle cleanups *** FAILED *** (4 minutes, 19 seconds)
...
[info] - Test decommissioning timeouts (47 seconds, 685 milliseconds)
[info] - Run SparkR on simple dataframe.R example *** FAILED *** (3 minutes, 4 seconds)
...
[info] Run completed in 29 minutes, 24 seconds.
[info] Total number of tests run: 30
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 24, failed 6, canceled 0, ignored 0, pending 0
[info] *** 6 TESTS FAILED ***
[error] Failed tests:
[error] org.apache.spark.deploy.k8s.integrationtest.KubernetesSuite
[error] (kubernetes-integration-tests / Test / test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 1978 s (32:58), completed Dec 4, 2021, 3:33:51 PM
$ docker images | grep dev
spark-r dev 636af4cb030f 4 minutes ago 1.42GB
spark-py dev 2284e581ac6b 5 minutes ago 1.26GB
spark dev f45786f9002f 6 minutes ago 916MB
$ docker run -it --rm spark:dev java -version | tail -n3
openjdk version "17.0.1" 2021-10-19
OpenJDK Runtime Environment (build 17.0.1+12-Debian-1deb11u2)
OpenJDK 64-Bit Server VM (build 17.0.1+12-Debian-1deb11u2, mixed mode, sharing)
|
Thank you @dongjoon-hyun . I'll update the |
…h to take a custom Dockerfile ### What changes were proposed in this pull request? This PR changes `dev-run-integration-tests.sh` to allow it to take a custom Dockerfile like #34790 did. With this change, this script accepts `--docker-file` option, which takes a path to a custom Dockerfile. ``` $ ./dev/run-integration-tests.sh --docker-file /path/to/dockerfile ``` ### Why are the changes needed? As of #34790, we can specify a custom Dockerfile by `spark.kubernetes.test.dockerFile` property when we run the K8s integration tests using Maven. We can run the integration test via `dev-run-integration-tests.sh` but there is no way to specify a custom Dockerfile. ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Confirmed that the K8s integration tests run with the following command using `Dockerfile.java17`. ``` cd resource-managers/kubernetes/integration-tests ./dev/dev-run-integration-tests.sh --docker-file ../docker/src/main/dockerfiles/spark/Dockerfile.java17 ``` Closes #34818 from sarutak/kube-integration-test-java17-2. Authored-by: Kousuke Saruta <sarutak@oss.nttdata.com> Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
What changes were proposed in this pull request?
This PR aims to support K8s integration tests for Java 17 using Maven and SBT.
The new system property
spark.kubernetes.test.dockerFileis introduced to specify a Dockerfile.By setting
Dockerfile.java17to the property, the integration tests run with Java 17.This PR also revised the change brought by SPARK-37354 (#34628) by changing
SparkBuild.scalaso that it can recognize the system propertyspark.kubernetes.test.javaImageTag, like the integration tests with Maven do.If both
spark.kubernetes.test.dockerFileandspark.kubernetes.test.javaImageTagare set,spark.kubernetes.test.dockerFileis preferred.Why are the changes needed?
To ensure Spark works on K8s with Java 17.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Confirmed that the intended version of Java is used for each pattern.
spark.kubernetes.test.javaImageTagnorspark.kubernetes.test.dockerFileis set (SBT)spark.kubernetes.test.javaImageTagnorspark.kubernetes.test.dockerFileis set (Maven)spark.kubernetes.test.javaImageTagis set (SBT):spark.kubernetes.test.javaImageTagis set (Maven):spark.kubernetes.test.dockerFileis set (SBT)spark.kubernetes.test.dockerFileis set (Maven)spark.kubernetes.test.javaImageTagandspark.kubernetes.test.dockerFileare set (SBT)spark.kubernetes.test.javaImageTagandspark.kubernetes.test.dockerFileare set (Maven)