Skip to content

Commit

Permalink
[SPARK-31786][K8S][BUILD] Upgrade kubernetes-client to 4.9.2
Browse files Browse the repository at this point in the history
This PR aims to upgrade `kubernetes-client` library to bring the JDK8 related fixes. Please note that JDK11 works fine without any problem.
- https://github.com/fabric8io/kubernetes-client/releases/tag/v4.9.2
  - JDK8 always uses http/1.1 protocol (Prevent OkHttp from wrongly enabling http/2)

OkHttp "wrongly" detects the Platform as Jdk9Platform on JDK 8u251.
- fabric8io/kubernetes-client#2212
- https://stackoverflow.com/questions/61565751/why-am-i-not-able-to-run-sparkpi-example-on-a-kubernetes-k8s-cluster

Although there is a workaround `export HTTP2_DISABLE=true` and `Downgrade JDK or K8s`, we had better avoid this problematic situation.

No. This will recover the failures on JDK 8u252.

- [x] Pass the Jenkins UT (#28601 (comment))
- [x] Pass the Jenkins K8S IT with the K8s 1.13 (#28601 (comment))
- [x] Manual testing with K8s 1.17.3. (Below)

**v1.17.6 result (on Minikube)**
```
KubernetesSuite:
- Run SparkPi with no resources
- Run SparkPi with a very long application name.
- Use SparkLauncher.NO_RESOURCE
- Run SparkPi with a master URL without a scheme.
- Run SparkPi with an argument.
- Run SparkPi with custom labels, annotations, and environment variables.
- All pods have the same service account by default
- Run extraJVMOptions check on driver
- Run SparkRemoteFileTest using a remote data file
- Run SparkPi with env and mount secrets.
- Run PySpark on simple pi.py example
- Run PySpark with Python2 to test a pyfiles example
- Run PySpark with Python3 to test a pyfiles example
- Run PySpark with memory customization
- Run in client mode.
- Start pod creation from template
- PVs with local storage
- Launcher client dependencies
- Test basic decommissioning
Run completed in 8 minutes, 27 seconds.
Total number of tests run: 19
Suites: completed 2, aborted 0
Tests: succeeded 19, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```

Closes #28601 from dongjoon-hyun/SPARK-K8S-CLIENT.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit 64ffc66)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit f05a26a)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
  • Loading branch information
dongjoon-hyun committed May 24, 2020
1 parent 1d1a207 commit 8bbc46a
Show file tree
Hide file tree
Showing 12 changed files with 47 additions and 48 deletions.
11 changes: 6 additions & 5 deletions dev/deps/spark-deps-hadoop-2.6
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,7 @@ jackson-core-asl/1.9.13//jackson-core-asl-1.9.13.jar
jackson-core/2.6.7//jackson-core-2.6.7.jar
jackson-databind/2.6.7.3//jackson-databind-2.6.7.3.jar
jackson-dataformat-yaml/2.6.7//jackson-dataformat-yaml-2.6.7.jar
jackson-datatype-jsr310/2.10.3//jackson-datatype-jsr310-2.10.3.jar
jackson-jaxrs/1.9.13//jackson-jaxrs-1.9.13.jar
jackson-mapper-asl/1.9.13//jackson-mapper-asl-1.9.13.jar
jackson-module-jaxb-annotations/2.6.7//jackson-module-jaxb-annotations-2.6.7.jar
Expand Down Expand Up @@ -130,14 +131,14 @@ jta/1.1//jta-1.1.jar
jtransforms/2.4.0//jtransforms-2.4.0.jar
jul-to-slf4j/1.7.16//jul-to-slf4j-1.7.16.jar
kryo-shaded/4.0.2//kryo-shaded-4.0.2.jar
kubernetes-client/4.6.1//kubernetes-client-4.6.1.jar
kubernetes-model-common/4.6.1//kubernetes-model-common-4.6.1.jar
kubernetes-model/4.6.1//kubernetes-model-4.6.1.jar
kubernetes-client/4.9.2//kubernetes-client-4.9.2.jar
kubernetes-model-common/4.9.2//kubernetes-model-common-4.9.2.jar
kubernetes-model/4.9.2//kubernetes-model-4.9.2.jar
leveldbjni-all/1.8//leveldbjni-all-1.8.jar
libfb303/0.9.3//libfb303-0.9.3.jar
libthrift/0.9.3//libthrift-0.9.3.jar
log4j/1.2.17//log4j-1.2.17.jar
logging-interceptor/3.12.0//logging-interceptor-3.12.0.jar
logging-interceptor/3.12.6//logging-interceptor-3.12.6.jar
lz4-java/1.4.0//lz4-java-1.4.0.jar
machinist_2.11/0.6.1//machinist_2.11-0.6.1.jar
macro-compat_2.11/1.1.1//macro-compat_2.11-1.1.1.jar
Expand All @@ -150,7 +151,7 @@ minlog/1.3.0//minlog-1.3.0.jar
netty-all/4.1.47.Final//netty-all-4.1.47.Final.jar
netty/3.9.9.Final//netty-3.9.9.Final.jar
objenesis/2.5.1//objenesis-2.5.1.jar
okhttp/3.12.0//okhttp-3.12.0.jar
okhttp/3.12.6//okhttp-3.12.6.jar
okio/1.15.0//okio-1.15.0.jar
opencsv/2.3//opencsv-2.3.jar
orc-core/1.5.5/nohive/orc-core-1.5.5-nohive.jar
Expand Down
11 changes: 6 additions & 5 deletions dev/deps/spark-deps-hadoop-2.7
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,7 @@ jackson-core-asl/1.9.13//jackson-core-asl-1.9.13.jar
jackson-core/2.6.7//jackson-core-2.6.7.jar
jackson-databind/2.6.7.3//jackson-databind-2.6.7.3.jar
jackson-dataformat-yaml/2.6.7//jackson-dataformat-yaml-2.6.7.jar
jackson-datatype-jsr310/2.10.3//jackson-datatype-jsr310-2.10.3.jar
jackson-jaxrs/1.9.13//jackson-jaxrs-1.9.13.jar
jackson-mapper-asl/1.9.13//jackson-mapper-asl-1.9.13.jar
jackson-module-jaxb-annotations/2.6.7//jackson-module-jaxb-annotations-2.6.7.jar
Expand Down Expand Up @@ -131,14 +132,14 @@ jta/1.1//jta-1.1.jar
jtransforms/2.4.0//jtransforms-2.4.0.jar
jul-to-slf4j/1.7.16//jul-to-slf4j-1.7.16.jar
kryo-shaded/4.0.2//kryo-shaded-4.0.2.jar
kubernetes-client/4.6.1//kubernetes-client-4.6.1.jar
kubernetes-model-common/4.6.1//kubernetes-model-common-4.6.1.jar
kubernetes-model/4.6.1//kubernetes-model-4.6.1.jar
kubernetes-client/4.9.2//kubernetes-client-4.9.2.jar
kubernetes-model-common/4.9.2//kubernetes-model-common-4.9.2.jar
kubernetes-model/4.9.2//kubernetes-model-4.9.2.jar
leveldbjni-all/1.8//leveldbjni-all-1.8.jar
libfb303/0.9.3//libfb303-0.9.3.jar
libthrift/0.9.3//libthrift-0.9.3.jar
log4j/1.2.17//log4j-1.2.17.jar
logging-interceptor/3.12.0//logging-interceptor-3.12.0.jar
logging-interceptor/3.12.6//logging-interceptor-3.12.6.jar
lz4-java/1.4.0//lz4-java-1.4.0.jar
machinist_2.11/0.6.1//machinist_2.11-0.6.1.jar
macro-compat_2.11/1.1.1//macro-compat_2.11-1.1.1.jar
Expand All @@ -151,7 +152,7 @@ minlog/1.3.0//minlog-1.3.0.jar
netty-all/4.1.47.Final//netty-all-4.1.47.Final.jar
netty/3.9.9.Final//netty-3.9.9.Final.jar
objenesis/2.5.1//objenesis-2.5.1.jar
okhttp/3.12.0//okhttp-3.12.0.jar
okhttp/3.12.6//okhttp-3.12.6.jar
okio/1.15.0//okio-1.15.0.jar
opencsv/2.3//opencsv-2.3.jar
orc-core/1.5.5/nohive/orc-core-1.5.5-nohive.jar
Expand Down
11 changes: 6 additions & 5 deletions dev/deps/spark-deps-hadoop-3.1
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,7 @@ jackson-core-asl/1.9.13//jackson-core-asl-1.9.13.jar
jackson-core/2.6.7//jackson-core-2.6.7.jar
jackson-databind/2.6.7.3//jackson-databind-2.6.7.3.jar
jackson-dataformat-yaml/2.6.7//jackson-dataformat-yaml-2.6.7.jar
jackson-datatype-jsr310/2.10.3//jackson-datatype-jsr310-2.10.3.jar
jackson-jaxrs-base/2.7.8//jackson-jaxrs-base-2.7.8.jar
jackson-jaxrs-json-provider/2.7.8//jackson-jaxrs-json-provider-2.7.8.jar
jackson-mapper-asl/1.9.13//jackson-mapper-asl-1.9.13.jar
Expand Down Expand Up @@ -147,14 +148,14 @@ kerby-pkix/1.0.1//kerby-pkix-1.0.1.jar
kerby-util/1.0.1//kerby-util-1.0.1.jar
kerby-xdr/1.0.1//kerby-xdr-1.0.1.jar
kryo-shaded/4.0.2//kryo-shaded-4.0.2.jar
kubernetes-client/4.6.1//kubernetes-client-4.6.1.jar
kubernetes-model-common/4.6.1//kubernetes-model-common-4.6.1.jar
kubernetes-model/4.6.1//kubernetes-model-4.6.1.jar
kubernetes-client/4.9.2//kubernetes-client-4.9.2.jar
kubernetes-model-common/4.9.2//kubernetes-model-common-4.9.2.jar
kubernetes-model/4.9.2//kubernetes-model-4.9.2.jar
leveldbjni-all/1.8//leveldbjni-all-1.8.jar
libfb303/0.9.3//libfb303-0.9.3.jar
libthrift/0.9.3//libthrift-0.9.3.jar
log4j/1.2.17//log4j-1.2.17.jar
logging-interceptor/3.12.0//logging-interceptor-3.12.0.jar
logging-interceptor/3.12.6//logging-interceptor-3.12.6.jar
lz4-java/1.4.0//lz4-java-1.4.0.jar
machinist_2.11/0.6.1//machinist_2.11-0.6.1.jar
macro-compat_2.11/1.1.1//macro-compat_2.11-1.1.1.jar
Expand All @@ -170,7 +171,7 @@ netty/3.9.9.Final//netty-3.9.9.Final.jar
nimbus-jose-jwt/4.41.1//nimbus-jose-jwt-4.41.1.jar
objenesis/2.5.1//objenesis-2.5.1.jar
okhttp/2.7.5//okhttp-2.7.5.jar
okhttp/3.12.0//okhttp-3.12.0.jar
okhttp/3.12.6//okhttp-3.12.6.jar
okio/1.15.0//okio-1.15.0.jar
opencsv/2.3//opencsv-2.3.jar
orc-core/1.5.5/nohive/orc-core-1.5.5-nohive.jar
Expand Down
3 changes: 2 additions & 1 deletion resource-managers/kubernetes/core/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@
<name>Spark Project Kubernetes</name>
<properties>
<sbt.project.name>kubernetes</sbt.project.name>
<kubernetes.client.version>4.6.1</kubernetes.client.version>
<!-- Note: Please update the kubernetes client version in kubernetes/integration-tests/pom.xml -->
<kubernetes.client.version>4.9.2</kubernetes.client.version>
</properties>

<dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -63,14 +63,10 @@ private[spark] class BasicDriverFeatureStep(
.build()
}

val driverCpuQuantity = new QuantityBuilder(false)
.withAmount(driverCpuCores)
.build()
val driverMemoryQuantity = new QuantityBuilder(false)
.withAmount(s"${driverMemoryWithOverheadMiB}Mi")
.build()
val driverCpuQuantity = new Quantity(driverCpuCores)
val driverMemoryQuantity = new Quantity(s"${driverMemoryWithOverheadMiB}Mi")
val maybeCpuLimitQuantity = driverLimitCores.map { limitCores =>
("cpu", new QuantityBuilder(false).withAmount(limitCores).build())
("cpu", new Quantity(limitCores))
}

val driverPort = conf.sparkConf.getInt("spark.driver.port", DEFAULT_DRIVER_PORT)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -85,12 +85,8 @@ private[spark] class BasicExecutorFeatureStep(
// name as the hostname. This preserves uniqueness since the end of name contains
// executorId
val hostname = name.substring(Math.max(0, name.length - 63))
val executorMemoryQuantity = new QuantityBuilder(false)
.withAmount(s"${executorMemoryTotal}Mi")
.build()
val executorCpuQuantity = new QuantityBuilder(false)
.withAmount(executorCoresRequest)
.build()
val executorMemoryQuantity = new Quantity(s"${executorMemoryTotal}Mi")
val executorCpuQuantity = new Quantity(executorCoresRequest)
val executorExtraClasspathEnv = executorExtraClasspath.map { cp =>
new EnvVarBuilder()
.withName(ENV_CLASSPATH)
Expand Down Expand Up @@ -152,9 +148,7 @@ private[spark] class BasicExecutorFeatureStep(
.addToArgs("executor")
.build()
val containerWithLimitCores = executorLimitCores.map { limitCores =>
val executorCpuLimitQuantity = new QuantityBuilder(false)
.withAmount(limitCores)
.build()
val executorCpuLimitQuantity = new Quantity(limitCores)
new ContainerBuilder(executorContainer)
.editResources()
.addToLimits("cpu", executorCpuLimitQuantity)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ private[spark] class MountVolumesFeatureStep(
new VolumeBuilder()
.withEmptyDir(
new EmptyDirVolumeSource(medium.getOrElse(""),
new Quantity(sizeLimit.orNull)))
sizeLimit.map(new Quantity(_)).orNull))
}

val volume = volumeBuilder.withName(spec.volumeName).build()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ package org.apache.spark.deploy.k8s.features

import scala.collection.JavaConverters._

import io.fabric8.kubernetes.api.model.{ContainerPort, ContainerPortBuilder, LocalObjectReferenceBuilder}
import io.fabric8.kubernetes.api.model.{ContainerPort, ContainerPortBuilder, LocalObjectReferenceBuilder, Quantity}

import org.apache.spark.{SparkConf, SparkFunSuite}
import org.apache.spark.deploy.k8s.{KubernetesConf, KubernetesDriverSpecificConf, SparkPod}
Expand Down Expand Up @@ -114,11 +114,11 @@ class BasicDriverFeatureStepSuite extends SparkFunSuite {

val resourceRequirements = configuredPod.container.getResources
val requests = resourceRequirements.getRequests.asScala
assert(requests("cpu").getAmount === "2")
assert(requests("memory").getAmount === "456Mi")
assert(amountAndFormat(requests("cpu")) === "2")
assert(amountAndFormat(requests("memory")) === "456Mi")
val limits = resourceRequirements.getLimits.asScala
assert(limits("memory").getAmount === "456Mi")
assert(limits("cpu").getAmount === "4")
assert(amountAndFormat(limits("memory")) === "456Mi")
assert(amountAndFormat(limits("cpu")) === "4")

val driverPodMetadata = configuredPod.pod.getMetadata
assert(driverPodMetadata.getName === "spark-driver-pod")
Expand Down Expand Up @@ -216,4 +216,6 @@ class BasicDriverFeatureStepSuite extends SparkFunSuite {
.withContainerPort(portNumber)
.withProtocol("TCP")
.build()

private def amountAndFormat(quantity: Quantity): String = quantity.getAmount + quantity.getFormat
}
Original file line number Diff line number Diff line change
Expand Up @@ -104,8 +104,8 @@ class BasicExecutorFeatureStepSuite
assert(executor.container.getImage === EXECUTOR_IMAGE)
assert(executor.container.getVolumeMounts.isEmpty)
assert(executor.container.getResources.getLimits.size() === 1)
assert(executor.container.getResources
.getLimits.get("memory").getAmount === "1408Mi")
assert(amountAndFormat(executor.container.getResources
.getLimits.get("memory")) === "1408Mi")

// The pod has no node selector, volumes.
assert(executor.pod.getSpec.getNodeSelector.isEmpty)
Expand Down Expand Up @@ -182,7 +182,7 @@ class BasicExecutorFeatureStepSuite
Seq.empty[String]))
val executor = step.configurePod(SparkPod.initialPod())
// This is checking that basic executor + executorMemory = 1408 + 42 = 1450
assert(executor.container.getResources.getRequests.get("memory").getAmount === "1450Mi")
assert(amountAndFormat(executor.container.getResources.getRequests.get("memory")) === "1450Mi")
}

// There is always exactly one controller reference, and it points to the driver pod.
Expand All @@ -209,4 +209,6 @@ class BasicExecutorFeatureStepSuite
}.toMap
assert(defaultEnvs === mapEnvs)
}

private def amountAndFormat(quantity: Quantity): String = quantity.getAmount + quantity.getFormat
}
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,8 @@ class MountVolumesFeatureStepSuite extends SparkFunSuite {
assert(configuredPod.pod.getSpec.getVolumes.size() === 1)
val emptyDir = configuredPod.pod.getSpec.getVolumes.get(0).getEmptyDir
assert(emptyDir.getMedium === "Memory")
assert(emptyDir.getSizeLimit.getAmount === "6G")
assert(emptyDir.getSizeLimit.getAmount === "6")
assert(emptyDir.getSizeLimit.getFormat === "G")
assert(configuredPod.container.getVolumeMounts.size() === 1)
assert(configuredPod.container.getVolumeMounts.get(0).getMountPath === "/tmp")
assert(configuredPod.container.getVolumeMounts.get(0).getName === "testVolume")
Expand All @@ -113,7 +114,7 @@ class MountVolumesFeatureStepSuite extends SparkFunSuite {
assert(configuredPod.pod.getSpec.getVolumes.size() === 1)
val emptyDir = configuredPod.pod.getSpec.getVolumes.get(0).getEmptyDir
assert(emptyDir.getMedium === "")
assert(emptyDir.getSizeLimit.getAmount === null)
assert(emptyDir.getSizeLimit === null)
assert(configuredPod.container.getVolumeMounts.size() === 1)
assert(configuredPod.container.getVolumeMounts.get(0).getMountPath === "/tmp")
assert(configuredPod.container.getVolumeMounts.get(0).getName === "testVolume")
Expand Down
2 changes: 1 addition & 1 deletion resource-managers/kubernetes/integration-tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<download-maven-plugin.version>1.3.0</download-maven-plugin.version>
<exec-maven-plugin.version>1.4.0</exec-maven-plugin.version>
<extraScalaTestArgs></extraScalaTestArgs>
<kubernetes-client.version>4.6.1</kubernetes-client.version>
<kubernetes-client.version>4.9.2</kubernetes-client.version>
<scala-maven-plugin.version>3.2.2</scala-maven-plugin.version>
<scalatest-maven-plugin.version>1.0</scalatest-maven-plugin.version>
<sbt.project.name>kubernetes-integration-tests</sbt.project.name>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,15 +56,15 @@ private[spark] class KubernetesSuite extends SparkFunSuite
protected var appLocator: String = _

// Default memory limit is 1024M + 384M (minimum overhead constant)
private val baseMemory = s"${1024 + 384}Mi"
private val baseMemory = s"${1024 + 384}"
protected val memOverheadConstant = 0.8
private val standardNonJVMMemory = s"${(1024 + 0.4*1024).toInt}Mi"
private val standardNonJVMMemory = s"${(1024 + 0.4*1024).toInt}"
protected val additionalMemory = 200
// 209715200 is 200Mi
protected val additionalMemoryInBytes = 209715200
private val extraDriverTotalMemory = s"${(1024 + memOverheadConstant*1024).toInt}Mi"
private val extraDriverTotalMemory = s"${(1024 + memOverheadConstant*1024).toInt}"
private val extraExecTotalMemory =
s"${(1024 + memOverheadConstant*1024 + additionalMemory).toInt}Mi"
s"${(1024 + memOverheadConstant*1024 + additionalMemory).toInt}"

override def beforeAll(): Unit = {
// The scalatest-maven-plugin gives system properties that are referenced but not set null
Expand Down

0 comments on commit 8bbc46a

Please sign in to comment.