From 7ba8b769fe8e3fcb0c9a72fea91d5ef25fc9981b Mon Sep 17 00:00:00 2001 From: yanghua Date: Mon, 27 Feb 2023 11:18:59 +0800 Subject: [PATCH] [Addon #603] Add Apache Spark as a experimental addon Signed-off-by: yanghua --- .../spark-kubernetes-operator/README.md | 47 ++++++++++++------- 1 file changed, 29 insertions(+), 18 deletions(-) diff --git a/experimental/addons/spark-kubernetes-operator/README.md b/experimental/addons/spark-kubernetes-operator/README.md index 1ddc395cb..38f70700c 100644 --- a/experimental/addons/spark-kubernetes-operator/README.md +++ b/experimental/addons/spark-kubernetes-operator/README.md @@ -32,25 +32,36 @@ vela ls -A | grep spark * Secondly, show the component type `spark-cluster`, so we know how to use it in one application. As a spark user, you can choose the parameter to set for your spark cluster. ``` -vela show spark-cluster +vela show spark-application # Specification -# Specification -+---------------------+-------------------------------------------------------------------------------+--------+----------+---------+ -| NAME | DESCRIPTION | TYPE | REQUIRED | DEFAULT | -+---------------------+-------------------------------------------------------------------------------+--------+----------+---------+ -| name | Specify the spark application name. | string | true | | -| namespace | Specify the namespace for spark application to install. | string | true | | -| type | Specify the application language type, e.g. "Scala", "Python", "Java" or "R". | string | true | | -| pythonVersion | Specify the python version. | string | true | | -| mode | Specify the deploy mode, e.go "cluster", "client" or "in-cluster-client". | string | true | | -| image | Specify the container image for the driver, executor, and init-container. | string | true | | -| imagePullPolicy | Specify the image pull policy for the driver, executor, and init-container. | string | true | | -| mainClass | Specify the fully-qualified main class of the Spark application. | string | true | | -| mainApplicationFile | Specify the path to a bundled JAR, Python, or R file of the application. | string | true | | -| sparkVersion | Specify the version of Spark the application uses. | string | true | | -| driverCores | Specify the number of CPU cores to request for the driver pod. | int | true | | -| executorCores | Specify the number of CPU cores to request for the executor pod. | int | true | | -+---------------------+-------------------------------------------------------------------------------+--------+----------+---------+ ++---------------------+------------------------------------------------------------------------------------------------------+-------------------+----------+---------+ +| NAME | DESCRIPTION | TYPE | REQUIRED | DEFAULT | ++---------------------+------------------------------------------------------------------------------------------------------+-------------------+----------+---------+ +| name | Specify the spark application name. | string | true | | +| namespace | Specify the namespace for spark application to install. | string | true | | +| type | Specify the application language type, e.g. "Scala", "Python", "Java" or "R". | string | true | | +| pythonVersion | Specify the python version. | string | false | | +| mode | Specify the deploy mode, e.go "cluster", "client" or "in-cluster-client". | string | true | | +| image | Specify the container image for the driver, executor, and init-container. | string | true | | +| imagePullPolicy | Specify the image pull policy for the driver, executor, and init-container. | string | true | | +| mainClass | Specify the fully-qualified main class of the Spark application. | string | true | | +| mainApplicationFile | Specify the path to a bundled JAR, Python, or R file of the application. | string | true | | +| sparkVersion | Specify the version of Spark the application uses. | string | true | | +| driverCores | Specify the number of CPU cores to request for the driver pod. | int | true | | +| executorCores | Specify the number of CPU cores to request for the executor pod. | int | true | | +| arguments | Specify a list of arguments to be passed to the application. | []string | false | | +| sparkConf | Specify the config information carries user-specified Spark configuration properties as they would | map[string]string | false | | +| | use the "--conf" option in spark-submit. | | | | +| hadoopConf | Specify the config information carries user-specified Hadoop configuration properties as they would | map[string]string | false | | +| | use the the "--conf" option in spark-submit. The SparkApplication controller automatically adds | | | | +| | prefix "spark.hadoop." to Hadoop configuration properties. | | | | +| sparkConfigMap | Specify the name of the ConfigMap containing Spark configuration files such as log4j.properties. The | string | false | | +| | controller will add environment variable SPARK_CONF_DIR to the path where the ConfigMap is mounted | | | | +| | to. | | | | +| hadoopConfigMap | Specify the name of the ConfigMap containing Hadoop configuration files such as core-site.xml. The | string | false | | +| | controller will add environment variable HADOOP_CONF_DIR to the path where the ConfigMap is mounted | | | | +| | to. | | | | ++---------------------+------------------------------------------------------------------------------------------------------+-------------------+----------+---------+ ```