diff --git a/docs/configuration.md b/docs/configuration.md index bd67144007e9..d2fdef02df9e 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -59,6 +59,7 @@ The following format is accepted: 1p or 1pb (pebibytes = 1024 tebibytes) ## Dynamically Loading Spark Properties + In some cases, you may want to avoid hard-coding certain configurations in a `SparkConf`. For instance, if you'd like to run the same application with different masters or different amounts of memory. Spark allows you to simply create an empty conf: @@ -106,7 +107,8 @@ line will appear. For all other configuration properties, you can assume the def Most of the properties that control internal settings have reasonable default values. Some of the most common options to set are: -#### Application Properties +### Application Properties + @@ -215,7 +217,8 @@ of the most common options to set are: Apart from these, the following properties are also available, and may be useful in some situations: -#### Runtime Environment +### Runtime Environment +
Property NameDefaultMeaning
@@ -471,7 +474,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Shuffle Behavior +### Shuffle Behavior + @@ -612,7 +616,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Spark UI +### Spark UI + @@ -745,7 +750,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Compression and Serialization +### Compression and Serialization + @@ -891,7 +897,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Memory Management +### Memory Management + @@ -981,7 +988,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Execution Behavior +### Execution Behavior + @@ -1108,7 +1116,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Networking +### Networking + @@ -1139,13 +1148,13 @@ Apart from these, the following properties are also available, and may be useful @@ -1217,7 +1226,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
spark.driver.bindAddress (value of spark.driver.host) -

Hostname or IP address where to bind listening sockets. This config overrides the SPARK_LOCAL_IP - environment variable (see below).

+ Hostname or IP address where to bind listening sockets. This config overrides the SPARK_LOCAL_IP + environment variable (see below). -

It also allows a different address from the local one to be advertised to executors or external systems. +
It also allows a different address from the local one to be advertised to executors or external systems. This is useful, for example, when running containers with bridged networking. For this to properly work, the different ports used by the driver (RPC, block manager and UI) need to be forwarded from the - container's host.

+ container's host.
-#### Scheduling +### Scheduling + @@ -1467,7 +1477,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Dynamic Allocation +### Dynamic Allocation + @@ -1548,7 +1559,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### Security +### Security + @@ -1729,7 +1741,7 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
-#### TLS / SSL +### TLS / SSL @@ -1737,21 +1749,21 @@ Apart from these, the following properties are also available, and may be useful @@ -1835,7 +1847,8 @@ Apart from these, the following properties are also available, and may be useful
Property NameDefaultMeaning
spark.ssl.enabled false -

Whether to enable SSL connections on all supported protocols.

+ Whether to enable SSL connections on all supported protocols. -

When spark.ssl.enabled is configured, spark.ssl.protocol - is required.

+
When spark.ssl.enabled is configured, spark.ssl.protocol + is required. -

All the SSL settings like spark.ssl.xxx where xxx is a +
All the SSL settings like spark.ssl.xxx where xxx is a particular configuration property, denote the global configuration for all the supported protocols. In order to override the global configuration for the particular protocol, - the properties must be overwritten in the protocol-specific namespace.

+ the properties must be overwritten in the protocol-specific namespace. -

Use spark.ssl.YYY.XXX settings to overwrite the global configuration for +
Use spark.ssl.YYY.XXX settings to overwrite the global configuration for particular protocol denoted by YYY. Example values for YYY include fs, ui, standalone, and historyServer. See SSL - Configuration for details on hierarchical SSL configuration for services.

+ Configuration for details on hierarchical SSL configuration for services.
-#### Spark SQL +### Spark SQL + Running the SET -v command will show the entire list of the SQL configuration.
@@ -1877,7 +1890,8 @@ showDF(properties, numRows = 200, truncate = FALSE)
-#### Spark Streaming +### Spark Streaming + @@ -1998,7 +2012,8 @@ showDF(properties, numRows = 200, truncate = FALSE)
Property NameDefaultMeaning
-#### SparkR +### SparkR + @@ -2047,7 +2062,7 @@ showDF(properties, numRows = 200, truncate = FALSE)
Property NameDefaultMeaning
-#### Deploy +### Deploy @@ -2070,15 +2085,16 @@ showDF(properties, numRows = 200, truncate = FALSE)
Property NameDefaultMeaning
-#### Cluster Managers +### Cluster Managers + Each cluster manager in Spark has additional configuration options. Configurations can be found on the pages for each mode: -##### [YARN](running-on-yarn.html#configuration) +#### [YARN](running-on-yarn.html#configuration) -##### [Mesos](running-on-mesos.html#configuration) +#### [Mesos](running-on-mesos.html#configuration) -##### [Standalone Mode](spark-standalone.html#cluster-launch-scripts) +#### [Standalone Mode](spark-standalone.html#cluster-launch-scripts) # Environment Variables