Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -737,7 +737,7 @@ package object config {
"application ends.")
.version("3.3.0")
.booleanConf
.createWithDefault(false)
.createWithDefault(true)

private[spark] val SHUFFLE_SERVICE_FETCH_RDD_ENABLED =
ConfigBuilder(Constants.SHUFFLE_SERVICE_FETCH_RDD_ENABLED)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -936,6 +936,7 @@ class MapOutputTrackerSuite extends SparkFunSuite with LocalSparkContext {
val newConf = new SparkConf
newConf.set("spark.shuffle.push.enabled", "true")
newConf.set("spark.shuffle.service.enabled", "true")
newConf.set("spark.shuffle.service.removeShuffle", "false")
newConf.set(SERIALIZER, "org.apache.spark.serializer.KryoSerializer")
newConf.set(IS_TESTING, true)

Expand Down
2 changes: 1 addition & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -1152,7 +1152,7 @@ Apart from these, the following properties are also available, and may be useful
</tr>
<tr>
<td><code>spark.shuffle.service.removeShuffle</code></td>
<td>false</td>
<td>true</td>
<td>
Whether to use the ExternalShuffleService for deleting shuffle blocks for
deallocated executors when the shuffle is no longer needed. Without this enabled,
Expand Down
2 changes: 2 additions & 0 deletions docs/core-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ license: |

- Since Spark 4.0, Spark uses `~/.ivy2.5.2` as Ivy user directory by default to isolate the existing systems from Apache Ivy's incompatibility. To restore the legacy behavior, you can set `spark.jars.ivy` to `~/.ivy2`.

- Since Spark 4.0, Spark uses the external shuffle service for deleting shuffle blocks for deallocated executors when the shuffle is no longer needed. To restore the legacy behavior, you can set `spark.shuffle.service.removeShuffle` to `false`.

## Upgrading from Core 3.4 to 3.5

- Since Spark 3.5, `spark.yarn.executor.failuresValidityInterval` is deprecated. Use `spark.executor.failuresValidityInterval` instead.
Expand Down