Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 5 additions & 13 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,14 +161,6 @@ Apart from these, the following properties are also available, and may be useful
#### Runtime Environment
<table class="table">
<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
<tr>
<td><code>spark.executor.memory</code></td>
<td>512m</td>
<td>
Amount of memory to use per executor process, in the same format as JVM memory strings
(e.g. <code>512m</code>, <code>2g</code>).
</td>
</tr>
<tr>
<td><code>spark.executor.extraJavaOptions</code></td>
<td>(none)</td>
Expand Down Expand Up @@ -365,7 +357,7 @@ Apart from these, the following properties are also available, and may be useful
<td><code>spark.ui.port</code></td>
<td>4040</td>
<td>
Port for your application's dashboard, which shows memory and workload data
Port for your application's dashboard, which shows memory and workload data.
</td>
</tr>
<tr>
Expand Down Expand Up @@ -880,8 +872,8 @@ Apart from these, the following properties are also available, and may be useful
<td><code>spark.scheduler.revive.interval</code></td>
<td>1000</td>
<td>
The interval length for the scheduler to revive the worker resource offers to run tasks.
(in milliseconds)
The interval length for the scheduler to revive the worker resource offers to run tasks
(in milliseconds).
</td>
</tr>
</tr>
Expand All @@ -893,7 +885,7 @@ Apart from these, the following properties are also available, and may be useful
to wait for before scheduling begins. Specified as a double between 0 and 1.
Regardless of whether the minimum ratio of resources has been reached,
the maximum amount of time it will wait before scheduling begins is controlled by config
<code>spark.scheduler.maxRegisteredResourcesWaitingTime</code>
<code>spark.scheduler.maxRegisteredResourcesWaitingTime</code>.
</td>
</tr>
<tr>
Expand Down Expand Up @@ -962,7 +954,7 @@ Apart from these, the following properties are also available, and may be useful
<code>spark.&lt;class name of filter&gt;.params='param1=value1,param2=value2'</code><br />
For example: <br />
<code>-Dspark.ui.filters=com.test.filter1</code> <br />
<code>-Dspark.com.test.filter1.params='param1=foo,param2=testing'</code>
<code>-Dspark.com.test.filter1.params='param1=foo,param2=testing'</code>.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There shouldn't be a period here, but I'll fix this when I merge it.

</td>
</tr>
<tr>
Expand Down