Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ import scala.xml.Node
import org.apache.spark.deploy.DeployMessages.{MasterStateResponse, RequestMasterState}
import org.apache.spark.deploy.ExecutorState
import org.apache.spark.deploy.master.ExecutorDesc
import org.apache.spark.ui.{UIUtils, WebUIPage}
import org.apache.spark.ui.{ToolTips, UIUtils, WebUIPage}
import org.apache.spark.util.Utils

private[ui] class ApplicationPage(parent: MasterWebUI) extends WebUIPage("app") {
Expand Down Expand Up @@ -69,6 +69,16 @@ private[ui] class ApplicationPage(parent: MasterWebUI) extends WebUIPage("app")
}
}
</li>
<li>
<span data-toggle="tooltip" title={ToolTips.APPLICATION_EXECUTOR_LIMIT}
data-placement="right">
<strong>Executor Limit: </strong>
{
if (app.executorLimit == Int.MaxValue) "Unlimited" else app.executorLimit
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed that ApplicationDescription has a maxCores field which is displayed on the Master UI's applications page. Do you think that there's any possibility for confusion between the limits displayed there (which are measured in terms of cores) and the executor limits shown here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed. Actually, there is another confusing difference between executor limit and maxCores: maxCores is fixed for an application while executorLimit is mutable. I'm not sure how to clarify them just on UI.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think for now it's OK. Conceptually the number of executors is constrained by the lower of the two: the executor limit and the cores limit (when translated into executors).

}
({app.executors.size} granted)
</span>
</li>
<li>
<strong>Executor Memory:</strong>
{Utils.megabytesToString(app.desc.memoryPerExecutorMB)}
Expand Down
6 changes: 6 additions & 0 deletions core/src/main/scala/org/apache/spark/ui/ToolTips.scala
Original file line number Diff line number Diff line change
Expand Up @@ -90,4 +90,10 @@ private[spark] object ToolTips {

val TASK_TIME =
"Shaded red when garbage collection (GC) time is over 10% of task time"

val APPLICATION_EXECUTOR_LIMIT =
"""Maximum number of executors that this application will use. This limit is finite only when
dynamic allocation is enabled. The number of granted executors may exceed the limit
ephemerally when executors are being killed.
"""
}