Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@
* and combine the metrics at the driver side. How to combine task metrics is defined by the
* metric class with the same metric name.
*
* When Spark needs to aggregate task metrics, it will internally construct the instance of
* custom metric class defined in data source by using reflection. Spark requires the class
* implementing this interface to have a 0-arg constructor.
*
* @since 3.2.0
*/
@Evolving
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -220,7 +220,10 @@ class SQLAppStatusListener(
metricAggregationMap.put(className, method)
method
} catch {
case NonFatal(_) =>
case NonFatal(e) =>
logWarning(s"Unable to load custom metric object for class `$className`. " +
"Please make sure that the custom metric class is in the classpath and " +
"it has 0-arg constructor.", e)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Utils#loadExtensions also allows "single-argument constructor that accepts SparkConf", do we need to mention that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we should. Actually we don't need the implementing class to implement it.

// Cannot initialize custom metric object, we might be in history server that does
// not have the custom metric class.
val defaultMethod = (_: Array[Long], _: Array[Long]) => "N/A"
Expand Down