-
Notifications
You must be signed in to change notification settings - Fork 164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Record Arrow FFI metrics #1128
base: main
Are you sure you want to change the base?
Conversation
db42583
to
1715b5f
Compare
@@ -88,6 +90,7 @@ impl ScanExec { | |||
) -> Result<Self, CometError> { | |||
let metrics_set = ExecutionPlanMetricsSet::default(); | |||
let baseline_metrics = BaselineMetrics::new(&metrics_set, 0); | |||
let arrow_ffi_time = MetricBuilder::new(&metrics_set).subset_time("arrow_ffi_time", 0); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see now that this isn't just FFI time. It is the cost of calling CometBatchIterator.next()
so includes the cost of that method getting the next input batch as well as the FFI export cost ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moving this to draft for now while I think about this more
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm thanks @andygrove
one thing I'd like to mention are you planning to have it permanently or enable this internal metrics based on some spark key so as to spend resources on metrics only when its really needed
Which issue does this PR close?
N/A
Rationale for this change
This is a subset of #1111, separated out to make reviews easier.
What changes are included in this PR?
Record time spent performing Arrow FFI to transfer batches between JVM and Rust code.
Note that these timings won't be fully exposed to Spark UI until we merge #1111.
How are these changes tested?