Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SparkConnector] Add proper time column support for Spark connector segment writer #14556

Merged
merged 4 commits into from
Dec 3, 2024

Conversation

cbalci
Copy link
Contributor

@cbalci cbalci commented Nov 27, 2024

Adding proper time column support to the Spark writer to correctly create time information in segment metadata, which was missing before.

Full api for configuring the time column when writing out segments is as follows:

    dataframe.write.format("pinot")
      .mode("append")
      .option("table", "airlineStats")
      .option("tableType", "OFFLINE")
      .option("segmentNameFormat", "{table}_{startTime}_{endTime}_{partitionId:03}")
      .option("timeColumnName", "ts")                  // <- column
      .option("timeFormat", "EPOCH|SECONDS")           // <- format
      .option("timeGranularity", "1:SECONDS")          // <- granularity
      .save("myPath")

Which follows Pinot's time column config options timeFormat and timeGranularity.

I also added a couple new template variables, startTime and endTime which can be used for building segment name. This is useful for efficient purging of out-of-retention segments without having to open the metadata.

Testing
Updated unit/integration tests.
I also tested and validated this behavior in a pre-production environment.

feature bugfix

@cbalci cbalci requested a review from ankitsultana November 27, 2024 21:12
@codecov-commenter
Copy link

codecov-commenter commented Nov 27, 2024

Codecov Report

Attention: Patch coverage is 69.23077% with 4 lines in your changes missing coverage. Please review.

Project coverage is 63.96%. Comparing base (59551e4) to head (2663d3b).
Report is 1413 commits behind head on master.

Files with missing lines Patch % Lines
...tor/spark/common/PinotDataSourceWriteOptions.scala 69.23% 2 Missing and 2 partials ⚠️
Additional details and impacted files
@@             Coverage Diff              @@
##             master   #14556      +/-   ##
============================================
+ Coverage     61.75%   63.96%   +2.21%     
- Complexity      207     1570    +1363     
============================================
  Files          2436     2683     +247     
  Lines        133233   147348   +14115     
  Branches      20636    22590    +1954     
============================================
+ Hits          82274    94249   +11975     
- Misses        44911    46145    +1234     
- Partials       6048     6954     +906     
Flag Coverage Δ
custom-integration1 100.00% <ø> (+99.99%) ⬆️
integration 100.00% <ø> (+99.99%) ⬆️
integration1 100.00% <ø> (+99.99%) ⬆️
integration2 0.00% <ø> (ø)
java-11 63.92% <69.23%> (+2.21%) ⬆️
java-21 63.85% <69.23%> (+2.22%) ⬆️
skip-bytebuffers-false 63.95% <69.23%> (+2.21%) ⬆️
skip-bytebuffers-true 63.81% <69.23%> (+36.08%) ⬆️
temurin 63.96% <69.23%> (+2.21%) ⬆️
unittests 63.95% <69.23%> (+2.21%) ⬆️
unittests1 55.54% <ø> (+8.65%) ⬆️
unittests2 34.61% <69.23%> (+6.88%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@@ -94,6 +115,8 @@ class PinotDataWriter[InternalRow](
val variables = Map(
"partitionId" -> partitionId,
"table" -> tableName,
"startTime" -> startTime,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: you can also update javadocs to show the new variables for segment name generation. But do call out that it only works for numeric time columns.

@ankitsultana ankitsultana merged commit d0d4419 into apache:master Dec 3, 2024
20 of 21 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants