-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-17456][CORE] Utility for parsing Spark versions #15017
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
CC: @yanboliang @MLnick Does one of you have time to review this? It should be very quick. Thanks! |
|
Would it make sense to replace some of the hacks with this util at the same time? |
|
Test build #65115 has finished for PR 15017 at commit
|
|
I like splitting things into separate PRs, but I'll go ahead and make a follow-up task to check for places which can be fixed. |
|
Sounds good :) |
|
LGTM, merged into master. Thanks! |
|
We can replace the hacks by the new utility in SPARK-17462. Thanks! |
|
Thanks @yanboliang ! |
|
Actually, I'd like to backport this to 2.0 since I'd like to put [https://issues.apache.org/jira/browse/SPARK-16240] into 2.0 as well. I'll do the backport. |
## What changes were proposed in this pull request? This patch adds methods for extracting major and minor versions as Int types in Scala from a Spark version string. Motivation: There are many hacks within Spark's codebase to identify and compare Spark versions. We should add a simple utility to standardize these code paths, especially since there have been mistakes made in the past. This will let us add unit tests as well. Currently, I want this functionality to check Spark versions to provide backwards compatibility for ML model persistence. ## How was this patch tested? Unit tests Author: Joseph K. Bradley <joseph@databricks.com> Closes #15017 from jkbradley/version-parsing. (cherry picked from commit 65b814b) Signed-off-by: Joseph K. Bradley <joseph@databricks.com>
|
OK I did the backport. I checked out branch-2.0 and ran the test locally. |
What changes were proposed in this pull request?
This patch adds methods for extracting major and minor versions as Int types in Scala from a Spark version string.
Motivation: There are many hacks within Spark's codebase to identify and compare Spark versions. We should add a simple utility to standardize these code paths, especially since there have been mistakes made in the past. This will let us add unit tests as well. Currently, I want this functionality to check Spark versions to provide backwards compatibility for ML model persistence.
How was this patch tested?
Unit tests