Skip to content

Conversation

@jkbradley
Copy link
Member

What changes were proposed in this pull request?

This patch adds methods for extracting major and minor versions as Int types in Scala from a Spark version string.

Motivation: There are many hacks within Spark's codebase to identify and compare Spark versions. We should add a simple utility to standardize these code paths, especially since there have been mistakes made in the past. This will let us add unit tests as well. Currently, I want this functionality to check Spark versions to provide backwards compatibility for ML model persistence.

How was this patch tested?

Unit tests

@jkbradley
Copy link
Member Author

CC: @yanboliang @MLnick Does one of you have time to review this? It should be very quick. Thanks!

@holdenk
Copy link
Contributor

holdenk commented Sep 8, 2016

Would it make sense to replace some of the hacks with this util at the same time?

@SparkQA
Copy link

SparkQA commented Sep 8, 2016

Test build #65115 has finished for PR 15017 at commit 8978cb3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@jkbradley
Copy link
Member Author

I like splitting things into separate PRs, but I'll go ahead and make a follow-up task to check for places which can be fixed.

@holdenk
Copy link
Contributor

holdenk commented Sep 9, 2016

Sounds good :)

@yanboliang
Copy link
Contributor

LGTM, merged into master. Thanks!

@asfgit asfgit closed this in 65b814b Sep 9, 2016
@yanboliang
Copy link
Contributor

We can replace the hacks by the new utility in SPARK-17462. Thanks!

@jkbradley
Copy link
Member Author

Thanks @yanboliang !

@jkbradley jkbradley deleted the version-parsing branch September 9, 2016 17:40
@jkbradley jkbradley restored the version-parsing branch September 9, 2016 17:40
@jkbradley
Copy link
Member Author

Actually, I'd like to backport this to 2.0 since I'd like to put [https://issues.apache.org/jira/browse/SPARK-16240] into 2.0 as well. I'll do the backport.

asfgit pushed a commit that referenced this pull request Sep 9, 2016
## What changes were proposed in this pull request?

This patch adds methods for extracting major and minor versions as Int types in Scala from a Spark version string.

Motivation: There are many hacks within Spark's codebase to identify and compare Spark versions. We should add a simple utility to standardize these code paths, especially since there have been mistakes made in the past. This will let us add unit tests as well.  Currently, I want this functionality to check Spark versions to provide backwards compatibility for ML model persistence.

## How was this patch tested?

Unit tests

Author: Joseph K. Bradley <joseph@databricks.com>

Closes #15017 from jkbradley/version-parsing.

(cherry picked from commit 65b814b)
Signed-off-by: Joseph K. Bradley <joseph@databricks.com>
@jkbradley
Copy link
Member Author

OK I did the backport. I checked out branch-2.0 and ran the test locally.

@jkbradley jkbradley deleted the version-parsing branch September 9, 2016 17:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants