-
Notifications
You must be signed in to change notification settings - Fork 309
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bump Spark version to 1.4 #752
Comments
It seems |
See #751 for a change that needs to be included. |
Correct |
So, I realized something awkward about this that maybe others have processed but that I hadn't yet: there's The former is only relevant to the spark classes we link against, which AFAIK have not changed in a way we care about since The latter informs what scripts (e.g. Spark's |
Yep, the compile time dependency shouldn't matter, unless it does. :) E.g. if a binary incompatibility in 1.x versions of Spark or one of its transitive dependencies slips through. Is #754 backward compatible script-wise with previous versions of Spark? A few simple examples I tried worked for me on Spark 1.3.1. It would be best if we didn't require |
Good point about transitive dependencies. If the net result of #754 ends up being that we don't need |
The transitive dependency thing is both nasty and common unfortunately... |
|
Would it be a fair assumption that those should be on the user's path? |
Probably depends on the user. It never is for me, bc I often use different versions of Spark. One option would be to check if |
+1 |
Ah yea, I don't have Checking both sgtm2. Is this done now that #754 is in? Should we bump the POM version? I can file a separate issue for that if necessary; I was just noting while doing README refactoring in #763 / #764 that we say we continuous-build against Spark |
Well, the POM is 1.4.1 now, right? How about we move CI to 1.4.1 as well? That seems like the simplest solution. I'll prep a PR. |
Nope! 1.2.0. Unless I am really out of it. |
You are correct, nevermind! I am OK with having the Jenkins sanity test script |
Having jenkins test a matrix of Sparks sgtm @fnothaft. I was going to just bump the Spark version in the POM via github's web-edit-file flow, but then I remembered your warnings about transitive deps above. Do you have some system for evaluating the danger of such an upgrade? |
Do we want to matrix test Spark at both the build and the executable level? I am OK with either.
The |
"Build level": basically run Upgrading Other random question: now that SPARK-8057 is in, will we be able to support Hadoop 1 again in Spark |
Exactly!
+1
Since |
Right, some of the potential binary incompatibility issues with transitive dependencies won't show up at build time, and there is a possibility that the classpath in test scope could be different than runtime. |
If I have everything right: #750 has been closed. |
That looks right, I just closed #659 |
Closed by 7e8eb05. |
This issue will track any progress necessary for that.
See #750.
See #659.
The text was updated successfully, but these errors were encountered: