Skip to content
This repository has been archived by the owner on Nov 16, 2019. It is now read-only.

Upgraded to Spark 2.0. Old Version. Closed in favor of #79 #77

Closed
wants to merge 2 commits into from

Conversation

javadba
Copy link
Contributor

@javadba javadba commented Jun 8, 2016

… compatible.

@junshi15
Copy link
Collaborator

junshi15 commented Jun 8, 2016

Thanks for your contribution. I have a few comments/questions.

  1. Spark 2.0 is still in "preview". Can we defer this PR until it is officially released?

  2. What do you mean by "backward compatibility to Spark 1.X", your pom.xml specifies Spark 2.0, so it will be built with Spark 2.0. Do you mean I can take the compiled jar file and launch it with Spark 1.6.1, which is the latest release?

  3. Why do you need a new Logger? I notice you copy it from org.apache.spark.internal. Does the existing Logger in CaffeOnSpark not meet your needs?

@javadba javadba closed this Jun 8, 2016
@javadba
Copy link
Contributor Author

javadba commented Jun 8, 2016

Superseded by #79

@javadba
Copy link
Contributor Author

javadba commented Jun 8, 2016

@junshi15

I have built a completely new PR #79 that makes the whole process simpler.
#79

To answer your questions:

  1. My team is building only on top of Spark 2.0 . At the very least we need this presently. It is my intention to help out the overall project .

My original PR recommended to maintain a separate 2.0-only branch. @mriduljain requested to make it backwards compatible on same branch : so the new PR addresses this.

  1. New PR takes care of this.

  2. The Logger is required because of one of CaffeOnSpark classes improperly depends on internal Spark Logging class - which is noted in 1.x to be deprecated for external use and in fact is private in 2.X. Take a look at LmdbRDD "with Logging".

@mriduljain
Copy link
Contributor

This looks fine. I will test out locally soon and then we should be good. Meanwhile you should be able to move ahead anyway.

@javadba javadba changed the title Upgraded to Spark 2.0. Contains breaking changes - not backwards… Upgraded to Spark 2.0. Now it is backwards compatible and defaults to spark 1.x Jun 11, 2016
@javadba javadba changed the title Upgraded to Spark 2.0. Now it is backwards compatible and defaults to spark 1.x Upgraded to Spark 2.0. Old Version. Closed in favor of #79 Jun 11, 2016
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants