Skip to content

Conversation

@pwendell
Copy link
Contributor

There are two relevant 'skip' configurations in the build, the first
is for "mvn install" and the second is for "mvn deploy". As of 1.2,
we actually use "mvn install" to generate our deployed artifcts,
because we have some customization of the nexus upload due to having
to cross compile for Scala 2.10 and 2.11.

There is no reason to have differents settings for these values,
this patch simply cleans this up for the repl/ and yarn/
projects.

@pwendell
Copy link
Contributor Author

@vanzin or @srowen - mind taking a quick look?

@SparkQA
Copy link

SparkQA commented Jan 16, 2015

Test build #25683 has started for PR 4078 at commit 07ac835.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Jan 16, 2015

Test build #25683 has finished for PR 4078 at commit 07ac835.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • case class SparkListenerJobEnd(
    • class SparkILoop(
    • * @param id The id (variable name, method name, class name, etc) whose
    • * Retrieves the class representing the id (variable name, method name,
    • * @param id The id (variable name, method name, class name, etc) whose
    • * @return Some containing term name (id) class if exists, else None
    • * @param id The id (variable name, method name, class name, etc) whose
    • * @param id The id (variable name, method name, class name, etc) whose
    • * Retrieves the runtime class and type representing the id (variable name,
    • * @param id The id (variable name, method name, class name, etc) whose
    • * @param id The id (variable name, method name, class name, etc) whose

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/25683/
Test FAILed.

@srowen
Copy link
Member

srowen commented Jan 16, 2015

So deploy is not really used for anything now? then its config can be completely removed. There are more occurrences of this configuration in tools, java8-tests, examples, assembly

@pwendell
Copy link
Contributor Author

I think it is likely still used by companies that publish Spark via the traditional route, either internally or for forks of Spark. For the upstream release publishing we do something a bit fancier because the built-in maven deploy plug-in does not nicely support cross building.

@vanzin
Copy link
Contributor

vanzin commented Jan 16, 2015

Hi @pwendell ,

I actually removed the skip from the install plugin in #2982. Sorry, I didn't realize that the build scripts used it to actually publish artifacts. But skipping install breaks some developer workflows - e.g., I'm pretty sure something like "mvn -pl :spark-assembly_2.10 package" breaks if you don't have all the needed artifacts installed in your local repo.

Maybe the build scripts can have a whitelist of things to publish, or the build can have a variable that controls whether the install plugins is enabled for these "internal" artifacts or not?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants