Skip to content

Conversation

@vanzin
Copy link
Contributor

@vanzin vanzin commented Jan 29, 2015

One side-effect of shading guava is that it disappears as a transitive
dependency. For Hadoop 2.x, this was masked by the fact that Hadoop
itself depends on guava. But certain versions of Hadoop 1.x also
shade guava, leaving either no guava or some random version pulled
by another dependency on the classpath.

So be explicit about the dependency in modules that use guava directly,
which is the right thing to do anyway.

Marcelo Vanzin added 2 commits January 29, 2015 10:10
One side-effect of shading guava is that it disappears as a transitive
dependency. For Hadoop 2.x, this was masked by the fact that Hadoop
itself depends on guava. But certain versions of Hadoop 1.x also
shade guava, leaving either no guava or some random version pulled
by another dependency on the classpath.

So be explicit about the dependency in modules that use guava directly,
which is the right thing to do anyway.
@pwendell
Copy link
Contributor

Thanks for figuring this out @vanzin!

@SparkQA
Copy link

SparkQA commented Jan 29, 2015

Test build #26326 has finished for PR 4272 at commit e3f30e5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@janzhou
Copy link

janzhou commented Jan 29, 2015

This patch works. Thanks @vanzin

@SparkQA
Copy link

SparkQA commented Jan 29, 2015

Test build #26325 has finished for PR 4272 at commit d3b2c84.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@pwendell
Copy link
Contributor

LGTM - pulling this in.

@asfgit asfgit closed this in f9e5694 Jan 29, 2015
@vanzin vanzin deleted the SPARK-5466 branch January 30, 2015 02:09
@tsingfu
Copy link

tsingfu commented Apr 14, 2015

@vanzin , today I found sql/hive-thriftserver also needs explicit guava dependencies. But in sql/hive-thriftserver/pom.xml the scope of com.google.guava.guava is runtime, and this causes build error as same as in SPARK-5466.

my test:
I try the scope with compile, build works fine.
vi sql/hive-thriftserver/pom.xml

    <dependency>
      <groupId>com.google.guava</groupId>
      <artifactId>guava</artifactId>
      <!-- <scope>runtime</scope> -->
    </dependency>

cc @pwendell

@WangTaoTheTonic
Copy link
Contributor

@tsingfu Could you add more description about how you found the build error? As I got same building error message when I tried to add some code in sql/hive-thriftserver module in which I invoked some method in Utils.class(in core module).

I am not sure in which case it will occured and in which not.

@vanzin I am not expert on compile thing so please give some explaination about this if you got more infos about it. I will appreciate a lot.

@tsingfu
Copy link

tsingfu commented Apr 14, 2015

@WangTaoTheTonic Hi, today I got the error when I did a new clean build in updated master branch as follows:

cd spark
//in master branch, update my local master from remote
git pull 
mvn clean
mvn generate-sources
mvn package -Dhadoop.version=2.5.0-cdh5.3.1  -DskipTests -Pyarn -Phive-0.13.1 -Phive-thriftserver -Pspark-ganglia-lgpl

then the error occured.

@WangTaoTheTonic
Copy link
Contributor

@tsingfu Okay I tested it too and the error meesage is:

[INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
^[[0m[^[[0minfo^[[0m] ^[[0mCompiling 8 Scala sources to /home/wangtao111/github/spark/sql/hive-thriftserver/target/scala-2.10/classes...^[[0m
^[[0m[^[[31merror^[[0m] ^[[0mbad symbolic reference. A signature in Utils.class refers to term util^[[0m
^[[0m[^[[31merror^[[0m] ^[[0min package com.google.common which is not available.^[[0m
^[[0m[^[[31merror^[[0m] ^[[0mIt may be completely missing from the current classpath, or the version on^[[0m
^[[0m[^[[31merror^[[0m] ^[[0mthe classpath might be incompatible with the version used when compiling Utils.class.^[[0m
^[[0m[^[[31merror^[[0m] ^[[0m^[[0m
^[[0m[^[[31merror^[[0m] ^[[0m while compiling: /home/wangtao111/github/spark/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLEnv.scala^[[0m
^[[0m[^[[31merror^[[0m] ^[[0m during phase: erasure^[[0m
^[[0m[^[[31merror^[[0m] ^[[0m library version: version 2.10.4^[[0m
^[[0m[^[[31merror^[[0m] ^[[0m compiler version: version 2.10.4^[[0m

The build command is: mvn -Pbigtop-dist -Phive-thriftserver -Phive -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -DskipTests clean package

Seems like SparkSQLEnv.scala used Utils.class and there is import com.google.common.xxx in Utils.scala. So if we shaded guava in core module there will come some problem?

Update: #5507 is fixing this now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants