Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vcf2adam -> Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; #871

Closed
jerryivanhoe opened this issue Oct 27, 2015 · 18 comments

Comments

@jerryivanhoe
Copy link

Hi,
I installed version 0.17.1 on top of AWS EMR with Spark 1.3.1.
Then I loaded chromosome 1 from 1000genomes project as vcf and copied this into hdfs
[hadoop@ip-172-31-21-10 jerry]$ adam-submit vcf2adam hdfs://chr1/chr1 hdfs://chr1/chr1.adam
Using SPARK_SUBMIT=/home/hadoop/spark/bin/spark-submit
15/10/27 10:05:51 INFO cli.ADAMMain: ADAM invoked with args: "vcf2adam" "hdfs://chr1/chr1" "hdfs://chr1/chr1.adam"
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:105)

any idea would be highly appreciated
Jerry

@jerryivanhoe
Copy link
Author

@heuermh
Copy link
Member

heuermh commented Oct 27, 2015

I believe this is when Spark and/or ADAM have been built for Scala 2.11 and then are run with Scala 2.10. If you could confirm that the Scala versions of Spark and ADAM match, that should help.

@jerryivanhoe
Copy link
Author

Thank you Michael.
I installed 2 older versions:
drwxr-xr-x 6 hadoop hadoop 4096 Oct 28 05:36 adam-distribution_2.10-0.17.1
drwxr-xr-x 6 hadoop hadoop 4096 Oct 28 05:36 adam-distribution_2.10-0.18.1
This looks a bit better, because the error appears later ...
e.g. with adam-distribution_2.10-0.17.1:
15/10/28 05:40:56 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
Exception in thread "main" java.lang.IllegalArgumentException: java.net.UnknownHostException: chr1
at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
I attached the full output.
convert_1
convert_2
convert_3
convert_4

@laserson
Copy link
Contributor

HDFS URIs typically look like this:

hdfs://namenode-host/path/to/whatever

So Hadoop/Spark thinks that the host running the namenode server is called chr1. You probably meant something like

hdfs:///chr1/chr1

(note the difference with the slashes). This will use the default namenode that is set with any associated Hadoop client config.

@jerryivanhoe
Copy link
Author

WOW Uri !!! Thanks so much ! It's running now.

Using only 2 out of 4 Clusternodes, but that's another issue, I will try to fix next !

best !
-Jerry

@heuermh
Copy link
Member

heuermh commented Oct 28, 2015

Good to hear!

There is a minor issue with the 0.18.1 distribution, reported here (#872). If you run into that, just move or remove the -sources.jar as shown in the issue description.

@laserson
Copy link
Contributor

Great, closing this. Definitely report it if you have additional issues.

@jerryivanhoe
Copy link
Author

Perfect job ! Thanks a lot, Michael and Uri !

@ankushreddy
Copy link

hi @laserson
am still facing the same error when am trying to run adam 0.18.1

./adam-submit transform /shared/avocado_test/NA06984.454.ssaha.SRP000033.2009_10.bam /shared/avocado_out/NA06984.454.ssaha.SRP000033.2009_10.bam_tags.adam -add_md_tags /shared/avocado_test/human_b36_male.fa
Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using SPARK_SUBMIT=/usr/hdp/2.2.4.2-2/spark-1.4.1/bin/spark-submit
16/01/28 14:33:15 INFO cli.ADAMMain: ADAM invoked with args: "transform" "/shared/avocado_test/NA06984.454.ssaha.SRP000033.2009_10.bam" "/shared/avocado_out/NA06984.454.ssaha.SRP000033.2009_10.bam_tags.adam" "-add_md_tags" "/shared/avocado_test/human_b36_male.fa"
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.bdgenomics.adam.cli.ADAMMain.apply(ADAMMain.scala:120)
at org.bdgenomics.adam.cli.ADAMMain$.main(ADAMMain.scala:77)
at org.bdgenomics.adam.cli.ADAMMain.main(ADAMMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/01/28 14:33:15 INFO util.Utils: Shutdown hook called

using the version spark 1.4.1
hadoop 2.6.0

Is there any workaround for this issue.
used this maven command to build.
mvn clean package -Dscala-2.10.5 -DskipTests

and used
mvn clean package -DskipTests

even then am facing the same issue.

@fnothaft
Copy link
Member

Hi @ankushreddy

We do not have a -Dscala-2.10.5 switch for our build. Are you running on a Spark build that is built for Scala 2.10 or 2.11?

@ankushreddy
Copy link

HI @fnothaft

we are running spark on scala 2.10

@fnothaft
Copy link
Member

When you do git status in the ADAM repository, what do you get?

@fnothaft fnothaft reopened this Jan 28, 2016
@ankushreddy
Copy link

I actually downloaded the source code of adam 0.18 from
https://github.com/bigdatagenomics/adam/releases

and compiled it on my hadoop I actually cloned the new repository to my desktop there am getting the status like.

C:\Users\asugured\Documents\GitHub\adam [master]> git status
On branch master
Your branch is up-to-date with 'origin/master'.
nothing to commit, working directory clean

@fnothaft
Copy link
Member

Which release of ADAM 0.18 did you download? We package both Scala 2.10 and 2.11 artifacts with each release. The error you're getting (missing method from inside of a Scala standard library) indicates that you have a Scala version mismatch between your Spark and ADAM distros. This can happen if you downloaded the ADAM 0.18 Scala 2.11 artifacts, but are running with Spark built for Scala 2.10. Can you run git log | head -n 1 | awk '{ print $2 }'? This will give us the commit hash for the latest commit, which will confirm which set of ADAM artifacts you've pulled down. You can also run grep scala.version pom.xml, which will return either 2.10 or 2.11.

@faissl-asu
Copy link

I performed the “grep scala.version pom.xml” in Ankush’s ADAM directory and got:

 Scala Version: 2.10

It sounds like we need to update the pom.xml file?

From: Frank Austin Nothaft [mailto:notifications@github.com]
Sent: Thursday, January 28, 2016 5:50 PM
To: bigdatagenomics/adam adam@noreply.github.com
Subject: Re: [adam] vcf2adam -> Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; (#871)

Which release of ADAM 0.18 did you download? We package both Scala 2.10 and 2.11 artifacts with each release. The error you're getting (missing method from inside of a Scala standard library) indicates that you have a Scala version mismatch between your Spark and ADAM distros. This can happen if you downloaded the ADAM 0.18 Scala 2.11 artifacts, but are running with Spark built for Scala 2.10. Can you run git log | head -n 1 | awk '{ print $2 }'? This will give us the commit hash for the latest commit, which will confirm which set of ADAM artifacts you've pulled down. You can also run grep scala.version pom.xml, which will return either 2.10 or 2.11.


Reply to this email directly or view it on GitHubhttps://github.com//issues/871#issuecomment-176500650.

@ankushreddy
Copy link

Thanks for reply @fnothaft and @faissl-asu we actually have spark built on scala 2.10 so we need to download adam-0.18_2.10 for scala 2.10 and adam-0.18_2.11 for scala 2.11.

Correct me if am wrong @fnothaft

Thanks for directing me in the correct way :)

Thanks & Regards,
Ankush Reddy.

@fnothaft
Copy link
Member

Thanks for reply @fnothaft and @faissl-asu we actually have spark built on scala 2.10 so we need to download adam-0.18_2.10 for scala 2.10 and adam-0.18_2.11 for scala 2.11.

This is correct.

I performed the “grep scala.version pom.xml” in Ankush’s ADAM directory and got:

Scala Version: 2.10

It sounds like we need to update the pom.xml file?

No, this looks like the correct version if you're running Spark built for Scala 2.10.

Thanks for directing me in the correct way :)

Definitely! Hopefully we can resolve this soon.

@fnothaft
Copy link
Member

fnothaft commented Jul 6, 2016

Closing as resolved.

@fnothaft fnothaft closed this as completed Jul 6, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants