Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ADAM to BAM conversion fails using relative path #1012

Closed
jpdna opened this issue Apr 24, 2016 · 3 comments
Closed

ADAM to BAM conversion fails using relative path #1012

jpdna opened this issue Apr 24, 2016 · 3 comments
Labels
Milestone

Comments

@jpdna
Copy link
Member

jpdna commented Apr 24, 2016

Using local filesystem ( I didn't try on HDFS) I observed the ADAM to BAM conversion using a relative path to the output BAM file attempted with:

adam-submit transform /mypath/mydataset.adam bam_limit_proj_marked/limit_proj_marked_v2.bam -single -sort_reads
to fail with error
java.io.FileNotFoundException: File bam_limit_proj_marked/limit_proj_marked_v2.bam_head does not exist

the same error also occurs if only the name of the intended output bam file (with no path) is provided

however, this will succed when specifying the absolute path to the output bam file like

adam-submit -- transform /mypath/mydataset.adam /mypath/bam_limit_proj_marked_v2/limit_proj_marked.bam -single -sort_reads

@fnothaft
Copy link
Member

That's odd! What line is causing this error to throw? Can you post the full stack trace?

@fnothaft fnothaft added the bug label Apr 24, 2016
@jpdna
Copy link
Member Author

jpdna commented Apr 24, 2016

Here you go:

Using ADAM_MAIN=org.bdgenomics.adam.cli.ADAMMain
Using SPARK_SUBMIT=/home/jp/Apps/Spark/1.6_on_scala_2.11/spark-1.6.1/bin/spark-submit
2016-04-24 08:20:17 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-04-24 08:21:26 WARN  TaskSetManager:70 - Lost task 11.0 in stage 2.0 (TID 493, localhost.localdomain): java.io.FileNotFoundException: File try2.bam_head does not exist
    at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:511)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:724)
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:501)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:397)
    at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:137)
    at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:339)
    at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
    at org.seqdoop.hadoop_bam.util.SAMHeaderReader.readSAMHeaderFrom(SAMHeaderReader.java:46)
    at org.seqdoop.hadoop_bam.KeyIgnoringBAMOutputFormat.readSAMHeaderFrom(KeyIgnoringBAMOutputFormat.java:66)
    at org.bdgenomics.adam.rdd.read.ADAMBAMOutputFormatHeaderLess.getRecordWriter(ADAMBAMOutputFormat.scala:94)
    at org.apache.spark.rdd.InstrumentedOutputFormat.getRecordWriter(InstrumentedOutputFormat.scala:35)
    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1107)
    at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1091)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

@fnothaft fnothaft added this to the 0.21.0 milestone Jul 20, 2016
@heuermh heuermh modified the milestones: 0.21.0, 0.22.0 Oct 13, 2016
@fnothaft
Copy link
Member

fnothaft commented Mar 3, 2017

Closing as we haven't been able to reproduce.

@fnothaft fnothaft closed this as completed Mar 3, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants