Skip to content

How to use SparkBWA from the Spark shell

Jose M. Abuin edited this page Oct 31, 2016 · 3 revisions

The SparkBWA API in the Spark shell

The Spark shell can be used either in Scala or Python. In this case, as SparkBWA is written in Java, only the Scala shell can be used.

To use SparkBWA from the Spark shell, first you need to cd into the SparkBWA build dir

cd target

After that, you need to start the Spark shell indicating that, additionally, you want to use the SparkBWA jar.

spark-shell --jars SparkBWA-0.2.jar"

Of course, in the previous step, you should also indicate the number of executors, memory per executor, etc ...

Clone this wiki locally