Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable #837

Closed
dbl001 opened this issue Sep 27, 2015 · 22 comments

Comments

@dbl001
Copy link

dbl001 commented Sep 27, 2015

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I had this problem with the binary distributions of Hadoop (e.g. - 2.3.0, 2.6.1) until I built hadoop from source with the 'native' option enabled and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service 'HTTP class server' on port 60826.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.5.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version 1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.0.1.5:60837]
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local directory at /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70
15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started with capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server directory is /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f
15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'HTTP file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:414)
    at sun.nio.ch.Net.bind(Net.java:406)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
    at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.eclipse.jetty.server.Server.doStart(Server.java:293)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
    at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
    at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
    at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
    at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
    at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
    at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:465)
    at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
    at $line3.$read$$iwC$$iwC.<init>(<console>:9)
    at $line3.$read$$iwC.<init>(<console>:18)
    at $line3.$read.<init>(<console>:20)
    at $line3.$read$.<init>(<console>:24)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.<init>(<console>:7)
    at $line3.$eval$.<clinit>(<console>)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED org.eclipse.jetty.server.Server@22686ddb: java.net.BindException: Address already in use
java.net.BindException: Address already in use
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:414)
    at sun.nio.ch.Net.bind(Net.java:406)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
    at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
    at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.eclipse.jetty.server.Server.doStart(Server.java:293)
    at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
    at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
    at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
    at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
    at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
    at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
    at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
    at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
    at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
    at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
    at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:465)
    at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
    at $line3.$read$$iwC$$iwC.<init>(<console>:9)
    at $line3.$read$$iwC.<init>(<console>:18)
    at $line3.$read.<init>(<console>:20)
    at $line3.$read$.<init>(<console>:24)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.<init>(<console>:7)
    at $line3.$eval$.<clinit>(<console>)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:483)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at http://10.0.1.5:4041
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar at http://10.0.1.5:60838/jars/commons-cli-1.2.jar with timestamp 1443372356368
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar with timestamp 1443372356374
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar at http://10.0.1.5:60838/jars/commons-codec-1.4.jar with timestamp 1443372356376
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar with timestamp 1443372356381
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar with timestamp 1443372356403
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar with timestamp 1443372356436
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar at http://10.0.1.5:60838/jars/log4j-1.2.17.jar with timestamp 1443372356443
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/1.1.1.7/snappy-java-1.1.1.7.jar at http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar with timestamp 1443372356501
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar at http://10.0.1.5:60838/jars/paranamer-2.6.jar with timestamp 1443372356512
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar with timestamp 1443372356519
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar with timestamp 1443372356521
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar with timestamp 1443372356574
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar with timestamp 1443372356601
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar with timestamp 1443372356623
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar with timestamp 1443372356655
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar with timestamp 1443372356687
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar with timestamp 1443372356695
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar with timestamp 1443372356702
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar with timestamp 1443372356708
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar with timestamp 1443372356722
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar with timestamp 1443372356747
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar with timestamp 1443372356755
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar with timestamp 1443372356802
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar at http://10.0.1.5:60838/jars/annotations-2.0.0.jar with timestamp 1443372356808
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar with timestamp 1443372356831
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar with timestamp 1443372356867
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar with timestamp 1443372356869
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar at http://10.0.1.5:60838/jars/avro-1.7.6.jar with timestamp 1443372356882
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar with timestamp 1443372356886
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1443372356915
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372356953
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar at http://10.0.1.5:60838/jars/kryo-2.21.jar with timestamp 1443372356995
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar with timestamp 1443372357147
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar at http://10.0.1.5:60838/jars/asm-4.0.jar with timestamp 1443372357152
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar at http://10.0.1.5:60838/jars/minlog-1.2.jar with timestamp 1443372357153
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar at http://10.0.1.5:60838/jars/objenesis-1.2.jar with timestamp 1443372357155
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar with timestamp 1443372357424
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar with timestamp 1443372357485
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar with timestamp 1443372357534
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar at http://10.0.1.5:60838/jars/htsjdk-1.133.jar with timestamp 1443372357561
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar with timestamp 1443372357576
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar at http://10.0.1.5:60838/jars/xz-1.5.jar with timestamp 1443372357581
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar at http://10.0.1.5:60838/jars/ant-1.8.2.jar with timestamp 1443372357612
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar with timestamp 1443372357622
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar at http://10.0.1.5:60838/jars/testng-6.8.8.jar with timestamp 1443372357639
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar at http://10.0.1.5:60838/jars/bsh-2.0b4.jar with timestamp 1443372357685
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar at http://10.0.1.5:60838/jars/jcommander-1.27.jar with timestamp 1443372357692
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar at http://10.0.1.5:60838/jars/guava-14.0.1.jar with timestamp 1443372357744
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372357758
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar with timestamp 1443372357947
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar with timestamp 1443372357980
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar at http://10.0.1.5:60838/jars/args4j-2.0.23.jar with timestamp 1443372358026
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372358030
15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID driver on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI: http://10.0.1.5:60826
15/09/27 09:45:59 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:60842 with 530.0 MB RAM, BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at http://10.0.1.5:4041
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/09/27 09:46:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook called
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4
@ryan-williams
Copy link
Member

FWIW I think I see that warning every time I run an adam-shell; I don't
think it hurts anything though any leads on making it go away are welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl notifications@github.com wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

I had this problem with the binary distributions of Hadoop (e.g. - 2.3.0,
2.6.1) until I built hadoop from source with the 'native' option enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service 'HTTP class server' on port 60826.
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version 1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.0.1.5:60837]
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local directory at /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70
15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started with capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server directory is /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f
15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'HTTP file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED org.eclipse.jetty.server.Server@22686ddb: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at http://10.0.1.5:4041
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar at http://10.0.1.5:60838/jars/commons-cli-1.2.jar with timestamp 1443372356368
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar with timestamp 1443372356374
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar at http://10.0.1.5:60838/jars/commons-codec-1.4.jar with timestamp 1443372356376
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar with timestamp 1443372356381
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar with timestamp 1443372356403
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar with timestamp 1443372356436
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar at http://10.0.1.5:60838/jars/log4j-1.2.17.jar with timestamp 1443372356443
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/1.1.1.7/snappy-java-1.1.1.7.jar at http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar with timestamp 1443372356501
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar at http://10.0.1.5:60838/jars/paranamer-2.6.jar with timestamp 1443372356512
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar with timestamp 1443372356519
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar with timestamp 1443372356521
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar with timestamp 1443372356574
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar with timestamp 1443372356601
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar with timestamp 1443372356623
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar with timestamp 1443372356655
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar with timestamp 1443372356687
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar with timestamp 1443372356695
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar with timestamp 1443372356702
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar with timestamp 1443372356708
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar with timestamp 1443372356722
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar with timestamp 1443372356747
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar with timestamp 1443372356755
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar with timestamp 1443372356802
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar at http://10.0.1.5:60838/jars/annotations-2.0.0.jar with timestamp 1443372356808
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar with timestamp 1443372356831
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar with timestamp 1443372356867
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar with timestamp 1443372356869
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar at http://10.0.1.5:60838/jars/avro-1.7.6.jar with timestamp 1443372356882
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar with timestamp 1443372356886
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1443372356915
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372356953
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar at http://10.0.1.5:60838/jars/kryo-2.21.jar with timestamp 1443372356995
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar with timestamp 1443372357147
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar at http://10.0.1.5:60838/jars/asm-4.0.jar with timestamp 1443372357152
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar at http://10.0.1.5:60838/jars/minlog-1.2.jar with timestamp 1443372357153
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar at http://10.0.1.5:60838/jars/objenesis-1.2.jar with timestamp 1443372357155
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar with timestamp 1443372357424
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar with timestamp 1443372357485
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar with timestamp 1443372357534
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar at http://10.0.1.5:60838/jars/htsjdk-1.133.jar with timestamp 1443372357561
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar with timestamp 1443372357576
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar at http://10.0.1.5:60838/jars/xz-1.5.jar with timestamp 1443372357581
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar at http://10.0.1.5:60838/jars/ant-1.8.2.jar with timestamp 1443372357612
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar with timestamp 1443372357622
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar at http://10.0.1.5:60838/jars/testng-6.8.8.jar with timestamp 1443372357639
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar at http://10.0.1.5:60838/jars/bsh-2.0b4.jar with timestamp 1443372357685
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar at http://10.0.1.5:60838/jars/jcommander-1.27.jar with timestamp 1443372357692
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar at http://10.0.1.5:60838/jars/guava-14.0.1.jar with timestamp 1443372357744
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372357758
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar with timestamp 1443372357947
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar with timestamp 1443372357980
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar at http://10.0.1.5:60838/jars/args4j-2.0.23.jar with timestamp 1443372358026
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372358030
15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID driver on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI: http://10.0.1.5:60826
15/09/27 09:45:59 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:60842 with 530.0 MB RAM, BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at http://10.0.1.5:4041
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/09/27 09:46:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook called
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
#837.

@dbl001
Copy link
Author

dbl001 commented Sep 27, 2015

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241 http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams notifications@github.com wrote:

FWIW I think I see that warning every time I run an adam-shell; I don't
think it hurts anything though any leads on making it go away are welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl notifications@github.com wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

I had this problem with the binary distributions of Hadoop (e.g. - 2.3.0,
2.6.1) until I built hadoop from source with the 'native' option enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service 'HTTP class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version 1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.0.1.5:60837]
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local directory at /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70
15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started with capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server directory is /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f
15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'HTTP file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED org.eclipse.jetty.server.Server@22686ddb: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at http://10.0.1.5:4041
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar at http://10.0.1.5:60838/jars/commons-cli-1.2.jar with timestamp 1443372356368
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar with timestamp 1443372356374
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar at http://10.0.1.5:60838/jars/commons-codec-1.4.jar with timestamp 1443372356376
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar with timestamp 1443372356381
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar with timestamp 1443372356403
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar with timestamp 1443372356436
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar at http://10.0.1.5:60838/jars/log4j-1.2.17.jar with timestamp 1443372356443
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/1.1.1.7/snappy-java-1.1.1.7.jar at http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar with timestamp 1443372356501
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar at http://10.0.1.5:60838/jars/paranamer-2.6.jar with timestamp 1443372356512
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar with timestamp 1443372356519
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar with timestamp 1443372356521
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar with timestamp 1443372356574
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar with timestamp 1443372356601
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar with timestamp 1443372356623
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar with timestamp 1443372356655
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar with timestamp 1443372356687
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar with timestamp 1443372356695
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar with timestamp 1443372356702
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar with timestamp 1443372356708
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar with timestamp 1443372356722
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar with timestamp 1443372356747
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar with timestamp 1443372356755
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar with timestamp 1443372356802
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar at http://10.0.1.5:60838/jars/annotations-2.0.0.jar with timestamp 1443372356808
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar with timestamp 1443372356831
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar with timestamp 1443372356867
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar with timestamp 1443372356869
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar at http://10.0.1.5:60838/jars/avro-1.7.6.jar with timestamp 1443372356882
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar with timestamp 1443372356886
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1443372356915
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372356953
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar at http://10.0.1.5:60838/jars/kryo-2.21.jar with timestamp 1443372356995
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar with timestamp 1443372357147
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar at http://10.0.1.5:60838/jars/asm-4.0.jar with timestamp 1443372357152
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar at http://10.0.1.5:60838/jars/minlog-1.2.jar with timestamp 1443372357153
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar at http://10.0.1.5:60838/jars/objenesis-1.2.jar with timestamp 1443372357155
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar with timestamp 1443372357424
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar with timestamp 1443372357485
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar with timestamp 1443372357534
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar at http://10.0.1.5:60838/jars/htsjdk-1.133.jar with timestamp 1443372357561
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar with timestamp 1443372357576
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar at http://10.0.1.5:60838/jars/xz-1.5.jar with timestamp 1443372357581
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar at http://10.0.1.5:60838/jars/ant-1.8.2.jar with timestamp 1443372357612
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar with timestamp 1443372357622
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar at http://10.0.1.5:60838/jars/testng-6.8.8.jar with timestamp 1443372357639
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar at http://10.0.1.5:60838/jars/bsh-2.0b4.jar with timestamp 1443372357685
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar at http://10.0.1.5:60838/jars/jcommander-1.27.jar with timestamp 1443372357692
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar at http://10.0.1.5:60838/jars/guava-14.0.1.jar with timestamp 1443372357744
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372357758
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar with timestamp 1443372357947
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar with timestamp 1443372357980
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar at http://10.0.1.5:60838/jars/args4j-2.0.23.jar with timestamp 1443372358026
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372358030
15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID driver on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI: http://10.0.1.5:60826
15/09/27 09:45:59 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:60842 with 530.0 MB RAM, BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at http://10.0.1.5:4041
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/09/27 09:46:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook called
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
#837.


Reply to this email directly or view it on GitHub #837 (comment).

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@7d64a960: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241 http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <notifications@github.com mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell; I don't
think it hurts anything though any leads on making it go away are welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com mailto:notifications@github.com> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

I had this problem with the binary distributions of Hadoop (e.g. - 2.3.0,
2.6.1) until I built hadoop from source with the 'native' option enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service 'HTTP class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version 1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 akka.tcp://sparkDriver@10.0.1.5:60837]
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local directory at /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70
15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started with capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server directory is /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f
15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'HTTP file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0 mailto:SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED org.eclipse.jetty.server.Server@22686ddb: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0 mailto:SelectChannelConnector@0.0.0.0:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service 'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar at http://10.0.1.5:60838/jars/commons-cli-1.2.jar http://10.0.1.5:60838/jars/commons-cli-1.2.jar with timestamp 1443372356368
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar with timestamp 1443372356374
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar at http://10.0.1.5:60838/jars/commons-codec-1.4.jar http://10.0.1.5:60838/jars/commons-codec-1.4.jar with timestamp 1443372356376
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar with timestamp 1443372356381
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar with timestamp 1443372356403
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar with timestamp 1443372356436
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar at http://10.0.1.5:60838/jars/log4j-1.2.17.jar http://10.0.1.5:60838/jars/log4j-1.2.17.jar with timestamp 1443372356443
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/1.1.1.7/snappy-java-1.1.1.7.jar at http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar with timestamp 1443372356501
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar at http://10.0.1.5:60838/jars/paranamer-2.6.jar http://10.0.1.5:60838/jars/paranamer-2.6.jar with timestamp 1443372356512
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar with timestamp 1443372356519
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar with timestamp 1443372356521
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar http://10.0.1.5:60838/jars/httpclient-4.3.2.jar with timestamp 1443372356574
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar http://10.0.1.5:60838/jars/httpcore-4.3.1.jar with timestamp 1443372356601
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar with timestamp 1443372356623
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar with timestamp 1443372356655
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar with timestamp 1443372356687
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar with timestamp 1443372356695
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar with timestamp 1443372356702
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar with timestamp 1443372356708
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar with timestamp 1443372356722
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar with timestamp 1443372356747
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar with timestamp 1443372356755
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar http://10.0.1.5:60838/jars/servo-core-0.5.5.jar with timestamp 1443372356802
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar at http://10.0.1.5:60838/jars/annotations-2.0.0.jar http://10.0.1.5:60838/jars/annotations-2.0.0.jar with timestamp 1443372356808
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar with timestamp 1443372356831
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar http://10.0.1.5:60838/jars/commons-io-1.3.2.jar with timestamp 1443372356867
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar with timestamp 1443372356869
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar at http://10.0.1.5:60838/jars/avro-1.7.6.jar http://10.0.1.5:60838/jars/avro-1.7.6.jar with timestamp 1443372356882
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar with timestamp 1443372356886
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1443372356915
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372356953
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar at http://10.0.1.5:60838/jars/kryo-2.21.jar http://10.0.1.5:60838/jars/kryo-2.21.jar with timestamp 1443372356995
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar with timestamp 1443372357147
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar at http://10.0.1.5:60838/jars/asm-4.0.jar http://10.0.1.5:60838/jars/asm-4.0.jar with timestamp 1443372357152
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar at http://10.0.1.5:60838/jars/minlog-1.2.jar http://10.0.1.5:60838/jars/minlog-1.2.jar with timestamp 1443372357153
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar at http://10.0.1.5:60838/jars/objenesis-1.2.jar http://10.0.1.5:60838/jars/objenesis-1.2.jar with timestamp 1443372357155
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar http://10.0.1.5:60838/jars/fastutil-6.4.4.jar with timestamp 1443372357424
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar with timestamp 1443372357485
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar with timestamp 1443372357534
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar at http://10.0.1.5:60838/jars/htsjdk-1.133.jar http://10.0.1.5:60838/jars/htsjdk-1.133.jar with timestamp 1443372357561
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar with timestamp 1443372357576
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar at http://10.0.1.5:60838/jars/xz-1.5.jar http://10.0.1.5:60838/jars/xz-1.5.jar with timestamp 1443372357581
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar at http://10.0.1.5:60838/jars/ant-1.8.2.jar http://10.0.1.5:60838/jars/ant-1.8.2.jar with timestamp 1443372357612
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar with timestamp 1443372357622
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar at http://10.0.1.5:60838/jars/testng-6.8.8.jar http://10.0.1.5:60838/jars/testng-6.8.8.jar with timestamp 1443372357639
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar at http://10.0.1.5:60838/jars/bsh-2.0b4.jar http://10.0.1.5:60838/jars/bsh-2.0b4.jar with timestamp 1443372357685
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar at http://10.0.1.5:60838/jars/jcommander-1.27.jar http://10.0.1.5:60838/jars/jcommander-1.27.jar with timestamp 1443372357692
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar at http://10.0.1.5:60838/jars/guava-14.0.1.jar http://10.0.1.5:60838/jars/guava-14.0.1.jar with timestamp 1443372357744
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372357758
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar http://10.0.1.5:60838/jars/scala-library-2.10.4.jar with timestamp 1443372357947
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar with timestamp 1443372357980
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar at http://10.0.1.5:60838/jars/args4j-2.0.23.jar http://10.0.1.5:60838/jars/args4j-2.0.23.jar with timestamp 1443372358026
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar with timestamp 1443372358030
15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID driver on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI: http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:60842 with 530.0 MB RAM, BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/09/27 09:46:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook called
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611
15/09/27 09:46:54 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory /private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
<#837 #837>.


Reply to this email directly or view it on GitHub #837 (comment).

@ryan-williams
Copy link
Member

If you're referring to the "BindException"s, that s Spark's UI trying to
bind to any free port starting from 4040 and incrementing by 1. You can
ignore those, for most intents and purposes.

Ironically, I was just running >16 spark apps in yarn-client mode on the
same machine just now and discovered that if Spark tries 16 ports and they
are all unavailable, it fails the app before it even starts. I'm using Spree
https://github.com/hammerlab/spree instead of Spark's web UI for these
apps so I just disabled the web UI altogether with "--conf
spark.ui.enabled=false".

But in general, you can ignore them; they're definitely annoying and Spark
should do something more graceful here probably, but that's another
discussion.

On Sun, Sep 27, 2015 at 10:41 PM dbl notifications@github.com wrote:

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED
org.spark-project.jetty.server.Server@7d64a960: java.net.BindException:
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241
<
http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <notifications@github.com
mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell; I don't
think it hurts anything though any leads on making it go away are
welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com <mailto:
notifications@github.com>> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop
library for your platform... using builtin-java classes where
applicable

I had this problem with the binary distributions of Hadoop (e.g. -
2.3.0,
2.6.1) until I built hadoop from source with the 'native' option
enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment
variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls to:
davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service 'HTTP
class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version
1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls to:
davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 <akka.tcp://
sparkDriver@10.0.1.5:60837>]
15/09/27 09:45:54 INFO util.Utils: Successfully started service
'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local
directory at
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70
15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started with
capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server
directory is
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f
15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'HTTP
file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0 mailto:SelectChannelConnector@0.0.0.0:4040:
java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@22686ddb: java.net.BindException: Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not bind
on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0
:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service
'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
at http://10.0.1.5:60838/jars/commons-cli-1.2.jar <
http://10.0.1.5:60838/jars/commons-cli-1.2.jar> with timestamp
1443372356368
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar
at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar <
http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar> with timestamp
1443372356374
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
at http://10.0.1.5:60838/jars/commons-codec-1.4.jar <
http://10.0.1.5:60838/jars/commons-codec-1.4.jar> with timestamp
1443372356376
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar <
http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar> with timestamp
1443372356381
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar <
http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar> with timestamp
1443372356403
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar
at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar <
http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar> with timestamp
1443372356436
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar
at http://10.0.1.5:60838/jars/log4j-1.2.17.jar <
http://10.0.1.5:60838/jars/log4j-1.2.17.jar> with timestamp 1443372356443
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/
1.1.1.7/snappy-java-1.1.1.7.jar at
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar <
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar> with timestamp
1443372356501
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar
at http://10.0.1.5:60838/jars/paranamer-2.6.jar <
http://10.0.1.5:60838/jars/paranamer-2.6.jar> with timestamp 1443372356512
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar> with timestamp
1443372356519
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar> with timestamp
1443372356521
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar
at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar <
http://10.0.1.5:60838/jars/httpclient-4.3.2.jar> with timestamp
1443372356574
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar
at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar <
http://10.0.1.5:60838/jars/httpcore-4.3.1.jar> with timestamp
1443372356601
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar> with timestamp
1443372356623
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar> with timestamp
1443372356655
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar> with timestamp
1443372356687
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar> with timestamp
1443372356695
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar> with timestamp
1443372356702
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar> with timestamp
1443372356708
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar> with timestamp
1443372356722
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar
at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar <
http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar> with
timestamp 1443372356747
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar> with timestamp
1443372356755
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar
at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar <
http://10.0.1.5:60838/jars/servo-core-0.5.5.jar> with timestamp
1443372356802
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar
at http://10.0.1.5:60838/jars/annotations-2.0.0.jar <
http://10.0.1.5:60838/jars/annotations-2.0.0.jar> with timestamp
1443372356808
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar
at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar <
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar> with
timestamp 1443372356831
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar
at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar <
http://10.0.1.5:60838/jars/commons-io-1.3.2.jar> with timestamp
1443372356867
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar
at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar <
http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar> with timestamp
1443372356869
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar
at http://10.0.1.5:60838/jars/avro-1.7.6.jar <
http://10.0.1.5:60838/jars/avro-1.7.6.jar> with timestamp 1443372356882
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar> with timestamp
1443372356886
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar> with timestamp
1443372356915
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372356953
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar
at http://10.0.1.5:60838/jars/kryo-2.21.jar <
http://10.0.1.5:60838/jars/kryo-2.21.jar> with timestamp 1443372356995
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar
at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar <
http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar> with timestamp
1443372357147
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar
at http://10.0.1.5:60838/jars/asm-4.0.jar <
http://10.0.1.5:60838/jars/asm-4.0.jar> with timestamp 1443372357152
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar
at http://10.0.1.5:60838/jars/minlog-1.2.jar <
http://10.0.1.5:60838/jars/minlog-1.2.jar> with timestamp 1443372357153
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar
at http://10.0.1.5:60838/jars/objenesis-1.2.jar <
http://10.0.1.5:60838/jars/objenesis-1.2.jar> with timestamp 1443372357155
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar
at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar <
http://10.0.1.5:60838/jars/fastutil-6.4.4.jar> with timestamp
1443372357424
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar> with timestamp
1443372357485
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar
at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar <
http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar> with timestamp
1443372357534
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar
at http://10.0.1.5:60838/jars/htsjdk-1.133.jar <
http://10.0.1.5:60838/jars/htsjdk-1.133.jar> with timestamp 1443372357561
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar
at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar <
http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar> with timestamp
1443372357576
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar
at http://10.0.1.5:60838/jars/xz-1.5.jar <
http://10.0.1.5:60838/jars/xz-1.5.jar> with timestamp 1443372357581
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-1.8.2.jar> with timestamp 1443372357612
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar> with timestamp
1443372357622
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar
at http://10.0.1.5:60838/jars/testng-6.8.8.jar <
http://10.0.1.5:60838/jars/testng-6.8.8.jar> with timestamp 1443372357639
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
at http://10.0.1.5:60838/jars/bsh-2.0b4.jar <
http://10.0.1.5:60838/jars/bsh-2.0b4.jar> with timestamp 1443372357685
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar
at http://10.0.1.5:60838/jars/jcommander-1.27.jar <
http://10.0.1.5:60838/jars/jcommander-1.27.jar> with timestamp
1443372357692
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar
at http://10.0.1.5:60838/jars/guava-14.0.1.jar <
http://10.0.1.5:60838/jars/guava-14.0.1.jar> with timestamp 1443372357744
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372357758
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar <
http://10.0.1.5:60838/jars/scala-library-2.10.4.jar> with timestamp
1443372357947
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar <
http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar> with timestamp
1443372357980
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar
at http://10.0.1.5:60838/jars/args4j-2.0.23.jar <
http://10.0.1.5:60838/jars/args4j-2.0.23.jar> with timestamp 1443372358026
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372358030
15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name
DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID driver
on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI:
http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server
created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint:
Registering block manager localhost:60842 with 530.0 MB RAM,
BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered
BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster: BlockManagerMaster
stopped
15/09/27 09:46:54 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped
SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook called
15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote
daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611
15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down;
proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
<#837 <
https://github.com/bigdatagenomics/adam/issues/837>>.


Reply to this email directly or view it on GitHub <
#837 (comment)
.


Reply to this email directly or view it on GitHub
#837 (comment)
.

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Thanks! I’ll check out Spree.

Just curious - have you ever tried:

  1. Jupiter scala
  2. Apache incubator Zeppelin

to run Adam, instead of adam-shell?

On Sep 27, 2015, at 7:55 PM, Ryan Williams notifications@github.com wrote:

If you're referring to the "BindException"s, that s Spark's UI trying to
bind to any free port starting from 4040 and incrementing by 1. You can
ignore those, for most intents and purposes.

Ironically, I was just running >16 spark apps in yarn-client mode on the
same machine just now and discovered that if Spark tries 16 ports and they
are all unavailable, it fails the app before it even starts. I'm using Spree
https://github.com/hammerlab/spree instead of Spark's web UI for these
apps so I just disabled the web UI altogether with "--conf
spark.ui.enabled=false".

But in general, you can ignore them; they're definitely annoying and Spark
should do something more graceful here probably, but that's another
discussion.

On Sun, Sep 27, 2015 at 10:41 PM dbl notifications@github.com wrote:

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED
org.spark-project.jetty.server.Server@7d64a960: java.net.BindException:
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at
org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241
<
http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <notifications@github.com
mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell; I don't
think it hurts anything though any leads on making it go away are
welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com <mailto:
notifications@github.com>> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop
library for your platform... using builtin-java classes where
applicable

I had this problem with the binary distributions of Hadoop (e.g. -
2.3.0,
2.6.1) until I built hadoop from source with the 'native' option
enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment
variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls to:
davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service 'HTTP
class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version
1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls to:
davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 <akka.tcp://
sparkDriver@10.0.1.5:60837>]
15/09/27 09:45:54 INFO util.Utils: Successfully started service
'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local
directory at
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70
15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started with
capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server
directory is
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f
15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service 'HTTP
file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0 mailto:SelectChannelConnector@0.0.0.0:4040:
java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@22686ddb: java.net.BindException: Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not bind
on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0
:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service
'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
at http://10.0.1.5:60838/jars/commons-cli-1.2.jar <
http://10.0.1.5:60838/jars/commons-cli-1.2.jar> with timestamp
1443372356368
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar
at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar <
http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar> with timestamp
1443372356374
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
at http://10.0.1.5:60838/jars/commons-codec-1.4.jar <
http://10.0.1.5:60838/jars/commons-codec-1.4.jar> with timestamp
1443372356376
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar <
http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar> with timestamp
1443372356381
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar <
http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar> with timestamp
1443372356403
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar
at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar <
http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar> with timestamp
1443372356436
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar
at http://10.0.1.5:60838/jars/log4j-1.2.17.jar <
http://10.0.1.5:60838/jars/log4j-1.2.17.jar> with timestamp 1443372356443
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/
1.1.1.7/snappy-java-1.1.1.7.jar at
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar <
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar> with timestamp
1443372356501
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar
at http://10.0.1.5:60838/jars/paranamer-2.6.jar <
http://10.0.1.5:60838/jars/paranamer-2.6.jar> with timestamp 1443372356512
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar> with timestamp
1443372356519
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar> with timestamp
1443372356521
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar
at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar <
http://10.0.1.5:60838/jars/httpclient-4.3.2.jar> with timestamp
1443372356574
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar
at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar <
http://10.0.1.5:60838/jars/httpcore-4.3.1.jar> with timestamp
1443372356601
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar> with timestamp
1443372356623
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar> with timestamp
1443372356655
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar> with timestamp
1443372356687
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar> with timestamp
1443372356695
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar> with timestamp
1443372356702
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar> with timestamp
1443372356708
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar> with timestamp
1443372356722
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar
at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar <
http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar> with
timestamp 1443372356747
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar> with timestamp
1443372356755
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar
at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar <
http://10.0.1.5:60838/jars/servo-core-0.5.5.jar> with timestamp
1443372356802
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar
at http://10.0.1.5:60838/jars/annotations-2.0.0.jar <
http://10.0.1.5:60838/jars/annotations-2.0.0.jar> with timestamp
1443372356808
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar
at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar <
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar> with
timestamp 1443372356831
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar
at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar <
http://10.0.1.5:60838/jars/commons-io-1.3.2.jar> with timestamp
1443372356867
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar
at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar <
http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar> with timestamp
1443372356869
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar
at http://10.0.1.5:60838/jars/avro-1.7.6.jar <
http://10.0.1.5:60838/jars/avro-1.7.6.jar> with timestamp 1443372356882
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar> with timestamp
1443372356886
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar> with timestamp
1443372356915
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372356953
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar
at http://10.0.1.5:60838/jars/kryo-2.21.jar <
http://10.0.1.5:60838/jars/kryo-2.21.jar> with timestamp 1443372356995
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar
at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar <
http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar> with timestamp
1443372357147
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar
at http://10.0.1.5:60838/jars/asm-4.0.jar <
http://10.0.1.5:60838/jars/asm-4.0.jar> with timestamp 1443372357152
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar
at http://10.0.1.5:60838/jars/minlog-1.2.jar <
http://10.0.1.5:60838/jars/minlog-1.2.jar> with timestamp 1443372357153
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar
at http://10.0.1.5:60838/jars/objenesis-1.2.jar <
http://10.0.1.5:60838/jars/objenesis-1.2.jar> with timestamp 1443372357155
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar
at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar <
http://10.0.1.5:60838/jars/fastutil-6.4.4.jar> with timestamp
1443372357424
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar> with timestamp
1443372357485
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar
at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar <
http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar> with timestamp
1443372357534
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar
at http://10.0.1.5:60838/jars/htsjdk-1.133.jar <
http://10.0.1.5:60838/jars/htsjdk-1.133.jar> with timestamp 1443372357561
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar
at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar <
http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar> with timestamp
1443372357576
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar
at http://10.0.1.5:60838/jars/xz-1.5.jar <
http://10.0.1.5:60838/jars/xz-1.5.jar> with timestamp 1443372357581
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-1.8.2.jar> with timestamp 1443372357612
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar> with timestamp
1443372357622
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar
at http://10.0.1.5:60838/jars/testng-6.8.8.jar <
http://10.0.1.5:60838/jars/testng-6.8.8.jar> with timestamp 1443372357639
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
at http://10.0.1.5:60838/jars/bsh-2.0b4.jar <
http://10.0.1.5:60838/jars/bsh-2.0b4.jar> with timestamp 1443372357685
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar
at http://10.0.1.5:60838/jars/jcommander-1.27.jar <
http://10.0.1.5:60838/jars/jcommander-1.27.jar> with timestamp
1443372357692
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar
at http://10.0.1.5:60838/jars/guava-14.0.1.jar <
http://10.0.1.5:60838/jars/guava-14.0.1.jar> with timestamp 1443372357744
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372357758
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar <
http://10.0.1.5:60838/jars/scala-library-2.10.4.jar> with timestamp
1443372357947
15/09/27 09:45:57 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar <
http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar> with timestamp
1443372357980
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar
at http://10.0.1.5:60838/jars/args4j-2.0.23.jar <
http://10.0.1.5:60838/jars/args4j-2.0.23.jar> with timestamp 1443372358026
15/09/27 09:45:58 INFO spark.SparkContext: Added JAR
file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372358030
15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name
DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID driver
on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI:
http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port 60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server
created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to register
BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint:
Registering block manager localhost:60842 with 530.0 MB RAM,
BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered
BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster: BlockManagerMaster
stopped
15/09/27 09:46:54 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped
SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook called
15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote
daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611
15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down;
proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting directory
/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
<#837 <
https://github.com/bigdatagenomics/adam/issues/837>>.


Reply to this email directly or view it on GitHub <
#837 (comment)
.


Reply to this email directly or view it on GitHub
#837 (comment)
.


Reply to this email directly or view it on GitHub #837 (comment).

@ryan-williams
Copy link
Member

I have not, though I did do a bunch of checking out spark-notebook recently
and have used it with success to run ADAM things.

I am also meaning to check out jupiter scala. I heard mixed reviews about
zeppelin so have planned to investigate the other two first.

On Sun, Sep 27, 2015 at 11:14 PM dbl notifications@github.com wrote:

Thanks! I’ll check out Spree.

Just curious - have you ever tried:

  1. Jupiter scala
  2. Apache incubator Zeppelin

to run Adam, instead of adam-shell?

On Sep 27, 2015, at 7:55 PM, Ryan Williams notifications@github.com
wrote:

If you're referring to the "BindException"s, that s Spark's UI trying to
bind to any free port starting from 4040 and incrementing by 1. You can
ignore those, for most intents and purposes.

Ironically, I was just running >16 spark apps in yarn-client mode on the
same machine just now and discovered that if Spark tries 16 ports and
they
are all unavailable, it fails the app before it even starts. I'm using
Spree
https://github.com/hammerlab/spree instead of Spark's web UI for these
apps so I just disabled the web UI altogether with "--conf
spark.ui.enabled=false".

But in general, you can ignore them; they're definitely annoying and
Spark
should do something more graceful here probably, but that's another
discussion.

On Sun, Sep 27, 2015 at 10:41 PM dbl notifications@github.com wrote:

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED
org.spark-project.jetty.server.Server@7d64a960:
java.net.BindException:
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com
wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241
<

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <
notifications@github.com
mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell; I
don't
think it hurts anything though any leads on making it go away are
welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com
<mailto:
notifications@github.com>> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop
library for your platform... using builtin-java classes where
applicable

I had this problem with the binary distributions of Hadoop (e.g. -
2.3.0,
2.6.1) until I built hadoop from source with the 'native' option
enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment
variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where
applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls
to:
davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service
'HTTP
class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM,
Java
1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version
1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls
to:
davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 <akka.tcp://
sparkDriver@10.0.1.5:60837>]
15/09/27 09:45:54 INFO util.Utils: Successfully started service
'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local
directory at

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70

15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started
with
capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server
directory is

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f

15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service
'HTTP
file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0
:4040:
java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@22686ddb: java.net.BindException:
Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not
bind
on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0
:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service
'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
at http://10.0.1.5:60838/jars/commons-cli-1.2.jar <
http://10.0.1.5:60838/jars/commons-cli-1.2.jar> with timestamp
1443372356368

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar
at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar <
http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar> with timestamp
1443372356374

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
at http://10.0.1.5:60838/jars/commons-codec-1.4.jar <
http://10.0.1.5:60838/jars/commons-codec-1.4.jar> with timestamp
1443372356376

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar <
http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar> with timestamp
1443372356381

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar <
http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar> with timestamp
1443372356403

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar
at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar <
http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar> with timestamp
1443372356436

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar
at http://10.0.1.5:60838/jars/log4j-1.2.17.jar <
http://10.0.1.5:60838/jars/log4j-1.2.17.jar> with timestamp
1443372356443

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/
1.1.1.7/snappy-java-1.1.1.7.jar at
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar <
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar> with timestamp
1443372356501

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar
at http://10.0.1.5:60838/jars/paranamer-2.6.jar <
http://10.0.1.5:60838/jars/paranamer-2.6.jar> with timestamp
1443372356512

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar> with timestamp
1443372356519

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar> with timestamp
1443372356521

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar
at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar <
http://10.0.1.5:60838/jars/httpclient-4.3.2.jar> with timestamp
1443372356574

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar
at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar <
http://10.0.1.5:60838/jars/httpcore-4.3.1.jar> with timestamp
1443372356601

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar> with timestamp
1443372356623

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar> with timestamp
1443372356655

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar> with timestamp
1443372356687

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar> with timestamp
1443372356695

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar> with timestamp
1443372356702

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar> with timestamp
1443372356708

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar> with timestamp
1443372356722

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar
at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar <
http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar> with
timestamp 1443372356747

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar> with
timestamp
1443372356755

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar
at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar <
http://10.0.1.5:60838/jars/servo-core-0.5.5.jar> with timestamp
1443372356802

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar
at http://10.0.1.5:60838/jars/annotations-2.0.0.jar <
http://10.0.1.5:60838/jars/annotations-2.0.0.jar> with timestamp
1443372356808

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar
at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar
<
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar>
with
timestamp 1443372356831

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar
at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar <
http://10.0.1.5:60838/jars/commons-io-1.3.2.jar> with timestamp
1443372356867

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar
at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar <
http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar> with timestamp
1443372356869

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar
at http://10.0.1.5:60838/jars/avro-1.7.6.jar <
http://10.0.1.5:60838/jars/avro-1.7.6.jar> with timestamp
1443372356882

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar> with timestamp
1443372356886

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar> with
timestamp
1443372356915

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372356953

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar
at http://10.0.1.5:60838/jars/kryo-2.21.jar <
http://10.0.1.5:60838/jars/kryo-2.21.jar> with timestamp 1443372356995

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar
at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar <
http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar> with timestamp
1443372357147

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar
at http://10.0.1.5:60838/jars/asm-4.0.jar <
http://10.0.1.5:60838/jars/asm-4.0.jar> with timestamp 1443372357152

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar
at http://10.0.1.5:60838/jars/minlog-1.2.jar <
http://10.0.1.5:60838/jars/minlog-1.2.jar> with timestamp
1443372357153

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar
at http://10.0.1.5:60838/jars/objenesis-1.2.jar <
http://10.0.1.5:60838/jars/objenesis-1.2.jar> with timestamp
1443372357155

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar
at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar <
http://10.0.1.5:60838/jars/fastutil-6.4.4.jar> with timestamp
1443372357424

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar> with
timestamp
1443372357485

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar
at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar <
http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar> with timestamp
1443372357534

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar
at http://10.0.1.5:60838/jars/htsjdk-1.133.jar <
http://10.0.1.5:60838/jars/htsjdk-1.133.jar> with timestamp
1443372357561

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar
at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar <
http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar> with timestamp
1443372357576

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar
at http://10.0.1.5:60838/jars/xz-1.5.jar <
http://10.0.1.5:60838/jars/xz-1.5.jar> with timestamp 1443372357581

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-1.8.2.jar> with timestamp 1443372357612

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar> with timestamp
1443372357622

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar
at http://10.0.1.5:60838/jars/testng-6.8.8.jar <
http://10.0.1.5:60838/jars/testng-6.8.8.jar> with timestamp
1443372357639

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
at http://10.0.1.5:60838/jars/bsh-2.0b4.jar <
http://10.0.1.5:60838/jars/bsh-2.0b4.jar> with timestamp 1443372357685

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar
at http://10.0.1.5:60838/jars/jcommander-1.27.jar <
http://10.0.1.5:60838/jars/jcommander-1.27.jar> with timestamp
1443372357692

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar
at http://10.0.1.5:60838/jars/guava-14.0.1.jar <
http://10.0.1.5:60838/jars/guava-14.0.1.jar> with timestamp
1443372357744

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372357758

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar <
http://10.0.1.5:60838/jars/scala-library-2.10.4.jar> with timestamp
1443372357947

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar <
http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar> with timestamp
1443372357980

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar
at http://10.0.1.5:60838/jars/args4j-2.0.23.jar <
http://10.0.1.5:60838/jars/args4j-2.0.23.jar> with timestamp
1443372358026

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372358030

15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name
DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID
driver
on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI:
http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port
60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server
created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to
register
BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint:
Registering block manager localhost:60842 with 530.0 MB RAM,
BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered
BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping
DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster:
BlockManagerMaster
stopped
15/09/27 09:46:54 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped
SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook
called
15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote
daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting
directory

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611

15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut
down;
proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting
directory

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
<#837 <
https://github.com/bigdatagenomics/adam/issues/837>>.


Reply to this email directly or view it on GitHub <

#837 (comment)

.


Reply to this email directly or view it on GitHub
<
https://github.com/bigdatagenomics/adam/issues/837#issuecomment-143626088>
.


Reply to this email directly or view it on GitHub <
#837 (comment)
.


Reply to this email directly or view it on GitHub
#837 (comment)
.

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Did you change the ‘adam-shell’ to include the Spree requirements?
E.g.

If using Spark ≥ 1.5.0, simply pass the following flags to spark-{shell,submit}:

--packages org.hammerlab:spark-json-relay:2.0.0
--conf spark.extraListeners=org.apache.spark.JsonRelay
Or

Include JsonRelay on the driver's classpath

--driver-class-path /path/to/json-relay-2.0.0.jar

Register your JsonRelay as a SparkListener

--conf spark.extraListeners=org.apache.spark.JsonRelay

Point it at your slim instance; default: localhost:8123

--conf spark.slim.host=…
--conf spark.slim.port=…

Any ideas about this error with slim:
$ sudo npm install -g slim.js
Password:

kerberos@0.0.14 install /opt/local/lib/node_modules/slim.js/node_modules/mongodb/node_modules/mongodb-core/node_modules/kerberos
(node-gyp rebuild) || (exit 0)

CXX(target) Release/obj.target/kerberos/lib/kerberos.o
CXX(target) Release/obj.target/kerberos/lib/worker.o
CC(target) Release/obj.target/kerberos/lib/kerberosgss.o
CC(target) Release/obj.target/kerberos/lib/base64.o
CXX(target) Release/obj.target/kerberos/lib/kerberos_context.o
SOLINK_MODULE(target) Release/kerberos.node
SOLINK_MODULE(target) Release/kerberos.node: Finished
/opt/local/bin/slim -> /opt/local/lib/node_modules/slim.js/slim
slim.js@1.2.1 /opt/local/lib/node_modules/slim.js
├── deep-equal@1.0.0
├── line-reader@0.2.4
├── minimist@1.1.1
├── async@1.3.0
├── mkdirp@0.5.1 (minimist@0.0.8)
├── node.extend@1.1.5 (is@3.1.0)
├── shelljs@0.5.1
├── moment@2.10.3
├── tracer@0.7.4 (tinytim@0.1.1, colors@1.0.3, dateformat@1.0.11)
├── oboe@2.1.2 (http-https@1.0.0)
└── mongodb@2.0.42 (es6-promise@2.1.1, readable-stream@1.0.31, mongodb-core@1.2.10)
David-Laxers-MacBook-Pro:spree davidlaxer$ slim
module.js:338
throw err;
^
Error: Cannot find module '/Users/davidlaxer/spree/slim.js'
at Function.Module._resolveFilename (module.js:336:15)
at Function.Module._load (module.js:278:25)
at Function.Module.runMain (module.js:501:10)
at startup (node.js:129:16)
at node.js:814:3

On Sep 27, 2015, at 9:12 PM, Ryan Williams notifications@github.com wrote:

I have not, though I did do a bunch of checking out spark-notebook recently
and have used it with success to run ADAM things.

I am also meaning to check out jupiter scala. I heard mixed reviews about
zeppelin so have planned to investigate the other two first.

On Sun, Sep 27, 2015 at 11:14 PM dbl notifications@github.com wrote:

Thanks! I’ll check out Spree.

Just curious - have you ever tried:

  1. Jupiter scala
  2. Apache incubator Zeppelin

to run Adam, instead of adam-shell?

On Sep 27, 2015, at 7:55 PM, Ryan Williams notifications@github.com
wrote:

If you're referring to the "BindException"s, that s Spark's UI trying to
bind to any free port starting from 4040 and incrementing by 1. You can
ignore those, for most intents and purposes.

Ironically, I was just running >16 spark apps in yarn-client mode on the
same machine just now and discovered that if Spark tries 16 ports and
they
are all unavailable, it fails the app before it even starts. I'm using
Spree
https://github.com/hammerlab/spree instead of Spark's web UI for these
apps so I just disabled the web UI altogether with "--conf
spark.ui.enabled=false".

But in general, you can ignore them; they're definitely annoying and
Spark
should do something more graceful here probably, but that's another
discussion.

On Sun, Sep 27, 2015 at 10:41 PM dbl notifications@github.com wrote:

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED
org.spark-project.jetty.server.Server@7d64a960:
java.net.BindException:
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com
wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241
<

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <
notifications@github.com
mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell; I
don't
think it hurts anything though any leads on making it go away are
welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com
<mailto:
notifications@github.com>> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X 10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop
library for your platform... using builtin-java classes where
applicable

I had this problem with the binary distributions of Hadoop (e.g. -
2.3.0,
2.6.1) until I built hadoop from source with the 'native' option
enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment
variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where
applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view acls
to:
davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started service
'HTTP
class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM,
Java
1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark version
1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view acls
to:
davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify acls
to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening on
addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 <akka.tcp://
sparkDriver@10.0.1.5:60837>]
15/09/27 09:45:54 INFO util.Utils: Successfully started service
'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local
directory at

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70

15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore started
with
capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server
directory is

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f

15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started service
'HTTP
file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0
:4040:
java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@22686ddb: java.net.BindException:
Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not
bind
on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0 <mailto:SelectChannelConnector@0.0.0.0
:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started service
'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
at http://10.0.1.5:60838/jars/commons-cli-1.2.jar <
http://10.0.1.5:60838/jars/commons-cli-1.2.jar> with timestamp
1443372356368

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar
at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar <
http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar> with timestamp
1443372356374

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
at http://10.0.1.5:60838/jars/commons-codec-1.4.jar <
http://10.0.1.5:60838/jars/commons-codec-1.4.jar> with timestamp
1443372356376

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar
at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar <
http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar> with timestamp
1443372356381

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar <
http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar> with timestamp
1443372356403

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar
at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar <
http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar> with timestamp
1443372356436

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar
at http://10.0.1.5:60838/jars/log4j-1.2.17.jar <
http://10.0.1.5:60838/jars/log4j-1.2.17.jar> with timestamp
1443372356443

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/
1.1.1.7/snappy-java-1.1.1.7.jar at
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar <
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar> with timestamp
1443372356501

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar
at http://10.0.1.5:60838/jars/paranamer-2.6.jar <
http://10.0.1.5:60838/jars/paranamer-2.6.jar> with timestamp
1443372356512

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar> with timestamp
1443372356519

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar> with timestamp
1443372356521

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar
at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar <
http://10.0.1.5:60838/jars/httpclient-4.3.2.jar> with timestamp
1443372356574

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar
at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar <
http://10.0.1.5:60838/jars/httpcore-4.3.1.jar> with timestamp
1443372356601

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar> with timestamp
1443372356623

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar> with timestamp
1443372356655

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar> with timestamp
1443372356687

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar> with timestamp
1443372356695

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar> with timestamp
1443372356702

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar> with timestamp
1443372356708

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar> with timestamp
1443372356722

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar
at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar <
http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar> with
timestamp 1443372356747

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar
at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar> with
timestamp
1443372356755

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar
at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar <
http://10.0.1.5:60838/jars/servo-core-0.5.5.jar> with timestamp
1443372356802

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar
at http://10.0.1.5:60838/jars/annotations-2.0.0.jar <
http://10.0.1.5:60838/jars/annotations-2.0.0.jar> with timestamp
1443372356808

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar
at http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar
<
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar>
with
timestamp 1443372356831

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar
at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar <
http://10.0.1.5:60838/jars/commons-io-1.3.2.jar> with timestamp
1443372356867

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar
at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar <
http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar> with timestamp
1443372356869

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar
at http://10.0.1.5:60838/jars/avro-1.7.6.jar <
http://10.0.1.5:60838/jars/avro-1.7.6.jar> with timestamp
1443372356882

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar> with timestamp
1443372356886

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar
at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar> with
timestamp
1443372356915

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372356953

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar
at http://10.0.1.5:60838/jars/kryo-2.21.jar <
http://10.0.1.5:60838/jars/kryo-2.21.jar> with timestamp 1443372356995

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar
at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar <
http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar> with timestamp
1443372357147

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar
at http://10.0.1.5:60838/jars/asm-4.0.jar <
http://10.0.1.5:60838/jars/asm-4.0.jar> with timestamp 1443372357152

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar
at http://10.0.1.5:60838/jars/minlog-1.2.jar <
http://10.0.1.5:60838/jars/minlog-1.2.jar> with timestamp
1443372357153

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar
at http://10.0.1.5:60838/jars/objenesis-1.2.jar <
http://10.0.1.5:60838/jars/objenesis-1.2.jar> with timestamp
1443372357155

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar
at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar <
http://10.0.1.5:60838/jars/fastutil-6.4.4.jar> with timestamp
1443372357424

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar
at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar> with
timestamp
1443372357485

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar
at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar <
http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar> with timestamp
1443372357534

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar
at http://10.0.1.5:60838/jars/htsjdk-1.133.jar <
http://10.0.1.5:60838/jars/htsjdk-1.133.jar> with timestamp
1443372357561

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar
at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar <
http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar> with timestamp
1443372357576

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar
at http://10.0.1.5:60838/jars/xz-1.5.jar <
http://10.0.1.5:60838/jars/xz-1.5.jar> with timestamp 1443372357581

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-1.8.2.jar> with timestamp 1443372357612

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar
at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar> with timestamp
1443372357622

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar
at http://10.0.1.5:60838/jars/testng-6.8.8.jar <
http://10.0.1.5:60838/jars/testng-6.8.8.jar> with timestamp
1443372357639

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
at http://10.0.1.5:60838/jars/bsh-2.0b4.jar <
http://10.0.1.5:60838/jars/bsh-2.0b4.jar> with timestamp 1443372357685

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar
at http://10.0.1.5:60838/jars/jcommander-1.27.jar <
http://10.0.1.5:60838/jars/jcommander-1.27.jar> with timestamp
1443372357692

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar
at http://10.0.1.5:60838/jars/guava-14.0.1.jar <
http://10.0.1.5:60838/jars/guava-14.0.1.jar> with timestamp
1443372357744

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372357758

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar
at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar <
http://10.0.1.5:60838/jars/scala-library-2.10.4.jar> with timestamp
1443372357947

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar <
http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar> with timestamp
1443372357980

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar
at http://10.0.1.5:60838/jars/args4j-2.0.23.jar <
http://10.0.1.5:60838/jars/args4j-2.0.23.jar> with timestamp
1443372358026

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar
at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372358030

15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default name
DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID
driver
on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class URI:
http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started service
'org.apache.spark.network.netty.NettyBlockTransferService' on port
60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server
created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to
register
BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint:
Registering block manager localhost:60842 with 530.0 MB RAM,
BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered
BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:46:54 INFO ui.SparkUI: Stopped Spark web UI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:46:54 INFO scheduler.DAGScheduler: Stopping
DAGScheduler
15/09/27 09:46:54 INFO spark.MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
15/09/27 09:46:54 INFO storage.MemoryStore: MemoryStore cleared
15/09/27 09:46:54 INFO storage.BlockManager: BlockManager stopped
15/09/27 09:46:54 INFO storage.BlockManagerMaster:
BlockManagerMaster
stopped
15/09/27 09:46:54 INFO
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
15/09/27 09:46:54 INFO spark.SparkContext: Successfully stopped
SparkContext
15/09/27 09:46:54 INFO util.ShutdownHookManager: Shutdown hook
called
15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote
daemon.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting
directory

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-94dfb979-530a-4b9d-8109-2dec5a277611

15/09/27 09:46:54 INFO
remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut
down;
proceeding with flushing remote transports.
15/09/27 09:46:54 INFO util.ShutdownHookManager: Deleting
directory

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4


Reply to this email directly or view it on GitHub
<#837 <
https://github.com/bigdatagenomics/adam/issues/837>>.

@ryan-williams
Copy link
Member

Hey David, that's weird, I've not seen that, can you jump in to the Spree
gitter room https://gitter.im/hammerlab/spree and we can discuss further?

On Mon, Sep 28, 2015 at 10:34 AM dbl notifications@github.com wrote:

Did you change the ‘adam-shell’ to include the Spree requirements?
E.g.

If using Spark ≥ 1.5.0, simply pass the following flags to
spark-{shell,submit}:

--packages org.hammerlab:spark-json-relay:2.0.0
--conf spark.extraListeners=org.apache.spark.JsonRelay
Or

Include JsonRelay on the driver's classpath

--driver-class-path /path/to/json-relay-2.0.0.jar

Register your JsonRelay as a SparkListener

--conf spark.extraListeners=org.apache.spark.JsonRelay

Point it at your slim instance; default: localhost:8123

--conf spark.slim.host=…
--conf spark.slim.port=…

Any ideas about this error with slim:
$ sudo npm install -g slim.js
Password:

kerberos@0.0.14 install
/opt/local/lib/node_modules/slim.js/node_modules/mongodb/node_modules/mongodb-core/node_modules/kerberos
(node-gyp rebuild) || (exit 0)

CXX(target) Release/obj.target/kerberos/lib/kerberos.o
CXX(target) Release/obj.target/kerberos/lib/worker.o
CC(target) Release/obj.target/kerberos/lib/kerberosgss.o
CC(target) Release/obj.target/kerberos/lib/base64.o
CXX(target) Release/obj.target/kerberos/lib/kerberos_context.o
SOLINK_MODULE(target) Release/kerberos.node
SOLINK_MODULE(target) Release/kerberos.node: Finished
/opt/local/bin/slim -> /opt/local/lib/node_modules/slim.js/slim
slim.js@1.2.1 /opt/local/lib/node_modules/slim.js
├── deep-equal@1.0.0
├── line-reader@0.2.4
├── minimist@1.1.1
├── async@1.3.0
├── mkdirp@0.5.1 (minimist@0.0.8)
├── node.extend@1.1.5 (is@3.1.0)
├── shelljs@0.5.1
├── moment@2.10.3
├── tracer@0.7.4 (tinytim@0.1.1, colors@1.0.3, dateformat@1.0.11)
├── oboe@2.1.2 (http-https@1.0.0)
└── mongodb@2.0.42 (es6-promise@2.1.1, readable-stream@1.0.31,
mongodb-core@1.2.10)
David-Laxers-MacBook-Pro:spree davidlaxer$ slim
module.js:338
throw err;
^
Error: Cannot find module '/Users/davidlaxer/spree/slim.js'
at Function.Module._resolveFilename (module.js:336:15)
at Function.Module._load (module.js:278:25)
at Function.Module.runMain (module.js:501:10)
at startup (node.js:129:16)
at node.js:814:3

On Sep 27, 2015, at 9:12 PM, Ryan Williams notifications@github.com
wrote:

I have not, though I did do a bunch of checking out spark-notebook
recently
and have used it with success to run ADAM things.

I am also meaning to check out jupiter scala. I heard mixed reviews about
zeppelin so have planned to investigate the other two first.

On Sun, Sep 27, 2015 at 11:14 PM dbl notifications@github.com wrote:

Thanks! I’ll check out Spree.

Just curious - have you ever tried:

  1. Jupiter scala
  2. Apache incubator Zeppelin

to run Adam, instead of adam-shell?

On Sep 27, 2015, at 7:55 PM, Ryan Williams <notifications@github.com

wrote:

If you're referring to the "BindException"s, that s Spark's UI
trying to
bind to any free port starting from 4040 and incrementing by 1. You
can
ignore those, for most intents and purposes.

Ironically, I was just running >16 spark apps in yarn-client mode on
the
same machine just now and discovered that if Spark tries 16 ports and
they
are all unavailable, it fails the app before it even starts. I'm
using
Spree
https://github.com/hammerlab/spree instead of Spark's web UI for
these
apps so I just disabled the web UI altogether with "--conf
spark.ui.enabled=false".

But in general, you can ignore them; they're definitely annoying and
Spark
should do something more graceful here probably, but that's another
discussion.

On Sun, Sep 27, 2015 at 10:41 PM dbl notifications@github.com
wrote:

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED
org.spark-project.jetty.server.Server@7d64a960:
java.net.BindException:
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com
wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

<

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <
notifications@github.com
mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell;
I
don't
think it hurts anything though any leads on making it go away
are
welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com
<mailto:
notifications@github.com>> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X
10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop
library for your platform... using builtin-java classes where
applicable

I had this problem with the binary distributions of Hadoop
(e.g. -
2.3.0,
2.6.1) until I built hadoop from source with the 'native'
option
enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment
variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java
classes
where
applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view
acls
to:
davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify
acls
to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started
service
'HTTP
class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM,
Java
1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark
version
1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view
acls
to:
davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify
acls
to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening
on
addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 <akka.tcp://
sparkDriver@10.0.1.5:60837>]
15/09/27 09:45:54 INFO util.Utils: Successfully started
service
'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local
directory at

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70

15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore
started
with
capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server
directory is

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f

15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started
service
'HTTP
file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0 <mailto:
SelectChannelConnector@0.0.0.0
:4040:
java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at

org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)

at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at

org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)

at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at
org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)

at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at
org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at

org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@22686ddb: java.net.BindException:
Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at

org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)

at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at

org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)

at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at
org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)

at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at
org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at

org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not
bind
on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0 <mailto:
SelectChannelConnector@0.0.0.0
:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started
service
'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar

at http://10.0.1.5:60838/jars/commons-cli-1.2.jar <
http://10.0.1.5:60838/jars/commons-cli-1.2.jar> with timestamp
1443372356368

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar

at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar <
http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar> with
timestamp
1443372356374

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar

at http://10.0.1.5:60838/jars/commons-codec-1.4.jar <
http://10.0.1.5:60838/jars/commons-codec-1.4.jar> with timestamp
1443372356376

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar

at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar <
http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar> with
timestamp
1443372356381

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar

at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar <
http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar> with
timestamp
1443372356403

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar

at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar <
http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar> with timestamp
1443372356436

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar

at http://10.0.1.5:60838/jars/log4j-1.2.17.jar <
http://10.0.1.5:60838/jars/log4j-1.2.17.jar> with timestamp
1443372356443

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/

1.1.1.7/snappy-java-1.1.1.7.jar at
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar <
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar> with timestamp
1443372356501

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar

at http://10.0.1.5:60838/jars/paranamer-2.6.jar <
http://10.0.1.5:60838/jars/paranamer-2.6.jar> with timestamp
1443372356512

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar> with timestamp
1443372356519

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar> with
timestamp
1443372356521

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar

at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar <
http://10.0.1.5:60838/jars/httpclient-4.3.2.jar> with timestamp
1443372356574

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar

at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar <
http://10.0.1.5:60838/jars/httpcore-4.3.1.jar> with timestamp
1443372356601

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar> with
timestamp
1443372356623

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar> with timestamp
1443372356655

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar> with
timestamp
1443372356687

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar> with
timestamp
1443372356695

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar> with
timestamp
1443372356702

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar> with
timestamp
1443372356708

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar> with
timestamp
1443372356722

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar

at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar
<
http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar>
with
timestamp 1443372356747

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar> with
timestamp
1443372356755

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar

at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar <
http://10.0.1.5:60838/jars/servo-core-0.5.5.jar> with timestamp
1443372356802

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar

at http://10.0.1.5:60838/jars/annotations-2.0.0.jar <
http://10.0.1.5:60838/jars/annotations-2.0.0.jar> with timestamp
1443372356808

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar

at
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar
<
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar

with

timestamp 1443372356831

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar

at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar <
http://10.0.1.5:60838/jars/commons-io-1.3.2.jar> with timestamp
1443372356867

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar

at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar <
http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar> with timestamp
1443372356869

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar

at http://10.0.1.5:60838/jars/avro-1.7.6.jar <
http://10.0.1.5:60838/jars/avro-1.7.6.jar> with timestamp
1443372356882

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar

at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar> with
timestamp
1443372356886

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar

at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar> with
timestamp
1443372356915

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar

at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar>
with
timestamp 1443372356953

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar

at http://10.0.1.5:60838/jars/kryo-2.21.jar <
http://10.0.1.5:60838/jars/kryo-2.21.jar> with timestamp
1443372356995

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar

at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar <
http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar> with
timestamp
1443372357147

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar

at http://10.0.1.5:60838/jars/asm-4.0.jar <
http://10.0.1.5:60838/jars/asm-4.0.jar> with timestamp
1443372357152

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar

at http://10.0.1.5:60838/jars/minlog-1.2.jar <
http://10.0.1.5:60838/jars/minlog-1.2.jar> with timestamp
1443372357153

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar

at http://10.0.1.5:60838/jars/objenesis-1.2.jar <
http://10.0.1.5:60838/jars/objenesis-1.2.jar> with timestamp
1443372357155

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar

at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar <
http://10.0.1.5:60838/jars/fastutil-6.4.4.jar> with timestamp
1443372357424

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar> with
timestamp
1443372357485

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar

at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar <
http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar> with timestamp
1443372357534

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar

at http://10.0.1.5:60838/jars/htsjdk-1.133.jar <
http://10.0.1.5:60838/jars/htsjdk-1.133.jar> with timestamp
1443372357561

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar

at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar <
http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar> with timestamp
1443372357576

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar

at http://10.0.1.5:60838/jars/xz-1.5.jar <
http://10.0.1.5:60838/jars/xz-1.5.jar> with timestamp
1443372357581

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar

at http://10.0.1.5:60838/jars/ant-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-1.8.2.jar> with timestamp
1443372357612

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar

at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar> with timestamp
1443372357622

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar

at http://10.0.1.5:60838/jars/testng-6.8.8.jar <
http://10.0.1.5:60838/jars/testng-6.8.8.jar> with timestamp
1443372357639

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar

at http://10.0.1.5:60838/jars/bsh-2.0b4.jar <
http://10.0.1.5:60838/jars/bsh-2.0b4.jar> with timestamp
1443372357685

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar

at http://10.0.1.5:60838/jars/jcommander-1.27.jar <
http://10.0.1.5:60838/jars/jcommander-1.27.jar> with timestamp
1443372357692

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar

at http://10.0.1.5:60838/jars/guava-14.0.1.jar <
http://10.0.1.5:60838/jars/guava-14.0.1.jar> with timestamp
1443372357744

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar

at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar>
with
timestamp 1443372357758

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar

at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar <
http://10.0.1.5:60838/jars/scala-library-2.10.4.jar> with
timestamp
1443372357947

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar

at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar <
http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar> with timestamp
1443372357980

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar

at http://10.0.1.5:60838/jars/args4j-2.0.23.jar <
http://10.0.1.5:60838/jars/args4j-2.0.23.jar> with timestamp
1443372358026

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar

at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372358030

15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default
name
DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID
driver
on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class
URI:
http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started
service
'org.apache.spark.network.netty.NettyBlockTransferService' on port
60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server
created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to
register
BlockManager
15/09/27 09:45:59 INFO storage.BlockManagerMasterEndpoint:
Registering block manager localhost:60842 with 530.0 MB RAM,
BlockManagerId(driver, localhost, 60842)
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Registered
BlockManager
15/09/27 09:46:00 INFO repl.SparkILoop: Created spark
context..
Spark context available as sc.
15/09/27 09:46:04 INFO repl.SparkILoop: Created sql context..
SQL context available as sqlContext.

scala> Stopping spark context.
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static/sql,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/execution,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/SQL,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/metrics/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:46:54 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

I’m there now ...

On Sep 28, 2015, at 7:40 AM, Ryan Williams notifications@github.com wrote:

Hey David, that's weird, I've not seen that, can you jump in to the Spree
gitter room https://gitter.im/hammerlab/spree and we can discuss further?

On Mon, Sep 28, 2015 at 10:34 AM dbl notifications@github.com wrote:

Did you change the ‘adam-shell’ to include the Spree requirements?
E.g.

If using Spark ≥ 1.5.0, simply pass the following flags to
spark-{shell,submit}:

--packages org.hammerlab:spark-json-relay:2.0.0
--conf spark.extraListeners=org.apache.spark.JsonRelay
Or

Include JsonRelay on the driver's classpath

--driver-class-path /path/to/json-relay-2.0.0.jar

Register your JsonRelay as a SparkListener

--conf spark.extraListeners=org.apache.spark.JsonRelay

Point it at your slim instance; default: localhost:8123

--conf spark.slim.host=…
--conf spark.slim.port=…

Any ideas about this error with slim:
$ sudo npm install -g slim.js
Password:

kerberos@0.0.14 install
/opt/local/lib/node_modules/slim.js/node_modules/mongodb/node_modules/mongodb-core/node_modules/kerberos
(node-gyp rebuild) || (exit 0)

CXX(target) Release/obj.target/kerberos/lib/kerberos.o
CXX(target) Release/obj.target/kerberos/lib/worker.o
CC(target) Release/obj.target/kerberos/lib/kerberosgss.o
CC(target) Release/obj.target/kerberos/lib/base64.o
CXX(target) Release/obj.target/kerberos/lib/kerberos_context.o
SOLINK_MODULE(target) Release/kerberos.node
SOLINK_MODULE(target) Release/kerberos.node: Finished
/opt/local/bin/slim -> /opt/local/lib/node_modules/slim.js/slim
slim.js@1.2.1 /opt/local/lib/node_modules/slim.js
├── deep-equal@1.0.0
├── line-reader@0.2.4
├── minimist@1.1.1
├── async@1.3.0
├── mkdirp@0.5.1 (minimist@0.0.8)
├── node.extend@1.1.5 (is@3.1.0)
├── shelljs@0.5.1
├── moment@2.10.3
├── tracer@0.7.4 (tinytim@0.1.1, colors@1.0.3, dateformat@1.0.11)
├── oboe@2.1.2 (http-https@1.0.0)
└── mongodb@2.0.42 (es6-promise@2.1.1, readable-stream@1.0.31,
mongodb-core@1.2.10)
David-Laxers-MacBook-Pro:spree davidlaxer$ slim
module.js:338
throw err;
^
Error: Cannot find module '/Users/davidlaxer/spree/slim.js'
at Function.Module._resolveFilename (module.js:336:15)
at Function.Module._load (module.js:278:25)
at Function.Module.runMain (module.js:501:10)
at startup (node.js:129:16)
at node.js:814:3

On Sep 27, 2015, at 9:12 PM, Ryan Williams notifications@github.com
wrote:

I have not, though I did do a bunch of checking out spark-notebook
recently
and have used it with success to run ADAM things.

I am also meaning to check out jupiter scala. I heard mixed reviews about
zeppelin so have planned to investigate the other two first.

On Sun, Sep 27, 2015 at 11:14 PM dbl notifications@github.com wrote:

Thanks! I’ll check out Spree.

Just curious - have you ever tried:

  1. Jupiter scala
  2. Apache incubator Zeppelin

to run Adam, instead of adam-shell?

On Sep 27, 2015, at 7:55 PM, Ryan Williams <notifications@github.com

wrote:

If you're referring to the "BindException"s, that s Spark's UI
trying to
bind to any free port starting from 4040 and incrementing by 1. You
can
ignore those, for most intents and purposes.

Ironically, I was just running >16 spark apps in yarn-client mode on
the
same machine just now and discovered that if Spark tries 16 ports and
they
are all unavailable, it fails the app before it even starts. I'm
using
Spree
https://github.com/hammerlab/spree instead of Spark's web UI for
these
apps so I just disabled the web UI altogether with "--conf
spark.ui.enabled=false".

But in general, you can ignore them; they're definitely annoying and
Spark
should do something more graceful here probably, but that's another
discussion.

On Sun, Sep 27, 2015 at 10:41 PM dbl notifications@github.com
wrote:

Still no solution.

Do you know why adam-shell gets these Java errors:

r@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 WARN component.AbstractLifeCycle: FAILED
org.spark-project.jetty.server.Server@7d64a960:
java.net.BindException:
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at

org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:236)

at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at
org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:246)
at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1913)

at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1904)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:246)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at
org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:474)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:474)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHandler{/api,null}
15/09/27 19:39:33 INFO handler.ContextHandler: stopped
o.s.j.s.ServletContextHan

On Sep 27, 2015, at 11:03 AM, David Laxer davidl@softintel.com
wrote:

It’s coming from Spark. I’m investigating ...

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

<

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32800241?noredirect=1#comment53458542_32800241

On Sep 27, 2015, at 10:15 AM, Ryan Williams <
notifications@github.com
mailto:notifications@github.com> wrote:

FWIW I think I see that warning every time I run an adam-shell;
I
don't
think it hurts anything though any leads on making it go away
are
welcome.

On Sun, Sep 27, 2015 at 12:50 PM dbl <notifications@github.com
<mailto:
notifications@github.com>> wrote:

I'm getting a warning running 'bin/adam-shell', on OS X
10.10.5:

15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop
library for your platform... using builtin-java classes where
applicable

I had this problem with the binary distributions of Hadoop
(e.g. -
2.3.0,
2.6.1) until I built hadoop from source with the 'native'
option
enabled
and set this environment variable:

HADOOP_COMMON_LIB_NATIVE_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native

Is this a Spark Warning? Do I have to set a SPARK environment
variable?
Is there an ADAM option I must set?

$ bin/adam-shell
Using SPARK_SHELL=/Users/davidlaxer/spark/bin/spark-shell
15/09/27 09:45:26 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java
classes
where
applicable
15/09/27 09:45:27 INFO spark.SecurityManager: Changing view
acls
to:
davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: Changing modify
acls
to: davidlaxer
15/09/27 09:45:27 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:28 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:29 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:29 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60826
15/09/27 09:45:29 INFO util.Utils: Successfully started
service
'HTTP
class server' on port 60826.
Welcome to


/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.0-SNAPSHOT
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM,
Java
1.8.0_05)
Type in expressions to have them evaluated.
Type :help for more information.
15/09/27 09:45:50 INFO spark.SparkContext: Running Spark
version
1.5.0-SNAPSHOT
15/09/27 09:45:50 INFO spark.SecurityManager: Changing view
acls
to:
davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: Changing modify
acls
to: davidlaxer
15/09/27 09:45:50 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view
permissions:
Set(davidlaxer); users with modify permissions: Set(davidlaxer)
15/09/27 09:45:52 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/09/27 09:45:53 INFO Remoting: Starting remoting
15/09/27 09:45:54 INFO Remoting: Remoting started; listening
on
addresses :[akka.tcp://sparkDriver@10.0.1.5:60837 <akka.tcp://
sparkDriver@10.0.1.5:60837>]
15/09/27 09:45:54 INFO util.Utils: Successfully started
service
'sparkDriver' on port 60837.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
MapOutputTracker
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
BlockManagerMaster
15/09/27 09:45:54 INFO storage.DiskBlockManager: Created local
directory at

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/blockmgr-57acd6fd-754f-4d86-9226-bbcead030b70

15/09/27 09:45:54 INFO storage.MemoryStore: MemoryStore
started
with
capacity 530.0 MB
15/09/27 09:45:54 INFO spark.HttpFileServer: HTTP File server
directory is

/private/var/folders/nj/nphdkhyj6s1dttb0pd9zb2wc0000gn/T/spark-c735407c-acd9-4be7-ac3a-2de4e82af5e4/httpd-72e12a03-5d15-41cc-a524-c74725d6ec0f

15/09/27 09:45:54 INFO spark.HttpServer: Starting HTTP Server
15/09/27 09:45:54 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:54 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0 mailto:SocketConnector@0.0.0.0:60838
15/09/27 09:45:54 INFO util.Utils: Successfully started
service
'HTTP
file server' on port 60838.
15/09/27 09:45:54 INFO spark.SparkEnv: Registering
OutputCommitCoordinator
15/09/27 09:45:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0 <mailto:
SelectChannelConnector@0.0.0.0
:4040:
java.net.BindException: Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at

org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)

at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at

org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)

at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at
org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)

at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at
org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at

org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@22686ddb: java.net.BindException:
Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:414)
at sun.nio.ch.Net.bind(Net.java:406)
at

sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)

at
sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at

org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)

at

org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)

at

org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)

at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at

org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)

at

org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:240)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:250)

at

org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1912)

at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1903)
at

org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:250)

at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at

org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:465)

at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.SparkContext.(SparkContext.scala:465)
at

org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.(:9)
at $line3.$read$$iwC.(:18)
at $line3.$read.(:20)
at $line3.$read$.(:24)
at $line3.$read$.()
at $line3.$eval$.(:7)
at $line3.$eval$.()
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at

org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)

at

org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)

at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at

org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at

org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at
org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)

at

org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)

at

org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)

at

org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at

org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)

at
org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at

org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)

at

org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at

scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at

sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at

sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)
at

org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)

at

org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)

at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/kill,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/api,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/static,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/threadDump,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/executors,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/environment,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/rdd,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/storage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/pool,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/stage,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/stages,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/job,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs/json,null}
15/09/27 09:45:55 INFO handler.ContextHandler: stopped
o.e.j.s.ServletContextHandler{/jobs,null}
15/09/27 09:45:56 WARN util.Utils: Service 'SparkUI' could not
bind
on port 4040. Attempting port 4041.
15/09/27 09:45:56 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/09/27 09:45:56 INFO server.AbstractConnector: Started
SelectChannelConnector@0.0.0.0 <mailto:
SelectChannelConnector@0.0.0.0
:4041
15/09/27 09:45:56 INFO util.Utils: Successfully started
service
'SparkUI' on port 4041.
15/09/27 09:45:56 INFO ui.SparkUI: Started SparkUI at
http://10.0.1.5:4041 http://10.0.1.5:4041/
15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-cli/commons-cli/1.2/commons-cli-1.2.jar

at http://10.0.1.5:60838/jars/commons-cli-1.2.jar <
http://10.0.1.5:60838/jars/commons-cli-1.2.jar> with timestamp
1443372356368

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-httpclient/commons-httpclient/3.1/commons-httpclient-3.1.jar

at http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar <
http://10.0.1.5:60838/jars/commons-httpclient-3.1.jar> with
timestamp
1443372356374

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-codec/commons-codec/1.4/commons-codec-1.4.jar

at http://10.0.1.5:60838/jars/commons-codec-1.4.jar <
http://10.0.1.5:60838/jars/commons-codec-1.4.jar> with timestamp
1443372356376

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar

at http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar <
http://10.0.1.5:60838/jars/commons-logging-1.1.1.jar> with
timestamp
1443372356381

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar

at http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar <
http://10.0.1.5:60838/jars/commons-compress-1.4.1.jar> with
timestamp
1443372356403

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-api/1.7.10/slf4j-api-1.7.10.jar

at http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar <
http://10.0.1.5:60838/jars/slf4j-api-1.7.10.jar> with timestamp
1443372356436

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/log4j/log4j/1.2.17/log4j-1.2.17.jar

at http://10.0.1.5:60838/jars/log4j-1.2.17.jar <
http://10.0.1.5:60838/jars/log4j-1.2.17.jar> with timestamp
1443372356443

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/xerial/snappy/snappy-java/

1.1.1.7/snappy-java-1.1.1.7.jar at
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar <
http://10.0.1.5:60838/jars/snappy-java-1.1.1.7.jar> with timestamp
1443372356501

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/thoughtworks/paranamer/paranamer/2.6/paranamer-2.6.jar

at http://10.0.1.5:60838/jars/paranamer-2.6.jar <
http://10.0.1.5:60838/jars/paranamer-2.6.jar> with timestamp
1443372356512

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-io_2.10/0.2.3/utils-io_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-io_2.10-0.2.3.jar> with timestamp
1443372356519

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-misc_2.10/0.2.3/utils-misc_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-misc_2.10-0.2.3.jar> with
timestamp
1443372356521

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpclient/4.3.2/httpclient-4.3.2.jar

at http://10.0.1.5:60838/jars/httpclient-4.3.2.jar <
http://10.0.1.5:60838/jars/httpclient-4.3.2.jar> with timestamp
1443372356574

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/httpcomponents/httpcore/4.3.1/httpcore-4.3.1.jar

at http://10.0.1.5:60838/jars/httpcore-4.3.1.jar <
http://10.0.1.5:60838/jars/httpcore-4.3.1.jar> with timestamp
1443372356601

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-cli_2.10/0.2.3/utils-cli_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-cli_2.10-0.2.3.jar> with
timestamp
1443372356623

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-avro/1.8.1/parquet-avro-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-avro-1.8.1.jar> with timestamp
1443372356655

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-column/1.8.1/parquet-column-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-column-1.8.1.jar> with
timestamp
1443372356687

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-common/1.8.1/parquet-common-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-common-1.8.1.jar> with
timestamp
1443372356695

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-encoding/1.8.1/parquet-encoding-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-encoding-1.8.1.jar> with
timestamp
1443372356702

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-hadoop/1.8.1/parquet-hadoop-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-hadoop-1.8.1.jar> with
timestamp
1443372356708

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-jackson/1.8.1/parquet-jackson-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-jackson-1.8.1.jar> with
timestamp
1443372356722

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-format/2.3.0-incubating/parquet-format-2.3.0-incubating.jar

at http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar
<
http://10.0.1.5:60838/jars/parquet-format-2.3.0-incubating.jar>
with
timestamp 1443372356747

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/utils/utils-metrics_2.10/0.2.3/utils-metrics_2.10-0.2.3.jar

at http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar <
http://10.0.1.5:60838/jars/utils-metrics_2.10-0.2.3.jar> with
timestamp
1443372356755

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/netflix/servo/servo-core/0.5.5/servo-core-0.5.5.jar

at http://10.0.1.5:60838/jars/servo-core-0.5.5.jar <
http://10.0.1.5:60838/jars/servo-core-0.5.5.jar> with timestamp
1443372356802

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/code/findbugs/annotations/2.0.0/annotations-2.0.0.jar

at http://10.0.1.5:60838/jars/annotations-2.0.0.jar <
http://10.0.1.5:60838/jars/annotations-2.0.0.jar> with timestamp
1443372356808

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scoverage/scalac-scoverage-plugin_2.10/0.99.2/scalac-scoverage-plugin_2.10-0.99.2.jar

at
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar
<
http://10.0.1.5:60838/jars/scalac-scoverage-plugin_2.10-0.99.2.jar

with

timestamp 1443372356831

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/commons-io/commons-io/1.3.2/commons-io-1.3.2.jar

at http://10.0.1.5:60838/jars/commons-io-1.3.2.jar <
http://10.0.1.5:60838/jars/commons-io-1.3.2.jar> with timestamp
1443372356867

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/bdg-formats/bdg-formats/0.4.0/bdg-formats-0.4.0.jar

at http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar <
http://10.0.1.5:60838/jars/bdg-formats-0.4.0.jar> with timestamp
1443372356869

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/avro/avro/1.7.6/avro-1.7.6.jar

at http://10.0.1.5:60838/jars/avro-1.7.6.jar <
http://10.0.1.5:60838/jars/avro-1.7.6.jar> with timestamp
1443372356882

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar

at http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-core-asl-1.9.13.jar> with
timestamp
1443372356886

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar

at http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar <
http://10.0.1.5:60838/jars/jackson-mapper-asl-1.9.13.jar> with
timestamp
1443372356915

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-core_2.10/0.17.2-SNAPSHOT/adam-core_2.10-0.17.2-SNAPSHOT.jar

at http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-core_2.10-0.17.2-SNAPSHOT.jar>
with
timestamp 1443372356953

15/09/27 09:45:56 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/kryo/kryo/2.21/kryo-2.21.jar

at http://10.0.1.5:60838/jars/kryo-2.21.jar <
http://10.0.1.5:60838/jars/kryo-2.21.jar> with timestamp
1443372356995

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar

at http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar <
http://10.0.1.5:60838/jars/reflectasm-1.07-shaded.jar> with
timestamp
1443372357147

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/ow2/asm/asm/4.0/asm-4.0.jar

at http://10.0.1.5:60838/jars/asm-4.0.jar <
http://10.0.1.5:60838/jars/asm-4.0.jar> with timestamp
1443372357152

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar

at http://10.0.1.5:60838/jars/minlog-1.2.jar <
http://10.0.1.5:60838/jars/minlog-1.2.jar> with timestamp
1443372357153

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/objenesis/objenesis/1.2/objenesis-1.2.jar

at http://10.0.1.5:60838/jars/objenesis-1.2.jar <
http://10.0.1.5:60838/jars/objenesis-1.2.jar> with timestamp
1443372357155

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/it/unimi/dsi/fastutil/6.4.4/fastutil-6.4.4.jar

at http://10.0.1.5:60838/jars/fastutil-6.4.4.jar <
http://10.0.1.5:60838/jars/fastutil-6.4.4.jar> with timestamp
1443372357424

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/parquet/parquet-scala_2.10/1.8.1/parquet-scala_2.10-1.8.1.jar

at http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar <
http://10.0.1.5:60838/jars/parquet-scala_2.10-1.8.1.jar> with
timestamp
1443372357485

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/seqdoop/hadoop-bam/7.0.0/hadoop-bam-7.0.0.jar

at http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar <
http://10.0.1.5:60838/jars/hadoop-bam-7.0.0.jar> with timestamp
1443372357534

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/github/samtools/htsjdk/1.133/htsjdk-1.133.jar

at http://10.0.1.5:60838/jars/htsjdk-1.133.jar <
http://10.0.1.5:60838/jars/htsjdk-1.133.jar> with timestamp
1443372357561

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/commons/commons-jexl/2.1.1/commons-jexl-2.1.1.jar

at http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar <
http://10.0.1.5:60838/jars/commons-jexl-2.1.1.jar> with timestamp
1443372357576

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/tukaani/xz/1.5/xz-1.5.jar

at http://10.0.1.5:60838/jars/xz-1.5.jar <
http://10.0.1.5:60838/jars/xz-1.5.jar> with timestamp
1443372357581

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant/1.8.2/ant-1.8.2.jar

at http://10.0.1.5:60838/jars/ant-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-1.8.2.jar> with timestamp
1443372357612

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/apache/ant/ant-launcher/1.8.2/ant-launcher-1.8.2.jar

at http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar <
http://10.0.1.5:60838/jars/ant-launcher-1.8.2.jar> with timestamp
1443372357622

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/testng/testng/6.8.8/testng-6.8.8.jar

at http://10.0.1.5:60838/jars/testng-6.8.8.jar <
http://10.0.1.5:60838/jars/testng-6.8.8.jar> with timestamp
1443372357639

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar

at http://10.0.1.5:60838/jars/bsh-2.0b4.jar <
http://10.0.1.5:60838/jars/bsh-2.0b4.jar> with timestamp
1443372357685

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/beust/jcommander/1.27/jcommander-1.27.jar

at http://10.0.1.5:60838/jars/jcommander-1.27.jar <
http://10.0.1.5:60838/jars/jcommander-1.27.jar> with timestamp
1443372357692

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/com/google/guava/guava/14.0.1/guava-14.0.1.jar

at http://10.0.1.5:60838/jars/guava-14.0.1.jar <
http://10.0.1.5:60838/jars/guava-14.0.1.jar> with timestamp
1443372357744

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-apis_2.10/0.17.2-SNAPSHOT/adam-apis_2.10-0.17.2-SNAPSHOT.jar

at http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-apis_2.10-0.17.2-SNAPSHOT.jar>
with
timestamp 1443372357758

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar

at http://10.0.1.5:60838/jars/scala-library-2.10.4.jar <
http://10.0.1.5:60838/jars/scala-library-2.10.4.jar> with
timestamp
1443372357947

15/09/27 09:45:57 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar

at http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar <
http://10.0.1.5:60838/jars/slf4j-log4j12-1.7.5.jar> with timestamp
1443372357980

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/args4j/args4j/2.0.23/args4j-2.0.23.jar

at http://10.0.1.5:60838/jars/args4j-2.0.23.jar <
http://10.0.1.5:60838/jars/args4j-2.0.23.jar> with timestamp
1443372358026

15/09/27 09:45:58 INFO spark.SparkContext: Added JAR

file:/Users/davidlaxer/adam/adam-cli/target/appassembler/repo/org/bdgenomics/adam/adam-cli_2.10/0.17.2-SNAPSHOT/adam-cli_2.10-0.17.2-SNAPSHOT.jar

at http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar <
http://10.0.1.5:60838/jars/adam-cli_2.10-0.17.2-SNAPSHOT.jar> with
timestamp 1443372358030

15/09/27 09:45:58 WARN metrics.MetricsSystem: Using default
name
DAGScheduler for source because spark.app.id is not set.
15/09/27 09:45:58 INFO executor.Executor: Starting executor ID
driver
on host localhost
15/09/27 09:45:58 INFO executor.Executor: Using REPL class
URI:
http://10.0.1.5:60826 http://10.0.1.5:60826/
15/09/27 09:45:59 INFO util.Utils: Successfully started
service
'org.apache.spark.network.netty.NettyBlockTransferService' on port
60842.
15/09/27 09:45:59 INFO netty.NettyBlockTransferService: Server
created on 60842
15/09/27 09:45:59 INFO storage.BlockManagerMaster: Trying to
register
BlockManager
15/09/27 09:45:59 INFO storag

@ryan-williams
Copy link
Member

FWIW these two Spark JIRAs are relevant to the BindException "warnings": SPARK-7623 SPARK-1902.

Let's leave this issue open to focus on just the NativeCodeLoader issue referenced in the issue name.

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

I solved the native code loader issue with Hadoop, but not yet with Spark.

http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32803314#32803314 http://stackoverflow.com/questions/32800018/hadoop-2-6-1-warning-warn-util-nativecodeloader/32803314#32803314

On Sep 28, 2015, at 9:07 AM, Ryan Williams notifications@github.com wrote:

FWIW these two Spark JIRAs are relevant to the BindException "warnings": SPARK-7623 https://issues.apache.org/jira/browse/SPARK-7623 SPARK-1902 https://issues.apache.org/jira/browse/SPARK-1902.

Let's leave this issue open to focus on just the NativeCodeLoader issue referenced in the issue name.


Reply to this email directly or view it on GitHub #837 (comment).

@laserson
Copy link
Contributor

IIRC, the hadoop native libraries are not supported for anything other than *nix platforms. Also see here:

http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

I wouldn't worry about this if you're running on a Mac or Windows, as presumably any production jobs would be running on a "real" cluster that's running Linux.

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Got it! Thanks!

On Sep 28, 2015, at 12:54 PM, Uri Laserson notifications@github.com wrote:

IIRC, the hadoop native libraries are not supported for anything other than *nix platforms. Also see here:

http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html
I wouldn't worry about this if you're running on a Mac or Windows, as presumably any production jobs would be running on a "real" cluster that's running Linux.


Reply to this email directly or view it on GitHub #837 (comment).

@ryan-williams
Copy link
Member

Sure enough, I don't see this on Linux, only OSX, and only in ADAM, not in spark-shells.

Do we know what ADAM is using that is exercising this, that Spark does not?

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

It’s coming from spark:

$ spark-shell
15/09/28 13:50:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

On Sep 28, 2015, at 1:12 PM, Ryan Williams notifications@github.com wrote:

Sure enough, I don't see this on Linux, only OSX, and only in ADAM, not in spark-shells.

Do we know what ADAM is using that is exercising this, that Spark does not?


Reply to this email directly or view it on GitHub #837 (comment).

@ryan-williams
Copy link
Member

Weird, I can't reproduce it with a spark-shell from any recent Spark version on OSX, but I do get it every time from adam-shell

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

I pulled Spark version 1.5.1 from github yesterday.

Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/,// //_\ version 1.5.1
/_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_05)

On Sep 28, 2015, at 1:56 PM, Ryan Williams notifications@github.com wrote:

Weird, I can't reproduce it with a spark-shell from any recent Spark version on OSX, but I do get it every time from adam-shell


Reply to this email directly or view it on GitHub #837 (comment).

@laserson
Copy link
Contributor

snappy maybe?

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Perhaps.
$ cd snappy
$ git pull
remote: Counting objects: 76, done.
remote: Total 76 (delta 38), reused 38 (delta 38), pack-reused 38
Unpacking objects: 100% (76/76), done.
From https://github.com/google/snappy
1ff9be9..0852af7 master -> origin/master

  • [new branch] gh-pages -> origin/gh-pages
  • [new tag] 1.1.3 -> 1.1.3
    Updating 1ff9be9..0852af7
    Fast-forward
    COPYING | 2 +-
    ChangeLog | 4384 ++++++++++++++++++++++++++--------------------
    NEWS | 12 +
    configure.ac | 4 +-
    snappy-c.h | 6 +-
    snappy-internal.h | 72 +-
    snappy-sinksource.cc | 33 +
    snappy-sinksource.h | 57 +-
    snappy-stubs-internal.h | 6 +-
    snappy-stubs-public.h.in | 6 +-
    snappy-test.cc | 7 +-
    snappy-test.h | 25 +-
    snappy.cc | 439 +++--
    snappy.h | 27 +-
    snappy_unittest.cc | 244 ++-
    15 files changed, 3158 insertions(+), 2166 deletions(-)

$ make check

PASS: snappy_unittest

Testsuite summary for snappy 1.1.3

TOTAL: 1

PASS: 1

SKIP: 0

XFAIL: 0

FAIL: 0

XPASS: 0

ERROR: 0

Recompiling Spark ...

On Sep 28, 2015, at 2:54 PM, Uri Laserson notifications@github.com wrote:

snappy maybe?


Reply to this email directly or view it on GitHub #837 (comment).

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Hmmm…

http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-load-native-hadoop-library-td509.html http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-load-native-hadoop-library-td509.html

On Sep 28, 2015, at 3:18 PM, David Laxer davidl@softintel.com wrote:

Perhaps.
$ cd snappy
$ git pull
remote: Counting objects: 76, done.
remote: Total 76 (delta 38), reused 38 (delta 38), pack-reused 38
Unpacking objects: 100% (76/76), done.
From https://github.com/google/snappy https://github.com/google/snappy
1ff9be9..0852af7 master -> origin/master

  • [new branch] gh-pages -> origin/gh-pages
  • [new tag] 1.1.3 -> 1.1.3
    Updating 1ff9be9..0852af7
    Fast-forward
    COPYING | 2 +-
    ChangeLog | 4384 ++++++++++++++++++++++++++--------------------
    NEWS | 12 +
    configure.ac | 4 +-
    snappy-c.h | 6 +-
    snappy-internal.h | 72 +-
    snappy-sinksource.cc http://snappy-sinksource.cc/ | 33 +
    snappy-sinksource.h | 57 +-
    snappy-stubs-internal.h | 6 +-
    snappy-stubs-public.h.in | 6 +-
    snappy-test.cc http://snappy-test.cc/ | 7 +-
    snappy-test.h | 25 +-
    snappy.cc http://snappy.cc/ | 439 +++--
    snappy.h | 27 +-
    snappy_unittest.cc | 244 ++-
    15 files changed, 3158 insertions(+), 2166 deletions(-)

$ make check

PASS: snappy_unittest

Testsuite summary for snappy 1.1.3

TOTAL: 1

PASS: 1

SKIP: 0

XFAIL: 0

FAIL: 0

XPASS: 0

ERROR: 0

Recompiling Spark ...

On Sep 28, 2015, at 2:54 PM, Uri Laserson <notifications@github.com mailto:notifications@github.com> wrote:

snappy maybe?


Reply to this email directly or view it on GitHub #837 (comment).

@dbl001
Copy link
Author

dbl001 commented Sep 28, 2015

Same result …

bin/spark-shell
15/09/28 16:04:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

On Sep 28, 2015, at 3:22 PM, David Laxer davidl@softintel.com wrote:

Hmmm…

http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-load-native-hadoop-library-td509.html http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-load-native-hadoop-library-td509.html

On Sep 28, 2015, at 3:18 PM, David Laxer <davidl@softintel.com mailto:davidl@softintel.com> wrote:

Perhaps.
$ cd snappy
$ git pull
remote: Counting objects: 76, done.
remote: Total 76 (delta 38), reused 38 (delta 38), pack-reused 38
Unpacking objects: 100% (76/76), done.
From https://github.com/google/snappy https://github.com/google/snappy
1ff9be9..0852af7 master -> origin/master

  • [new branch] gh-pages -> origin/gh-pages
  • [new tag] 1.1.3 -> 1.1.3
    Updating 1ff9be9..0852af7
    Fast-forward
    COPYING | 2 +-
    ChangeLog | 4384 ++++++++++++++++++++++++++--------------------
    NEWS | 12 +
    configure.ac | 4 +-
    snappy-c.h | 6 +-
    snappy-internal.h | 72 +-
    snappy-sinksource.cc http://snappy-sinksource.cc/ | 33 +
    snappy-sinksource.h | 57 +-
    snappy-stubs-internal.h | 6 +-
    snappy-stubs-public.h.in | 6 +-
    snappy-test.cc http://snappy-test.cc/ | 7 +-
    snappy-test.h | 25 +-
    snappy.cc http://snappy.cc/ | 439 +++--
    snappy.h | 27 +-
    snappy_unittest.cc | 244 ++-
    15 files changed, 3158 insertions(+), 2166 deletions(-)

$ make check

PASS: snappy_unittest

Testsuite summary for snappy 1.1.3

TOTAL: 1

PASS: 1

SKIP: 0

XFAIL: 0

FAIL: 0

XPASS: 0

ERROR: 0

Recompiling Spark ...

On Sep 28, 2015, at 2:54 PM, Uri Laserson <notifications@github.com mailto:notifications@github.com> wrote:

snappy maybe?


Reply to this email directly or view it on GitHub #837 (comment).

@laserson
Copy link
Contributor

I wouldn't worry about this unless you're getting these errors on a linux box.

@fnothaft fnothaft closed this as completed Jan 7, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants