Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Number of partitions (0) must be positive #30

Open
vutrungduc7593 opened this issue May 4, 2018 · 0 comments

Comments

@vutrungduc7593
Copy link

vutrungduc7593 commented May 4, 2018

I have this problem when run make LIMIT=1 local-ingest command. Can someone help me? Thank you

ducvu@DUC-VU:~/projects/geotrellis-landsat-emr-demo$ make LIMIT=1 local-ingest
spark-submit --name "Landsat Demo ducvu Ingest" --master "local[4]" --driver-memory 4G \
ingest/target/scala-2.11/ingest-assembly-0.1.0.jar \
--backend-profiles "file:////home/ducvu/projects/geotrellis-landsat-emr-demo/conf/backend-profiles.json" \
--input "file:///home/ducvu/projects/geotrellis-landsat-emr-demo/conf/input-local.json" \
--output "file:///home/ducvu/projects/geotrellis-landsat-emr-demo/conf/output-local.json"
2018-05-04 11:01:36 WARN  Utils:66 - Your hostname, DUC-VU resolves to a loopback address: 127.0.1.1; using 192.168.0.107 instead (on interface eno1)
2018-05-04 11:01:36 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-05-04 11:01:36 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-05-04 11:01:36 INFO  LandsatIngestMain$:63 - Arguments: WrappedArray(--backend-profiles, file:////home/ducvu/projects/geotrellis-landsat-emr-demo/conf/backend-profiles.json, --input, file:///home/ducvu/projects/geotrellis-landsat-emr-demo/conf/input-local.json, --output, file:///home/ducvu/projects/geotrellis-landsat-emr-demo/conf/output-local.json)
2018-05-04 11:01:36 INFO  SparkContext:54 - Running Spark version 2.3.0
2018-05-04 11:01:36 INFO  SparkContext:54 - Submitted application: GeoTrellis Landsat Ingest
2018-05-04 11:01:36 INFO  SecurityManager:54 - Changing view acls to: ducvu,datdt
2018-05-04 11:01:36 INFO  SecurityManager:54 - Changing modify acls to: ducvu,datdt
2018-05-04 11:01:36 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-05-04 11:01:36 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-05-04 11:01:36 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(ducvu, datdt); groups with view permissions: Set(); users  with modify permissions: Set(ducvu, datdt); groups with modify permissions: Set()
2018-05-04 11:01:36 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 36911.
2018-05-04 11:01:36 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-05-04 11:01:36 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-05-04 11:01:36 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-05-04 11:01:36 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-05-04 11:01:36 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-a703712d-d752-409d-b26d-dc8abfd36381
2018-05-04 11:01:36 INFO  MemoryStore:54 - MemoryStore started with capacity 2004.6 MB
2018-05-04 11:01:36 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-05-04 11:01:36 INFO  log:192 - Logging initialized @1554ms
2018-05-04 11:01:37 INFO  Server:346 - jetty-9.3.z-SNAPSHOT
2018-05-04 11:01:37 INFO  Server:414 - Started @1626ms
2018-05-04 11:01:37 INFO  AbstractConnector:278 - Started ServerConnector@3d4d3fe7{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-05-04 11:01:37 INFO  Utils:54 - Successfully started service 'SparkUI' on port 4040.
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22680f52{/jobs,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7cd1ac19{/jobs/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2f40a43{/jobs/job,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@69c43e48{/jobs/job/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1804f60d{/stages,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a80515c{/stages/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@547e29a4{/stages/stage,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1b39fd82{/stages/stage/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3e2fc448{/stages/pool,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@21680803{/stages/pool/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@588ab592{/storage,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c8b96ec{/storage/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4cc61eb1{/storage/rdd,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2d8f2f3a{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2024293c{/environment,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7048f722{/environment/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@c074c0c{/executors,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@58a55449{/executors/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5949eba8{/executors/threadDump,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6e0ff644{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@58dea0a5{/static,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@50b8ae8d{/,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@255990cc{/api,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@40e4ea87{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@58783f6c{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-05-04 11:01:37 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://192.168.0.107:4040
2018-05-04 11:01:37 INFO  SparkContext:54 - Added JAR file:/home/ducvu/projects/geotrellis-landsat-emr-demo/ingest/target/scala-2.11/ingest-assembly-0.1.0.jar at spark://192.168.0.107:36911/jars/ingest-assembly-0.1.0.jar with timestamp 1525406497164
2018-05-04 11:01:37 INFO  Executor:54 - Starting executor ID driver on host localhost
2018-05-04 11:01:37 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33925.
2018-05-04 11:01:37 INFO  NettyBlockTransferService:54 - Server created on 192.168.0.107:33925
2018-05-04 11:01:37 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-05-04 11:01:37 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.0.107, 33925, None)
2018-05-04 11:01:37 INFO  BlockManagerMasterEndpoint:54 - Registering block manager 192.168.0.107:33925 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.0.107, 33925, None)
2018-05-04 11:01:37 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.0.107, 33925, None)
2018-05-04 11:01:37 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.0.107, 33925, None)
2018-05-04 11:01:37 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@66434cc8{/metrics/json,null,AVAILABLE,@Spark}
2018-05-04 11:01:41 INFO  TemporalMultibandLandsatInput:49 - Found 0 landsat images
[ERROR] [05/04/2018 11:01:41.473] [00http_query_66de0202-cc9d-462d-a121-c70eb4333f3a-akka.actor.default-dispatcher-2] [akka.actor.ActorSystemImpl(00http_query_66de0202-cc9d-462d-a121-c70eb4333f3a)] Outgoing request stream error (akka.stream.AbruptTerminationException)
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Number of partitions (0) must be positive.
        at scala.Predef$.require(Predef.scala:224)
        at org.apache.spark.rdd.RDD$$anonfun$coalesce$1.apply(RDD.scala:450)
        at org.apache.spark.rdd.RDD$$anonfun$coalesce$1.apply(RDD.scala:449)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
        at org.apache.spark.rdd.RDD.coalesce(RDD.scala:449)
        at org.apache.spark.rdd.RDD$$anonfun$repartition$1.apply(RDD.scala:421)
        at org.apache.spark.rdd.RDD$$anonfun$repartition$1.apply(RDD.scala:421)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
        at org.apache.spark.rdd.RDD.repartition(RDD.scala:420)
        at demo.etl.landsat.LandsatInput.fetch(LandsatInput.scala:80)
        at demo.etl.landsat.TemporalMultibandLandsatInput.apply(TemporalMultibandLandsatInput.scala:53)
        at demo.LandsatIngestMain$$anonfun$main$1.apply(LandsatIngest.scala:69)
        at demo.LandsatIngestMain$$anonfun$main$1.apply(LandsatIngest.scala:65)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at demo.LandsatIngestMain$.main(LandsatIngest.scala:65)
        at demo.LandsatIngestMain.main(LandsatIngest.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-05-04 11:01:41 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2018-05-04 11:01:41 INFO  AbstractConnector:318 - Stopped Spark@3d4d3fe7{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-05-04 11:01:41 INFO  SparkUI:54 - Stopped Spark web UI at http://192.168.0.107:4040
2018-05-04 11:01:41 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-05-04 11:01:41 INFO  MemoryStore:54 - MemoryStore cleared
2018-05-04 11:01:41 INFO  BlockManager:54 - BlockManager stopped
2018-05-04 11:01:41 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-05-04 11:01:41 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-05-04 11:01:41 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-05-04 11:01:41 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-05-04 11:01:41 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-7209680f-23da-4983-b18c-f29360175b15
2018-05-04 11:01:41 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-70e0f1c9-87a6-4a8c-85ae-8c0d4552ed9b
Makefile:136: recipe for target 'local-ingest' failed
make: *** [local-ingest] Error 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant