Skip to content

Conversation

@zsxwing
Copy link
Member

@zsxwing zsxwing commented Jan 11, 2016

@JoshRosen
Copy link
Contributor

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Jan 11, 2016

Test build #49095 has finished for PR 10692 at commit a3e3e17.

  • This patch fails from timeout after a configured wait of 250m.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 11, 2016

Test build #49116 has finished for PR 10692 at commit a3e3e17.

  • This patch fails from timeout after a configured wait of 250m.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zsxwing
Copy link
Member Author

zsxwing commented Jan 11, 2016

retest this please

@SparkQA
Copy link

SparkQA commented Jan 11, 2016

Test build #49157 has finished for PR 10692 at commit a3e3e17.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zsxwing
Copy link
Member Author

zsxwing commented Jan 11, 2016

Traceback (most recent call last):
  File "/usr/lib64/python2.6/runpy.py", line 122, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/usr/lib64/python2.6/runpy.py", line 34, in _run_code
    exec code in run_globals
  File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/streaming/tests.py", line 1638, in <module>
    kafka_assembly_jar = search_kafka_assembly_jar()
  File "/home/jenkins/workspace/SparkPullRequestBuilder/python/pyspark/streaming/tests.py", line 1564, in search_kafka_assembly_jar
    "remove all but one") % (", ".join(jars)))
Exception: Found multiple Spark Streaming Kafka assembly JARs: /home/jenkins/workspace/SparkPullRequestBuilder/external/kafka-assembly/target/scala-2.10/spark-streaming-kafka-assembly-2.0.0-SNAPSHOT.jar, /home/jenkins/workspace/SparkPullRequestBuilder/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-spark-581766.jar; please remove all but one

This is pretty weird. Where did /home/jenkins/workspace/SparkPullRequestBuilder/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-spark-581766.jar come from? @JoshRosen did you see any similar issue?

@zsxwing
Copy link
Member Author

zsxwing commented Jan 11, 2016

retest this please

@JoshRosen
Copy link
Contributor

@zsxwing, that file is generated as part of the dummy publishing step which is used to produce POMs for use by the dev/test-dependencies checks. Ping me if it happens again and I'll look at the build logs to see if I can figure out what those aren't being cleaned up. We might have to add an additional clean step somewhere.

@JoshRosen
Copy link
Contributor

Actually, I think I've spotted the problem. Working on a small fix now.

@JoshRosen
Copy link
Contributor

I've opened #10704 to fix this

@zsxwing
Copy link
Member Author

zsxwing commented Jan 11, 2016

By the way, I packed py4j-0.9.1-src.zip by myself since I could not find it anywhere.

@JoshRosen
Copy link
Contributor

Jenkins, retest this please.

@JoshRosen
Copy link
Contributor

PY4J zip packaging and source diffs look good to me, so I think this is good-to-go pending tests.

@zsxwing
Copy link
Member Author

zsxwing commented Jan 11, 2016

PY4J zip packaging and source diffs look good to me, so I think this is good-to-go pending tests.

This is not ready for merging. I plan to revert our workarounds for Py4J issues in this PR.

@SparkQA
Copy link

SparkQA commented Jan 11, 2016

Test build #49164 has finished for PR 10692 at commit a3e3e17.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

…nDStream.registerSerializer is called only once"

    This partially reverts commit 6cfe341 because the fix to make sure PySpark reads the checkpoint only once is still worth to keep
@SparkQA
Copy link

SparkQA commented Jan 11, 2016

Test build #49170 has finished for PR 10692 at commit a3e3e17.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zsxwing zsxwing changed the title [SPARK-12652][PySpark][WIP]Upgrade Py4J to 0.9.1 [SPARK-12652][PySpark]Upgrade Py4J to 0.9.1 Jan 11, 2016
@SparkQA
Copy link

SparkQA commented Jan 12, 2016

Test build #49188 has finished for PR 10692 at commit bfd4b5c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zsxwing
Copy link
Member Author

zsxwing commented Jan 12, 2016

@JoshRosen @davies could you take a look?

@davies
Copy link
Contributor

davies commented Jan 12, 2016

@zsxwing Could you also update the patch for callback server to use new API? see py4j/py4j#147

@zsxwing
Copy link
Member Author

zsxwing commented Jan 12, 2016

@zsxwing Could you also update the patch for callback server to use new API? see py4j/py4j#147

Updated

@SparkQA
Copy link

SparkQA commented Jan 12, 2016

Test build #49215 has finished for PR 10692 at commit 3db993c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zsxwing
Copy link
Member Author

zsxwing commented Jan 12, 2016

@davies could you take a look at my latest commit?

@davies
Copy link
Contributor

davies commented Jan 12, 2016

LGTM

@zsxwing
Copy link
Member Author

zsxwing commented Jan 12, 2016

Thanks, merging to master

@asfgit asfgit closed this in 4f60651 Jan 12, 2016
@zsxwing zsxwing deleted the py4j-0.9.1 branch January 12, 2016 22:29
@sarathjiguru
Copy link

Hello @zsxwing ,

This PR is not merged with branch-1.6. Is there a reason for it?
I am hoping latest version of py4j is needed for production ready spark streaming jobs. Please let me know if it is not the case.

Commits on Jan 13:
Master: https://github.com/apache/spark/commits/master?page=38
v1.6.1: https://github.com/apache/spark/commits/v1.6.1?page=3
branch-1.6: https://github.com/apache/spark/commits/branch-1.6?page=5

This particular commit is not merged with branch-1.6.

@zsxwing
Copy link
Member Author

zsxwing commented May 2, 2016

This PR is not merged with branch-1.6. Is there a reason for it?

@sarathjiguru as we have already merged workarounds to branch-1.6, it's not necessary to upgrade Py4j for 1.6.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants