Skip to content

Conversation

@tdas
Copy link
Contributor

@tdas tdas commented Jun 28, 2016

What changes were proposed in this pull request?

  • Moved DataStreamReader/Writer from pyspark.sql to pyspark.sql.streaming to make them consistent with scala packaging
  • Exposed the necessary classes in sql.streaming package so that they appear in the docs
  • Added pyspark.sql.streaming module to the docs

How was this patch tested?

  • updated unit tests.
  • generated docs for testing visibility of pyspark.sql.streaming classes.

@tdas
Copy link
Contributor Author

tdas commented Jun 28, 2016

@zsxwing Can you take a look?

@SparkQA
Copy link

SparkQA commented Jun 28, 2016

Test build #61417 has finished for PR 13955 at commit 94d6019.

  • This patch fails Python style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 28, 2016

Test build #61419 has finished for PR 13955 at commit 8269e4b.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 28, 2016

Test build #61420 has finished for PR 13955 at commit 7dc42d9.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 29, 2016

Test build #61431 has finished for PR 13955 at commit cb517c1.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • public final class JavaStructuredNetworkWordCount

self.interval)


class DataStreamReader(OptionUtils):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we create a separate file? this file is already very long.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then probably we have to create directory streaming and move all the code in streaming.py into separate files. I wanted to minimize the changes. we can do the bookkeeping later. what do you think?

@SparkQA
Copy link

SparkQA commented Jun 29, 2016

Test build #61432 has finished for PR 13955 at commit e8f4f1d.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 29, 2016

Test build #61439 has finished for PR 13955 at commit 50626bd.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@zsxwing
Copy link
Member

zsxwing commented Jun 29, 2016

LGTM. Merging to master and 2.0. Thanks!

@asfgit asfgit closed this in f454a7f Jun 29, 2016
asfgit pushed a commit that referenced this pull request Jun 29, 2016
…rk.sql to pyspark.sql.streaming

## What changes were proposed in this pull request?

- Moved DataStreamReader/Writer from pyspark.sql to pyspark.sql.streaming to make them consistent with scala packaging
- Exposed the necessary classes in sql.streaming package so that they appear in the docs
- Added pyspark.sql.streaming module to the docs

## How was this patch tested?
- updated unit tests.
- generated docs for testing visibility of pyspark.sql.streaming classes.

Author: Tathagata Das <tathagata.das1565@gmail.com>

Closes #13955 from tdas/SPARK-16266.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants