Skip to content

Conversation

@srowen
Copy link
Member

@srowen srowen commented Mar 1, 2019

What changes were proposed in this pull request?

Docs still say that Spark will be available on PyPi "in the future"; just needs to be updated.

How was this patch tested?

Doc build

@srowen srowen self-assigned this Mar 1, 2019
@SparkQA
Copy link

SparkQA commented Mar 1, 2019

Test build #102931 has finished for PR 23933 at commit 859d3ba.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM.

@HyukjinKwon
Copy link
Member

Merged to master.

Users can also download a "Hadoop free" binary and run Spark with any Hadoop version
[by augmenting Spark's classpath](hadoop-provided.html).
Scala and Java users can include Spark in their projects using its Maven coordinates and in the future Python users can also install Spark from PyPI.
Scala and Java users can include Spark in their projects using its Maven coordinates and Python users can install Spark from PyPI.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, we might have to clarify it installs "PySpark" than "Spark" ... (given the discussion we had before at #23715)

@srowen srowen deleted the SPARK-26807 branch March 4, 2019 16:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants