Skip to content

Commit 6ff7d1f

Browse files
committed
(DOCSP-21416) Update connector api, version constant, and release notes for v10 (#94)
* (DOCSP-21416) Finalize v10 docs
1 parent 721045f commit 6ff7d1f

17 files changed

+50
-496
lines changed

snooty.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ intersphinx = ["https://www.mongodb.com/docs/manual/objects.inv"]
66
toc_landing_pages = ["configuration"]
77

88
[constants]
9-
current-version = "10.0"
9+
current-version = "10.0.0"
1010
spark-core-version = "3.0.1"
1111
spark-sql-version = "3.0.1"
1212
scala-version = "2.12"

source/includes/extracts-command-line.yaml

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -17,11 +17,11 @@ content: |
1717
---
1818
ref: list-configuration-explanation
1919
content: |
20-
- The :ref:`spark.mongodb.input.uri <spark-input-conf>` specifies the
20+
- The :ref:`spark.mongodb.read.uri <spark-input-conf>` specifies the
2121
MongoDB server address (``127.0.0.1``), the database to connect
2222
(``test``), and the collection (``myCollection``) from which to read
2323
data, and the read preference.
24-
- The :ref:`spark.mongodb.output.uri <spark-output-conf>` specifies the
24+
- The :ref:`spark.mongodb.write.uri <spark-output-conf>` specifies the
2525
MongoDB server address (``127.0.0.1``), the database to connect
2626
(``test``), and the collection (``myCollection``) to which to write
2727
data. Connects to port ``27017`` by default.
@@ -37,8 +37,8 @@ content: |
3737
3838
.. code-block:: sh
3939
40-
./bin/spark-shell --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
41-
--conf "spark.mongodb.output.uri=mongodb://127.0.0.1/test.myCollection" \
40+
./bin/spark-shell --conf "spark.mongodb.read.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
41+
--conf "spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection" \
4242
--packages org.mongodb.spark:mongo-spark-connector_{+scala-version+}:{+current-version+}
4343
4444
.. include:: /includes/extracts/list-configuration-explanation.rst
@@ -54,8 +54,8 @@ content: |
5454
5555
.. code-block:: sh
5656
57-
./bin/pyspark --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
58-
--conf "spark.mongodb.output.uri=mongodb://127.0.0.1/test.myCollection" \
57+
./bin/pyspark --conf "spark.mongodb.read.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
58+
--conf "spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection" \
5959
--packages org.mongodb.spark:mongo-spark-connector_{+scala-version+}:{+current-version+}
6060
6161
.. include:: /includes/extracts/list-configuration-explanation.rst
@@ -71,8 +71,8 @@ content: |
7171
7272
.. code-block:: sh
7373
74-
./bin/sparkR --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
75-
--conf "spark.mongodb.output.uri=mongodb://127.0.0.1/test.myCollection" \
74+
./bin/sparkR --conf "spark.mongodb.read.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
75+
--conf "spark.mongodb.write.uri=mongodb://127.0.0.1/test.myCollection" \
7676
--packages org.mongodb.spark:mongo-spark-connector_{+scala-version+}:{+current-version+}
7777
7878
.. include:: /includes/extracts/list-configuration-explanation.rst

source/includes/list-prerequisites.rst

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,6 @@
22
:manual:`MongoDB documentation </>` and `Spark documentation
33
<https://spark.apache.org/docs/latest/>`_ for more details.
44

5-
- Running MongoDB instance (version 2.6 or later).
5+
- Running MongoDB instance (version 3.6 or later).
66

7-
- Spark 2.4.x.
8-
9-
- Scala 2.12.x
7+
- Spark 3.2.x.
Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11

22
.. note::
33

4-
If you use ``SparkConf`` to set the connector's input
4+
If you use ``SparkConf`` to set the connector's read
55
configurations, prefix each property with
6-
``spark.mongodb.input.partitionerOptions.``
6+
``spark.mongodb.read.partitionerOptions.``
77
instead of ``partitioner.options.``.
88

source/index.txt

Lines changed: 6 additions & 73 deletions
Original file line numberDiff line numberDiff line change
@@ -41,80 +41,13 @@ versions of Apache Spark and MongoDB:
4141
- MongoDB Version
4242

4343
* - **{+current-version+}**
44-
- **3.0.x**, **3.1.x**
45-
- **2.6 or later**
44+
- **3.2.x**
45+
- **3.6 or later**
4646

47-
* - 2.4.4
48-
- 2.4.x
49-
- 2.6 or later
47+
Announcements
48+
-------------
5049

51-
* - 2.3.6
52-
- 2.3.x
53-
- 2.6 or later
54-
55-
* - 2.2.10
56-
- 2.2.x
57-
- 2.6 or later
58-
59-
* - 2.1.9
60-
- 2.1.x
61-
- 2.6 or later
62-
63-
* - 2.0.0
64-
- 2.0.x
65-
- 2.6 or later
66-
67-
* - 1.1.0
68-
- 1.6.x
69-
- 2.6 or later
70-
71-
72-
----
73-
74-
.. note:: **Announcements**
75-
76-
- **August 17, 2020**, `MongoDB Connector for Spark version v3.0.0
77-
<https://www.mongodb.com/products/spark-connector>`_ Released.
78-
79-
- **Jun 10, 2020**, `MongoDB Connector for Spark versions v2.4.2,
80-
v2.3.4, v2.2.8, and v2.1.7
81-
<https://www.mongodb.com/products/spark-connector>`_ Released.
82-
83-
- **Jun 06, 2019**, `MongoDB Connector for Spark versions v2.4.1,
84-
v2.3.3, v2.2.7, and v2.1.6
85-
<https://www.mongodb.com/products/spark-connector>`_ Released.
86-
87-
- **Dec 07, 2018**, `MongoDB Connector for Spark versions v2.4.0,
88-
v2.3.2, v2.2.6, and v2.1.5
89-
<https://www.mongodb.com/products/spark-connector>`_ Released.
90-
91-
- **Oct 08, 2018**, `MongoDB Connector for Spark versions v2.3.1,
92-
v2.2.5, and v2.1.4
93-
<https://www.mongodb.com/products/spark-connector>`_ Released.
94-
95-
- **July 30, 2018**, `MongoDB Connector for Spark versions v2.3.0,
96-
v2.2.4, and v2.1.3
97-
<https://www.mongodb.com/products/spark-connector>`_ Released.
98-
99-
- **June 19, 2018**, `MongoDB Connector for Spark versions v2.2.3 and
100-
v2.1.2
101-
<https://www.mongodb.com/products/spark-connector>`_ Released.
102-
103-
- **April 18, 2018**, `MongoDB Connector for Spark version v2.2.2
104-
<https://www.mongodb.com/products/spark-connector>`_ Released.
105-
106-
- **Oct 31, 2017**, `MongoDB Connector for Spark version v2.2.1
107-
<https://www.mongodb.com/products/spark-connector>`_ Released.
108-
109-
- **July 13, 2017**, `MongoDB Connector for Spark version v2.2.0
110-
<https://www.mongodb.com/products/spark-connector>`_ Released.
111-
112-
- **July 12, 2017**, `MongoDB Connector for Spark versions v2.2.0 and
113-
v2.1.0
114-
<https://www.mongodb.com/products/spark-connector>`_ Released.
115-
116-
- **November 1, 2016**, `MongoDB Connector for Spark v2.0.0
117-
<https://www.mongodb.com/products/spark-connector>`_ Released.
50+
- **March 31, 2022**, `MongoDB Connector for Spark version v10.0.0 <https://www.mongodb.com/products/spark-connector>`_ released.
11851

11952
.. toctree::
12053
:titlesonly:
@@ -126,4 +59,4 @@ versions of Apache Spark and MongoDB:
12659
structured-streaming
12760
faq
12861
release-notes
129-
API Docs <https://www.javadoc.io/doc/org.mongodb.spark/mongo-spark-connector_{+scala-version+}/{+current-version+}>
62+
API Docs <https://www.javadoc.io/doc/org.mongodb.spark/mongo-spark-connector/latest/index.html>

source/java/aggregation.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,8 +20,8 @@
2020
SparkSession spark = SparkSession.builder()
2121
.master("local")
2222
.appName("MongoSparkConnectorIntro")
23-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
24-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
23+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
24+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
2525
.getOrCreate();
2626

2727
// Create a JavaSparkContext using the SparkSession's SparkContext object

source/java/api.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -59,8 +59,8 @@ The Java API provides a ``JavaSparkContext`` that takes a
5959
SparkSession spark = SparkSession.builder()
6060
.master("local")
6161
.appName("MongoSparkConnectorIntro")
62-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
63-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
62+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
63+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
6464
.getOrCreate();
6565

6666
// Create a JavaSparkContext using the SparkSession's SparkContext object
@@ -73,12 +73,12 @@ The Java API provides a ``JavaSparkContext`` that takes a
7373
}
7474
}
7575

76-
- The :ref:`spark.mongodb.input.uri <spark-input-conf>` specifies the
76+
- The :ref:`spark.mongodb.read.uri <spark-input-conf>` specifies the
7777
MongoDB server address(``127.0.0.1``), the database to connect
7878
(``test``), and the collection (``myCollection``) from which to read
7979
data, and the read preference.
8080

81-
- The :ref:`spark.mongodb.output.uri <spark-output-conf>` specifies the
81+
- The :ref:`spark.mongodb.write.uri <spark-output-conf>` specifies the
8282
MongoDB server address(``127.0.0.1``), the database to connect
8383
(``test``), and the collection (``myCollection``) to which to write
8484
data.

source/java/datasets-and-sql.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,8 +46,8 @@ Consider a collection named ``characters``:
4646
SparkSession spark = SparkSession.builder()
4747
.master("local")
4848
.appName("MongoSparkConnectorIntro")
49-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
50-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
49+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
50+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
5151
.getOrCreate();
5252

5353
// Create a JavaSparkContext using the SparkSession's SparkContext object

source/java/read-from-mongodb.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,8 @@ saved as part of the :ref:`write example <java-write>`.
2121
SparkSession spark = SparkSession.builder()
2222
.master("local")
2323
.appName("MongoSparkConnectorIntro")
24-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
25-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
24+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
25+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
2626
.getOrCreate();
2727

2828
// Create a JavaSparkContext using the SparkSession's SparkContext object
@@ -69,8 +69,8 @@ Specify a ``ReadConfig``
6969
SparkSession spark = SparkSession.builder()
7070
.master("local")
7171
.appName("MongoSparkConnectorIntro")
72-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
73-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
72+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
73+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
7474
.getOrCreate();
7575

7676
// Create a JavaSparkContext using the SparkSession's SparkContext object

source/java/write-to-mongodb.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -21,8 +21,8 @@
2121
SparkSession spark = SparkSession.builder()
2222
.master("local")
2323
.appName("MongoSparkConnectorIntro")
24-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
25-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
24+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
25+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
2626
.getOrCreate();
2727

2828
JavaSparkContext jsc = new JavaSparkContext(spark.sparkContext());
@@ -77,8 +77,8 @@ Using a ``WriteConfig``
7777
SparkSession spark = SparkSession.builder()
7878
.master("local")
7979
.appName("MongoSparkConnectorIntro")
80-
.config("spark.mongodb.input.uri", "mongodb://127.0.0.1/test.myCollection")
81-
.config("spark.mongodb.output.uri", "mongodb://127.0.0.1/test.myCollection")
80+
.config("spark.mongodb.read.uri", "mongodb://127.0.0.1/test.myCollection")
81+
.config("spark.mongodb.write.uri", "mongodb://127.0.0.1/test.myCollection")
8282
.getOrCreate();
8383

8484
JavaSparkContext jsc = new JavaSparkContext(spark.sparkContext());

0 commit comments

Comments
 (0)