Skip to content

Commit 737d450

Browse files
author
Chris Cho
authored
fixes to Kafka Connector tutorials (#16)
* fixes to Kafka Connector tutorials
1 parent bcce86b commit 737d450

File tree

6 files changed

+43
-39
lines changed

6 files changed

+43
-39
lines changed

source/tutorials/explore-change-streams.txt

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -40,8 +40,8 @@ Explore Change Streams
4040

4141
.. step:: Open a Change Stream
4242

43-
In **ChangeStreamShell1**, create a Python script to open a change stream using
44-
the PyMongo driver.
43+
In **ChangeStreamShell1**, create a Python script to open a change
44+
stream using the PyMongo driver.
4545

4646
.. code-block:: bash
4747
:copyable: true
@@ -62,7 +62,7 @@ Explore Change Streams
6262
with db.orders.watch() as stream:
6363
print('\nChange Stream is opened on the Tutorial1.orders namespace. Currently watching ...\n\n')
6464
for change in stream:
65-
print(dumps(change, indent=2))
65+
print(dumps(change, indent = 2))
6666

6767
Run the Python script:
6868

@@ -91,7 +91,7 @@ Explore Change Streams
9191
After you connect successfully, you should see the following
9292
MongoDB shell prompt:
9393

94-
.. code-block::
94+
.. code-block:: none
9595
:copyable: false
9696

9797
rs0 [direct: primary] test>
@@ -168,11 +168,11 @@ Explore Change Streams
168168
from bson.json_util import dumps
169169
client = pymongo.MongoClient('mongodb://mongo1')
170170
db = client.get_database(name='Tutorial1')
171-
pipeline = [ { "$match": { "$and": [ { "fullDocument.type":"temp" }, { "fullDocument.value":{ "$gte":100 } } ] } } ]
171+
pipeline = [ { "$match": { "$and": [ { "fullDocument.type": "temp" }, { "fullDocument.value": { "$gte": 100 } } ] } } ]
172172
with db.sensors.watch(pipeline=pipeline) as stream:
173173
print('\nChange Stream is opened on the Tutorial1.sensors namespace. Currently watching for values > 100...\n\n')
174174
for change in stream:
175-
print(dumps(change, indent=2))
175+
print(dumps(change, indent = 2))
176176

177177
Run the Python script:
178178

@@ -205,15 +205,15 @@ Explore Change Streams
205205
.. code-block:: json
206206
:copyable: false
207207

208-
[ { "$match": { "$and": [ { "fullDocument.type":"temp" }, { "fullDocument.value":{ "$gte":100 } } ] } } ]
208+
[ { "$match": { "$and": [ { "fullDocument.type": "temp" }, { "fullDocument.value": { "$gte": 100 } } ] } } ]
209209

210210
Try inserting the following documents in in **ChangeStreamShell2** to verify the
211211
change stream only produces events when the documents match the filter:
212212

213213
.. code-block:: javascript
214214

215-
db.sensors.insertOne( { 'type' : 'temp', 'value':99 } )
216-
db.sensors.insertOne( { 'type' : 'pressure', 'value':22 } )
215+
db.sensors.insertOne( { 'type' : 'temp', 'value': 99 } )
216+
db.sensors.insertOne( { 'type' : 'pressure', 'value': 22 } )
217217

218218
.. step:: (Optional) Stop the Docker Containers
219219

source/tutorials/migrate-time-series.txt

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -18,10 +18,6 @@ Time series collections efficiently store time series data. Time series
1818
data consists of measurements taken at time intervals, metadata that describes
1919
the measurement, and the time of the measurement.
2020

21-
over a period of time. Time series data consists of measurement data collected
22-
over time, metadata that describes the measurement, and the time of the
23-
measurement.
24-
2521
To convert data from a MongoDB collection to a time series collection using
2622
the connector, you need to perform the following tasks:
2723

@@ -33,8 +29,8 @@ the connector, you need to perform the following tasks:
3329

3430
In this tutorial, you perform these preceding tasks to migrate stock data
3531
from a collection to a time series collection. The time series collection
36-
stores the data more efficiently and retains the ability to analyze stock
37-
performance over time using aggregation operators.
32+
stores and indexes the data more efficiently and retains the ability to analyze
33+
stock performance over time using aggregation operators.
3834

3935
.. important:: Apple M1 incompatibility
4036

source/tutorials/replicate-with-cdc.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -251,7 +251,7 @@ Replicate Data with a CDC Handler
251251
.. code-block:: json
252252

253253
use CDCTutorial
254-
db.Source.insert({ proclaim: "Hello World!" });
254+
db.Source.insertOne({ proclaim: "Hello World!" });
255255

256256
Once MongoDB completes the insert command, you should receive an
257257
acknowledgment that resembles the following text:

source/tutorials/sink-connector.txt

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -27,23 +27,23 @@ Get Started with the MongoDB Kafka Sink Connector
2727
Create an interactive shell session on the tutorial Docker Container
2828
using the following command:
2929

30-
.. code-block::
30+
.. code-block:: bash
3131
:copyable: true
3232

3333
docker run --rm --name SinkTutorialShell --network mongodb-kafka-base_localnet -it robwma/mongokafkatutorial:latest bash
3434

3535
Create a source configuration file called ``simplesink.json`` with
3636
the following command:
3737

38-
.. code-block::
38+
.. code-block:: bash
3939
:copyable: true
4040

4141
nano simplesink.json
4242

4343
Paste the following configuration information into the file and save
4444
your changes:
4545

46-
.. code-block::
46+
.. code-block:: json
4747
:copyable: true
4848
:emphasize-lines: 7-9
4949

@@ -70,7 +70,7 @@ Get Started with the MongoDB Kafka Sink Connector
7070
Run the following command in the shell to start the sink connector
7171
using the configuration file you created:
7272

73-
.. code-block::
73+
.. code-block:: bash
7474
:copyable: true
7575

7676
cx simplesink.json
@@ -116,8 +116,8 @@ Get Started with the MongoDB Kafka Sink Connector
116116

117117
.. step:: Write Data to a Kafka Topic
118118

119-
In the same shell, create a Python script to open a change stream using
120-
the PyMongo driver.
119+
In the same shell, create a Python script to write data to a Kafka
120+
topic.
121121

122122
.. code-block:: bash
123123
:copyable: true
@@ -126,18 +126,18 @@ Get Started with the MongoDB Kafka Sink Connector
126126

127127
Paste the following code into the file and save your changes:
128128

129-
.. code-block::
129+
.. code-block:: python
130130
:copyable: true
131131

132132
from kafka import KafkaProducer
133133
import json
134134
from json import dumps
135135

136-
p = KafkaProducer(bootstrap_servers=['broker:29092'],value_serializer=lambda x:dumps(x).encode('utf-8'))
136+
p = KafkaProducer(bootstrap_servers = ['broker:29092'], value_serializer = lambda x:dumps(x).encode('utf-8'))
137137

138-
data={'name':'roscoe'}
138+
data = {'name': 'roscoe'}
139139

140-
p.send('Tutorial2.pets', value=data)
140+
p.send('Tutorial2.pets', value = data)
141141

142142
p.flush()
143143

@@ -153,31 +153,31 @@ Get Started with the MongoDB Kafka Sink Connector
153153
In the same shell, connect to MongoDB using ``mongosh``, the MongoDB
154154
shell by running the following command:
155155

156-
.. code-block::
156+
.. code-block:: bash
157157
:copyable: true
158158

159159
mongosh "mongodb://mongo1"
160160

161161
After you connect successfully, you should see the following
162162
MongoDB shell prompt:
163163

164-
.. code-block::
164+
.. code-block:: none
165165
:copyable: false
166166

167167
rs0 [direct: primary] test>
168168

169169
At the prompt, type the following commands to retrieve all the
170170
documents in the ``Tutorial2.pets`` MongoDB namespace:
171171

172-
.. code-block::
172+
.. code-block:: javascript
173173
:copyable: true
174174

175175
use Tutorial2
176176
db.pets.find()
177177

178178
You should see the following document returned as the result:
179179

180-
.. code-block::
180+
.. code-block:: json
181181
:copyable: false
182182

183183
{ _id: ObjectId("62659..."), name: 'roscoe' }

source/tutorials/source-connector.txt

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -102,7 +102,7 @@ Get Started with the MongoDB Kafka Source Connector
102102
In the same shell, connect to MongoDB using ``mongosh``, the MongoDB
103103
shell by running the following command:
104104

105-
.. code-block:: shell
105+
.. code-block:: bash
106106

107107
mongosh "mongodb://mongo1"
108108

@@ -155,7 +155,7 @@ Get Started with the MongoDB Kafka Source Connector
155155
Confirm the content of data on the new Kafka topic by running the
156156
following command:
157157

158-
.. code-block::
158+
.. code-block:: bash
159159

160160
kc Tutorial1.orders
161161

@@ -185,7 +185,7 @@ Get Started with the MongoDB Kafka Source Connector
185185
``payload`` that includes the ``fullDocument`` data as highlighted in
186186
the following formatted JSON document:
187187

188-
.. code-block::
188+
.. code-block:: json
189189
:copyable: false
190190
:emphasize-lines: 12-17
191191

@@ -225,7 +225,7 @@ Get Started with the MongoDB Kafka Source Connector
225225

226226
Stop the connector using the following command:
227227

228-
.. code-block::
228+
.. code-block:: bash
229229

230230
del mongo-simple-source
231231

@@ -249,7 +249,7 @@ Get Started with the MongoDB Kafka Source Connector
249249
Remove the existing configuration, add the following configuration,
250250
and save the file:
251251

252-
.. code-block::
252+
.. code-block:: json
253253

254254
{
255255
"name": "mongo-simple-source",
@@ -271,28 +271,28 @@ Get Started with the MongoDB Kafka Source Connector
271271

272272
Connect to MongoDB using ``mongosh`` using the following command:
273273

274-
.. code-block:: shell
274+
.. code-block:: bash
275275

276276
mongosh "mongodb://mongo1"
277277

278278
At the prompt, type the following commands to insert a new document:
279279

280-
.. code-block::
280+
.. code-block:: bash
281281

282282
use Tutorial1
283283
db.orders.insertOne( { 'order_id' : 2, 'item' : 'oatmeal' } )
284284

285285
Confirm the content of data on the new Kafka topic by running the
286286
following command:
287287

288-
.. code-block::
288+
.. code-block:: bash
289289

290290
kc Tutorial1.orders
291291

292292
The ``payload`` field in the "Value" document should contain only the
293293
following document data:
294294

295-
.. code-block::
295+
.. code-block:: json
296296
:copyable: false
297297

298298
{"_id": {"$oid": "626565..."}, "order_id": 2, "item": "oatmeal"}"}

source/tutorials/tutorial-setup.txt

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,14 @@ Set Up Your Development Environment with Docker
3838
contains helper scripts to simplify the commands you need to run in
3939
the tutorials.
4040

41+
Log into Docker on the command line, providing your credentials when
42+
prompted, by running the following command:
43+
44+
.. code-block:: bash
45+
:copyable: true
46+
47+
docker login
48+
4149
Run the following command in your terminal to download the image from
4250
Docker Hub:
4351

0 commit comments

Comments
 (0)