Skip to content

Commit c63af95

Browse files
LeengitSimoneBendazzoli93
authored andcommitted
DOC: Use upstream/downstream instead of source/destination (Project-MONAI#316)
* DOC: Use upstream/downstream instead of source/destination For monai.deploy.core.Operator, instead of sometimes using "upstream" and "downstream" and sometimes using "source" and "destination", consistently use former. Signed-off-by: Lee Newberg <lee.newberg@kitware.com> * DOC: Use source/destination instead of upstream/downstream For monai.deploy.core.Operator, instead of sometimes using "upstream" and "downstream" and sometimes using "source" and "destination", consistently use the latter. Signed-off-by: Lee Newberg <lee.newberg@kitware.com> Signed-off-by: Simone Bendazzoli <simben@kth.se>
1 parent d952ceb commit c63af95

File tree

11 files changed

+44
-44
lines changed

11 files changed

+44
-44
lines changed

docs/source/developing_with_sdk/creating_application_class.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,11 +52,11 @@ The resource requirements (such as `cpu`, `memory`, and `gpu`) for the applicati
5252

5353
In `compose()` method, operators are instantiated and connected through <a href="../modules/_autosummary/monai.deploy.core.Application.html#monai.deploy.core.Application.add_flow">self.add_flow()</a>.
5454

55-
> add_flow(upstream_op, downstream_op, io_map=None)
55+
> add_flow(source_op, destination_op, io_map=None)
5656
5757
`io_map` is a dictionary of mapping from the source operator's label to the destination operator's label(s) and its type is `Dict[str, str|Set[str]]`.
5858

59-
We can skip specifying `io_map` if both the number of `upstream_op`'s outputs and the number of `downstream_op`'s inputs are one.
59+
We can skip specifying `io_map` if both the number of `source_op`'s outputs and the number of `destination_op`'s inputs are one.
6060
For example, if Operators named `task1` and `task2` has only one input and output (with the label `image`), `self.add_flow(task1, task2)` is same with `self.add_flow(task1, task2, {"image": "image"})` or `self.add_flow(task1, task2, {"image": {"image"}})`.
6161

6262
```python

docs/srs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ MONAI Deploy App SDK 0.1.0
8989

9090
## [REQ] Representing Workflow With DAG
9191

92-
The SDK shall enable dependencies among upstream and downstream operators in an application using a DAG so that app workflow can be modeled unambiguously. The SDK shall provide a mechanism to link an output port of an upstream operator to an input port of a downstream operator to form the DAG.
92+
The SDK shall enable dependencies among source and destination operators in an application using a DAG so that app workflow can be modeled unambiguously. The SDK shall provide a mechanism to link an output port of a source operator to an input port of a destination operator to form the DAG.
9393

9494
### Background
9595

@@ -397,7 +397,7 @@ MONAI Deploy App SDK 0.1.0
397397

398398
## [REQ] Loading a DICOM 2d/3d dataset into a unified domain object
399399

400-
The SDK shall enable applications to load a 2D/3D imaging dataset belonging to a single DICOM series into a unified "Image" domain object so that downstream operators can process this domain object based on the application's needs such as transformation and inference.
400+
The SDK shall enable applications to load a 2D/3D imaging dataset belonging to a single DICOM series into a unified "Image" domain object so that destination operators can process this domain object based on the application's needs such as transformation and inference.
401401

402402
### Background
403403

examples/apps/ai_livertumor_seg_app/app.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ def compose(self):
5858
"Tumor",
5959
]
6060
)
61-
# Create the processing pipeline, by specifying the upstream and downstream operators, and
61+
# Create the processing pipeline, by specifying the source and destination operators, and
6262
# ensuring the output from the former matches the input of the latter, in both name and type.
6363
self.add_flow(study_loader_op, series_selector_op, {"dicom_study_list": "dicom_study_list"})
6464
self.add_flow(
@@ -68,7 +68,7 @@ def compose(self):
6868
# Add the publishing operator to save the input and seg images for Render Server.
6969
# Note the PublisherOperator has temp impl till a proper rendering module is created.
7070
self.add_flow(unetr_seg_op, publisher_op, {"saved_images_folder": "saved_images_folder"})
71-
# Note below the dicom_seg_writer requires two inputs, each coming from a upstream operator.
71+
# Note below the dicom_seg_writer requires two inputs, each coming from a source operator.
7272
self.add_flow(
7373
series_selector_op, dicom_seg_writer, {"study_selected_series_list": "study_selected_series_list"}
7474
)

examples/apps/ai_spleen_seg_app/app.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,14 +64,14 @@ def compose(self):
6464
# Create DICOM Seg writer with segment label name in a string list
6565
dicom_seg_writer = DICOMSegmentationWriterOperator(seg_labels=["Spleen"])
6666

67-
# Create the processing pipeline, by specifying the upstream and downstream operators, and
67+
# Create the processing pipeline, by specifying the source and destination operators, and
6868
# ensuring the output from the former matches the input of the latter, in both name and type.
6969
self.add_flow(study_loader_op, series_selector_op, {"dicom_study_list": "dicom_study_list"})
7070
self.add_flow(
7171
series_selector_op, series_to_vol_op, {"study_selected_series_list": "study_selected_series_list"}
7272
)
7373
self.add_flow(series_to_vol_op, bundle_spleen_seg_op, {"image": "image"})
74-
# Note below the dicom_seg_writer requires two inputs, each coming from a upstream operator.
74+
# Note below the dicom_seg_writer requires two inputs, each coming from a source operator.
7575
self.add_flow(
7676
series_selector_op, dicom_seg_writer, {"study_selected_series_list": "study_selected_series_list"}
7777
)

examples/apps/ai_unetr_seg_app/app.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ def compose(self):
5454
output_file="stl/multi-organs.stl", keep_largest_connected_component=False
5555
)
5656

57-
# Create the processing pipeline, by specifying the upstream and downstream operators, and
57+
# Create the processing pipeline, by specifying the source and destination operators, and
5858
# ensuring the output from the former matches the input of the latter, in both name and type.
5959
self.add_flow(study_loader_op, series_selector_op, {"dicom_study_list": "dicom_study_list"})
6060
self.add_flow(

monai/deploy/core/application.py

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -182,35 +182,35 @@ def add_operator(self, operator: Operator):
182182
self._graph.add_operator(operator)
183183

184184
def add_flow(
185-
self, upstream_op: Operator, downstream_op: Operator, io_map: Optional[Dict[str, Union[str, Set[str]]]] = None
185+
self, source_op: Operator, destination_op: Operator, io_map: Optional[Dict[str, Union[str, Set[str]]]] = None
186186
):
187-
"""Adds a flow from upstream to downstream.
187+
"""Adds a flow from source to destination.
188188
189-
An output port of the upstream operator is connected to one of the
190-
input ports of a downstream operators.
189+
An output port of the source operator is connected to one of the
190+
input ports of a destination operators.
191191
192192
Args:
193-
upstream_op (Operator): An instance of the upstream operator of type Operator.
194-
downstream_op (Operator): An instance of the downstream operator of type Operator.
193+
source_op (Operator): An instance of the source operator of type Operator.
194+
destination_op (Operator): An instance of the destination operator of type Operator.
195195
io_map (Optional[Dict[str, Union[str, Set[str]]]]): A dictionary of mapping from the source operator's label
196196
to the destination operator's label(s).
197197
"""
198198

199-
# Ensure that the upstream and downstream operators are valid
200-
upstream_op.ensure_valid()
201-
downstream_op.ensure_valid()
199+
# Ensure that the source and destination operators are valid
200+
source_op.ensure_valid()
201+
destination_op.ensure_valid()
202202

203-
op_output_labels = upstream_op.op_info.get_labels(IO.OUTPUT)
204-
op_input_labels = downstream_op.op_info.get_labels(IO.INPUT)
203+
op_output_labels = source_op.op_info.get_labels(IO.OUTPUT)
204+
op_input_labels = destination_op.op_info.get_labels(IO.INPUT)
205205
if not io_map:
206206
if len(op_output_labels) > 1:
207207
raise IOMappingError(
208-
f"The upstream operator has more than one output port "
208+
f"The source operator has more than one output port "
209209
f"({', '.join(op_output_labels)}) so mapping should be specified explicitly!"
210210
)
211211
if len(op_input_labels) > 1:
212212
raise IOMappingError(
213-
f"The downstream operator has more than one output port ({', '.join(op_input_labels)}) "
213+
f"The destination operator has more than one output port ({', '.join(op_input_labels)}) "
214214
f"so mapping should be specified explicitly!"
215215
)
216216
io_map = {"": {""}}
@@ -221,14 +221,14 @@ def add_flow(
221221
if isinstance(v, str):
222222
io_maps[k] = {v}
223223

224-
# Verify that the upstream & downstream operator have the input and output ports specified by the io_map
224+
# Verify that the source & destination operator have the input and output ports specified by the io_map
225225
output_labels = list(io_maps.keys())
226226

227227
if len(op_output_labels) == 1 and len(output_labels) != 1:
228228
raise IOMappingError(
229-
f"The upstream operator({upstream_op.name}) has only one port with label "
229+
f"The source operator({source_op.name}) has only one port with label "
230230
f"'{next(iter(op_output_labels))}' but io_map specifies {len(output_labels)} "
231-
f"labels({', '.join(output_labels)}) to the upstream operator's output port"
231+
f"labels({', '.join(output_labels)}) to the source operator's output port"
232232
)
233233

234234
for output_label in output_labels:
@@ -239,7 +239,7 @@ def add_flow(
239239
del io_maps[output_label]
240240
break
241241
raise IOMappingError(
242-
f"The upstream operator({upstream_op.name}) has no output port with label '{output_label}'. "
242+
f"The source operator({source_op.name}) has no output port with label '{output_label}'. "
243243
f"It should be one of ({', '.join(op_output_labels)})."
244244
)
245245

@@ -249,9 +249,9 @@ def add_flow(
249249

250250
if len(op_input_labels) == 1 and len(input_labels) != 1:
251251
raise IOMappingError(
252-
f"The downstream operator({downstream_op.name}) has only one port with label "
252+
f"The destination operator({destination_op.name}) has only one port with label "
253253
f"'{next(iter(op_input_labels))}' but io_map specifies {len(input_labels)} "
254-
f"labels({', '.join(input_labels)}) to the downstream operator's input port"
254+
f"labels({', '.join(input_labels)}) to the destination operator's input port"
255255
)
256256

257257
for input_label in input_labels:
@@ -262,11 +262,11 @@ def add_flow(
262262
input_labels.add(next(iter(op_input_labels)))
263263
break
264264
raise IOMappingError(
265-
f"The downstream operator({downstream_op.name}) has no input port with label '{input_label}'. "
265+
f"The destination operator({destination_op.name}) has no input port with label '{input_label}'. "
266266
f"It should be one of ({', '.join(op_input_labels)})."
267267
)
268268

269-
self._graph.add_flow(upstream_op, downstream_op, io_maps)
269+
self._graph.add_flow(source_op, destination_op, io_maps)
270270

271271
def get_package_info(self, model_path: Union[str, Path] = "") -> Dict:
272272
"""Returns the package information of this application.

monai/deploy/core/executors/multi_process_executor.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,10 +38,10 @@
3838
# # Figure out how to deal with duplicate nodes
3939
# q.put(e[1])
4040
# edge_data = g.get_edge_data(e[0], e[1])
41-
# output = node.get_output(edge_data["upstream_op_port"])
42-
# key1 = (e[0].get_uid(), "output", edge_data["upstream_op_port"])
41+
# output = node.get_output(edge_data["source_op_port"])
42+
# key1 = (e[0].get_uid(), "output", edge_data["source_op_port"])
4343
# self._storage.store(key1, output)
44-
# key2 = (e[1].get_uid(), "input", edge_data["downstream_op_port"])
44+
# key2 = (e[1].get_uid(), "input", edge_data["destination_op_port"])
4545
# self._storage.store(key2, output)
4646

4747
# def _launch_operator(self, op):

monai/deploy/core/executors/multi_threaded_executor.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,8 +38,8 @@
3838
# # Figure out how to deal with duplicate nodes
3939
# q.put(e[1])
4040
# edge_data = g.get_edge_data(e[0], e[1])
41-
# output = node.get_output(edge_data["upstream_op_port"])
42-
# key1 = (e[0].get_uid(), "output", edge_data["upstream_op_port"])
41+
# output = node.get_output(edge_data["source_op_port"])
42+
# key1 = (e[0].get_uid(), "output", edge_data["source_op_port"])
4343
# self._storage.store(key1, output)
44-
# key2 = (e[1].get_uid(), "input", edge_data["downstream_op_port"])
44+
# key2 = (e[1].get_uid(), "input", edge_data["destination_op_port"])
4545
# self._storage.store(key2, output)

monai/deploy/operators/monai_bundle_inference_operator.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -215,7 +215,7 @@ class MonaiBundleInferenceOperator(InferenceOperator):
215215
216216
For image input and output, the type is the `Image` class. For output of probabilities, the type is `Dict`.
217217
218-
This operator is expected to be linked with both upstream and downstream operators, e.g. receiving an `Image` object from
218+
This operator is expected to be linked with both source and destination operators, e.g. receiving an `Image` object from
219219
the `DICOMSeriesToVolumeOperator`, and passing a segmentation `Image` to the `DICOMSegmentationWriterOperator`.
220220
In such cases, the I/O storage type can only be `IN_MEMORY` due to the restrictions imposed by the application executor.
221221
However, when used as the first operator in an application, its input storage type needs to be `DISK`, and the file needs
@@ -618,7 +618,7 @@ def _send_output(self, value: Any, name: str, metadata: Dict, op_output: OutputC
618618
# out of the MONAI post processing is [CWHD] with dim for batch already squeezed out.
619619
# Prediction image, e.g. segmentation image, needs to have its dimensions
620620
# rearranged to fit the conventions used by Image class, i.e. [DHW], without channel dim.
621-
# Also, based on known use cases, e.g. prediction being seg image and the downstream
621+
# Also, based on known use cases, e.g. prediction being seg image and the destination
622622
# operators expect the data type to be unit8, conversion needs to be done as well.
623623
# Metadata, such as pixel spacing and orientation, also needs to be set in the Image object,
624624
# which is why metadata is expected to be passed in.

notebooks/tutorials/01_simple_app.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -341,11 +341,11 @@
341341
"In `compose()` method, objects of `SobelOperator`, `MedianOperator`, and `GaussianOperator` classes are created\n",
342342
"and connected through <a href=\"../../modules/_autosummary/monai.deploy.core.Application.html#monai.deploy.core.Application.add_flow\">self.add_flow()</a>.\n",
343343
"\n",
344-
"> add_flow(upstream_op, downstream_op, io_map=None)\n",
344+
"> add_flow(source_op, destination_op, io_map=None)\n",
345345
"\n",
346346
"`io_map` is a dictionary of mapping from the source operator's label to the destination operator's label(s) and its type is `Dict[str, str|Set[str]]`. \n",
347347
"\n",
348-
"We can skip specifying `io_map` if both the number of `upstream_op`'s outputs and the number of `downstream_op`'s inputs are one so `self.add_flow(sobel_op, median_op)` is same with `self.add_flow(sobel_op, median_op, {\"image\": \"image\"})` or `self.add_flow(sobel_op, median_op, {\"image\": {\"image\"}})`.\n"
348+
"We can skip specifying `io_map` if both the number of `source_op`'s outputs and the number of `destination_op`'s inputs are one so `self.add_flow(sobel_op, median_op)` is same with `self.add_flow(sobel_op, median_op, {\"image\": \"image\"})` or `self.add_flow(sobel_op, median_op, {\"image\": {\"image\"}})`.\n"
349349
]
350350
},
351351
{

0 commit comments

Comments
 (0)