Skip to content

Commit

Permalink
Remove unused MLflow client arg from DFP inference implementations (#…
Browse files Browse the repository at this point in the history
…1700)

- Remove unused MLflow client arg from DFP inference stage and module implementations
- Add missing table of content items to Modular DFP guide

Closes #1693 

## By Submitting this PR I confirm:
- I am familiar with the [Contributing Guidelines](https://github.com/nv-morpheus/Morpheus/blob/main/docs/source/developer_guide/contributing.md).
- When the PR is ready for review, new or existing tests cover these changes.
- When the PR is ready for review, the documentation is up to date with these changes.

Authors:
  - Eli Fajardo (https://github.com/efajardo-nv)

Approvers:
  - Michael Demoret (https://github.com/mdemoret-nv)

URL: #1700
  • Loading branch information
efajardo-nv committed May 15, 2024
1 parent 1e8518b commit b61502a
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 4 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,12 @@ limitations under the License.
- [DFP Post Processing](#dfp-post-processing)
- [Serialize](#serialize)
- [Write to File](#write-to-file)
- [Running Example Modular DFP Pipelines](#running-example-modular-dfp-pipelines)
- [System requirements](#system-requirements)
- [Building the services](#building-the-services)
- [Downloading the example datasets](#downloading-the-example-datasets)
- [Run Morpheus pipeline](#run-morpheus-pipeline)
- [Output Fields](#output-fields)

## Introduction

Expand Down Expand Up @@ -522,7 +528,7 @@ pip install s3fs
python examples/digital_fingerprinting/fetch_example_data.py all
```

### Morpheus Pipeline
### Run Morpheus pipeline
From the `examples/digital_fingerprinting/production` dir, run:
```bash
docker compose run morpheus_pipeline bash
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ def process_task(control_message: ControlMessage):
if (model_cache is None):
raise RuntimeError(f"Could not find model for user {user_id}")

loaded_model = model_cache.load_model(client)
loaded_model = model_cache.load_model()

# TODO(Devin): Recovery strategy should be more robust/configurable in practice
except Exception as exec_info:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def on_data(self, message: MultiDFPMessage) -> MultiDFPMessage:
if (model_cache is None):
raise RuntimeError(f"Could not find model for user {user_id}")

loaded_model = model_cache.load_model(self._client)
loaded_model = model_cache.load_model()

except Exception:
logger.exception("Error trying to get model", exc_info=True)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ def last_used(self):
def last_checked(self):
return self._last_checked

def load_model(self, _) -> AutoEncoder:
def load_model(self) -> AutoEncoder:

now = datetime.now()

Expand Down

0 comments on commit b61502a

Please sign in to comment.