Skip to content

Commit

Permalink
refactor(examples) Update quickstart-monai example (#3934)
Browse files Browse the repository at this point in the history
  • Loading branch information
jafermarq authored Aug 24, 2024
1 parent dbe9570 commit ecac7f5
Show file tree
Hide file tree
Showing 13 changed files with 369 additions and 375 deletions.
1 change: 1 addition & 0 deletions examples/quickstart-monai/.gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
MedNIST*
.data_download.lock
96 changes: 42 additions & 54 deletions examples/quickstart-monai/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,88 +4,76 @@ dataset: [MedNIST]
framework: [MONAI]
---

# Flower Example using MONAI
# Federated Learning with MONAI and Flower (Quickstart Example)

This introductory example to Flower uses MONAI, but deep knowledge of MONAI is not necessarily required to run the example. However, it will help you understand how to adapt Flower to your use case.
Running this example in itself is quite easy.
Running this example in itself is quite easy. [MONAI](https://docs.monai.io/en/latest/index.html)(Medical Open Network for AI) is a PyTorch-based, open-source framework for deep learning in healthcare imaging, part of the PyTorch Ecosystem. This example uses a subset of the [MedMNIST](https://medmnist.com/) dataset including 6 classes, as done in [MONAI's classification demo](https://colab.research.google.com/drive/1wy8XUSnNWlhDNazFdvGBHLfdkGvOHBKe). Each client trains am [DenseNet121](https://docs.monai.io/en/stable/networks.html#densenet121) from MONAI.

[MONAI](https://docs.monai.io/en/latest/index.html)(Medical Open Network for AI) is a PyTorch-based, open-source framework for deep learning in healthcare imaging, part of the PyTorch Ecosystem.
> \[!NOTE\]
> This example uses [Flower Datasets](https://flower.ai/docs/datasets/) to partition the MedMNIST dataset. Its a good example to show how to bring any dataset into Flower and partition it using any of the built-in [partitioners](https://flower.ai/docs/datasets/ref-api/flwr_datasets.partitioner.html) (e.g. `DirichletPartitioner`, `PathologicalPartitioner`). Learn [how to use partitioners](https://flower.ai/docs/datasets/tutorial-use-partitioners.html) in a step-by-step tutorial.
Its ambitions are:
## Set up the project

- developing a community of academic, industrial and clinical researchers collaborating on a common foundation;
### Clone the project

- creating state-of-the-art, end-to-end training workflows for healthcare imaging;

- providing researchers with an optimized and standardized way to create and evaluate deep learning models.

## Project Setup

Start by cloning the example project. We prepared a single-line command that you can copy into your shell which will checkout the example for you:
Start by cloning the example project:

```shell
git clone --depth=1 https://github.com/adap/flower.git _tmp && mv _tmp/examples/quickstart-monai . && rm -rf _tmp && cd quickstart-monai
git clone --depth=1 https://github.com/adap/flower.git _tmp \
&& mv _tmp/examples/quickstart-monai . \
&& rm -rf _tmp \
&& cd quickstart-monai
```

This will create a new directory called `quickstart-monai` containing the following files:
This will create a new directory called `quickstart-monai` with the following structure:

```shell
-- pyproject.toml
-- requirements.txt
-- client.py
-- data.py
-- model.py
-- server.py
-- README.md
quickstart-monai
├── monaiexample
│ ├── __init__.py
│ ├── client_app.py # Defines your ClientApp
│ ├── server_app.py # Defines your ServerApp
│ └── task.py # Defines your model, training and data loading
├── pyproject.toml # Project metadata like dependencies and configs
└── README.md
```

### Installing Dependencies

Project dependencies (such as `monai` and `flwr`) are defined in `pyproject.toml` and `requirements.txt`. We recommend [Poetry](https://python-poetry.org/docs/) to install those dependencies and manage your virtual environment ([Poetry installation](https://python-poetry.org/docs/#installation)) or [pip](https://pip.pypa.io/en/latest/development/), but feel free to use a different way of installing dependencies and managing virtual environments if you have other preferences.
### Install dependencies and project

#### Poetry
Install the dependencies defined in `pyproject.toml` as well as the `monaiexample` package.

```shell
poetry install
poetry shell
```bash
pip install -e .
```

Poetry will install all your dependencies in a newly created virtual environment. To verify that everything works correctly you can run the following command:

```shell
poetry run python3 -c "import flwr"
```
## Run the project

If you don't see any errors you're good to go!
You can run your Flower project in both _simulation_ and _deployment_ mode without making changes to the code. If you are starting with Flower, we recommend you using the _simulation_ mode as it requires fewer components to be launched manually. By default, `flwr run` will make use of the Simulation Engine.

#### pip
### Run with the Simulation Engine

Write the command below in your terminal to install the dependencies according to the configuration file requirements.txt.
> \[!TIP\]
> This example runs faster when the `ClientApp`s have access to a GPU. If your system has one, you can make use of it by configuring the `backend.client-resources` component in `pyproject.toml`. If you want to try running the example with GPU right away, use the `local-simulation-gpu` federation as shown below.
```shell
pip install -r requirements.txt
```bash
# Run with the default federation (CPU only)
flwr run .
```

## Run Federated Learning with MONAI and Flower

Afterwards you are ready to start the Flower server as well as the clients. You can simply start the server in a terminal as follows:
Run the project in the `local-simulation-gpu` federation that gives CPU and GPU resources to each `ClientApp`. By default, at most 4x`ClientApp` will run in parallel in the available GPU.

```shell
python3 server.py
```bash
# Run with the `local-simulation-gpu` federation
flwr run . local-simulation-gpu
```

Now you are ready to start the Flower clients which will participate in the learning. To do so simply open two more terminal windows and run the following commands. Clients will train a [DenseNet121](https://docs.monai.io/en/stable/networks.html#densenet121) from MONAI. If a GPU is present in your system, clients will use it.

Start client 1 in the first terminal:
You can also override some of the settings for your `ClientApp` and `ServerApp` defined in `pyproject.toml`. For example:

```shell
python3 client.py --partition-id 0
```bash
flwr run . --run-config num-server-rounds=5,batch-size=32
```

Start client 2 in the second terminal:

```shell
python3 client.py --partition-id 1
```
### Run with the Deployment Engine

You will see that the federated training is starting. Look at the [code](https://github.com/adap/flower/tree/main/examples/quickstart-monai) for a detailed explanation.
> \[!NOTE\]
> An update to this example will show how to run this Flower project with the Deployment Engine and TLS certificates, or with Docker.
61 changes: 0 additions & 61 deletions examples/quickstart-monai/client.py

This file was deleted.

158 changes: 0 additions & 158 deletions examples/quickstart-monai/data.py

This file was deleted.

33 changes: 0 additions & 33 deletions examples/quickstart-monai/model.py

This file was deleted.

Empty file.
Loading

0 comments on commit ecac7f5

Please sign in to comment.