Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retire MXNet examples #2724

Merged
merged 11 commits into from
Dec 25, 2023
5 changes: 1 addition & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ design of Flower is based on a few guiding principles:
- **Framework-agnostic**: Different machine learning frameworks have different
strengths. Flower can be used with any machine learning framework, for
example, [PyTorch](https://pytorch.org),
[TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/)
[TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/)
for users who enjoy computing gradients by hand.

- **Understandable**: Flower is written with maintainability in mind. The
Expand Down Expand Up @@ -81,7 +81,6 @@ Stay tuned, more tutorials are coming soon. Topics include **Privacy and Securit
- [Quickstart (PyTorch)](https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html)
- [Quickstart (Hugging Face)](https://flower.dev/docs/framework/tutorial-quickstart-huggingface.html)
- [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/framework/tutorial-quickstart-pytorch-lightning.html)
- [Quickstart (MXNet)](https://flower.dev/docs/framework/example-mxnet-walk-through.html)
- [Quickstart (Pandas)](https://flower.dev/docs/framework/tutorial-quickstart-pandas.html)
- [Quickstart (fastai)](https://flower.dev/docs/framework/tutorial-quickstart-fastai.html)
- [Quickstart (JAX)](https://flower.dev/docs/framework/tutorial-quickstart-jax.html)
Expand Down Expand Up @@ -124,7 +123,6 @@ Quickstart examples:
- [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning)
- [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai)
- [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas)
- [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart-mxnet)
- [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax)
- [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist)
- [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android)
Expand All @@ -134,7 +132,6 @@ Other [examples](https://github.com/adap/flower/tree/main/examples):

- [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices)
- [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated)
- [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet-from-centralized-to-federated)
- [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow)
- [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch)
- Single-Machine Simulation of Federated Learning Systems ([PyTorch](https://github.com/adap/flower/tree/main/examples/simulation_pytorch)) ([Tensorflow](https://github.com/adap/flower/tree/main/examples/simulation_tensorflow))
Expand Down
2 changes: 2 additions & 0 deletions doc/source/ref-changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@

- **General updates to Flower Examples** ([#2381](https://github.com/adap/flower/pull/2381))

- **Retiring MXNet examples** The development of the MXNet fremework has ended and the project is now [archived on GitHub](https://github.com/apache/mxnet). Existing MXNet examples won't receive updates [#2724](https://github.com/adap/flower/pull/2724)

- **Update Flower Baselines**

- HFedXGBoost [#2226](https://github.com/adap/flower/pull/2226)
Expand Down
2 changes: 2 additions & 0 deletions doc/source/tutorial-quickstart-mxnet.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@
Quickstart MXNet
================

.. warning:: MXNet is no longer maintained and has been moved into `Attic <https://attic.apache.org/projects/mxnet.html>`_. As a result, we would encourage you to use other ML frameworks alongise Flower, for example, PyTorch. This tutorial might be removed in future versions of Flower.

.. meta::
:description: Check out this Federated Learning quickstart tutorial for using Flower with MXNet to train a Sequential model on MNIST.

Expand Down
2 changes: 2 additions & 0 deletions examples/mxnet-from-centralized-to-federated/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# MXNet: From Centralized To Federated

> Note the MXNet project has ended, and is now in [Attic](https://attic.apache.org/projects/mxnet.html). The MXNet GitHub has also [been archived](https://github.com/apache/mxnet). As a result, this example won't be receiving more updates. Using MXNet is no longer recommnended.

This example demonstrates how an already existing centralized MXNet-based machine learning project can be federated with Flower.

This introductory example for Flower uses MXNet, but you're not required to be a MXNet expert to run the example. The example will help you to understand how Flower can be used to build federated learning use cases based on an existing MXNet project.
Expand Down
5 changes: 2 additions & 3 deletions examples/mxnet-from-centralized-to-federated/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ authors = ["The Flower Authors <hello@flower.dev>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
flwr = ">=1.0,<2.0"
# flwr = { path = "../../", develop = true } # Development
mxnet = "1.6.0"
flwr = "1.6.0"
mxnet = "1.9.1"
numpy = "1.23.1"
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
flwr>=1.0,<2.0
mxnet==1.6.0
flwr==1.6.0
mxnet==1.9.1
numpy==1.23.1
2 changes: 2 additions & 0 deletions examples/quickstart-mxnet/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# Flower Example using MXNet

> Note the MXNet project has ended, and is now in [Attic](https://attic.apache.org/projects/mxnet.html). The MXNet GitHub has also [been archived](https://github.com/apache/mxnet). As a result, this example won't be receiving more updates. Using MXNet is no longer recommnended.

This example demonstrates how to run a MXNet machine learning project federated with Flower.

This introductory example for Flower uses MXNet, but you're not required to be a MXNet expert to run the example. The example will help you to understand how Flower can be used to build federated learning use cases based on an existing MXNet projects.
Expand Down
5 changes: 2 additions & 3 deletions examples/quickstart-mxnet/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ authors = ["The Flower Authors <hello@flower.dev>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
flwr = ">=1.0,<2.0"
# flwr = { path = "../../", develop = true } # Development
mxnet = "1.6.0"
flwr = "1.6.0"
mxnet = "1.9.1"
numpy = "1.23.1"
4 changes: 2 additions & 2 deletions examples/quickstart-mxnet/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
flwr>=1.0,<2.0
mxnet==1.6.0
flwr==1.6.0
mxnet==1.9.1
numpy==1.23.1