diff --git a/README.md b/README.md index b8b62e8c0c43..750b5cdb4b93 100644 --- a/README.md +++ b/README.md @@ -34,7 +34,7 @@ design of Flower is based on a few guiding principles: - **Framework-agnostic**: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, [PyTorch](https://pytorch.org), - [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [MXNet](https://mxnet.apache.org/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/) + [TensorFlow](https://tensorflow.org), [Hugging Face Transformers](https://huggingface.co/), [PyTorch Lightning](https://pytorchlightning.ai/), [scikit-learn](https://scikit-learn.org/), [JAX](https://jax.readthedocs.io/), [TFLite](https://tensorflow.org/lite/), [fastai](https://www.fast.ai/), [Pandas](https://pandas.pydata.org/) for federated analytics, or even raw [NumPy](https://numpy.org/) for users who enjoy computing gradients by hand. - **Understandable**: Flower is written with maintainability in mind. The @@ -81,7 +81,6 @@ Stay tuned, more tutorials are coming soon. Topics include **Privacy and Securit - [Quickstart (PyTorch)](https://flower.dev/docs/framework/tutorial-quickstart-pytorch.html) - [Quickstart (Hugging Face)](https://flower.dev/docs/framework/tutorial-quickstart-huggingface.html) - [Quickstart (PyTorch Lightning [code example])](https://flower.dev/docs/framework/tutorial-quickstart-pytorch-lightning.html) -- [Quickstart (MXNet)](https://flower.dev/docs/framework/example-mxnet-walk-through.html) - [Quickstart (Pandas)](https://flower.dev/docs/framework/tutorial-quickstart-pandas.html) - [Quickstart (fastai)](https://flower.dev/docs/framework/tutorial-quickstart-fastai.html) - [Quickstart (JAX)](https://flower.dev/docs/framework/tutorial-quickstart-jax.html) @@ -124,7 +123,6 @@ Quickstart examples: - [Quickstart (PyTorch Lightning)](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch-lightning) - [Quickstart (fastai)](https://github.com/adap/flower/tree/main/examples/quickstart-fastai) - [Quickstart (Pandas)](https://github.com/adap/flower/tree/main/examples/quickstart-pandas) -- [Quickstart (MXNet)](https://github.com/adap/flower/tree/main/examples/quickstart-mxnet) - [Quickstart (JAX)](https://github.com/adap/flower/tree/main/examples/quickstart-jax) - [Quickstart (scikit-learn)](https://github.com/adap/flower/tree/main/examples/sklearn-logreg-mnist) - [Quickstart (Android [TFLite])](https://github.com/adap/flower/tree/main/examples/android) @@ -134,7 +132,6 @@ Other [examples](https://github.com/adap/flower/tree/main/examples): - [Raspberry Pi & Nvidia Jetson Tutorial](https://github.com/adap/flower/tree/main/examples/embedded-devices) - [PyTorch: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/pytorch-from-centralized-to-federated) -- [MXNet: From Centralized to Federated](https://github.com/adap/flower/tree/main/examples/mxnet-from-centralized-to-federated) - [Advanced Flower with TensorFlow/Keras](https://github.com/adap/flower/tree/main/examples/advanced-tensorflow) - [Advanced Flower with PyTorch](https://github.com/adap/flower/tree/main/examples/advanced-pytorch) - Single-Machine Simulation of Federated Learning Systems ([PyTorch](https://github.com/adap/flower/tree/main/examples/simulation_pytorch)) ([Tensorflow](https://github.com/adap/flower/tree/main/examples/simulation_tensorflow)) diff --git a/doc/source/ref-changelog.md b/doc/source/ref-changelog.md index 507489e76e7b..c4aad511a4a5 100644 --- a/doc/source/ref-changelog.md +++ b/doc/source/ref-changelog.md @@ -6,6 +6,8 @@ - **General updates to Flower Examples** ([#2381](https://github.com/adap/flower/pull/2381)) +- **Retiring MXNet examples** The development of the MXNet fremework has ended and the project is now [archived on GitHub](https://github.com/apache/mxnet). Existing MXNet examples won't receive updates [#2724](https://github.com/adap/flower/pull/2724) + - **Update Flower Baselines** - HFedXGBoost [#2226](https://github.com/adap/flower/pull/2226) diff --git a/doc/source/tutorial-quickstart-mxnet.rst b/doc/source/tutorial-quickstart-mxnet.rst index 149d060e4c00..ff8d4b2087dd 100644 --- a/doc/source/tutorial-quickstart-mxnet.rst +++ b/doc/source/tutorial-quickstart-mxnet.rst @@ -4,6 +4,8 @@ Quickstart MXNet ================ +.. warning:: MXNet is no longer maintained and has been moved into `Attic `_. As a result, we would encourage you to use other ML frameworks alongise Flower, for example, PyTorch. This tutorial might be removed in future versions of Flower. + .. meta:: :description: Check out this Federated Learning quickstart tutorial for using Flower with MXNet to train a Sequential model on MNIST. diff --git a/examples/mxnet-from-centralized-to-federated/README.md b/examples/mxnet-from-centralized-to-federated/README.md index 839d3b16a1cf..2c3f240d8978 100644 --- a/examples/mxnet-from-centralized-to-federated/README.md +++ b/examples/mxnet-from-centralized-to-federated/README.md @@ -1,5 +1,7 @@ # MXNet: From Centralized To Federated +> Note the MXNet project has ended, and is now in [Attic](https://attic.apache.org/projects/mxnet.html). The MXNet GitHub has also [been archived](https://github.com/apache/mxnet). As a result, this example won't be receiving more updates. Using MXNet is no longer recommnended. + This example demonstrates how an already existing centralized MXNet-based machine learning project can be federated with Flower. This introductory example for Flower uses MXNet, but you're not required to be a MXNet expert to run the example. The example will help you to understand how Flower can be used to build federated learning use cases based on an existing MXNet project. diff --git a/examples/mxnet-from-centralized-to-federated/pyproject.toml b/examples/mxnet-from-centralized-to-federated/pyproject.toml index a0d31f76ebdd..952683eb90f6 100644 --- a/examples/mxnet-from-centralized-to-federated/pyproject.toml +++ b/examples/mxnet-from-centralized-to-federated/pyproject.toml @@ -10,7 +10,6 @@ authors = ["The Flower Authors "] [tool.poetry.dependencies] python = ">=3.8,<3.11" -flwr = ">=1.0,<2.0" -# flwr = { path = "../../", develop = true } # Development -mxnet = "1.6.0" +flwr = "1.6.0" +mxnet = "1.9.1" numpy = "1.23.1" diff --git a/examples/mxnet-from-centralized-to-federated/requirements.txt b/examples/mxnet-from-centralized-to-federated/requirements.txt index 73060e27c70c..8dd6f7150dfd 100644 --- a/examples/mxnet-from-centralized-to-federated/requirements.txt +++ b/examples/mxnet-from-centralized-to-federated/requirements.txt @@ -1,3 +1,3 @@ -flwr>=1.0,<2.0 -mxnet==1.6.0 +flwr==1.6.0 +mxnet==1.9.1 numpy==1.23.1 diff --git a/examples/quickstart-mxnet/README.md b/examples/quickstart-mxnet/README.md index 930cec5acdfd..37e01ef2707c 100644 --- a/examples/quickstart-mxnet/README.md +++ b/examples/quickstart-mxnet/README.md @@ -1,5 +1,7 @@ # Flower Example using MXNet +> Note the MXNet project has ended, and is now in [Attic](https://attic.apache.org/projects/mxnet.html). The MXNet GitHub has also [been archived](https://github.com/apache/mxnet). As a result, this example won't be receiving more updates. Using MXNet is no longer recommnended. + This example demonstrates how to run a MXNet machine learning project federated with Flower. This introductory example for Flower uses MXNet, but you're not required to be a MXNet expert to run the example. The example will help you to understand how Flower can be used to build federated learning use cases based on an existing MXNet projects. diff --git a/examples/quickstart-mxnet/pyproject.toml b/examples/quickstart-mxnet/pyproject.toml index a0d31f76ebdd..952683eb90f6 100644 --- a/examples/quickstart-mxnet/pyproject.toml +++ b/examples/quickstart-mxnet/pyproject.toml @@ -10,7 +10,6 @@ authors = ["The Flower Authors "] [tool.poetry.dependencies] python = ">=3.8,<3.11" -flwr = ">=1.0,<2.0" -# flwr = { path = "../../", develop = true } # Development -mxnet = "1.6.0" +flwr = "1.6.0" +mxnet = "1.9.1" numpy = "1.23.1" diff --git a/examples/quickstart-mxnet/requirements.txt b/examples/quickstart-mxnet/requirements.txt index 73060e27c70c..8dd6f7150dfd 100644 --- a/examples/quickstart-mxnet/requirements.txt +++ b/examples/quickstart-mxnet/requirements.txt @@ -1,3 +1,3 @@ -flwr>=1.0,<2.0 -mxnet==1.6.0 +flwr==1.6.0 +mxnet==1.9.1 numpy==1.23.1