-
Notifications
You must be signed in to change notification settings - Fork 881
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
refactor(examples) Refactor quickstart-sklearn-tabular to use the new Flower #3893
Conversation
examples/quickstart-sklearn-tabular/sklearnexample/server_app.py
Outdated
Show resolved
Hide resolved
…earn-tabular-example
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
…m:adap/flower into update-quickstart-sklearn-tabular-example
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
…m:adap/flower into update-quickstart-sklearn-tabular-example
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
examples/quickstart-sklearn-tabular/sklearnexample/client_app.py
Outdated
Show resolved
Hide resolved
examples/quickstart-sklearn-tabular/sklearnexample/server_app.py
Outdated
Show resolved
Hide resolved
def get_parameters(self, config): | ||
return get_model_parameters(self.model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are trying to update the examples in such a way that this method is no longer needed. For this, we need to initialize the global model when constructing the ServerApp
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I don't quite get it. This function, either directly or not (so either using self.get_parameters
or get_model_parameters
), is needed since we need to get the model parameters each round, not only the global one once. Also, in the mlx example it exists.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes. we want to retain the functionality to extract the parameters (in this case the get_model_parameters()
function) but having the get_parameters()
method seems in many cases redundant since the only reason why it exists is so, if the strategy doesn't initialise global model, one random client can be sampled and call the get_parameters()
method. But it seems to be better to always initialise the global model at the strategy so this get_parameters()
is not needed (i.e. fit()
can directly call the get_model_parameters()
function when returning).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But it seems to be better to always initialise the global mode
what is the reason for that?
For me, the discussion seems like a potato-potAto kind of thing. What difference does it really make? It's either the get_model_weights
is implemented as the external function and passed to the method or directly as a method, right? So it does not really matter which function/method is used in the fit?
examples/quickstart-sklearn-tabular/sklearnexample/client_app.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some bits i missed earlier
examples/quickstart-sklearn-tabular/sklearnexample/client_app.py
Outdated
Show resolved
Hide resolved
examples/quickstart-sklearn-tabular/sklearnexample/client_app.py
Outdated
Show resolved
Hide resolved
```shell | ||
poetry run python3 client.py --partition-id 0 # partition-id should be any of {0,1,2} | ||
```bash | ||
flwr run . --run-config penalty="l1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
flwr run . --run-config penalty="l1" | |
flwr run . --run-config num-server-rounds =10 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems "l1" penalty is not supported, so i'd suggest showing the overriding functionality with another setting.
ValueError: Solver lbfgs supports only 'l2' or None penalties, got l1 penalty.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've changed the solver to "saga" now which supports also l1
.
Though I cannot run the
flwr run . --run-config penalty="l1"
I get:
CalledProcessError: Command '['flower-simulation', '--app', '.', '--num-supernodes', '3', '--run-config', 'penalty=l1']' returned non-zero exit status 1.
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
Co-authored-by: Javier <jafermarq@users.noreply.github.com>
… Flower (#3893) Co-authored-by: jafermarq <javier@flower.ai>
Issue
The example does not use the new Flower API.
Proposal
Update the example to follow the structure and code of the new Flower.