Federated Learning with XGBoost and Flower (Quickstart Example)ΒΆ
This example demonstrates how to perform EXtreme Gradient Boosting (XGBoost) within Flower using xgboost
package.
We use HIGGS dataset for this example to perform a binary classification task.
Tree-based with bagging method is used for aggregation on the server.
This project provides a minimal code example to enable you to get started quickly. For a more comprehensive code example, take a look at xgboost-comprehensive.
Set up the projectΒΆ
Clone the projectΒΆ
Start by cloning the example project:
git clone --depth=1 https://github.com/adap/flower.git _tmp \
&& mv _tmp/examples/xgboost-quickstart . \
&& rm -rf _tmp \
&& cd xgboost-quickstart
This will create a new directory called xgboost-quickstart
with the following structure:
xgboost-quickstart
βββ xgboost_quickstart
β βββ __init__.py
β βββ client_app.py # Defines your ClientApp
β βββ server_app.py # Defines your ServerApp
β βββ task.py # Defines your utilities and data loading
βββ pyproject.toml # Project metadata like dependencies and configs
βββ README.md
Install dependencies and projectΒΆ
Install the dependencies defined in pyproject.toml
as well as the xgboost_quickstart
package.
pip install -e .
Run the projectΒΆ
You can run your Flower project in both simulation and deployment mode without making changes to the code. If you are starting with Flower, we recommend you using the simulation mode as it requires fewer components to be launched manually. By default, flwr run
will make use of the Simulation Engine.
Run with the Simulation EngineΒΆ
flwr run .
You can also override some of the settings for your ClientApp
and ServerApp
defined in pyproject.toml
. For example:
flwr run . --run-config "num-server-rounds=5 params.eta=0.05"
[!TIP] For a more detailed walk-through check our quickstart XGBoost tutorial
Run with the Deployment EngineΒΆ
[!NOTE] An update to this example will show how to run this Flower application with the Deployment Engine and TLS certificates, or with Docker.