Flower Example using XGBoost#

View on GitHub

This example demonstrates how to perform EXtreme Gradient Boosting (XGBoost) within Flower using xgboost package. We use HIGGS dataset for this example to perform a binary classification task. Tree-based with bagging method is used for aggregation on the server.

This project provides a minimal code example to enable you to get started quickly. For a more comprehensive code example, take a look at xgboost-comprehensive.

Project Setup#

Start by cloning the example project. We prepared a single-line command that you can copy into your shell which will checkout the example for you:

git clone --depth=1 https://github.com/adap/flower.git && mv flower/examples/xgboost-quickstart . && rm -rf flower && cd xgboost-quickstart

This will create a new directory called xgboost-quickstart containing the following files:

-- README.md         <- Your're reading this right now
-- server.py         <- Defines the server-side logic
-- client.py         <- Defines the client-side logic
-- run.sh            <- Commands to run experiments
-- pyproject.toml    <- Example dependencies

Installing Dependencies#

Project dependencies (such as xgboost and flwr) are defined in pyproject.toml. You can install the dependencies by invoking pip:

# From a new python environment, run:
pip install .

Then, to verify that everything works correctly you can run the following command:

python3 -c "import flwr"

If you don’t see any errors you’re good to go!

Run Federated Learning with XGBoost and Flower#

Afterwards you are ready to start the Flower server as well as the clients. You can simply start the server in a terminal as follows:

python3 server.py

Now you are ready to start the Flower clients which will participate in the learning. To do so simply open two more terminal windows and run the following commands.

Start client 1 in the first terminal:

python3 client.py --partition-id=0

Start client 2 in the second terminal:

python3 client.py --partition-id=1

You will see that XGBoost is starting a federated training.

Alternatively, you can use run.sh to run the same experiment in a single terminal as follows:

poetry run ./run.sh

Look at the code and tutorial for a detailed explanation.