Flower Logistic Regression Example using scikit-learn and Flower (Quickstart Example)ΒΆ

View on GitHub

This example of Flower uses scikit-learn’s LogisticRegression model to train a federated learning system. It will help you understand how to adapt Flower for use with scikit-learn. Running this example in itself is quite easy. This example uses Flower Datasets to download, partition and preprocess the MNIST dataset.

Set up the projectΒΆ

Clone the projectΒΆ

Start by cloning the example project:

git clone --depth=1 https://github.com/adap/flower.git _tmp \
		&& mv _tmp/examples/sklearn-logreg-mnist . \
		&& rm -rf _tmp && cd sklearn-logreg-mnist

This will create a new directory called sklearn-logreg-mnist with the following structure:

sklearn-logreg-mnist
β”œβ”€β”€ README.md
β”œβ”€β”€ pyproject.toml      # Project metadata like dependencies and configs
└── sklearn_example
    β”œβ”€β”€ __init__.py
    β”œβ”€β”€ client_app.py   # Defines your ClientApp
    β”œβ”€β”€ server_app.py   # Defines your ServerApp
    └── task.py         # Defines your model, training and data loading

Install dependencies and projectΒΆ

Install the dependencies defined in pyproject.toml as well as the sklearn_example package.

pip install -e .

Run the projectΒΆ

You can run your Flower project in both simulation and deployment mode without making changes to the code. If you are starting with Flower, we recommend you using the simulation mode as it requires fewer components to be launched manually. By default, flwr run will make use of the Simulation Engine.

Run with the Simulation EngineΒΆ

flwr run .

You can also override some of the settings for your ClientApp and ServerApp defined in pyproject.toml. For example:

flwr run . --run-config "num-server-rounds=5 fraction-fit=0.25"

[!TIP] For a more detailed walk-through check our quickstart PyTorch tutorial

Run with the Deployment EngineΒΆ

[!NOTE] An update to this example will show how to run this Flower application with the Deployment Engine and TLS certificates, or with Docker.