--- tags: [quickstart] dataset: [] framework: [numpy] --- # Federated Learning with NumPy and Flower (Quickstart Example) [View on GitHub](https://github.com/adap/flower/blob/main/examples/quickstart-numpy) This example of Flower uses a dummy `NumPy` model as well as dummy training and evaluation steps in the `ClientApp` to showcase the core functionality of Flower apps. This app does not use a dataset. ## Set up the project ### Fetch the app Install Flower: ```shell pip install flwr ``` Fetch the app: ```shell flwr new @flwrlabs/quickstart-numpy ``` ```shell quickstart-numpy ├── quickstart_numpy │ ├── __init__.py │ ├── client_app.py # Defines your ClientApp │ ├── server_app.py # Defines your ServerApp │ └── task.py # Defines model creation ├── pyproject.toml # Project metadata like dependencies and configs └── README.md ``` ### Install dependencies and project Install the dependencies defined in `pyproject.toml` as well as the `quickstart_numpy` package. ```bash pip install -e . ``` > **Tip:** Your `pyproject.toml` file can define more than just the dependencies of your Flower app. You can also use it to specify hyperparameters for your runs and control which Flower Runtime is used. By default, it uses the Simulation Runtime, but you can switch to the Deployment Runtime when needed. > Learn more in the [TOML configuration guide](https://flower.ai/docs/framework/how-to-configure-pyproject-toml.html). ## Run with the Simulation Engine In the `quickstart-numpy` directory, use `flwr run` to run a local simulation: ```bash flwr run . ``` Refer to the [How to Run Simulations](https://flower.ai/docs/framework/how-to-run-simulations.html) guide in the documentation for advice on how to optimize your simulations. ## Run with the Deployment Engine Follow this [how-to guide](https://flower.ai/docs/framework/how-to-run-flower-with-deployment-engine.html) to run the same app in this example but with Flower's Deployment Engine. After that, you might be interested in setting up [secure TLS-enabled communications](https://flower.ai/docs/framework/how-to-enable-tls-connections.html) and [SuperNode authentication](https://flower.ai/docs/framework/how-to-authenticate-supernodes.html) in your federation. You can run Flower on Docker too! Check out the [Flower with Docker](https://flower.ai/docs/framework/docker/index.html) documentation. ## Resources - Flower website: [flower.ai](https://flower.ai/) - Check the documentation: [flower.ai/docs](https://flower.ai/docs/) - Give Flower a ⭐️ on GitHub: [GitHub](https://github.com/adap/flower) - Join the Flower community! - [Flower Slack](https://flower.ai/join-slack/) - [Flower Discuss](https://discuss.flower.ai/)