Flower Baselines Documentation¶
Welcome to Flower Baselines’ documentation. Flower is a friendly federated AI framework.
Join the Flower Community¶
The Flower Community is growing quickly - we’re a friendly group of researchers, engineers, students, professionals, academics, and other enthusiasts.
Flower Baselines¶
Flower Baselines are a collection of organised directories used to reproduce results from well-known publications or benchmarks. You can check which baselines already exist and/or contribute your own baseline.
Method |
Dataset |
Tags |
---|---|---|
cifar10, mushrooms, libsvm |
compression, heterogeneous setting, variance reduction, image classification |
|
CIFAR-100 |
image classification, system heterogeneity, cross-device, knowledge distillation |
|
CIFAR-10, Fashion-MNIST |
non-iid, image classification |
|
MNIST, MNIST-M, SVHN, USPS, SynthDigits |
data heterogeneity, feature shift, cross-silo |
|
cifar10, mnist |
malicious client, debugging, fault localization, image classification, data poisoning |
|
FEMNIST, SHAKESPEARE |
meta learning, maml, meta-sgd, personalization |
|
CIFAR-100, Tiny-ImageNet |
data heterogeneity, knowledge distillation, image classification |
|
CIFAR-10 |
normalized averaging, heterogeneous optimization, image classification |
|
CIFAR-10, CIFAR-100, MNIST |
image classification, personalization, low-rank training, tensor decomposition |
|
CIFAR-10, FLICKR-AES |
system heterogeneity, image classification, personalization, horizontal data partition |
|
CIFAR-100, Caltech101 |
foundation-models, pre-trained, one-shot, one-round |
|
MNIST |
image classification, cross-device, stragglers |
|
CIFAR-10, CIFAR-100 |
image classification, label heterogeneity, personalized federated learning |
|
Ambient Context, Speech Commands |
Audio Classification, Semi Supervised learning |
|
UCF-101, Kinectics-400 |
action recognition, cross-device, ssl, video, videossl |
|
TED-LIUM 3 |
speech, asr, cross-device |
|
“CIFAR-10” |
“Federated Learning”, “Heterogeneity”, “Efficient DNNs”, “Distributed Systems” |
|
MNIST, FashionMNIST |
robustness, model poisoning, anomaly detection, autoregressive model, regression, classification |
|
MNIST, CIFAR-10 |
system heterogeneity, image classification |
|
a9a, cod-rna, ijcnn1, space_ga, cpusmall, YearPredictionMSD |
cross-silo, tree-based, XGBoost, Classification, Regression, Tabular |
|
CIFAR-10, CIFAR-100 |
data heterogeneity, image classification, cross-silo, constrastive-learning |
|
CIFAR-10, MNIST, Fashion-MNIST |
data heterogeneity, image classification, benchmark |
|
MNIST |
local training, communication compression, partial participation, variance reduction |
Tutorials¶
A learning-oriented series of tutorials, the best place to start.
Note
Coming soon
How-to guides¶
Problem-oriented how-to guides show step-by-step how to achieve a specific goal.
Explanations¶
Understanding-oriented concept guides explain and discuss key topics and underlying ideas behind Flower and collaborative AI.
Note
Coming soon
References¶
Information-oriented API reference and other reference material.
- DASHA: Distributed Nonconvex Optimization with Communication Compression and Optimal Oracle Complexity
- DepthFL: Depthwise Federated Learning for Heterogeneous Clients
- Measuring the effects of non-identical data distribution for federated visual classification
- FedBN: Federated Learning on Non-IID Features via Local Batch Normalization
- FedDebug: Systematic Debugging for Federated Learning Applications
- FedMeta: Federated Meta-Learning with Fast Convergence and Efficient Communication
- FedMLB: Multi-Level Branched Regularization for Federated Learning
- Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization
- FedPara: Low-rank Hadamard Product for Communication-Efficient Federated Learning
- Federated Learning with Personalization Layers
- FedPFT: One-shot Federated Learning with Foundation Models
- FedProx: Federated Optimization in Heterogeneous Networks
- Exploiting Shared Representations for Personalized Federated Learning
- FedStar: Federated Self-training for Semi-supervised Audio Recognition
- Federated Self-supervised Learning for Video Understanding
- Federated Learning for ASR Based on wav2vec 2.0
- FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
- FLANDERS: Protecting Federated Learning from Extreme Model Poisoning Attacks via Multidimensional Time Series Anomaly Detection
- HeteroFL: Computation And Communication Efficient Federated Learning For Heterogeneous Clients
- Gradient-less Federated Gradient Boosting Trees with Learnable Learning Rates
- Model-Contrastive Federated Learning
- Federated Learning on Non-IID Data Silos: An Experimental Study
- TAMUNA: Doubly Accelerated Federated Learning with Local Training, Compression, and Partial Participation