Photon: Federated LLM Pre-Training
Photon is the first full system for federated LLM training, enabling global-scale, low-bandwidth pre-training. It trains 7B models faster than baselines with better perplexity and 64x–512x less communication.
Photon is the first full system for federated LLM training, enabling global-scale, low-bandwidth pre-training. It trains 7B models faster than baselines with better perplexity and 64x–512x less communication.
DEPT is a communication-efficient framework for federated LLM pre-training, enabling vocabulary-agnostic training across diverse data while cutting embedding memory 4–5x and improving perplexity up to 20%.
Flower and FLARE integrate to boost the FL ecosystem—Flower supports FL development and research, while FLARE provides a production-ready runtime. Integration enables seamless, modification-free deployment.
WorldLM enables global federated LLM training across diverse legal and data settings via federations of federations, using partial model localization. It outperforms standard setups by up to 1.91×.
MedPerf is an open framework for benchmarking medical AI using federated evaluation, enabling privacy-first, large-scale, real-world testing of models across healthcare facilities.