Publication
Federated Averaging (FEDAVG) has emerged as the algorithm of choice for federated learning due to its simplicity and low communication cost. However, in spite of recent research efforts, its performance is not fully understood. We obtain tight convergence rates for FEDAVG and prove that it suffers from 'client-drift' when the data is heterogeneous (non-iid), resulting in unstable and slow convergence.
Andreas Mortensen, Léa Deillon, Alejandra Inés Slagter, Eva Luisa Vogt, David Hernandez Escobar, Jonathan Aristya Setyadji
Athanasios Nenes, Paraskevi Georgakaki
Andrea Rinaldo, Gianluca Botter