posted on 2025-08-01, 14:24authored byJ Mills, J Hu, G Min
Federated Learning (FL) is a swiftly evolving field
within machine learning for collaboratively training models at the
network edge in a privacy-preserving fashion, without training
data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applications
with sensitive data such as healthcare, finance, and social media.
However, there are barriers to real-world FL at the wireless
network edge, stemming from massive wireless parallelism and
the high communication costs of model transmission. The communication cost of FL is heavily impacted by the heterogeneous
distribution of data across clients, and some cutting-edge works
attempt to address this problem using novel client-side optimisation strategies. In this paper, we provide a tutorial on model
training in FL, and survey the recent developments in client-side
optimisation and how they relate to the communication properties
of FL. We then perform a set of comparison experiments on
a representative subset of these strategies, gaining insights into
their communication-convergence tradeoffs. Finally, we highlight
challenges to client-side optimisation and provide suggestions for
future developments for FL at the wireless edge.
Funding
101008297
Engineering and Physical Sciences Research Council (EPSRC)