Resource-efficient federated learning robust to communication errors
Peer reviewed, Journal article
Accepted version
View/ Open
Date
2023Metadata
Show full item recordCollections
Original version
10.1109/SSP53291.2023.10208024Abstract
The effectiveness of federated learning (FL) in leveraging distributed datasets is highly contingent upon the accuracy of model exchanges between clients and servers. Communication errors caused by noisy links can negatively impact learning accuracy. To address this issue, we present an FL algorithm that is robust to communication errors while reducing the communication load on clients. To derive the proposed algorithm, we consider a weighted least-squares regression problem as a motivating example. We cast the considered problem as a distributed optimization problem over a federated network, which employs random scheduling to enhance communication efficiency, and solve it using the alternating direction method of multipliers. To improve robustness, we eliminate the local dual parameters and reduce the number of global model exchanges via a change of variable. We analyze the mean convergence of our proposed algorithm and demonstrate its effectiveness compared with related existing algorithms via simulations.