Skip to content
/ Libra Public

Repo for preprint paper - Understanding Generalization of Federated Learning: the Trade-off between Model Stability and Optimization

Notifications You must be signed in to change notification settings

dunzeng/Libra

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Libra

Repo for preprint paper "Understanding Generalization of Federated Learning: the Trade-off between Model Stability and Optimization".

Supplementary experiments: train TinyBert on Federated AGNews

Figure 1 : Using same $\eta_g$, large local update steps K make FL early over-fitting. We observe that TinyBert failed to converge when $K=1, 20$.

Figure 2 : Stablizing federated optimization (K=15) with global learning rate decay $\eta_g^t = \eta_g \cdot \epsilon^{t}$.

Figure 3: FOSM (K=3) - large momentum $\beta$ enlarges generalization gap, indicating worsen stability.

Reference

Please cite our paper if you find its insights helpful.

@article{zeng2024understanding,
  title={Understanding generalization of federated learning: The trade-off between model stability and optimization},
  author={Zeng, Dun and Wu, Zheshun and Liu, Shiyu and Pan, Yu and Tang, Xiaoying and Xu, Zenglin},
  journal={arXiv preprint arXiv:2411.16303},
  year={2024}
}

About

Repo for preprint paper - Understanding Generalization of Federated Learning: the Trade-off between Model Stability and Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published