%0 Journal Article %T 基于梯度范数感知最小化的去中心化联邦学习算法
A Decentralized Federated Learning Algorithm Based on Gradient Norm Aware Minimization %A 方诚意 %A 胡建华 %A 黄佳龙 %J Modeling and Simulation %P 665-671 %@ 2324-870X %D 2025 %I Hans Publishing %R 10.12677/mos.2025.144319 %X 去中心化联邦学习是通过一组设备执行隐私保护的分布式学习,它有效降低了中心化联邦学习的通信成本和信息泄露风险。然而,设备之间的非独立同分布数据会影响模型效果。为了解决这个问题,几乎所有的算法都利用经验风险最小化作为局部优化器,但这很容易造成客户端本地训练过拟合,造成算法全局模型的泛化能力下降。本文利用梯度范数感知最小化,提出基于梯度范数感知最小化的去中心化联邦学习算法,使全局模型损失函数的表面更加平滑,提升模型的泛化能力。
Decentralized Federated Learning performs privacy-preserving distributed learning across a group of devices, reducing the communication costs and information leakage risks associated with centralized federated learning. However, the non-independent and identically distributed (Non-IID) data among devices can negatively impact the model’s performance. To address this issue, most algorithms adopt empirical risk minimization as the local optimizer, which often leads to overfitting during local client training and results in decreased generalization ability of the global model. This paper proposes a Decentralized Federated Learning Algorithm based on Gradient Norm-Aware Minimization, which smooths the loss surface of the global model and enhances its generalization performance. %K 去中心化联邦学习, %K 分布式算法, %K 非独立同分布数据
Decentralized Federated Learning %K Distributed Algorithms %K Non-IID Data %U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=112196