Conventional federated learning frameworks typically rely on a central server to coordinate model aggregation, which introduces scalability bottlenecks and fails to address data heterogeneity across geographically distributed clients. This paper introduces Proximity-based Self-Federated Learning (PSFL), a decentralized approach where clients autonomously form local federations based on geographical proximity and data similarity. In PSFL, nodes exchange only model updates within their neighborhoods and self-organize into specialized model groups without any central orchestration. This enables higher adaptability to non-IID data distributions while reducing communication overhead. Experiments on benchmark datasets demonstrate that PSFL achieves superior performance compared to traditional centralized FL in highly heterogeneous environments.

Full paper