Federated learning traditionally depends on centralized servers to orchestrate model aggregation, which can limit scalability and resilience. This paper introduces a field-based coordination (FBC) paradigm to support federated learning in a fully decentralized way. By leveraging computational fields, devices exchange local information that self-organizes into global coordination patterns, enabling model training without central control. The approach supports partitioned models, autonomous collaboration, and enhanced fault tolerance. Experimental evaluation shows that FBC achieves performance comparable to centralized approaches, while improving scalability and flexibility in distributed and dynamic environments.

Full paper