The growing demands of Society 5.0 require intelligent systems that are both cooperative and energy efficient. This paper introduces Sparse Self-Federated Learning (SSFL), a novel paradigm combining sparsification with self-federation to reduce communication and computation costs in distributed training. By leveraging sparsity, the approach ensures lightweight yet accurate models, while self-federation allows nodes to autonomously coordinate their training without centralized orchestration. Experimental results demonstrate significant gains in energy efficiency and scalability, highlighting the potential of SSFL for large-scale cooperative intelligence on edge devices and cyber-physical systems.