References & Citations
Computer Science > Machine Learning
Title: Resource-Aware Heterogeneous Federated Learning using Neural Architecture Search
(Submitted on 9 Nov 2022 (v1), last revised 1 May 2024 (this version, v2))
Abstract: Federated Learning (FL) is extensively used to train AI/ML models in distributed and privacy-preserving settings. Participant edge devices in FL systems typically contain non-independent and identically distributed (Non-IID) private data and unevenly distributed computational resources. Preserving user data privacy while optimizing AI/ML models in a heterogeneous federated network requires us to address data and system/resource heterogeneity. To address these challenges, we propose Resource-aware Federated Learning (RaFL). RaFL allocates resource-aware specialized models to edge devices using Neural Architecture Search (NAS) and allows heterogeneous model architecture deployment by knowledge extraction and fusion. Combining NAS and FL enables on-demand customized model deployment for resource-diverse edge devices. Furthermore, we propose a multi-model architecture fusion scheme allowing the aggregation of the distributed learning results. Results demonstrate RaFL's superior resource efficiency compared to SoTA.
Submission history
From: Sixing Yu [view email][v1] Wed, 9 Nov 2022 09:38:57 GMT (1248kb,D)
[v2] Wed, 1 May 2024 03:31:12 GMT (3582kb,D)
Link back to: arXiv, form interface, contact.