Issue |
Natl Sci Open
Volume 2, Number 1, 2023
|
|
---|---|---|
Article Number | 20220043 | |
Number of page(s) | 17 | |
Section | Information Sciences | |
DOI | https://doi.org/10.1360/nso/20220043 | |
Published online | 10 January 2023 |
RESEARCH ARTICLE
DeceFL: a principled fully decentralized federated learning framework
1 School of Artificial Intelligence and Automation, Huazhong University of Science and Technology, Wuhan 430074, China
2 School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, China
3 Department of Applied Mathematics, University of Waterloo, Waterloo N2L 3G1, Canada
4 State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, Shenyang 110819, China
5 School of Mathematical and Physical Sciences, Wuhan Textile University, Wuhan 430200, China
6 School of Electrical Engineering and Computer Science, and Digital Futures, KTH Royal Institute of Technology, Stockholm 10044, Sweden
7 Shenyang Institute of Automation (SIA), Chinese Academy of Sciences, Shenyang 110169, China
8 School of Mathematics, Frontiers Science Center for Mobile Information Communication and Security, Southeast University, Nanjing 211189, China
9 Purple Mountain Laboratories, Nanjing 211111, China
10 AVIC Chengdu Aircraft Industrial (Group) Co., Ltd., Chengdu 610091, China
* Corresponding author (email: yye@hust.edu.cn)
Received:
2
August
2022
Revised:
5
October
2022
Accepted:
28
October
2022
Traditional machine learning relies on a centralized data pipeline for model training in various applications; however, data are inherently fragmented. Such a decentralized nature of databases presents the serious challenge for collaboration: sending all decentralized datasets to a central server raises serious privacy concerns. Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks, such as federated learning, most state-of-the-art frameworks are built still in a centralized way, in which a central client is needed for collecting and distributing model information (instead of data itself) from every other client, leading to high communication burden and high vulnerability when there exists a failure at or an attack on the central client. Here we propose a principled decentralized federated learning algorithm (DeceFL), which does not require a central client and relies only on local information transmission between clients and their neighbors, representing a fully decentralized learning framework. It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate O(1/T) (where T is the number of iterations in gradient descent) as centralized federated learning when the loss function is smooth and strongly convex. Finally, the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions, time-invariant and time-varying topologies, as well as IID and Non-IID of datasets, demonstrating its applicability to a wide range of real-world medical and industrial applications.
Key words: decentralized federated learning / smart manufacturing / control systems privacy
© The Author(s) 2023. Published by China Science Publishing & Media Ltd. and EDP Sciences.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.