|
|
|
Ying-Hsun Lai, Shin-Yeh Chen, Wen-Chi Chou, Hua-Yang Hsu and Han-Chieh Chao
Federated learning trains a neural network model using the client?s data to maintain the benefits of centralized model training while maintaining their privacy. However, if the client data are not independently and identically distributed (non-IID) becau...
ver más
|
|
|
|
|
|
|
Hanyue Xu, Kah Phooi Seng, Jeremy Smith and Li Minn Ang
In the context of smart cities, the integration of artificial intelligence (AI) and the Internet of Things (IoT) has led to the proliferation of AIoT systems, which handle vast amounts of data to enhance urban infrastructure and services. However, the co...
ver más
|
|
|
|
|
|
|
Pin-Hung Juan and Ja-Ling Wu
In this study, we present a federated learning approach that combines a multi-branch network and the Oort client selection algorithm to improve the performance of federated learning systems. This method successfully addresses the significant issue of non...
ver más
|
|
|
|
|
|
|
Yankai Lv, Haiyan Ding, Hao Wu, Yiji Zhao and Lei Zhang
Federated learning (FL) is an emerging decentralized machine learning framework enabling private global model training by collaboratively leveraging local client data without transferring it centrally. Unlike traditional distributed optimization, FL trai...
ver más
|
|
|
|
|
|
|
Kavitha Srinivasan, Sainath Prasanna, Rohit Midha, Shraddhaa Mohan
Pág. 1 - 20
Advances have been made in the field of Machine Learning showing that it is an effective tool that can be used for solving real world problems. This success is hugely attributed to the availability of accessible data which is not the case for many fields...
ver más
|
|
|
|
|
|
|
Shao-Ming Lee and Ja-Ling Wu
Recently, federated learning (FL) has gradually become an important research topic in machine learning and information theory. FL emphasizes that clients jointly engage in solving learning tasks. In addition to data security issues, fundamental challenge...
ver más
|
|
|
|
|
|
|
Qianqian Tong, Guannan Liang, Jiahao Ding, Tan Zhu, Miao Pan and Jinbo Bi
Regularized sparse learning with the l0
l
0
-norm is important in many areas, including statistical learning and signal processing. Iterative hard thresholding (IHT) methods are the state-of-the-art for nonconvex-constrained sparse learning due to their ...
ver más
|
|
|
|
|
|
|
Sumit Rai, Arti Kumari and Dilip K. Prasad
Federated learning promises an elegant solution for learning global models across distributed and privacy-protected datasets. However, challenges related to skewed data distribution, limited computational and communication resources, data poisoning, and ...
ver más
|
|
|
|
|
|
|
Ahmed A. Al-Saedi, Veselka Boeva and Emiliano Casalicchio
Federated Learning (FL) provides a promising solution for preserving privacy in learning shared models on distributed devices without sharing local data on a central server. However, most existing work shows that FL incurs high communication costs. To ad...
ver más
|
|
|
|
|
|
|
Zheyi Chen, Weixian Liao, Pu Tian, Qianlong Wang and Wei Yu
Distributed machine learning paradigms have benefited from the concurrent advancement of deep learning and the Internet of Things (IoT), among which federated learning is one of the most promising frameworks, where a central server collaborates with loca...
ver más
|
|
|
|