Inicio  /  Information  /  Vol: 14 Par: 7 (2023)  /  Artículo
ARTÍCULO
TITULO

Federated Edge Intelligence and Edge Caching Mechanisms

Aristeidis Karras    
Christos Karras    
Konstantinos C. Giotopoulos    
Dimitrios Tsolis    
Konstantinos Oikonomou and Spyros Sioutas    

Resumen

Federated learning (FL) has emerged as a promising technique for preserving user privacy and ensuring data security in distributed machine learning contexts, particularly in edge intelligence and edge caching applications. Recognizing the prevalent challenges of imbalanced and noisy data impacting scalability and resilience, our study introduces two innovative algorithms crafted for FL within a peer-to-peer framework. These algorithms aim to enhance performance, especially in decentralized and resource-limited settings. Furthermore, we propose a client-balancing Dirichlet sampling algorithm with probabilistic guarantees to mitigate oversampling issues, optimizing data distribution among clients to achieve more accurate and reliable model training. Within the specifics of our study, we employed 10, 20, and 40 Raspberry Pi devices as clients in a practical FL scenario, simulating real-world conditions. The well-known FedAvg algorithm was implemented, enabling multi-epoch client training before weight integration. Additionally, we examined the influence of real-world dataset noise, culminating in a performance analysis that underscores how our novel methods and research significantly advance robust and efficient FL techniques, thereby enhancing the overall effectiveness of decentralized machine learning applications, including edge intelligence and edge caching.

 Artículos similares

       
 
Stela Stoykova and Nikola Shakev    
The aim of this paper is to present a systematic literature review of the existing research, published between 2006 and 2023, in the field of artificial intelligence for management information systems. Of the 3946 studies that were considered by the auth... ver más
Revista: Algorithms

 
Jaime Rincon, Vicente Julian and Carlos Carrascosa    
In recent years federated learning has emerged as a new paradigm for training machine learning models oriented to distributed systems. The main idea is that each node of a distributed system independently trains a model and shares only model parameters, ... ver más
Revista: Applied Sciences

 
Amit Portnoy, Yoav Tirosh and Danny Hendler    
The paper provides a solution for practical federated learning tasks in which a dataset is partitioned among potentially malicious clients. One such case is training a model on edge medical devices, where a compromised device could not only lead to lower... ver más
Revista: Applied Sciences

 
Sumit Rai, Arti Kumari and Dilip K. Prasad    
Federated learning promises an elegant solution for learning global models across distributed and privacy-protected datasets. However, challenges related to skewed data distribution, limited computational and communication resources, data poisoning, and ... ver más
Revista: AI

 
Olivier Debauche, Jean Bertin Nkamla Penka, Saïd Mahmoudi, Xavier Lessage, Moad Hani, Pierre Manneback, Uriel Kanku Lufuluabu, Nicolas Bert, Dounia Messaoudi and Adriano Guttadauria    
The aging of the world?s population, the willingness of elderly to remain independent, and the recent COVID-19 pandemic have demonstrated the urgent need for home-based diagnostic and patient monitoring systems to reduce the financial and organizational ... ver más
Revista: Information