Inicio  /  Future Internet  /  Vol: 15 Par: 9 (2023)  /  Artículo
ARTÍCULO
TITULO

Exploring Homomorphic Encryption and Differential Privacy Techniques towards Secure Federated Learning Paradigm

Rezak Aziz    
Soumya Banerjee    
Samia Bouzefrane and Thinh Le Vinh    

Resumen

The trend of the next generation of the internet has already been scrutinized by top analytics enterprises. According to Gartner investigations, it is predicted that, by 2024, 75% of the global population will have their personal data covered under privacy regulations. This alarming statistic necessitates the orchestration of several security components to address the enormous challenges posed by federated and distributed learning environments. Federated learning (FL) is a promising technique that allows multiple parties to collaboratively train a model without sharing their data. However, even though FL is seen as a privacy-preserving distributed machine learning method, recent works have demonstrated that FL is vulnerable to some privacy attacks. Homomorphic encryption (HE) and differential privacy (DP) are two promising techniques that can be used to address these privacy concerns. HE allows secure computations on encrypted data, while DP provides strong privacy guarantees by adding noise to the data. This paper first presents consistent attacks on privacy in federated learning and then provides an overview of HE and DP techniques for secure federated learning in next-generation internet applications. It discusses the strengths and weaknesses of these techniques in different settings as described in the literature, with a particular focus on the trade-off between privacy and convergence, as well as the computation overheads involved. The objective of this paper is to analyze the challenges associated with each technique and identify potential opportunities and solutions for designing a more robust, privacy-preserving federated learning framework.

 Artículos similares

       
 
Ying-Hsun Lai, Shin-Yeh Chen, Wen-Chi Chou, Hua-Yang Hsu and Han-Chieh Chao    
Federated learning trains a neural network model using the client?s data to maintain the benefits of centralized model training while maintaining their privacy. However, if the client data are not independently and identically distributed (non-IID) becau... ver más
Revista: Future Internet

 
Haedam Kim, Suhyun Park, Hyemin Hong, Jieun Park and Seongmin Kim    
As the size of the IoT solutions and services market proliferates, industrial fields utilizing IoT devices are also diversifying. However, the proliferation of IoT devices, often intertwined with users? personal information and privacy, has led to a cont... ver más
Revista: Future Internet

 
Jing Liu, Xuesong Hai and Keqin Li    
Massive amounts of data drive the performance of deep learning models, but in practice, data resources are often highly dispersed and bound by data privacy and security concerns, making it difficult for multiple data sources to share their local data dir... ver más
Revista: Future Internet

 
Wei He and Mingze Chen    
The advancement of cutting-edge technologies significantly transforms urban lifestyles and is indispensable in sustainable urban design and planning. This systematic review focuses on the critical role of innovative technologies and digitalization, parti... ver más
Revista: Buildings

 
Zacharias Anastasakis, Terpsichori-Helen Velivassaki, Artemis Voulkidis, Stavroula Bourou, Konstantinos Psychogyios, Dimitrios Skias and Theodore Zahariadis    
Federated Learning is identified as a reliable technique for distributed training of ML models. Specifically, a set of dispersed nodes may collaborate through a federation in producing a jointly trained ML model without disclosing their data to each othe... ver más
Revista: Future Internet