Redirigiendo al acceso original de articulo en 19 segundos...
Inicio  /  Algorithms  /  Vol: 15 Par: 7 (2022)  /  Artículo
ARTÍCULO
TITULO

ZenoPS: A Distributed Learning System Integrating Communication Efficiency and Security

Cong Xie    
Oluwasanmi Koyejo and Indranil Gupta    

Resumen

Distributed machine learning is primarily motivated by the promise of increased computation power for accelerating training and mitigating privacy concerns. Unlike machine learning on a single device, distributed machine learning requires collaboration and communication among the devices. This creates several new challenges: (1) the heavy communication overhead can be a bottleneck that slows down the training, and (2) the unreliable communication and weaker control over the remote entities make the distributed system vulnerable to systematic failures and malicious attacks. This paper presents a variant of stochastic gradient descent (SGD) with improved communication efficiency and security in distributed environments. Our contributions include (1) a new technique called error reset to adapt both infrequent synchronization and message compression for communication reduction in both synchronous and asynchronous training, (2) new score-based approaches for validating the updates, and (3) integration with both error reset and score-based validation. The proposed system provides communication reduction, both synchronous and asynchronous training, Byzantine tolerance, and local privacy preservation. We evaluate our techniques both theoretically and empirically.

Palabras claves

 Artículos similares

       
 
Yuzhu Zhang and Hao Xu    
This study investigates the problem of decentralized dynamic resource allocation optimization for ad-hoc network communication with the support of reconfigurable intelligent surfaces (RIS), leveraging a reinforcement learning framework. In the present co... ver más
Revista: Algorithms

 
Tuan Phong Tran, Anh Hung Ngoc Tran, Thuan Minh Nguyen and Myungsik Yoo    
Multi-access edge computing (MEC) brings computations closer to mobile users, thereby decreasing service latency and providing location-aware services. Nevertheless, given the constrained resources of the MEC server, it is crucial to provide a limited nu... ver más
Revista: Applied Sciences

 
Changhao Wu, Siyang He, Zengshan Yin and Chongbin Guo    
Large-scale low Earth orbit (LEO) remote satellite constellations have become a brand new, massive source of space data. Federated learning (FL) is considered a promising distributed machine learning technology that can communicate optimally using these ... ver más
Revista: Applied Sciences

 
Alya Alshammari and Khalil El Hindi    
The combination of collaborative deep learning and Cyber-Physical Systems (CPSs) has the potential to improve decision-making, adaptability, and efficiency in dynamic and distributed environments. However, it brings privacy, communication, and resource r... ver más
Revista: Applied Sciences

 
Nyo Me Htun, Toshiaki Owari, Satoshi Tsuyuki and Takuya Hiroshima    
High-value timber species with economic and ecological importance are usually distributed at very low densities, such that accurate knowledge of the location of these trees within a forest is critical for forest management practices. Recent technological... ver más
Revista: Algorithms