Inicio  /  Information  /  Vol: 15 Par: 4 (2024)  /  Artículo

Quantum Convolutional Long Short-Term Memory Based on Variational Quantum Algorithms in the Era of NISQ

Zeyu Xu    
Wenbin Yu    
Chengjun Zhang and Yadang Chen    


In the era of noisy intermediate-scale quantum (NISQ) computing, the synergistic collaboration between quantum and classical computing models has emerged as a promising solution for tackling complex computational challenges. Long short-term memory (LSTM), as a popular network for modeling sequential data, has been widely acknowledged for its effectiveness. However, with the increasing demand for data and spatial feature extraction, the training cost of LSTM exhibits exponential growth. In this study, we propose the quantum convolutional long short-term memory (QConvLSTM) model. By ingeniously integrating classical convolutional LSTM (ConvLSTM) networks and quantum variational algorithms, we leverage the variational quantum properties and the accelerating characteristics of quantum states to optimize the model training process. Experimental validation demonstrates that, compared to various LSTM variants, our proposed QConvLSTM model outperforms in terms of performance. Additionally, we adopt a hierarchical tree-like circuit design philosophy to enhance the model?s parallel computing capabilities while reducing dependence on quantum bit counts and circuit depth. Moreover, the inherent noise resilience in variational quantum algorithms makes this model more suitable for spatiotemporal sequence modeling tasks on NISQ devices.

 Artículos similares