Inicio  /  Applied Sciences  /  Vol: 8 Núm: 4 Par: April (2018)  /  Artículo
ARTÍCULO
TITULO

Reconstruct Recurrent Neural Networks via Flexible Sub-Models for Time Series Classification

Ye Ma    
Qing Chang    
Huanzhang Lu and Junliang Liu    

Resumen

Recurrent neural networks (RNNs) remain challenging, and there is still a lack of long-term memory or learning ability in sequential data classification and prediction. In this paper, we propose a flexible recurrent model, BIdirectional COnvolutional RaNdom RNNs (BICORN-RNNs), incorporating a series of sub-models: random projection, convolutional operation, and bidirectional transmission. These subcategories advance classification accuracy, which was limited by the gradient vanishing and the exploding problem. Experiments on public time series datasets demonstrate that our proposed method substantially outperforms a variety of existing models. Furthermore, the coordination of the accuracy and efficiency concerning a variety of factors, including SNR, length, data missing, and overlapping, is also discussed.

 Artículos similares