We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Reconstruct Recurrent Neural Networks via Flexible Sub-Models for Time Series Classification.
- Authors
Ma, Ye; Chang, Qing; Lu, Huanzhang; Liu, Junliang
- Abstract
Recurrent neural networks (RNNs) remain challenging, and there is still a lack of long-term memory or learning ability in sequential data classification and prediction. In this paper, we propose a flexible recurrent model, BIdirectional COnvolutional RaNdom RNNs (BICORN-RNNs), incorporating a series of sub-models: random projection, convolutional operation, and bidirectional transmission. These subcategories advance classification accuracy, which was limited by the gradient vanishing and the exploding problem. Experiments on public time series datasets demonstrate that our proposed method substantially outperforms a variety of existing models. Furthermore, the coordination of the accuracy and efficiency concerning a variety of factors, including SNR, length, data missing, and overlapping, is also discussed.
- Subjects
ARTIFICIAL neural networks; RANDOM projection method; TIME series analysis
- Publication
Applied Sciences (2076-3417), 2018, Vol 8, Issue 4, p630
- ISSN
2076-3417
- Publication type
Article
- DOI
10.3390/app8040630