We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Efficient Attention Mechanism for Dynamic Convolution in Lightweight Neural Network.
- Authors
Ding, Enjie; Cheng, Yuhao; Xiao, Chengcheng; Liu, Zhongyu; Yu, Wanli; La Foresta, Fabio
- Abstract
Light-weight convolutional neural networks (CNNs) suffer limited feature representation capabilities due to low computational budgets, resulting in degradation in performance. To make CNNs more efficient, dynamic neural networks (DyNet) have been proposed to increase the complexity of the model by using the Squeeze-and-Excitation (SE) module to adaptively obtain the importance of each convolution kernel through the attention mechanism. However, the attention mechanism in the SE network (SENet) selects all channel information for calculations, which brings essential challenges: (a) interference caused by the internal redundant information; and (b) increasing number of network calculations. To address the above problems, this work proposes a dynamic convolutional network (termed as EAM-DyNet) to reduce the number of channels in feature maps by extracting only the useful spatial information. EAM-DyNet first uses the random channel reduction and channel grouping reduction methods to remove the redundancy in the information. As the downsampling of information can lead to the loss of useful information, it then applies an adaptive average pooling method to maintain the information integrity. Extensive experimental results on the baseline demonstrate that EAM-DyNet outperformed the existing approaches, thus it can achieve higher accuracy of the network test and less network parameters.
- Subjects
CONVOLUTIONAL neural networks; DATA integrity; MACHINE learning
- Publication
Applied Sciences (2076-3417), 2021, Vol 11, Issue 7, p3111
- ISSN
2076-3417
- Publication type
Article
- DOI
10.3390/app11073111