We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Context-Dependent Multimodal Sentiment Analysis Based on a Complex Attention Mechanism.
- Authors
Deng, Lujuan; Liu, Boyi; Li, Zuhe; Ma, Jiangtao; Li, Hanbing
- Abstract
Multimodal sentiment analysis aims to understand people's attitudes and opinions from different data forms. Traditional modality fusion methods for multimodal sentiment analysis con-catenate or multiply various modalities without fully utilizing context information and the correlation between modalities. To solve this problem, this article provides a new model based on a multimodal sentiment analysis framework based on a recurrent neural network with a complex attention mechanism. First, after the raw data is preprocessed, the numerical feature representation is obtained using feature extraction. Next, the numerical features are input into the recurrent neural network, and the output results are multimodally fused using a complex attention mechanism layer. The objective of the complex attention mechanism is to leverage enhanced non-linearity to more effectively capture the inter-modal correlations, thereby improving the performance of multimodal sentiment analysis. Finally, the processed results are fed into the classification layer and the sentiment output is obtained using the classification layer. This process can effectively capture the semantic information and contextual relationship of the input sequence and fuse different pieces of modal information. Our model was tested on the CMU-MOSEI datasets, achieving an accuracy of 82.04%.
- Subjects
RECURRENT neural networks; SENTIMENT analysis; FEATURE extraction
- Publication
Electronics (2079-9292), 2023, Vol 12, Issue 16, p3516
- ISSN
2079-9292
- Publication type
Article
- DOI
10.3390/electronics12163516