We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Cross-Modal Search for Social Networks via Adversarial Learning.
- Authors
Zhou, Nan; Du, Junping; Xue, Zhe; Liu, Chong; Li, Jinxuan
- Abstract
Cross-modal search has become a research hotspot in the recent years. In contrast to traditional cross-modal search, social network cross-modal information search is restricted by data quality for arbitrary text and low-resolution visual features. In addition, the semantic sparseness of cross-modal data from social networks results in the text and visual modalities misleading each other. In this paper, we propose a cross-modal search method for social network data that capitalizes on adversarial learning (cross-modal search with adversarial learning: CMSAL). We adopt self-attention-based neural networks to generate modality-oriented representations for further intermodal correlation learning. A search module is implemented based on adversarial learning, through which the discriminator is designed to measure the distribution of generated features from intramodal and intramodal perspectives. Experiments on real-word datasets from Sina Weibo and Wikipedia, which have similar properties to social networks, show that the proposed method outperforms the state-of-the-art cross-modal search methods.
- Subjects
WEIBO (Web resource); WIKIPEDIA; SOCIAL networks; LEARNING; INFORMATION networks; DATA quality
- Publication
Computational Intelligence & Neuroscience, 2020, p1
- ISSN
1687-5265
- Publication type
Article
- DOI
10.1155/2020/7834953