We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Abstract painting image orientation recognition based on eye movement and multitask learning.
- Authors
Bai, Ruyi; Guo, Xiaoying
- Abstract
Abstract paintings are produced by artists based on their concepts and employ color, texture, and other techniques. Determining the arrangement of abstract paintings is challenging given their ambiguous nature. Previous studies on image orientation recognition faced three major difficulties. First, they relied heavily on pre-existing convolution neural network models, such as VGG and AlexNet. Second, they focused largely on a single task—recognizing image orientation. Finally, ground truth data concerning visual perception regions of images were often obtained through manual annotation. To overcome these issues, we introduce OC–OD: a multitask approach fused across multiple feature layers for better performance. The orientation classification (OC) subtask is the primary task, whereas the visual perception region detection (OD) subtask is auxiliary. OC and OD utilize the same feature extraction layer; OD aims to enhance the efficiency of OC completion. At the same time, the ground truth data used in OD is obtained from gaze fixation density maps gathered by an eye tracker during the subject's viewing of the image, rather than through manual annotation. Two datasets were chosen to compare the training impact of various model parameters and architecture. The experimental results were extensively compared, leading to the discovery that our proposed approach significantly enhances orientation recognition accuracy and outperforms other state-of-the-art methodologies.
- Subjects
ABSTRACT painting; GAZE; IMAGE recognition (Computer vision); EYE movements; CONVOLUTIONAL neural networks; ARTIFICIAL neural networks; VISUAL perception
- Publication
Journal of Electronic Imaging, 2024, Vol 33, Issue 2, p23018
- ISSN
1017-9909
- Publication type
Article
- DOI
10.1117/1.JEI.33.2.023018