We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
A Joint Optimization Framework of the Embedding Model and Classifier for Meta-Learning.
- Authors
Liu, Zhongyu; Chu, Xu; Lu, Yan; Yu, Wanli; Miao, Shuguang; Ding, Enjie
- Abstract
The aim of meta-learning is to train the machine to learn quickly and accurately. Improving the performance of the meta-learning model is important in solving the problem of small samples and in achieving general artificial intelligence. A meta-learning method based on feature embedding that exhibits good performance on the few-shot problem was previously proposed. In this method, the pretrained deep convolution neural network was used as the embedding model of sample features, and the output of one layer was used as the feature representation of samples. The main limitation of the method is the inability to fuse low-level texture features and high-level semantic features of the embedding model and joint optimization of the embedding model and classifier. Therefore, a multilayer adaptive joint training and optimization method of the embedding model was proposed in the current study. The main characteristics of the current method include using multilayer adaptive hierarchical loss to train the embedding model and using the quantum genetic algorithm to jointly optimize the embedding model and classifier. Validation was performed based on multiple public datasets for meta-learning model testing. The proposed method shows higher accuracy compared with multiple baseline methods.
- Subjects
ARTIFICIAL intelligence; CONVOLUTIONAL neural networks; PROBLEM solving; MACHINE learning; GENETIC algorithms; METAHEURISTIC algorithms; VIRTUAL networks
- Publication
Scientific Programming, 2021, p1
- ISSN
1058-9244
- Publication type
Article
- DOI
10.1155/2021/1538914