We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Facial Expression Recognition Model Depending on Optimized Support Vector Machine.
- Authors
Alhussan, Amel Ali; Talaat, Fatma M.; El-kenawy, El-Sayed M.; Abdelhamid, Abdelaziz A.; Ibrahim, Abdelhameed; Khafaga, Doaa Sami; Alnaggar, Mona
- Abstract
In computer vision, emotion recognition using facial expression images is considered an important research issue. Deep learning advances in recent years have aided in attaining improved results in this issue. According to recent studies, multiple facial expressions may be included in facial photographs representing a particular type of emotion. It is feasible and useful to convert face photos into collections of visual words and carry out global expression recognition. The main contribution of this paper is to propose a facial expression recognition model (FERM) depending on an optimized Support Vector Machine (SVM). To test the performance of the proposed model (FERM), AffectNet is used. AffectNet uses 1250 emotion-related keywords in six different languages to search three major search engines and get over 1,000,000 facial photos online. The FERM is composed of three main phases: (i) the Data preparation phase, (ii) Applying grid search for optimization, and (iii) the categorization phase. Linear discriminant analysis (LDA) is used to categorize the data into eight labels (neutral, happy, sad, surprised, fear, disgust, angry, and contempt). Due to using LDA, the performance of categorization via SVM has been obviously enhanced. Grid search is used to find the optimal values for hyperparameters of SVM (C and gamma). The proposed optimized SVM algorithm has achieved an accuracy of 99% and a 98% F1 score.
- Subjects
FACIAL expression; SUPPORT vector machines; EMOTION recognition; FISHER discriminant analysis; DEEP learning; COMPUTER vision
- Publication
Computers, Materials & Continua, 2023, Vol 76, Issue 1, p499
- ISSN
1546-2218
- Publication type
Article
- DOI
10.32604/cmc.2023.039368