Research Article
BibTex RIS Cite

Enhanced Emotion Recognition through Hybrid Deep Learning and SVM Integration

Year 2025, Volume: 14 Issue: 1, 348 - 360, 26.03.2025
https://doi.org/10.17798/bitlisfen.1588046

Abstract

The facial expression recognition system, which contributes to the processes to be more effective and faster in many fields such as medicine, education and security, plays an important role in various applications. For example, while emotional and psychological states can be monitored thanks to facial expression recognition in the health field, it can be used in critical applications such as lie detection in the security sector. In education, students' instant facial expressions are analyzed to contribute to the learning processes. The problem of emotion recognition from facial expressions, which is related to many fields, is of great importance in obtaining accurate and reliable results. Therefore, in order to increase the performance of emotion recognition from facial expressions, a hybrid approach combining deep learning and classical machine learning methods is considered in this study. In the proposed method, the ResNet50 model is used as a feature and Support Vector Machines (SVM) is used as a classifier. In this study, a hybrid approach consisting of the combination of ResNet50 and SVM methods is proposed-to increase the performance of emotion recognition from facial expressions. In order to analyze facial expressions, six basic emotions are classified as happiness, sadness, anger, fear, surprise and disgust using the CK+48 dataset. Experimental results show that the proposed hybrid approach has high accuracy in emotion recognition and outperforms traditional machine-learning algorithms.

Ethical Statement

The study is complied with research and publication ethics.

Thanks

This study was developed from Muhammed Kerem TÜRKEŞ's master's thesis.

References

  • R. E. Riggio and J. Lee, “Emotional and interpersonal competencies and leader development,” Hum. Resour. Manag. Rev., vol. 17, no. 4, pp. 418–426, 2007, doi: 10.1016/j.hrmr.2007.08.008.
  • A. Petrovici and T. Dobrescu, “The Role of Emotional Intelligence in Building Interpersonal Communication Skills,” Procedia - Soc. Behav. Sci., vol. 116, pp. 1405–1410, 2014, doi: 10.1016/j.sbspro.2014.01.406.
  • A. G. Sanfey et al., “The Neural Basis of Economic Decision-Making in the Ultimatum Game Published by : American Association for the Advancement of Science Stable URL : http://www.jstor.org/stable/3834595 The Neural Basis of Economic Decision-Making in the Ultimatum Game,” Science (80-. )., vol. 300, no. 5626, pp. 1755–1758, 2003.
  • A. D. Angie, S. Connelly, E. P. Waples, and V. Kligyte, “The influence of discrete emotions on judgement and decision-making: A meta-analytic review,” Cogn. Emot., vol. 25, no. 8, pp. 1393–1422, 2011, doi: 10.1080/02699931.2010.550751.
  • A. Ortony, “Are All ‘Basic Emotions’ Emotions? A Problem for the (Basic) Emotions Construct,” Perspect. Psychol. Sci., vol. 17, no. 1, pp. 41–61, 2022, doi: 10.1177/1745691620985415.
  • F. Akar and İ. Akgül, “Derin Öğrenme Modeli ile Yüz İfadelerinden Duygu Tanıma,” Iğdır Üniversitesi Fen Bilim. Enstitüsü Derg., vol. 12, no. 1, pp. 69–79, 2022, doi: 10.21597/jist.976577.
  • D. Özdemir and S. Karaman, “Investigating interactions between students with mild mental retardation and humanoid robot in terms of feedback types,” Egit. ve Bilim, vol. 42, no. 191, pp. 109–138, 2017, doi: 10.15390/EB.2017.6948.
  • D. Mehta, M. F. H. Siddiqui, and A. Y. Javaid, “Recognition of emotion intensities using machine learning algorithms: A comparative study,” Sensors (Switzerland), vol. 19, no. 8, pp. 1–24, 2019, doi: 10.3390/s19081897.
  • C. Turan and K. M. Lam, “Histogram-based local descriptors for facial expression recognition (FER): A comprehensive study,” J. Vis. Commun. Image Represent., vol. 55, no. January, pp. 331–341, 2018, doi: 10.1016/j.jvcir.2018.05.024.
  • Y. Aydin, “A Comparative Analysis of Skin Cancer Detection Applications Using Histogram-Based Local Descriptors,” Diagnostics, vol. 13, no. 19, 2023, doi: 10.3390/diagnostics13193142.
  • B. Li and D. Lima, “Facial expression recognition via ResNet-50,” Int. J. Cogn. Comput. Eng., vol. 2, no. February, pp. 57–64, 2021, doi: 10.1016/j.ijcce.2021.02.002.
  • S. Bayrakdar, D. Akgün, and İ. Yücedağ, “An accelerated approach for facial expression analysis on video files,” Pamukkale Univ. J. Eng. Sci., vol. 23, no. 5, pp. 602–613, 2017, doi: 10.5505/pajes.2016.00908.
  • M. Mukhopadhyay, A. Dey, and S. Kahali, “A deep-learning-based facial expression recognition method using textural features,” Neural Comput. Appl., vol. 35, no. 9, pp. 6499–6514, 2023, doi: 10.1007/s00521-022-08005-7.
  • H. Sadeghi and A. A. Raie, “HistNet: Histogram-based convolutional neural network with Chi-squared deep metric learning for facial expression recognition,” Inf. Sci. (Ny)., vol. 608, pp. 472–488, 2022, doi: 10.1016/j.ins.2022.06.092.
  • M. Karnati, A. Seal, S. Member, J. Jaworek-korjakowska, and O. Krejcar, “Facial Expression Recognition In-the-Wild Using,” vol. 72, 2023.
  • H. B. U. Haq, W. Akram, M. N. Irshad, A. Kosar, and M. Abid, “Enhanced Real-Time Facial Expression Recognition Using Deep Learning,” Acadlore Trans. AI Mach. Learn., vol. 3, no. 1, pp. 24–35, 2024, doi: 10.56578/ataiml030103.
  • P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews, “The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression,” 2010 IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. - Work. CVPRW 2010, no. July, pp. 94–101, 2010, doi: 10.1109/CVPRW.2010.5543262.
  • S. Li and W. Deng, “A Deeper Look at Facial Expression Dataset Bias,” IEEE Trans. Affect. Comput., vol. 13, no. 2, pp. 881–893, 2020, doi: 10.1109/TAFFC.2020.2973158.
  • K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2016-Decem, pp. 770–778, 2016, doi: 10.1109/CVPR.2016.90.
  • M. Hussain, J. J. Bird, and D. R. Faria, “A Study on CNN Transfer Learning for Image Classification,” in Advances in Computational Intelligence Systems, 2019, pp. 191–202.
  • R. Gandhi, “Support Vector Machine Introduction to Machine Learning Algorithms,” Towar. Data Sci., vol. 7, no. 06, 2018, [Online]. Available: https://towardsdatascience.com/support-vector-machine-introduction-to-machine-learning-algorithms-934a444fca47.
  • S. Suthaharan, “Support Vector Machine,” in Machine Learning Models and Algorithms for Big Data Classification: Thinking with Examples for Effective Learning, Springer, 2016, pp. 207–235.
  • S. Dutta and S. K. Bandyopadhyay, “Early Breast Cancer Prediction using Artificial Intelligence Methods,” J. Eng. Res. Reports, vol. 13, no. 2, pp. 48–54, 2020, doi: 10.9734/jerr/2020/v13i217105.
  • M. P. Deisenroth, A. A. Faisal, and C. S. ONG, Mathematics for, no. c. Cambridge University Press., 2020.
  • Y. Aydın, “Automated identification of copy‐move forgery using Hessian and patch feature,” J. Forensic Sci., vol. 69, no. 1, pp. 131–138, 2024.
  • A. Liu and H. Yue, “Facial Expression Recognition Based on CNN-LSTM,” Proc. 2023 7th Int. Conf. Electron. Inf. Technol. Comput. Eng., 2023, doi: 10.1145/3650400.3650480.
  • E. Owusu and I. Wiafe, “An advance ensemble classification for object recognition,” Neural Comput. Appl., vol. 33, no. 18, pp. 11661–11672, 2021, doi: 10.1007/s00521-021-05881-3.
  • N. Syalomta, N. Fasya Rahim, K. Usman, and N. K. Caecar Pratiwi, “Face Expressions Recognition Based on Image Processing using Convolutional Neural Network for Human Computer Interface,” SHS Web Conf., vol. 139, p. 03017, 2022, doi: 10.1051/shsconf/202213903017.
  • P. Deepan, R. Vidya, M. A. Reddy, N. Arul, J. Ravichandran, and S. Dhiravidaselvi, “A Hybrid Gabor Filter-Convolutional Neural Networks Model for Facial Emotion Recognition System,” Indian J. Sci. Technol., vol. 17, no. 35, pp. 3696–3703, 2024, doi: 10.17485/ijst/v17i35.1998.

Hibrit Derin Öğrenme ve SVM Entegrasyonu ile Gelişmiş Duygu Tanıma

Year 2025, Volume: 14 Issue: 1, 348 - 360, 26.03.2025
https://doi.org/10.17798/bitlisfen.1588046

Abstract

Tıp, eğitim ve güvenlik gibi birçok alanda süreçlerin daha efektif ve hızlı olmasına katkı sağlayan yüz ifadesi tanıma sistemi, çeşitli uygulamalarda kullanılarak önemli bir rol oynamaktadır. Örneğin, sağlık alanında yüz ifadesi tanıma sayesinde duygusal ve psikolojik durumlar izlenebilirken, güvenlik sektöründe yalan tespiti gibi kritik uygulamalara kullanılabilmektedir. Eğitimde ise öğrencilerin anlık yüz ifadeleri analiz edilerek öğrenme süreçlerine katkı sağlanmaktadır. Birçok alanla ilişkili olan yüz ifadelerinden duygu tanıma problemi, doğru ve güvenilir sonuçlar elde edilmesi açısından büyük önem taşımaktadır. Bu nedenle, yüz ifadelerinden duygu tanıma performansını artırmak amacıyla bu çalışmada derin öğrenme ve klasik makine öğrenme yöntemlerini birleştiren bir hibrit yaklaşım ele alınmıştır. Önerilen yöntemde öznitelik olarak ResNet50 modeli kullanılırken sınıflandırıcı olarak SVM kullanılmıştır. Bu çalışmada , yüz ifadelerinden duygu tanıma performansını artırmak amacıyla ResNet50 ve Destek Vektör Makineleri (SVM) yöntemlerinin birleşiminden oluşan bir hibrit yaklaşım önerilmektedir. Yüz ifadelerini analiz etmek için CK+48 veri seti kullanılarak happiness, sadness, anger, fear, fear, surprise, disgust olmak üzere altı temel duygu sınıflandırılmıştır. Deneysel sonuçlar, önerilen hibrit yaklaşımın duygu tanımada yüksek doğruluk oranına sahip olduğunu ve geleneksel makine öğrenme algoritmalarına göre daha başarılı performans sergilediğini göstermektedir

References

  • R. E. Riggio and J. Lee, “Emotional and interpersonal competencies and leader development,” Hum. Resour. Manag. Rev., vol. 17, no. 4, pp. 418–426, 2007, doi: 10.1016/j.hrmr.2007.08.008.
  • A. Petrovici and T. Dobrescu, “The Role of Emotional Intelligence in Building Interpersonal Communication Skills,” Procedia - Soc. Behav. Sci., vol. 116, pp. 1405–1410, 2014, doi: 10.1016/j.sbspro.2014.01.406.
  • A. G. Sanfey et al., “The Neural Basis of Economic Decision-Making in the Ultimatum Game Published by : American Association for the Advancement of Science Stable URL : http://www.jstor.org/stable/3834595 The Neural Basis of Economic Decision-Making in the Ultimatum Game,” Science (80-. )., vol. 300, no. 5626, pp. 1755–1758, 2003.
  • A. D. Angie, S. Connelly, E. P. Waples, and V. Kligyte, “The influence of discrete emotions on judgement and decision-making: A meta-analytic review,” Cogn. Emot., vol. 25, no. 8, pp. 1393–1422, 2011, doi: 10.1080/02699931.2010.550751.
  • A. Ortony, “Are All ‘Basic Emotions’ Emotions? A Problem for the (Basic) Emotions Construct,” Perspect. Psychol. Sci., vol. 17, no. 1, pp. 41–61, 2022, doi: 10.1177/1745691620985415.
  • F. Akar and İ. Akgül, “Derin Öğrenme Modeli ile Yüz İfadelerinden Duygu Tanıma,” Iğdır Üniversitesi Fen Bilim. Enstitüsü Derg., vol. 12, no. 1, pp. 69–79, 2022, doi: 10.21597/jist.976577.
  • D. Özdemir and S. Karaman, “Investigating interactions between students with mild mental retardation and humanoid robot in terms of feedback types,” Egit. ve Bilim, vol. 42, no. 191, pp. 109–138, 2017, doi: 10.15390/EB.2017.6948.
  • D. Mehta, M. F. H. Siddiqui, and A. Y. Javaid, “Recognition of emotion intensities using machine learning algorithms: A comparative study,” Sensors (Switzerland), vol. 19, no. 8, pp. 1–24, 2019, doi: 10.3390/s19081897.
  • C. Turan and K. M. Lam, “Histogram-based local descriptors for facial expression recognition (FER): A comprehensive study,” J. Vis. Commun. Image Represent., vol. 55, no. January, pp. 331–341, 2018, doi: 10.1016/j.jvcir.2018.05.024.
  • Y. Aydin, “A Comparative Analysis of Skin Cancer Detection Applications Using Histogram-Based Local Descriptors,” Diagnostics, vol. 13, no. 19, 2023, doi: 10.3390/diagnostics13193142.
  • B. Li and D. Lima, “Facial expression recognition via ResNet-50,” Int. J. Cogn. Comput. Eng., vol. 2, no. February, pp. 57–64, 2021, doi: 10.1016/j.ijcce.2021.02.002.
  • S. Bayrakdar, D. Akgün, and İ. Yücedağ, “An accelerated approach for facial expression analysis on video files,” Pamukkale Univ. J. Eng. Sci., vol. 23, no. 5, pp. 602–613, 2017, doi: 10.5505/pajes.2016.00908.
  • M. Mukhopadhyay, A. Dey, and S. Kahali, “A deep-learning-based facial expression recognition method using textural features,” Neural Comput. Appl., vol. 35, no. 9, pp. 6499–6514, 2023, doi: 10.1007/s00521-022-08005-7.
  • H. Sadeghi and A. A. Raie, “HistNet: Histogram-based convolutional neural network with Chi-squared deep metric learning for facial expression recognition,” Inf. Sci. (Ny)., vol. 608, pp. 472–488, 2022, doi: 10.1016/j.ins.2022.06.092.
  • M. Karnati, A. Seal, S. Member, J. Jaworek-korjakowska, and O. Krejcar, “Facial Expression Recognition In-the-Wild Using,” vol. 72, 2023.
  • H. B. U. Haq, W. Akram, M. N. Irshad, A. Kosar, and M. Abid, “Enhanced Real-Time Facial Expression Recognition Using Deep Learning,” Acadlore Trans. AI Mach. Learn., vol. 3, no. 1, pp. 24–35, 2024, doi: 10.56578/ataiml030103.
  • P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews, “The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression,” 2010 IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. - Work. CVPRW 2010, no. July, pp. 94–101, 2010, doi: 10.1109/CVPRW.2010.5543262.
  • S. Li and W. Deng, “A Deeper Look at Facial Expression Dataset Bias,” IEEE Trans. Affect. Comput., vol. 13, no. 2, pp. 881–893, 2020, doi: 10.1109/TAFFC.2020.2973158.
  • K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 2016-Decem, pp. 770–778, 2016, doi: 10.1109/CVPR.2016.90.
  • M. Hussain, J. J. Bird, and D. R. Faria, “A Study on CNN Transfer Learning for Image Classification,” in Advances in Computational Intelligence Systems, 2019, pp. 191–202.
  • R. Gandhi, “Support Vector Machine Introduction to Machine Learning Algorithms,” Towar. Data Sci., vol. 7, no. 06, 2018, [Online]. Available: https://towardsdatascience.com/support-vector-machine-introduction-to-machine-learning-algorithms-934a444fca47.
  • S. Suthaharan, “Support Vector Machine,” in Machine Learning Models and Algorithms for Big Data Classification: Thinking with Examples for Effective Learning, Springer, 2016, pp. 207–235.
  • S. Dutta and S. K. Bandyopadhyay, “Early Breast Cancer Prediction using Artificial Intelligence Methods,” J. Eng. Res. Reports, vol. 13, no. 2, pp. 48–54, 2020, doi: 10.9734/jerr/2020/v13i217105.
  • M. P. Deisenroth, A. A. Faisal, and C. S. ONG, Mathematics for, no. c. Cambridge University Press., 2020.
  • Y. Aydın, “Automated identification of copy‐move forgery using Hessian and patch feature,” J. Forensic Sci., vol. 69, no. 1, pp. 131–138, 2024.
  • A. Liu and H. Yue, “Facial Expression Recognition Based on CNN-LSTM,” Proc. 2023 7th Int. Conf. Electron. Inf. Technol. Comput. Eng., 2023, doi: 10.1145/3650400.3650480.
  • E. Owusu and I. Wiafe, “An advance ensemble classification for object recognition,” Neural Comput. Appl., vol. 33, no. 18, pp. 11661–11672, 2021, doi: 10.1007/s00521-021-05881-3.
  • N. Syalomta, N. Fasya Rahim, K. Usman, and N. K. Caecar Pratiwi, “Face Expressions Recognition Based on Image Processing using Convolutional Neural Network for Human Computer Interface,” SHS Web Conf., vol. 139, p. 03017, 2022, doi: 10.1051/shsconf/202213903017.
  • P. Deepan, R. Vidya, M. A. Reddy, N. Arul, J. Ravichandran, and S. Dhiravidaselvi, “A Hybrid Gabor Filter-Convolutional Neural Networks Model for Facial Emotion Recognition System,” Indian J. Sci. Technol., vol. 17, no. 35, pp. 3696–3703, 2024, doi: 10.17485/ijst/v17i35.1998.
There are 29 citations in total.

Details

Primary Language English
Subjects Artificial Intelligence (Other)
Journal Section Research Article
Authors

Muhammed Kerem Türkeş 0000-0003-2403-2907

Yıldız Aydın 0000-0002-3877-6782

Publication Date March 26, 2025
Submission Date November 19, 2024
Acceptance Date February 20, 2025
Published in Issue Year 2025 Volume: 14 Issue: 1

Cite

IEEE M. K. Türkeş and Y. Aydın, “Enhanced Emotion Recognition through Hybrid Deep Learning and SVM Integration”, Bitlis Eren Üniversitesi Fen Bilimleri Dergisi, vol. 14, no. 1, pp. 348–360, 2025, doi: 10.17798/bitlisfen.1588046.

Bitlis Eren University
Journal of Science Editor
Bitlis Eren University Graduate Institute
Bes Minare Mah. Ahmet Eren Bulvari, Merkez Kampus, 13000 BITLIS