Research Article
BibTex RIS Cite

EMOTION DETECTION VIA BERT-BASED DEEP LEARNING APPROACHES IN NATURAL LANGUAGE PROCESSING

Year 2024, Volume: 9 Issue: 2, 103 - 114, 31.12.2024

Abstract

This study focuses on emotion detection using BERT-based deep learning approaches in the field of natural language processing (NLP). Unlike traditional methods, the BERT model exhibits superior performance in sentiment analysis with its ability to produce bidirectional contextual representations. In the study, a dataset consisting of social media posts written in Sundanese language was used and four main emotional states (anger, enthusiasm, anxiety, and melancholy) were classified. In the data preprocessing stage, the special characteristics of the language and the informal structure of the social media language were taken into account. The performance of the BERT model was evaluated using metrics such as accuracy, precision, sensitivity, and F1 score and compared with other methods. Experimental results show that BERT-based models provide high accuracy and reliability in sentiment detection tasks. In addition, the contextual understanding capability of the BERT model provided a significant advantage in overcoming previously encountered classification challenges. The findings show that BERT-based sentiment detection models can be effectively used in various applications such as social media analysis, customer feedback evaluation, and brand reputation management. This study provides an important contribution to the development of more effective and reliable methods for sentiment analysis in the field of NLP.

References

  • [1] Young T, Hazarika D, Poria S, Cambria E. Recent trends in deep learning based natural language processing. IEEE Computational Intelligence Magazine 2018; Accessed: Jul. 18, 2024. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/8416973
  • [2] Liu B, Zhao J, Liu K, Xu L. Sentiment analysis: mining opinions, sentiments, and emotions. MIT Press, 2016. doi: 10.1162/COLI
  • [3] Zhang L, Wang S, Liu B. Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 2018; 8(4). doi: 10.1002/widm.1253
  • [4] Taboada M, Brooke J, Tofiloski M, Voll K, Stede M. Lexicon-based methods for sentiment analysis. Computational Linguistics 2011; 37(2): 267-307. Accessed: Jul. 18, 2024. [Online]. Available: https://direct.mit.edu/coli/article-abstract/37/2/267/2105
  • [5] Zhang Y, Jin R, Zhou Z. Understanding bag-of-words model: a statistical framework. International Journal of Machine Learning and Cybernetics 2010; 1(1-4): 43-52. doi: 10.1007/s13042-010-0001-0
  • [6] Kim H, Jeong Y. Sentiment classification using convolutional neural networks. Applied Sciences 2019; 9(11): 2347. doi: 10.3390/app9112347
  • [7] Tai KS, Socher R, Manning CD. Improved semantic representations from tree-structured long short-term memory networks. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, 2015; 1: 1556-1566. doi: 10.3115/v1/P15-1150
  • [8] Yang Z, Yang D, Dyer C, He X, Smola A, Hovy E. Hierarchical attention networks for document classification. Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2016; 1480-1489. Accessed: Jul. 18, 2024. Available: https://aclanthology.org/N16-1174.pdf
  • [9] Li M, Chen L, Zhao J, Li Q. Sentiment analysis of Chinese stock reviews based on BERT model. Applied Intelligence 2021; 51(7): 5016-5024. doi: 10.1007/s10489-020-02101-8
  • [10] Sun C, Qiu X, Xu Y, Huang X. How to Fine-Tune BERT for Text Classification? Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 2019; 11856 LNAI: 194-206. doi: 10.1007/978-3-030-32381-3_16
  • [11] Li S, Zhao Z, Hu R, Li W, Liu T, Du X. Analogical reasoning on Chinese morphological and semantic relations. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2018; 2: 138-143. doi: 10.18653/v1/P18-2023
  • [12] Putra OV, Wasmanson FM, Harmini T, Utama SN. Sundanese Twitter Dataset for Emotion Classification. CENIM 2020 - Proceedings: International Conference on Computer Engineering, Network, and Intelligent Multimedia 2020; 391-395. Nov. 2020. doi: 10.1109/CENIM51130.2020.9297929
  • [13] LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015; 521: 436-444. doi: 10.1038/nature14539
  • [14] Goldberg Y. A primer on neural network models for natural language processing. Journal of Artificial Intelligence Research 2016; 57: 345-420. Accessed: Jul. 18, 2024. [Online]. Available: http://www.jair.org/index.php/jair/article/view/11030
  • [15] Devlin J, Chang MW, Lee K, Toutanova K. BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018. Accessed: May 16, 2024. [Online]. Available: https://arxiv.org/abs/1810.04805
  • [16] Rogers A, Kovaleva O, Rumshisky A. A primer in BERTology: What we know about how BERT works. Transactions of the Association for Computational Linguistics 2020; 8: 842-866. doi: 10.1162/TACL_A_00349/96482
  • [17] Hung C, Tsai CF, Huang H. Extracting word-of-mouth sentiments via SentiWordNet for document quality classification. Recent Patents on Computer Science 2012; 5: 145-152. Accessed: Jul. 29, 2024. [Online]. Available: https://www.ingentaconnect.com/content/ben/cseng/2012/00000005/00000002/art00008
  • [18] Pal S, Ghosh S, Nag A. Sentiment analysis in the light of LSTM recurrent neural networks. International Journal of Synthetic Emotions (IJSE) 2018. Accessed: Jul. 29, 2024. [Online]. Available: https://www.igi-global.com/article/sentiment-analysis-in-the-light-of-lstm-recurrent-neural-networks/209424
  • [19] Lin J, Kolcz A. Large-scale machine learning at Twitter. Proceedings of the ACM SIGMOD International Conference on Management of Data 2012; 793-804. doi: 10.1145/2213836.2213958
  • [20] Ohana B, Tierney B. Sentiment classification of reviews using SentiWordNet. Computer Sciences, 2009. Accessed: Jul. 29, 2024. [Online]. Available: https://arrow.tudublin.ie/scschcomcon/293/
  • [21] Maas AL, Daly RE, Pham PT, Huang D, Ng AY, Potts C. Learning word vectors for sentiment analysis. Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics, 2011; 142-150. Accessed: Jul. 29, 2024. [Online]. Available: https://aclanthology.org/P11-1015.pdf
  • [22] Park E, Kang J, Choi D, Han J. Understanding customers’ hotel revisiting behaviour: a sentiment analysis of online feedback reviews. Current Issues in Tourism 2020; 23(5): 605-611. Mar. 2018. doi: 10.1080/13683500.2018.1549025
  • [23] Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E. Scikit-learn: Machine learning in Python. Journal of Machine Learning Research 2011; 12: 2825-2830. Accessed: Jul. 29, 2024. https://www.jmlr.org/papers/volume12/pedregosa11a/pedregosa11a.pdf
  • [24] Lin J, Kolcz A. Large-scale machine learning at Twitter. Proceedings of the ACM SIGMOD International Conference on Management of Data 2012; 793-804. doi: 10.1145/2213836.2213958
  • [25] Perlich C. Learning Curves in Machine Learning. 2010. Accessed: Jul. 29, 2024. Available: https://dominoweb.draco.res.ibm.com/reports/rc24756.pdf
There are 25 citations in total.

Details

Primary Language English
Subjects Information Systems Development Methodologies and Practice
Journal Section Research Article
Authors

Zülfikar Aslan 0000-0002-2706-5715

Publication Date December 31, 2024
Submission Date July 29, 2024
Acceptance Date December 25, 2024
Published in Issue Year 2024 Volume: 9 Issue: 2

Cite

APA Aslan, Z. (2024). EMOTION DETECTION VIA BERT-BASED DEEP LEARNING APPROACHES IN NATURAL LANGUAGE PROCESSING. The International Journal of Energy and Engineering Sciences, 9(2), 103-114.

IMPORTANT NOTES

No part of the material protected by this copyright may be reproduced or utilized in any form or by any means, without the prior written permission of the copyright owners, unless the use is a fair dealing for the purpose of private study, research or review. The authors reserve the right that their material can be used for purely educational and research purposes. All the authors are responsible for the originality and plagiarism, multiple publication, disclosure and conflicts of interest and fundamental errors in the published works.

*Please note that  All the authors are responsible for the originality and plagiarism, multiple publication, disclosure and conflicts of interest and fundamental errors in the published works. Author(s) submitting a manuscript for publication in IJEES also accept that the manuscript may go through screening for plagiarism check using IThenticate software. For experimental works involving animals, approvals from relevant ethics committee should have been obtained beforehand assuring that the experiment was conducted according to relevant national or international guidelines on care and use of laboratory animals.  Authors may be requested to provide evidence to this end.
 
**Authors are highly recommended to obey the IJEES policies regarding copyrights/Licensing and ethics before submitting their manuscripts.


Copyright © 2025. AA. All rights reserved