Can Lexicon-Based Sentiment Analysis Boost Performances of Transformer-Based Models?

Lindung Parningotan Manik, Harry Susianto, Arawinda Dinakaramani, Niken Pramanik, Totok Suhardijanto

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

An essential endeavor in natural language processing, sentiment analysis entails determining the sentiment expressed in a text. Transformer-based models like BERT have attained state-of-the-art performance in sentiment analysis tasks. However, these algorithms may have difficulty distinguishing sentiment-laden words. In response, we proposed combining lexicon-based sentiment analysis and transformer-based models. This study investigates the effect of lexicon-based sentiment analysis, particularly SentiStrength, on BERT's efficacy in sentiment analysis tasks. Experimental evaluations reveal that incorporating sentiment lexicons enhances the accuracy and F1-score of classical sentiment analysis compared to the baseline BERT model. Our findings demonstrate the value of incorporating external knowledge sources into transformer-based sentiment analysis models.

Original languageEnglish
Title of host publicationProceedings of the 7th 2023 International Conference on New Media Studies, CONMEDIA 2023
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages314-319
Number of pages6
ISBN (Electronic)9798350307504
DOIs
Publication statusPublished - 2023
Event7th International Conference on New Media Studies, CONMEDIA 2023 - Bali, Indonesia
Duration: 6 Dec 20238 Dec 2023

Publication series

NameProceedings of the 7th 2023 International Conference on New Media Studies, CONMEDIA 2023

Conference

Conference7th International Conference on New Media Studies, CONMEDIA 2023
Country/TerritoryIndonesia
CityBali
Period6/12/238/12/23

Keywords

  • BERT
  • sentiment analysis
  • sentiment lexicons
  • sentistrength
  • transformer-based models

Fingerprint

Dive into the research topics of 'Can Lexicon-Based Sentiment Analysis Boost Performances of Transformer-Based Models?'. Together they form a unique fingerprint.

Cite this