City Research Online

LT-SAA: A Lightweight Transformer with Sentiment-Aware Attention for Fine-Tuned BERT-Based Sentiment Classification

Nabi, M. R., Miah, M. B. A., Saedi, M. ORCID: 0000-0001-6436-1057 , Nobe, S., Hosen, A. S. M. S. & Aminuddin, A. (2025). LT-SAA: A Lightweight Transformer with Sentiment-Aware Attention for Fine-Tuned BERT-Based Sentiment Classification. Paper presented at the International Conference on Emerging Trends in Cybersecurity (ICETCS 2025), 27-28 Oct 2025, Wolverhampton, UK.

Abstract

Sentiment analysis of user-generated content on digital platforms like Twitter is essential for understanding public opinion and emotional polarity across diverse situations. However, Tweets are typically short and informal which is challenging for multi-class sentiment classification (positive, neutral, negative). Existing techniques often rely on generic attention mechanisms or implicit sentiment handling, struggle with complex expression, and perform equally well across all sentiment classes. This paper proposes LT-SAA (Lightweight Transformer with Sentiment-Aware Attention), a novel hybrid model that integrates fine-tuned BERT embeddings with a lightweight transformer block and a sentiment-aware attention layer guided by VADER sentiment scores. The model employs selective fine-tuning of the final six transformer layers of BERT to enhance computing performance while preserving contextual understanding. The model’s capacity to concentrate on emotionally significant tokens is improved by the sentiment-aware attention mechanism, which dynamically weights features using normalized VADER scores. Experiments on the SemEval-2017 Task 4 Sub-task A (English Language) dataset, LT-SAA achievesan F1-score of 0.61, outperforming baseline models including fine-tuned BERT(F1: 0.593), Multi-view Ensemble (F1: 0.523), and Interpolated DNN (F1: 0.587). By utilizing oversampling and class weighting strategies, the proposed method demonstrates improved recall performance and better handling of class imbalance. LT-SAA offers a realistic and scalable framework for real-time social media monitoring by integrating symbolic sentiment cues with efficient transformer attention. It could be applied to cross-domain datasets like Amazon reviews and implicit sentiment detection could be added for broader applicability.

Publication Type: Conference or Workshop Item (Paper)
Additional Information: This version of the article has been accepted for publication, after peer review (when applicable) and is subject to Springer Nature’s AM terms of use, but is not the Version of Record and does not reflect post-acceptance improvements, or any corrections. The Version of Record will be available online at: https://link.springer.com/series/7818
Publisher Keywords: Sentiment Analysis, Twitter, BERT, Attention Mechanisms, VADER
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: School of Science & Technology
School of Science & Technology > Department of Computer Science
SWORD Depositor:
[thumbnail of Sentiment_Final_Draft - rafi shoron.pdf] Text - Accepted Version
This document is not freely accessible due to copyright restrictions.

To request a copy, please use the button below.

Request a copy

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login