A Optimized BERT for Multimodal Sentiment Analysis

ACM Transactions on Multimedia Computing, Communications, and Applications(2022)

引用 1|浏览33
暂无评分
摘要
Sentiment analysis of one modality (e.g., text or image) has been broadly studied. However, not much attention has been paid to the sentiment analysis of multi-modal data. As the research and applications about Multi-modal data analysis are more and more broadly, it is necessary to optimize BERT internal structure. This paper proposes a Hierarchical multi-head Self Attention and Gate Channel BERT which is an optimized BERT model. The model is composed of three modules: the Hierarchical Multi-head Self Attention module realizes the hierarchical extraction process of features; Gate Channel module replaces BERT’s original Feed Forward layer to realize information filtering; Finally, the tensor fusion model based on self-attention mechanism is utilized to implement the fusion process of different modal features. Experiments show our method achieves promising results and improves the accuracy by 5-6% when compared with traditional models on CMU-MOSI dataset.
更多
查看译文
关键词
bert
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要