An explanation framework and method for AI-based text emotion analysis and visualisation

DECISION SUPPORT SYSTEMS(2024)

引用 0|浏览4
暂无评分
摘要
With the rapid development of artificial intelligence, there is an increasing number of industries relying on the accuracy and efficiency of deep learning algorithms. But due to the inexplicability and black box effect of deep neural networks, we can only obtain results without knowing the applied reasoning behind them. That engenders scepticism and resistance from some quarters of deep learning-based technologies. In the context of emotion analysis used in business and public opinion monitoring, it is sometimes difficult for decision-makers to trust the outcome without explanation from the supposedly emotionless machines. There are mathematical-based explanation methods, and they often generalise emotion analysis as a classification task. Still, emotion should be different from other task categories because the generation of emotion involves human-specific factors and logic. This paper proposes an emotion analysis explanation framework that is grounded in psychological theories focusing on the stimulus from classic emotion theories. This proposed framework emphasises considering the cause and trigger of emotions as the explanation for the deep learning-based emotion analysis, and it includes two main components: the extraction of the emotion cause and the visualisation of emotion-triggering words. Compared with the existing approaches, our proposed framework is based on the perspective of human psychology with higher credibility and significant theoretical support. In addition, we purposefully design and implement an intuitive visualisation for the framework, instead of complex numerical representations, to improve the explanation comprehensibility for a broader audience.
更多
查看译文
关键词
Explainable AI,Explanation framework,Emotion analysis,Emotion theory
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要