ELMo Layer Embedding Comparision with Short Text Classification

2023 3rd Asian Conference on Innovation in Technology (ASIANCON)(2023)

引用 0|浏览1
暂无评分
摘要
ELMo was the first context based embedded generated neural model which mainly solved the problem of polesmy issue. Word embedding generated with ELMo was used with downstream tasks such as text classification to improve the classification performances. The architecture of ELMo consists of 3 layers such as convolution layer, and two LSTM layers. These layers can be either concatenate or used separately to generate different embedding representations. with the existing work, ELMo embedding layer performances was not properly evaluated for short text classification tasks. Moreover, it is hard to identify the best traditional classification algorithms with ELMo embeddings. Therefore, this study mainly focuses on identifying the best layer embedding weighting scheme as well as the best traditional machine learning algorithm which performs well with short text classification tasks. Seven short text type datasets were selected for the experiment. According to the experiment, concatenation of three layers was identified as the best layer embedding representation over the others. There was a around 2–4% performance improvement over the other layer combination. Moreover, the support vector machine algorithm showed the best classification performances among other Machine learning algorithms.
更多
查看译文
关键词
Embeddings,ELMo,short text,classification,SVM
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要