Efficient Text Classification with Echo State Networks

2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2021)

引用 3|浏览11
暂无评分
摘要
We consider echo state networks (ESNs) for text classification. More specifically, we investigate the learning capabilities of ESNs with pre-trained word embedding as input features, trained on the IMDb and TREC sentiment and question classification datasets, respectively. First, we introduce a customized training paradigm for the processing of multiple input time series (the inputs texts) associated with categorical targets (their corresponding classes). For sentiment tasks, we use an additional frozen attention mechanism which is based on an external lexicon, and hence requires only negligible computational cost. Within this paradigm, ESNs can be trained in tens of seconds on a GPU. We show that ESNs significantly outperform their Ridge regression baselines provided with the same embedded features. ESNs also compete with classical Bi-LSTM networks while keeping a training time of up to 23 times faster. These results show that ESNs can be considered as robust, efficient and fast candidates for text classification tasks. Overall, this study falls within the context of light and fast-to-train models for NLP.
更多
查看译文
关键词
reservoir computing, echo state networks, natural language processing, text classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要