Diving into Text Representation Learning with Deep Hashing

2021 International Conference on Electronic Information Engineering and Computer Science (EIECS)(2021)

引用 0|浏览0
暂无评分
摘要
With the development of deep neural network, especially the emergence of large scale pre-trained language model, methods based on deep learning have substantially advanced the state of the art across a variety of subtasks in natural language processing. Nevertheless, in order to get better performance, the increasing model parameters and exponential growth of training time become problems that can not be ignored. In this paper, inspired by the application of deep hashing in large-scale image retrieval, text representation learning based on deep hashing technology has been extensively explored. And we use three common subtasks in natural language processing to evaluate the proposed method. Experimental results show that deep hashing can greatly reduce the physical space cost by text representation under the condition of limited performance loss, which is of great significance in large-scale text representation, and is worthy of further exploration.
更多
查看译文
关键词
Deep Hashing,Pre-trained Language Models,Transformer,Textual Representation Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要