Seq-Dnc-Seq: Context Aware Dialog Generation System Through External Memory

2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2019)

引用 3|浏览12
暂无评分
摘要
Most of the conventional Seq2seq based chit-chat models can analyze and process one or two sentences at a time and are trained to answer specific patterns. However, in a real chit-chat conversation, one sentence is interpreted in various ways according to the preceding context. It is often difficult to make a different response for the same input with the existing Seq2seq based chit-chat models. This makes it difficult for Seq2seq models to understand the previous context in a multi-turn chit-chat conversation. To overcome this problem, the dialogue generation models should have an external memory to store the contextual information related to a conversation. In this paper, we propose a new dialogue generation model, which uses differentiable neural computer (DNC) in the conventional Seq2seq model named Seq-DNC-seq. The proposed Seq-DNC-seq model incorporates external memory into the conventional Seq2seq structure to generate an appropriate dialogue based on the memory of previous conversations. Experimental results show that the proposed Seq-DNC-seq model successfully generates multi-turn chit-chat and the output sentences differ depending on the previous sentence even with the same input text. This not only helps the agent to overcome the existing limitations of chit-chat conversation but also understand the context in longer conversation.
更多
查看译文
关键词
Context, chatbot, memory, DNC, seq2seq
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要