Lexicon-Enhanced Transformer with Pointing for Domains Specific Generative Question Answering

international conference on algorithms and architectures for parallel processing(2020)

引用 1|浏览9
暂无评分
摘要
Aiming at the problem of inaccurate generation caused by the lack of external knowledge in the generative automatic question answering system, we propose a new answer generation model (LEP-Transformer) that integrates domain lexicon and copy mechanism, which can enable the Transformer to effectively deal with the long-distance dependence of different text granularity and have the ability to reproduce the details of the facts when generating answers. And the experimental results on two different datasets show that the model can alleviate this problem and has ability to model short text and long text sequences simultaneously.
更多
查看译文
关键词
Transformer, Domain lexicon, Copy mechanism, Knowledge fusion structure
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要