Unsupervised And Weakly Supervised Approaches For Answer Selection Tasks With Scarce Annotations

OPEN COMPUTER SCIENCE(2019)

引用 0|浏览4
暂无评分
摘要
Addressing Answer Selection (AS) tasks with complex neural networks typically requires a large amount of annotated data to increase the accuracy of the models. In this work, we are interested in simple models that can potentially give good performance on datasets with no or few annotations. First, we propose new unsupervised baselines that leverage distributed word and sentence representations. Second, we compare the ability of our neural architectures to learn from few annotated examples in a weakly supervised scheme and we demonstrate how these methods can benefit from a pre-training on an external dataset. With an emphasis on results reproducibility, we show that our simple methods can reach or approach state-of-the-art performances on four common AS datasets.
更多
查看译文
关键词
neural networks, natural language processing, question answering, answer selection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要