Humans Outperform Machines at the Bilingual Shannon Game.

ENTROPY(2017)

引用 1|浏览35
暂无评分
摘要
We provide an upper bound for the amount of information a human translator adds to an original text, i.e., how many bits of information we need to store a translation, given the original. We do this by creating a Bilingual Shannon Game that elicits character guesses from human subjects, then developing models to estimate the entropy of those guess sequences.
更多
查看译文
关键词
compression,multilingual,translation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要