Overview Of The Inex 2010 Book Track: Scaling Up The Evaluation Using Crowdsourcing

INEX'10: Proceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval(2011)

引用 20|浏览17
暂无评分
摘要
The goal of the INEX Book Track is to evaluate approaches for supporting users in searching, navigating and reading the full texts of digitized books. The investigation is focused around four tasks: 1) Best Books to Reference, 2) Prove It, 3) Structure Extraction, and 4) Active Reading. In this paper, we report on the setup and the results of these tasks in 2010. The main outcome of the track lies in the changes to the methodology for constructing the test collection for the evaluation of the Best Books and Prove It search tasks. In an effort to scale up the evaluation, we explored the use of crowdsourcing both to create the test topics and then to gather the relevance labels for the topics over a corpus of 50k digitized books. The resulting test collection construction methodology combines editorial judgments contributed by INEX participants with crowdsourced relevance labels. We provide an analysis of the crowdsourced data and conclude that - with appropriate task design crowdsourcing does provide a suitable framework for the evaluation of book search approaches.
更多
查看译文
关键词
Best Books,digitized book,resulting test collection construction,test collection,test topic,INEX Book Track,INEX participant,book search approach,crowdsourced data,crowdsourced relevance label,book track
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要