User-oriented evaluation methods for information retrieval: a case study based on conceptual models for query expansion

Exploring artificial intelligence in the new millennium(2003)

引用 26|浏览57
暂无评分
摘要
This chapter discusses evaluation methods based on the use of nondichotomous relevance judgments in information retrieval (IR) experiments. It is argued that evaluation methods should credit IR methods for their ability to retrieve highly relevant documents. This is desirable from the user's point of view in modern large IR environments. The proposed methods are (1) a novel application of P-R curves and average precision computations based on separate recall bases for documents of different degrees of relevance, and (2) two novel measures computing the cumulative gain the user obtains by examining the retrieval result up to a given ranked position. We then demonstrate the use of these evaluation methods in a case study on the effectiveness of query types, based on combinations of query structures and expansion, in retrieving documents of various degrees of relevance. Query expansion is based on concepts, which are selected from a conceptual model, and then expanded by semantic relationships given in the model. The test is run with a best-match retrieval system (InQuery) in a text database consisting of newspaper articles. The case study indicates the usability of domain-dependent conceptual models in query expansion for IR. The results show that expanded queries with a strong query structure are most effective in retrieving highly relevant documents. The differences between the query types are practically essential and statistically significant. More generally, the novel evaluation methods and the case demonstrate that nondichotomous relevance assessments are applicable in IR experiments and allow harder testing of IR methods. Proposed methods are user-oriented because users' benefits and efforts--highly relevant documents and number of documents to be examined-- are taken into account.
更多
查看译文
关键词
expanded query,evaluation method,ir method,case study,query expansion,modern large ir environment,query type,relevant document,information retrieval,conceptual model,user-oriented evaluation method,ir experiment
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要