Improving search engine efficiency through contextual factor selection

AI MAGAZINE(2021)

引用 0|浏览5
暂无评分
摘要
Learning to rank (LTR) is an important artificial intelligence (AI) approach supporting the operation of many search engines. In large-scale search systems, the ranking results are continually improved with the introduction of more factors to be considered by LTR. However, the more factors being considered, the more computation resources required, which in turn, results in increased system response latency. Therefore, removing redundant factors can significantly improve search engine efficiency. In this paper, we report on our experience incorporating our Contextual Factor Selection (CFS) deep reinforcement learning approach into the Taobao e-commerce platform to optimize the selection of factors based on the context of each search query to simultaneously maintaining search result quality while significantly reducing latency. Online deployment on Taobao.com demonstrated that CFS is able to reduce average search latency under everyday use scenarios by more than 40% compared to the previous approach with comparable search result quality. Under peak usage during the Single's Day Shopping Festival (November 11th) in 2017, CFS reduced the average search latency by 20% compared to the previous approach.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要