Creating Experts From the Crowd: Techniques for Finding Workers for Difficult Tasks

Multimedia, IEEE Transactions  (2014)

引用 8|浏览22
暂无评分
摘要
Crowdsourcing is currently used for a range of applications, either by exploiting unsolicited user-generated content, such as spontaneously annotated images, or by utilizing explicit crowdsourcing platforms such as Amazon Mechanical Turk to mass-outsource artificial-intelligence-type jobs. However, crowdsourcing is most often seen as the best option for tasks that do not require more of people than their uneducated intuition as a human being. This article describes our methods for identifying workers for crowdsourced tasks that are difficult for both machines and humans. It discusses the challenges we encountered in qualifying annotators and the steps we took to select the individuals most likely to do well at these tasks.
更多
查看译文
关键词
social networking (online),video signal processing,Amazon Mechanical Turk,annotators,artificial-intelligence-type jobs,crowdsourcing,multimodal location estimation,social media video,unsolicited user-generated content,Annotation,cheat detection,crowdsourcing,mechanical turk,multimodal location estimation,qualification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要