Hierarchical multi-armed bandits for discovering hidden populations.

ASONAM '19: International Conference on Advances in Social Networks Analysis and Mining Vancouver British Columbia Canada August, 2019(2019)

引用 5|浏览54
暂无评分
摘要
This paper proposes a novel algorithm to discover hidden individuals in a social network. The problem is increasingly important for social scientists as the populations (e.g., individuals with mental illness) that they study converse online. Since these populations do not use the category (e.g., mental illness) to self-describe, directly querying with text is non-trivial. To by-pass the limitations of network and query re-writing frameworks, we focus on identifying hidden populations through attributed search. We propose a hierarchical Multi-Arm Bandit (DT-TMP) sampler that uses a decision tree coupled with reinforcement learning to query the combinatorial attributed search space by exploring and expanding along high yielding decision-tree branches. A comprehensive set of experiments over a suite of twelve sampling tasks on three online web platforms, and three offline entity datasets reveals that DT-TMP outperforms all baseline samplers by upto a margin of 54% on Twitter and 48% on RateMDs. An extensive ablation study confirms DT-TMP's superior performance under different sampling scenarios.
更多
查看译文
关键词
hidden populations,social network,query rewriting,decision tree branches,combinatorial attributed search space,online Web platforms,hierarchical multiarmed bandits,reinforcement learning,Twitter
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要