Multi-network fusion for collective inference.

KDD(2010)

引用 2|浏览11
暂无评分
摘要
ABSTRACTAlthough much of the recent work in statistical relational learning has focused on homogeneous networks, many relational domains naturally consist of multiple observed networks, where each network source records a different type of relationship between the same set of entities. For example, data about organizations may contain both an email communication network and a network of coworker ties. Since collective classification models rely on propagating information throughout the relational network to improve predictions, multi-network methods will need to consider how to best combine relational information from various link sources. There are two opportunities to combine multi-source link information for relational classification: data fusion methods combine the available information during learning, while classification fusion methods learn and apply models independently on each network source, then combine model predictions during inference. Past work has focused primarily on data fusion techniques, where features and/or links from various sources are combined. However, as the number of links, sources, and/or features increases, this approach can lead to high variance in the learned model, and can also increase the amount of noise propagated during inference, which will degrade performance. In this work, we focus on classification fusion, which overcomes these limitations by learning independent models to reduce variance. In addition, we develop a novel approach to collective fusion, which interleaves the learned models during collective inference. We evaluate our methods on synthetic and real-world social network data, showing that collective fusion significantly outperforms other methods over a wide range of conditions.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要