Fast metadata-driven multiresolution tensor decomposition.

CIKM '11: International Conference on Information and Knowledge Management Glasgow Scotland, UK October, 2011(2011)

引用 4|浏览13
暂无评分
摘要
Tensors (multi-dimensional arrays) are widely used for representing high-order dimensional data, in applications ranging from social networks, sensor data, and Internet traffic. Multi-way data analysis techniques, in particular tensor decompositions, allow extraction of hidden correlations among multi-way data and thus are key components of many data analysis frameworks. Intuitively, these algorithms can be thought of as multi-way clustering schemes, which consider multiple facets of the data in identifying clusters, their weights, and contributions of each data element. Unfortunately, algorithms for fitting multi-way models are, in general, iterative and very time consuming. In this paper, we observe that, in many applications, there is a priori background knowledge (or metadata) about one or more domain dimensions. This metadata is often in the form of a hierarchy that clusters the elements of a given data facet (or mode). In this paper, we investigate whether such single-mode data hierarchies can be used to boost the efficiency of tensor decomposition process, without significant impact on the final decomposition quality. We consider each domain hierarchy as a guide to help provide higher- or lower-resolution views of the data in the tensor on demand and we rely on these metadata-induced multi-resolution tensor representations to develop a multiresolution approach to tensor decomposition. In this paper, we focus on an alternating least squares (ALS) based implementation of the PARAllel FACtors (PARAFAC) decomposition (which decomposes a tensor into a diagonal tensor and a set of factor matrices). Experiment results show that, when the available metadata is used as a rough guide, the proposed multiresolution method helps fit PARAFAC models with consistent (for both dense and sparse tensor representations, under different parameters settings) savings in execution time and memory consumption, while preserving the quality of the decomposition.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要