Learning Perceptual Texture Similarity And Relative Attributes From Computational Features

2016 International Joint Conference on Neural Networks (IJCNN)(2016)

引用 4|浏览5
暂无评分
摘要
Previous work has shown that perceptual texture similarity and relative attributes cannot be well described by computational features. In this paper, we propose to predict human's visual perception of texture images by learning a non-linear mapping from computational feature space to perceptual space. Hand-crafted features and deep features, which were successfully applied in texture classification tasks, were extracted and used to train Random Forest and rankSVM models against perceptual data from psychophysical experiments. Three texture datasets were used to test our proposed method and the experiments show that the predictions of such learnt models are in high correlation with human's results.
更多
查看译文
关键词
perceptual texture similarity,relative attributes,human visual perception,texture images,nonlinear mapping,computational feature space,perceptual space,hand-crafted features,deep features,texture classification,random forest,rankSVM,psychophysical experiments,texture datasets
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要