Label smoothing and task-adaptive loss function based on prototype network for few-shot learning

Neural Networks(2022)

引用 6|浏览8
暂无评分
摘要
Aiming at solving the problems of prototype network that the label information is not reliable enough and that the hyperparameters of the loss function cannot follow the changes of image feature information, we propose a method that combines label smoothing and hyperparameters. First, the label information of an image is processed by label smoothing regularization. Then, according to different classification tasks, the distance matrix and logarithmic operation of the image feature are used to fuse the distance matrix of the image with the hyperparameters of the loss function. Finally, the hyperparameters are associated with the smoothed label and the distance matrix for predictive classification. The method is validated on the miniImageNet, FC100 and tieredImageNet datasets. The results show that, compared with the unsmoothed label and fixed hyperparameters methods, the classification accuracy of the flexible hyperparameters in the loss function under the condition of few-shot learning is improved by 2%–3%. The result shows that the proposed method can suppress the interference of false labels, and the flexibility of hyperparameters can improve classification accuracy.
更多
查看译文
关键词
Flexible hyperparameters,Improved loss function,Few-shot learning,Image classification,Deep learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要