Skin Cancer Detection using Knowledge Distillation

Rashmi Yadav,Aruna Bhat

2024 3rd International Conference for Innovation in Technology (INOCON)(2024)

引用 0|浏览0
暂无评分
摘要
Skin conditions are increasingly prevalent worldwide nowadays. For this, early detection and diagnosis are effective for the treatment and prevention of skin disorders. In this research, we proposed the distilling knowledge to identify skin diseases. Our work is in two parts. Firstly, we train the deep neural network (teacher model) on large dataset which is publicly available on the Kaggle platform. While the teacher model has high accuracy, due to its computational cost it can’t be used on low-power devices. So, to resolve this problem, we transfer the knowledge of the higher model i.e. teacher model to the lower model i.e. student model by the knowledge distillation (KD) technique. As the student model is simple and requires few calculations, we can deploy the model on low-power devices. We evaluate the proposed framework on the publicly available dataset representing binary classification. Our results show that the teacher model has 0.9531 training accuracy and 0.8197 validation accuracy with 0.8758 test accuracy. Adding to this, the distilled model has 0.7970 accuracy on test set.
更多
查看译文
关键词
Skin lesion,detection,Knowledge distillation,DenseNet
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要