Assessing Hyper Parameter Optimization and Speedup for Convolutional Neural Networks.

Int. J. Artif. Intell. Mach. Learn.(2020)

引用 0|浏览4
暂无评分
摘要
The increased processing power of graphical processing units (GPUs) and the availability of large image datasets has fostered a renewed interest in extracting semantic information from images. Promising results for complex image categorization problems have been achieved using deep learning, with neural networks comprised of many layers. Convolutional neural networks (CNN) are one such architecture which provides more opportunities for image classification. Advances in CNN enable the development of training models using large labelled image datasets, but the hyper parameters need to be specified, which is challenging and complex due to the large number of parameters. A substantial amount of computational power and processing time is required to determine the optimal hyper parameters to define a model yielding good results. This article provides a survey of the hyper parameter search and optimization methods for CNN architectures.
更多
查看译文
关键词
Artificial Intelligence,Cognitive Image Processing,Convolution,Deep Learning,Hidden Layers,Machine Learning,Object Recognition,Semantics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要