Winner takes all hashing for speeding up the training of neural networks in large class problems.
Pattern Recognition Letters(2017)
摘要
•Use Winner Takes All Hashing (WTA) to speed up neural networks classification layer.•Study the number of units that require computation in a classification layer.•Study the performance of WTA in identifying units that require computation.•Demonstrate speed and accuracy trade-off during testing when using WTA hashing.•Show 6x speedup on the fall 2011 release of the ImageNet-21K dataset in training.
更多查看译文
关键词
41A05,41A10,65D05,65D17
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络