Approximate logic neuron model trained by states of matter search algorithm.

Knowledge-Based Systems(2019)

引用 40|浏览5
暂无评分
摘要
An approximate logic neuron model (ALNM) is a single neural model with a dynamic dendritic structure. During the training process, the model is capable of reducing useless synapses and unnecessary branches of dendrites by neural pruning function. It provides a simplified dendritic morphology for each particular problem. Then, the simplified model of ALNM can be substituted with a logic circuit, which is easy to implement on hardware. However, the computational capacity of this model has been greatly restricted by its learning algorithm, the back-propagation (BP) algorithm, because it is sensitive to initial values and easy to be trapped into local minima. To address this critical issue, we have investigated the capabilities of heuristic optimization methods that are acknowledged as global searching algorithms. Through comparison experiments, a states of matter search (SMS) algorithm has been verified to be the most suitable training method for ALNM. To evaluate the performance of SMS, six benchmark datasets are utilized in the experiments. The corresponding results are compared with the BP algorithm, other optimization methods, and several widely used classifiers. In addition, the classification performances of logic circuits trained by SMS are also presented in this study.
更多
查看译文
关键词
Classification,States of matter search,Neural network,Pruning,Logic circuit
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要