In-Materio Extreme Learning Machines.

Parallel Problem Solving from Nature(2022)

引用 0|浏览20
暂无评分
摘要
Nanomaterial networks have been presented as a building block for unconventional in-Materio processors. Evolution in-Materio (EiM) has previously presented a way to configure and exploit physical materials for computation, but their ability to scale as datasets get larger and more complex remains unclear. Extreme Learning Machines (ELMs) seek to exploit a randomly initialised single layer feed forward neural network by training the output layer only. An analogy for a physical ELM is pro0duced by exploiting nanomaterial networks as material neurons within the hidden layer. Circuit simulations are used to efficiently investigate diode-resistor networks which act as our material neurons. These in-Materio ELMs (iM-ELMs) outperform common classification methods and traditional artificial ELMs of a similar hidden layer size. For iM-ELMs using the same number of hidden layer neurons, leveraging larger more complex material neuron topologies (with more nodes/electrodes) leads to better performance, showing that these larger materials have a better capability to process data. Finally, iM-ELMs using virtual material neurons, where a single material is re-used as several virtual neurons, were found to achieve comparable results to iM-ELMs which exploited several different materials. However, while these Virtual iM-ELMs provide significant flexibility, they sacrifice the highly parallelised nature of physically implemented iM-ELMs.
更多
查看译文
关键词
Evolution in-Materio,Evolvable processors,Extreme learning machines,Material neurons,Virtual neurons,Classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要