Multimodal Time Series Learning of Robots Based on Distributed and Integrated Modalities: Verification with a Simulator and Actual Robots

ICRA(2023)

引用 1|浏览7
暂无评分
摘要
We have developed an autonomous robot motion generation model based on distributed and integrated multimodal learning. Since each modality used as a robot's senses, such as image, joint angle, and torque, has a different physical meaning and time characteristic, the generation of autonomous motions using multimodal learning has sometimes failed due to overlearning in one of the modalities. Inspired by the sensory processing of the human brain, our model is based on the processing of each sense performed in the primary somatosensory cortex and the integrated processing of multiple senses in the association cortex and the primary motor cortex. Specifically, the proposed model utilizes two types of recurrent neural networks: sensory RNNs, which learn each sense in a time series, and a union RNN, which communicates with sensory RNNs and learns sensory integration. The simulation results of multiple tasks showed that our model processes multiple modalities appropriately and generates smoother motions with lower jerk than the conventional model. We also demonstrated a chair assembly task by combining fixed motions and autonomous motions with our model.
更多
查看译文
关键词
association cortex,autonomous motions,autonomous robot motion generation model,conventional model,different physical meaning,fixed motions,integrated modalities,integrated processing,joint angle,multimodal learning,multiple modalities,multiple senses,primary motor cortex,primary somatosensory cortex,sensory integration,sensory processing,sensory RNNs,smoother motions,time characteristic
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要