Active Learning-Based Grasp for Accurate Industrial Manipulation

Xiaokuan Fu,Yong Liu, Zhilei Wang

IEEE Transactions on Automation Science and Engineering(2019)

引用 18|浏览26
暂无评分
摘要
We propose an active learning-based grasp method for accurate industrial manipulation that combines the high accuracy of geometrically driven grasp methods and the generalization ability of data-driven grasp methods. Our grasp sequence consists of pregrasp stage and grasp stage which integrates the active perception and manipulation. In pregrasp stage, the manipulator actively moves and perceives the object. At each step, given the perception image, a motion is chosen so that the manipulator can adjust to a proper pose to grasp the object. We train a convolutional neural network to estimate the motion and combine the network with a closed-loop control so that the end effector can move to the pregrasp state. In grasp stage, the manipulator executes a fixed motion to complete the grasp task. The fixed motion can be acquired from the demonstration with nonexpert conveniently. Our proposed method does not require the prior knowledge of camera intrinsic parameters, hand-eye transformation, or manually designed feature of objects. Instead, the training data sets containing prior knowledge are collected through interactive perception. The method can be easily transferred to new tasks with a few human interventions and is able to complete high accuracy grasp task with a certain robustness to partial observation condition. In our circuit board grasping tests, we could achieve a grasp accuracy of 0.8 mm and $0.6°.
更多
查看译文
关键词
Task analysis,Manipulators,Cameras,Grasping,Three-dimensional displays,Robustness
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要