论文部分内容阅读
In industrial manufacturing,the deployment of dual-arm robots in assembly tasks has become a trend.However,making the dual-arm robots more intelligent in such applications is still an open,challenging issue.This paper proposes a novel framework that combines task-oriented motion planning with visual perception to facilitate robot deployment from perception to execution and finish assembly problems by using dual-arm robots.In this framework,visual perception is first employed to track the effects of the robot behaviors and observe states of the workpieces,where the performance of tasks can be abstracted as a high-level state for intelligent reasoning.The assembly task and manipulation sequences can be obtained by analyzing and reasoning the state transition trajectory of the environment as well as the workpieces.Next,the corresponding assembly manipulation can be generated and parameterized according to the differences between adjacent states by combining with the prebuilt knowledge of the scenarios.Experiments are set up with a dual-arm robotic system (ABB YuMi and an RGB-D camera) to validate the proposed framework.Experimental results demonstrate the effectiveness of the proposed framework and the promising value of its practical application.