Visual-tactile manipulation to collect household waste in outdoor
Published in Revista Iberoamericana de Automática e Informática Industrial (RIAI), 2022
This work presents a perception system applied to robotic manipulation, that is able to assist in navegation, household wasteclassification and collection in outdoor environments. This system is made up of optical tactile sensors, RGBD cameras and aLiDAR. These sensors are integrated on a mobile platform with a robot manipulator and a robotic gripper. Our system is dividedin three software modules, two of them are vision-based and the last one is tactile-based. The vision-based modules use CNNsto localize and recognize solid household waste, together with the grasping points estimation. The tactile-based module, whichalso uses CNNs and image processing, adjusts the gripper opening to control the grasping from touch data. Our proposal achieveslocalization errors around 6 %, a recognition accuracy of 98 % and ensures the grasping stability the 91 % of the attempts. The sum of runtimes of the three modules is less than 750 ms.
Keywords: Visual detection, Object recognition, Object location, Tactile perception, Robotic manipulation
Recommended citation: Julio Castaño-Amorós, Ignacio de Loyola Páez-Ubieta, Pablo Gil, Santiago Puente (2023). "Visual-tactile manipulation to collect household waste in outdoor." Revista Iberoamericana de Automática e Informática Industrial (RIAI). 20, 163-174, doi: 10.4995/riai.2022.18534
Download Paper