Robotic assembly, using RGBD-based object pose estimation & grasp detection.
Ahmad, Saad (2020)
Ahmad, Saad
2020
Automaatiotekniikan DI-ohjelma - Master's Programme in Automation Engineering
Tekniikan ja luonnontieteiden tiedekunta - Faculty of Engineering and Natural Sciences
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
Hyväksymispäivämäärä
2020-10-07
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-202009267132
https://urn.fi/URN:NBN:fi:tuni-202009267132
Tiivistelmä
A lot of research has been done, in robotics, on grasp-detection using image and depth-sensor data. Most recent of this research proves a dominance of deep-learning methods on both known and novel objects. With a drastic shift towards data-driven approaches, it comes as a natural consequence, that the amount and variety of datasets have become huge. Although standardized object-sets and benchmarking protocols have been repeatedly used and improved upon, the com-plete pipeline from object-detection to pose-estimation, dexterous grasping and generalized ma-nipulation is an intricate problem that is still going through re-iteration with different object cate-gories and varying manipulation tasks and constraints. In this context, this thesis is a replication of two state-of-the art grasp/pose estimation methods i.e., Class-agnostic and multi-class trained. The grasps estimated are used to assess the performance and repeatability of pick-and-place and pick-and-oscillate tasks. It also goes in depth, through data-collection, training and evaluation of pose-estimation on an entirely new dataset comprising of a few complex industrial parts and a few of the standard parts from cranfield assembly. The aim of this research is to assess the re-usability of the modern methods in pose and grasp-estimation literature in robotics in terms of retraining, performance, efficiency, generalization and to combine it into a grasp-manipulate pipe-line for evaluating it’s true utility in robotic-manipulation research.