Development of a Toolset for Calibrating Optical Motion Capture with External RGBD Cameras
Käpylä, Jani (2023)
Käpylä, Jani
2023
Tieto- ja sähkötekniikan kandidaattiohjelma - Bachelor's Programme in Computing and Electrical Engineering
Informaatioteknologian ja viestinnän tiedekunta - Faculty of Information Technology and Communication Sciences
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited.
Hyväksymispäivämäärä
2023-12-20
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-2023121610919
https://urn.fi/URN:NBN:fi:tuni-2023121610919
Tiivistelmä
Motion capture (mocap) is a procedure to track an object or actor's movement in space. This is widely utilized in the entertainment industry as a source of animation and to allow live productions. In research, mocap can be utilized as a source of ground truth, for example, in robotics and computer vision. Camera-related applications require precise synchronization and knowing the exact transformation of the camera's optical center to the mocap frame. Once this transformation is known, it's possible to do a coordinate transformation from the mocap frame to the camera coordinates.
The objective of this work is to integrate an optical motion capture system with external depth and color cameras. This includes camera calibration and synchronization of the system. Synchronization is exceptionally important due to the working principle and sensitivity of the depth and mocap cameras involved. The transformation between the mocap and color camera is calibrated with a custom checkerboard tracked by the mocap. A toolset and pipeline for these tasks are created for future use. The work is performed by first studying related literature and implementing different parts with C++ and Python. Existing libraries are utilized where possible. Mocap calibration is performed with the Kabsch-Umeyama algorithm, which is based on the least squares method. The system as a whole is evaluated and analyzed at every stage.
The produced system is tested by creating a proof-of-concept data set for object tracking and segmentation. The results show that the implemented system produces accurate positional ground truth for the camera's optical center. However, more care should be placed in operating the mocap as tracking data had some problems in some frames. The whole calibration procedure could be made more straightforward by picking a different calibration method where there is no need to track the checkerboard.
The objective of this work is to integrate an optical motion capture system with external depth and color cameras. This includes camera calibration and synchronization of the system. Synchronization is exceptionally important due to the working principle and sensitivity of the depth and mocap cameras involved. The transformation between the mocap and color camera is calibrated with a custom checkerboard tracked by the mocap. A toolset and pipeline for these tasks are created for future use. The work is performed by first studying related literature and implementing different parts with C++ and Python. Existing libraries are utilized where possible. Mocap calibration is performed with the Kabsch-Umeyama algorithm, which is based on the least squares method. The system as a whole is evaluated and analyzed at every stage.
The produced system is tested by creating a proof-of-concept data set for object tracking and segmentation. The results show that the implemented system produces accurate positional ground truth for the camera's optical center. However, more care should be placed in operating the mocap as tracking data had some problems in some frames. The whole calibration procedure could be made more straightforward by picking a different calibration method where there is no need to track the checkerboard.
Kokoelmat
- Kandidaatintutkielmat [8430]