Lightweight Monocular Depth with a Novel Neural Architecture Search Method
Huynh, Lam; Nguyen, Phong; Matas, Jiri; Rahtu, Esa; Heikkila, Janne (2022)
Huynh, Lam
Nguyen, Phong
Matas, Jiri
Rahtu, Esa
Heikkila, Janne
IEEE
2022
This publication is copyrighted. You may download, display and print it for Your own personal use. Commercial use is prohibited
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-202211078205
https://urn.fi/URN:NBN:fi:tuni-202211078205
Kuvaus
Peer reviewed
Tiivistelmä
This paper presents a novel neural architecture search method, called LiDNAS, for generating lightweight monocular depth estimation models. Unlike previous neural architecture search (NAS) approaches, where finding optimized networks is computationally demanding, the introduced novel Assisted Tabu Search leads to efficient architecture exploration. Moreover, we construct the search space on a pre-defined backbone network to balance layer diversity and search space size. The LiDNAS method outperforms the state-of-the-art NAS approach, proposed for disparity and depth estimation, in terms of search efficiency and output model performance. The LiDNAS optimized models achieve result superior to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and ScanNet, while being 7%-500% more compact in size, i.e the number of model parameters.
Kokoelmat
- TUNICRIS-julkaisut [19195]