Paper: Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data by Lihe Yang, Bingyi Kang, Zilong Huang, Xiaogang Xu, Jiashi Feng, Hengshuang Zhaoonathan Ho, Ajay Jain, Pieter Abbeel, 19 Jan. 2024
Code: https://github.com/LiheYoung/Depth-Anything
Project Page: https://depth-anything.github.io/
Conference: CVPR2024
Category: foundation models, monocular depth estimation
- Context & Background
- Method
- Qualitative Results
- Experiments & Ablations
- Further Readings & Resources
Why is depth such an important modality and why using deep learning for it?
Put simply: to navigate through 3D space, one must need to know where all the stuff is and at which distance. Classical applications include collision avoidance, drivable space detection, placing objects into a virtual or augmented reality, creating 3D objects, navigating a robot to grab an object and many…