Method: ZeroPose

User jianqiujianqiu
Training image modalities None
Test image modalities RGB

Submitted to: BOP Challenge 2023

Training data: GSO dataset rendered by Megapose

Onboarding data: No need for onboarding

Used 3D models: Default, CAD

Notes: These results are from the ZeroPose method without multi-hypothesis refinement. Abstract: We present a CAD model-based zero-shot pose estimation pipeline called ZeroPose. Existing pose estimation methods remain to require expensive training when applied to an unseen object, which greatly hinders their scalability in the practical application of industry. In contrast, the proposed method enables the accurate estimation of pose parameters for previously unseen objects without the need for training. Specifically, we design a two-step pipeline consisting of CAD model-based zero-shot instance segmentation and a zero-shot pose estimator. For the first step, there is a simple but effective way to leverage CAD models and visual foundation models SAM and Imagebind to segment the interest unseen object at the instance level. For the second step, we based on the intensive geometric information in the CAD model of the rigid object to propose a lightweight hierarchical geometric structure matching mechanism achieving zero-shot pose estimation. Extensive experimental results on the seven core datasets on the BOP challenge show that the proposed zero-shot instance segmentation methods achieve comparable performance with supervised MaskRCNN and the zero-shot pose estimation results outperform the SOTA pose estimators with better efficiency.

If you have any questions, feel free to contact us at

Computer specifications RTX 3090

Public submissions

Date Submission name Dataset
2023-08-23 02:24 - HB
2023-08-23 10:46 - ITODD
2023-08-23 11:00 - YCB-V
2023-08-23 11:32 - T-LESS
2023-08-23 12:06 - TUD-L
2023-08-23 12:10 - LM-O
2023-08-23 12:47 - IC-BIN
2023-09-16 13:06 - IC-BIN
2023-09-16 13:50 - LM-O
2023-09-17 09:25 - HB
2023-09-17 09:39 - ITODD
2023-09-17 09:40 - T-LESS
2023-09-17 10:25 - YCB-V
2023-09-17 12:19 - TUD-L