Submission: ZeroPose-Multi-Hypo-Refinement/TUD-L

Download submission
Submission name
Submission time (UTC) Sept. 20, 2023, 1:04 p.m.
User jianqiujianqiu
Task Model-based 6D localization of unseen objects
Dataset TUD-L
Description
Evaluation scores
AR:0.790
AR_MSPD:0.811
AR_MSSD:0.812
AR_VSD:0.748
average_time_per_image:-1.000

Method: ZeroPose-Multi-Hypo-Refinement

User jianqiujianqiu
Publication https://arxiv.org/abs/2305.17934
Implementation
Training image modalities RGB-D
Test image modalities RGB-D
Description

Submitted to: BOP Challenge 2023

Training data: GSO dataset rendered by Megapose

Onboarding data: No need for onboarding

Used 3D models: Default, CAD

Notes: These results are from the ZeroPose method with multi-hypothesis refinement (Megapose Refiner). Abstract: We present a CAD model-based zero-shot pose estimation pipeline called ZeroPose. Existing pose estimation methods remain to require expensive training when applied to an unseen object, which greatly hinders their scalability in the practical application of industry. In contrast, the proposed method enables the accurate estimation of pose parameters for previously unseen objects without the need for training. Specifically, we design a two-step pipeline consisting of CAD model-based zero-shot instance segmentation and a zero-shot pose estimator. For the first step, there is a simple but effective way to leverage CAD models and visual foundation models SAM and Imagebind to segment the interest unseen object at the instance level. For the second step, we based on the intensive geometric information in the CAD model of the rigid object to propose a lightweight hierarchical geometric structure matching mechanism achieving zero-shot pose estimation. Extensive experimental results on the seven core datasets on the BOP challenge show that the proposed zero-shot instance segmentation methods achieve comparable performance with supervised MaskRCNN and the zero-shot pose estimation results outperform the SOTA pose estimators with better efficiency.

If you have any questions, feel free to contact us at jianqiuer@gmail.com.

Computer specifications V100