Submission name | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Submission time (UTC) | Aug. 22, 2023, 10:19 a.m. | ||||||||||
User | jianqiujianqiu | ||||||||||
Task | Model-based 6D localization of unseen objects | ||||||||||
Dataset | YCB-V | ||||||||||
Description | |||||||||||
Evaluation scores |
|
User | jianqiujianqiu |
---|---|
Publication | https://arxiv.org/abs/2305.17934 |
Implementation | |
Training image modalities | RGB-D |
Test image modalities | RGB-D |
Description | Submitted to: BOP Challenge 2023 Training data: GSO dataset rendered by Megapose Onboarding data: No need for onboarding Used 3D models: Default, CAD Notes: These results are from the ZeroPose method with multi-hypothesis refinement (Megapose Refiner). Abstract: We present a CAD model-based zero-shot pose estimation pipeline called ZeroPose. Existing pose estimation methods remain to require expensive training when applied to an unseen object, which greatly hinders their scalability in the practical application of industry. In contrast, the proposed method enables the accurate estimation of pose parameters for previously unseen objects without the need for training. Specifically, we design a two-step pipeline consisting of CAD model-based zero-shot instance segmentation and a zero-shot pose estimator. For the first step, there is a simple but effective way to leverage CAD models and visual foundation models SAM and Imagebind to segment the interest unseen object at the instance level. For the second step, we based on the intensive geometric information in the CAD model of the rigid object to propose a lightweight hierarchical geometric structure matching mechanism achieving zero-shot pose estimation. Extensive experimental results on the seven core datasets on the BOP challenge show that the proposed zero-shot instance segmentation methods achieve comparable performance with supervised MaskRCNN and the zero-shot pose estimation results outperform the SOTA pose estimators with better efficiency. If you have any questions, feel free to contact us at jianqiuer@gmail.com. |
Computer specifications | V100 |