Model-based 2D detection of unseen objects – TUD-L

This leaderbord shows the ranking for Model-based 2D detection of unseen objects on TUD-L. The metrics are defined in Section 2 of the BOP 2022 paper. The reported time is the average image processing time.

Date (UTC) Submission Test image AP AP50 AP75 APS APM APL AR1 AR10 AR100 ARS ARM ARL Time (s)
2024-11-16 MUSE (full) RGB 0.595 0.878 0.728 0.096 0.609 0.659 0.674 0.698 0.698 0.500 0.686 0.777 0.622
2023-12-05 SAM6D-FastSAM RGB-D 0.573 0.865 0.672 0.577 0.583 0.621 0.648 0.667 0.667 0.600 0.662 0.718 0.333
2024-09-18 F3DT2D (test01/blenderproc) RGB 0.573 0.856 0.696 0.026 0.573 0.700 0.660 0.689 0.689 0.475 0.676 0.782 0.449
2024-03-22 SAM6D-FastSAM(RGB) RGB 0.546 0.830 0.633 0.536 0.554 0.608 0.631 0.658 0.658 0.575 0.655 0.712 0.186
2023-12-05 SAM6D RGB-D 0.537 0.850 0.617 0.575 0.549 0.446 0.634 0.655 0.659 0.700 0.652 0.605 2.393
2023-08-02 CNOS (FastSAM) (FastSAM) RGB 0.534 0.829 0.623 0.507 0.553 0.378 0.635 0.655 0.655 0.675 0.652 0.603 0.163
2023-11-23 ViewInvDet RGB 0.508 0.813 0.584 0.599 0.533 0.436 0.620 0.648 0.648 0.650 0.649 0.633 1.268
2024-05-08 NIDS-Net_WA_Sappe RGB 0.486 0.829 0.522 0.363 0.525 0.414 0.584 0.598 0.598 0.475 0.592 0.595 0.489
2024-05-08 NIDS-Net_WA RGB 0.460 0.807 0.481 0.316 0.507 0.288 0.567 0.581 0.581 0.475 0.574 0.521 0.485
2024-05-08 NIDS-Net_basic RGB 0.434 0.775 0.462 0.047 0.474 0.309 0.554 0.577 0.577 0.475 0.573 0.501 0.487
2023-09-17 ZeroPose RGB 0.431 0.650 0.514 0.030 0.442 0.446 0.540 0.607 0.614 0.775 0.611 0.510 3.406
2023-08-02 CNOS (SAM) (SAM) RGB 0.368 0.598 0.399 0.308 0.383 0.245 0.453 0.476 0.476 0.675 0.477 0.420 1.623

Tip: Hover over the numbers to see more decimal places.

Latest submissions to this leaderboard: