Model-based 2D segmentation of unseen objects – TUD-L

This leaderbord shows the ranking for Model-based 2D segmentation of unseen objects on TUD-L. The reported time is the average image processing time.

Date (UTC) Submission Test image AP AP50 AP75 APS APM APL AR1 AR10 AR100 ARS ARM ARL Time (s)
2023-12-05 SAM6D RGB-D 0.569 0.885 0.674 0.647 0.567 0.526 0.629 0.649 0.650 0.750 0.640 0.612 2.393
2024-11-16 MUSE (full) RGB 0.565 0.855 0.688 0.129 0.569 0.623 0.624 0.652 0.652 0.600 0.637 0.737 0.622
2024-05-27 NIDS-Net_WA_Sappe RGB 0.556 0.878 0.617 0.501 0.560 0.634 0.605 0.614 0.614 0.600 0.600 0.697 0.489
2024-05-08 NIDS-Net_WA RGB 0.535 0.869 0.578 0.449 0.547 0.492 0.592 0.604 0.604 0.600 0.589 0.678 0.485
2024-05-07 NIDS-Net_basic RGB 0.520 0.844 0.562 0.064 0.527 0.566 0.582 0.604 0.604 0.600 0.591 0.672 0.487
2023-12-05 SAM6D-FastSAM RGB-D 0.517 0.866 0.593 0.595 0.518 0.580 0.568 0.577 0.577 0.625 0.567 0.644 0.333
2024-03-22 SAM6D-FastSAM(RGB) RGB 0.501 0.838 0.571 0.568 0.496 0.579 0.557 0.574 0.574 0.600 0.563 0.648 0.186
2024-03-22 SAM6D(RGB) RGB 0.498 0.824 0.569 0.317 0.495 0.448 0.584 0.620 0.623 0.675 0.614 0.596 1.980
2023-08-02 CNOS (FastSAM) (FastSAM) RGB 0.480 0.818 0.541 0.523 0.484 0.359 0.553 0.566 0.566 0.650 0.556 0.571 0.163
2023-11-21 ViewInvDet RGB 0.464 0.825 0.506 0.520 0.467 0.453 0.540 0.566 0.566 0.600 0.555 0.606 1.268
2023-08-23 ZeroPose RGB 0.421 0.647 0.512 0.030 0.424 0.442 0.511 0.582 0.586 0.750 0.581 0.530 3.406
2023-08-02 CNOS (SAM) (SAM) RGB 0.391 0.599 0.458 0.322 0.400 0.269 0.450 0.473 0.473 0.675 0.472 0.444 1.623
2023-09-27 lcc-fastsam (3rd stage) RGB 0.153 0.288 0.146 0.012 0.161 0.142 0.249 0.291 0.291 0.175 0.276 0.395 0.356

Tip: Hover over the numbers to see more decimal places.

Latest submissions to this leaderboard: