On this page, we provide detailed results containing the performances of all methods in terms of all metrics on all classes. Please refer to the Benchmark Suite for details on the evaluation and metrics. Jump to the individual tables via the following links:
Pixel-Level Semantic Labeling Task
Instance-Level Semantic Labeling Task
Panoptic Semantic Labeling Task
3D Vehicle Detection Task
Pixel-Level Semantic Labeling Task
IoU on class-level
iIoU on class-level
Instance-Level Semantic Labeling Task
AP on class-level
AP 50 % on class-level
AP 100 m on class-level
AP 50 m on class-level
Panoptic Semantic Labeling Task
PQ on class-level
SQ on class-level
RQ on class-level
3D Vehicle Detection Task
All average metrics