Skip to yearly menu bar Skip to main content


Poster

DAB-DETR: Dynamic Anchor Boxes are Better Queries for DETR

Shilong Liu · Feng Li · Hao Zhang · Xiao Yang · Xianbiao Qi · Hang Su · Jun Zhu · Lei Zhang

Keywords: [ object detection ] [ transformer ]


Abstract:

We present in this paper a novel query formulation using dynamic anchor boxes for DETR and offer a deeper understanding of the role of queries in DETR. This new formulation directly uses box coordinates as queries in Transformer decoders and dynamically updates them layer-by-layer. Using box coordinates not only helps using explicit positional priors to improve the query-to-feature similarity measure and eliminate the slow training convergence issue in DETR, but also allows us to modulate the positional attention map using the box width and height information. Such a design makes it clear that queries in DETR can be implemented as performing soft ROI pooling layer-by-layer in a cascade manner. As a result, it leads to the best performance among the DETR-like detection models under the same setting, e.g. AP 45.7\% using R50 as backbone trained in 50 epochs. We also conducted extensive experiments to confirm our analysis and verify the effectiveness of our methods. Code will be released soon.

Chat is not available.