Wentong Li, Yijie Chen, Kaixuan Hu, Jianke Zhu* (Arxiv)
- Based on OrientedRepPoints detector, the 2nd and 3rd Places are achieved on the Task 2 and Task 1 respectively in the “2021 challenge of Learning to Understand Aerial Images(LUAI)”. The detailed codes and introductions about it, please refer to this repository and 知乎.
-
About the detailed installation, please see this CSDN Blog. (Thanks for author@SSSlasH of this blog).
-
The code for MMRotate is available now.
-
RepPoints + our APAA can obtain +2.5AP (36.3 to 38.8) improvement with R-50 on COCO dataset for general object detection.
Please refer to for installation and dataset preparation.
This repo is based on . Please see for the basic usage.
The results on DOTA test set are shown in the table below. More detailed results please see the paper.
Model | Backbone | data aug(HSV+Rotation) | mAP | model | log |
---|---|---|---|---|---|
OrientedReppoints | R-50 | 75.97 | model | log | |
OrientedReppoints | R-101 | 76.52 | model | log | |
OrientedReppoints | Swin-Tiny | √ | 78.11 | model | log |
Note:
- The pretrained model--swin_tiny_patch4_window7_224 of Swin-Tiny for pytorch1.4.0 is here.
- We recommend to use our demo configs with 4 GPUs.
- The results are performed on the original DOTA images with 1024x1024 patches.
- The scale jitter is employed during training. More details see the paper.
The mAOE results on DOTA val set are shown in the table below.
Model | Backbone | mAOE | Download |
---|---|---|---|
OrientedReppoints | R-50 | 5.93° | model |
Note:Orientation error evaluation (mAOE) is calculated on the val subset(only train subset for training).
The visualization code for oriented bounding boxes and learning points is .
- Oriented bounding box
@inproceeding{orientedreppoints,
title="Oriented RepPoints for Aerial Object Detection.",
author="Wentong {Li}, Yijie {Chen}, Kaixuan {Hu}, Jianke {Zhu}.",
journal="The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)",
year="2022"
}
Here are some great resources we benefit. We would espeicially thank the authors of: