Skip to content

tamtamz/bigdata2024-buildext

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IEEE BigData Cup 2024: Building Extraction

This repository contains the source code for the competition scheduled to be held in IEEE BigData conference in 2024, IEEE BigData Cup 2024: Building Extraction. Our model secured the second position in the competition leaderboard.

The source code is based on the MMDetection framework. We will provide instructions on how to train a model for the building extraction competition and test the trained model.

Installation

Please follow the instruction of MMDetection and install MMCV and MMEngine. After installing them, clone this repository, go to the cloned directory, and install this framework as follows:

$ pip install -v -e .

Data preparation

Image and annotations

To align with the COCO annotation format, it is required to slightly modify the annotations provided by the competition organizer and save them in the data/annotations directory. Please download the competition dataset from this link and modify them as follows:

$ python prepare_bigdata.py --data_path path/to/dataset --save_path data

The modified annotations are stored in the data/annotations directory, and the images are copied to the data/images directory.

Additionally, we used an external dataset, the Alabama Buildings Segmentation dataset. After downloading the dataset, please generate the annotation data with the following command:

$ python prepare_alabama.py --data_path path/to/dataset --save_path data

The final folder structure becomes as follows:

.
├── data
:   ├── images
    │     ├── bigdata_train
    │     │     ├── CBD_0001_0_0.jpg
    │     │     ├── CBD_0001_0_1.jpg
    │     │     :
    │     ├── bigdata_val
    │     ├── bigdata_test
    │     └── alabama
    └── annotations
          ├── bigdata_train.json
          ├── bigdata_val.json
          ├── bigdata_test.json
          └── alabama.json

Initialization parameters

The model is initialized with the COCO pretrained model. Please download the parameters pretrained with the COCO dataset from this link and store it in the top of the directory as follows:

.
├── configs
├── convert_to_submit.py
:
├── rtmdet-ins_x_8xb16-300e_coco_20221124_111313-33d4595b.pth
: 

Pretrained model

If you want to test our model without training it, you can download the pretrained model parameters from this link.

Training

The training can be started with the following commands.

Training with a single GPU.

$ python tools/train.py configs/building/rtmdet-ins_x_allaug_alabama_val.py

Training with 4 GPUs.

$ ./tools/dist_train.sh configs/building/rtmdet-ins_x_allaug_alabama_val.py 4

The output data will be stored in the work_dirs/rtmdet-ins_x_allaug_alabama_val directory.

Testing

After training is complete, you can evaluate the model using the test set. The following command runs the testing process on the competition test set and generates the results in a dets.pkl file.

$ python tools/test.py \
      configs/building/rtmdet-ins_x_allaug_alabama_val.py \
      work_dirs/rtmdet-ins_x_allaug_alabama_val/epoch_24.pth \
      --work-dir pred_test \
      --out pred_test/dets.pkl

To submit your results to the competition, the output pickle file must be converted to a CSV format. Use the command below to convert the dets.pkl file to a CSV file. By default, the output file will be named dets.csv.

python convert_to_submit.py --input_path pred_test/dets.pkl

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages