In this work, we propose to tackle the issue of detecting and segmenting small and complex-shaped buildings in Electro-Optical (EO) and SAR satellite imagery. A novel architecture Deep Multi-scale Aware Overcomplete Network (DeepMAO), is proposed that comprises an overcomplete branch that focuses on fine structural features and an undercomplete (U-Net) branch tasked to focus on coarse, semantic-rich features. Additionally, a novel self-regulating augmentation strategy, “Loss-Mix,” is proposed to increase pixel representation of misclassified pixels.
Check out our paper here and presentation slides here.
We advise you to use conda environment to run the package. Run the following command to install all the necessary modules:
conda env create -f environment.yml
conda activate solaris_new
You can download the MSAW(Multi-Sensor All Weather Mapping) dataset here.
Update the data paths in the 'DeepMAO.py' in the command argparser.
Our training and validation split is based of the official SN6 challenge repo. To be precise, you can find the validation files in the val_masks.csv file. To train the model from scratch, run the following command:
./train.sh
To only run the inference, use the ckpt provided here. In the train.sh file, remove the '--train' parameter. The provided checkpoint is only for EO modality of MSAW dataset.
To get an additional bump on the result, run the post-processing file using:
./postprocessing.sh
To view the predictions, run the following commands:
mkdir plots_of_predictions
python visual.py
If you find this repo useful for your work, please cite our paper:
@inproceedings{sikdar2023deepmao,
title={DeepMAO: Deep Multi-Scale Aware Overcomplete Network for Building Segmentation in Satellite Imagery},
author={Sikdar, Aniruddh and Udupa, Sumanth and Gurunath, Prajwal and Sundaram, Suresh},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={487--496},
year={2023}
}
This repo is based on Spacenet6 challenge.