Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Master #9

Open
wants to merge 579 commits into
base: master
Choose a base branch
from
Open

Master #9

wants to merge 579 commits into from

Conversation

bowenroom
Copy link
Owner

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

Please describe the motivation of this PR and the goal you want to achieve through this PR.

Modification

Please briefly describe what modification is made in this PR.

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMDet3D.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

RockeyCoss and others added 30 commits January 11, 2022 12:27
* [Feature] add auto resume

* Update mmseg/utils/find_latest_checkpoint.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update mmseg/utils/find_latest_checkpoint.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* modify docstring

* Update mmseg/utils/find_latest_checkpoint.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* add copyright

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
* Fix typo in usage example

* original mosaic code in mmdet

* Adjust mosaic to the semantic segmentation

* Remove bbox test in test_mosaic

* Add unittests

* Fix resize mode for seg_fields

* Fix repr error

* modify Mosaic docs

* modify from Mosaic to RandomMosaic

* Add docstring

* modify Mosaic docstring

* [Docs] Add a blank line before Returns:

* add blank lines

Co-authored-by: MeowZheng <meowzheng@outlook.com>
* Fix typo in usage example

* original MultiImageMixDataset code in mmdet

* Add MultiImageMixDataset unittests in test_dataset_wrapper

* fix lint error

* fix value name ann_file to ann_dir

* modify retrieve_data_cfg (#1)

* remove dynamic_scale & add palette

* modify retrieve_data_cfg method

* modify retrieve_data_cfg func

* fix error

* improve the unittests coverage

* fix unittests error

* Dataset (#2)

* add cfg-options

* Add unittest in test_build_dataset

* add blank line

* add blank line

* add a blank line

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

Co-authored-by: Younghoon-Lee <72462227+Younghoon-Lee@users.noreply.github.com>
Co-authored-by: MeowZheng <meowzheng@outlook.com>
Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
* [Feature] add log collector

* Update .dev/log_collector/readme.md

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update .dev/log_collector/example_config.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* fix typo and so on

* modify readme

* fix some bugs and revise the readme.md

* more elegant

* Update .dev/log_collector/readme.md

Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
* fix stdc1 download link

* fix stdc1 download link
* Update README.md

Update README to add OpenMMLab website and platform link

* Update README_zh-CN.md

Update README_zh-CN to add website and platform link in chinese
* add isprs potsdam dataset

* add isprs dataset configs

* fix lint error

* fix potsdam conversion bug

* fix error in potsdam class

* fix error in potsdam class

* add vaihingen dataset

* add vaihingen dataset

* add vaihingen dataset

* fix some description errors.

* fix some description errors.

* fix some description errors.

* upload models & logs of Potsdam

* remove vaihingen and add unit test

* add chinese readme

* add pseudodataset

* use mmcv and add class_names

* use f-string

* add new dataset unittest

* add docstring and remove global variables args

* fix metafile error in PSPNet

* fix pretrained value

* Add dataset info

* fix typo

Co-authored-by: MengzhangLI <mcmong@pku.edu.cn>
…s default work-dir (#1126)

* [Feature] benchmark can add work_dir and repeat times

* change the parameter's name

* change the name of the log file

* add skp road

* add default work dir

* make it optional

* Update tools/benchmark.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update tools/benchmark.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* fix typo

* modify json name

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
* add cocostuff in class_names

* add more class names
* Fix typo in usage example

* original MultiImageMixDataset code in mmdet

* Add MultiImageMixDataset unittests in test_dataset_wrapper

* fix lint error

* fix value name ann_file to ann_dir

* modify retrieve_data_cfg (#1)

* remove dynamic_scale & add palette

* modify retrieve_data_cfg method

* modify retrieve_data_cfg func

* fix error

* improve the unittests coverage

* fix unittests error

* Dataset (#2)

* add cfg-options

* Add unittest in test_build_dataset

* add blank line

* add blank line

* add a blank line

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* [Fix] Add MultiImageMixDataset unittests

Co-authored-by: Younghoon-Lee <72462227+Younghoon-Lee@users.noreply.github.com>
Co-authored-by: MeowZheng <meowzheng@outlook.com>
Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
* Add Vaihingen

* upload models&logs of vaihingen

* fix unit test

* fix dataset pipeline

* fix unit test coverage

* fix vaihingen docstring
* add vaihingen in readme

* add vaihingen in readme

* add vaihingen in readme
* [Docs] Add MultiImageMixDataset tutorial

* modify to randommosaic

* fix markdown
)

* fix README.md in configs

* fix README.md in configs

* modify [ALGORITHM] to [BACKBONE] in backbone config README.md
* segmenter: add model

* update

* readme: update

* config: update

* segmenter: update readme

* segmenter: update

* segmenter: update

* segmenter: update

* configs: set checkpoint path to pretrain folder

* segmenter: modify vit-s/lin, remove data config

* rreadme: update

* configs: transfer from _base_ to segmenter

* configs: add 8x1 suffix

* configs: remove redundant lines

* configs: cleanup

* first attempt

* swipe CI error

* Update mmseg/models/decode_heads/__init__.py

Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>

* segmenter_linear: use fcn backbone

* segmenter_mask: update

* models: add segmenter vit

* decoders: yapf+remove unused imports

* apply precommit

* segmenter/linear_head: fix

* segmenter/linear_header: fix

* segmenter: fix mask transformer

* fix error

* segmenter/mask_head: use trunc_normal init

* refactor segmenter head

* Fetch upstream (#1)

* [Feature] Change options to cfg-option (#1129)

* [Feature] Change option to cfg-option

* add expire date and fix the docs

* modify docstring

* [Fix] Add <!-- [ABSTRACT] --> in metafile #1127

* [Fix] Fix correct num_classes of HRNet in LoveDA dataset #1136

* Bump to v0.20.1 (#1138)

* bump version 0.20.1

* bump version 0.20.1

* [Fix] revise --option to --options #1140

Co-authored-by: Rockey <41846794+RockeyCoss@users.noreply.github.com>
Co-authored-by: MengzhangLI <mcmong@pku.edu.cn>

* decode_head: switch from linear to fcn

* fix init list formatting

* configs: remove variants, keep only vit-s on ade

* align inference metric of vit-s-mask

* configs: add vit t/b/l

* Update mmseg/models/decode_heads/segmenter_mask_head.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update mmseg/models/decode_heads/segmenter_mask_head.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update mmseg/models/decode_heads/segmenter_mask_head.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update mmseg/models/decode_heads/segmenter_mask_head.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* Update mmseg/models/decode_heads/segmenter_mask_head.py

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>

* model_converters: use torch instead of einops

* setup: remove einops

* segmenter_mask: fix missing imports

* add necessary imported init funtion

* segmenter/seg-l: set resolution to 640

* segmenter/seg-l: fix test size

* fix vitjax2mmseg

* add README and unittest

* fix unittest

* add docstring

* refactor config and add pretrained link

* fix typo

* add paper name in readme

* change segmenter config names

* fix typo in readme

* fix typos in readme

* fix segmenter typo

* fix segmenter typo

* delete redundant comma in config files

* delete redundant comma in config files

* fix convert script

* update lateset master version

Co-authored-by: MengzhangLI <mcmong@pku.edu.cn>
Co-authored-by: Junjun2016 <hejunjun@sjtu.edu.cn>
Co-authored-by: Rockey <41846794+RockeyCoss@users.noreply.github.com>
Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
* Fix bug in non-distributed training

* Fix bug in non-distributed testing

* delete uncomment lines

* add args.gpus
* [Enhance] New-style CPU training and inference.

* assert mmcv version

* SyncBN to BN in training and testing

* SyncBN to BN in training and testing

* upload untracked files to this branch

* delete gpu_ids

* fix bugs

* assert args.gpu_id in train.py

* use cfg.gpu_ids = [args.gpu_id]

* use cfg.gpu_ids = [args.gpu_id]

* fix typo

* fix typo

* fix typos
* change version to v0.21.0

* change version to v0.21.0

* change version to v0.21.0

* change version to v0.21.0
1. Fix img path typo in `useful_tools.md`, `zh_cn/model_zoo.md`, and `zh_cn/train.md`
2. Add missing content in `zh_cn/useful_tools.md` to to match `en/useful_tools.md`
* [Improve] Use MMCV load_state_dict func in ViT/Swin

* use CheckpointLoader instead
* [Improve] Add exception for PointRend for support CPU-only usage

* fixed linting
* Bump v0.21.1

* add improvements in changelog

* add improvements in changelog

* fix cn readme

* change changelog
jinwonkim93 and others added 30 commits January 11, 2023 13:59
)

## Motivation

Add link for high quality synthetic face occlusion dataset.

## Modification

readme.md
## Motivation

The docstring in the class PascalContextDataset59 is misleading. Try to fix it.

## Modification

The docstring in the class PascalContextDataset59 is changed.
….x (#2457)

## Motivation

Introducing new models and features into OpenMMLab's algorithm libraries
has long been complained to be troublesome due to the rigorous
requirements on code quality, which could hinder the fast iteration of
SOTA models and might discourage potential contributors from sharing
their latest outcome here.

Ref: #2412

## Modification

This PR adds a new `projects/` folder, which will be a place for some
experimental models/features. Implementations inside might be not quite
perfect but already fine to produce some exciting results. We hope that
this PR can help us better embrace the contribution from our community.
We also add the first example project to illustrate what we expect a
good project to have.
## Motivation

1. circle ci failed without zlib1g-dev installation
3. add approve button

## Modification

1. .circleci/
## Motivation

Please describe the motivation of this PR and the goal you want to
achieve through this PR.
support get_classes, get_palette for Occluded Face dataset. 

## Modification

Please briefly describe what modification is made in this PR.
add occludedface_classes()
add occludedface_palette()
modified dataset_aliases
## Motivation

Based on the ImageNet dataset, we propose the ImageNet-S dataset has 1.2 million training images and 50k high-quality semantic segmentation annotations to support unsupervised/semi-supervised semantic segmentation on the ImageNet dataset.

paper:
Large-scale Unsupervised Semantic Segmentation (TPAMI 2022)
[Paper link](https://arxiv.org/abs/2106.03149)

## Modification

1. Support imagenet-s dataset and its' configuration
2. Add the dataset preparation in the documentation
…2500)

## Motivation

I want to fix a bug through this PR. The bug occurs when two options --
`reduce_zero_label=True`, and custom classes are used.
`reduce_zero_label` remaps the GT seg labels by remapping the zero-class
to 255 which is ignored. Conceptually, this should occur *before* the
`label_map` is applied, which maps *already reduced labels*. However,
currently, the `label_map` is applied before the zero label is reduced.

## Modification

The modification is simple:
- I've just interchanged the order of the two operations by moving 4
lines from bottom to top.
- I've added a test that passes when the fix is introduced, and fails on
the original `master` branch.

## BC-breaking (Optional)

I do not anticipate this change braking any backward-compatibility.

## Checklist

- [x] Pre-commit or other linting tools are used to fix the potential
lint issues.
  - _I've fixed all linting/pre-commit errors._
- [x] The modification is covered by complete unit tests. If not, please
add more unit test to ensure the correctness.
  - _I've added a unit test._ 
- [x] If the modification has potential influence on downstream
projects, this PR should be tested with downstream projects, like MMDet
or MMDet3D.
  - _I don't think this change affects MMDet or MMDet3D._
- [x] The documentation has been modified accordingly, like docstring or
example tutorials.
- _This change fixes an existing bug and doesn't require modifying any
documentation/docstring._
## Motivation

This fixes #2493. When the `label_map` is created, the index for ignored
classes was being set to -1, whereas the index that is actually ignored
is 255. This worked indirectly since -1 was underflowed to 255 when
converting to uint8.

The same fix was made in the 1.x by #2332 but this fix was never made to
`master`.

## Modification

The only small modification is setting the index of ignored classes to
255 instead of -1.

## Checklist

- [x] Pre-commit or other linting tools are used to fix the potential
lint issues.
  - _I've fixed all linting/pre-commit errors._
- [x] The modification is covered by complete unit tests. If not, please
add more unit test to ensure the correctness.
- _No unit tests need to be added. Unit tests that are affected were
modified.
- [x] If the modification has potential influence on downstream
projects, this PR should be tested with downstream projects, like MMDet
or MMDet3D.
  - _I don't think this change affects MMDet or MMDet3D._
- [x] The documentation has been modified accordingly, like docstring or
example tutorials.
- _This change fixes an existing bug and doesn't require modifying any
documentation/docstring._
…2520)

## Motivation

open-mmlab/mmeval#85

---------

Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
Co-authored-by: Zaida Zhou <58739961+zhouzaida@users.noreply.github.com>
## Motivation

Through this PR, I (1) fix a bug, and (2) perform some associated cleanup, and (3) add a unit test. The bug occurs during evaluation when two options -- `reduce_zero_label=True`, and custom classes are used. The bug was that the `reduce_zero_label` is not properly propagated (see details below).

## Modification

1. **Bugfix**

The bug occurs [in the initialization of `CustomDataset`](https://github.com/open-mmlab/mmsegmentation/blob/5d49918b3c48df5544213562aa322bfa89d67ef1/mmseg/datasets/custom.py#L108-L110) where the `reduce_zero_label` flag is not propagated to its member `self.gt_seg_map_loader_cfg`:

```python
self.gt_seg_map_loader = LoadAnnotations(
) if gt_seg_map_loader_cfg is None else LoadAnnotations(
    **gt_seg_map_loader_cfg)
```

Because the `reduce_zero_label` flag was not being propagated, the zero label reduction was being [unnecessarily and explicitly duplicated during the evaluation](https://github.com/open-mmlab/mmsegmentation/blob/5d49918b3c48df5544213562aa322bfa89d67ef1/mmseg/core/evaluation/metrics.py#L66-L69).

As pointed in a previous PR (#2500), `reduce_zero_label` must occur before applying the `label_map`. Due to this bug, the order gets reversed when both features are used simultaneously.

This has been fixed to:

```python
self.gt_seg_map_loader = LoadAnnotations(
    reduce_zero_label=reduce_zero_label, **gt_seg_map_loader_cfg)
```

2. **Cleanup**

Due to the bug fix, since both `reduce_zero_label` and `label_map` are being applied in `get_gt_seg_map_by_idx()` (i.e. `LoadAnnotations.__call__()`), the evaluation does not need perform them anymore. However, for backwards compatibility, the evaluation keeps previous input arguments.

This was pointed out for `label_map` in a previous issue (#1415) that the `label_map` should  not be applied in the evaluation. This was handled by [passing an empty dict](https://github.com/open-mmlab/mmsegmentation/blob/5d49918b3c48df5544213562aa322bfa89d67ef1/mmseg/datasets/custom.py#L306-L311):

```python
# as the labels has been converted when dataset initialized
# in `get_palette_for_custom_classes ` this `label_map`
# should be `dict()`, see
# #1415
# for more ditails
label_map=dict(),
reduce_zero_label=self.reduce_zero_label))
```

Similar to this, I now also set `reduce_label=False` since it is now also being handled by `get_gt_seg_map_by_idx()` (i.e. `LoadAnnotations.__call__()`).

3. **Unit test**

I've added a unit test that tests the `CustomDataset.pre_eval()` function when `reduce_zero_label=True` and custom classes are used. The test fails on the original `master` branch but passes with this fix.

## BC-breaking (Optional)

I do not anticipate this change braking any backward-compatibility.

## Checklist

- [x] Pre-commit or other linting tools are used to fix the potential lint issues.
  - _I've fixed all linting/pre-commit errors._
- [x] The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  - _I've added a test that passes when the fix is introduced, and fails on the original master branch._
- [x] If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMDet3D.
  - _I don't think this change affects MMDet or MMDet3D._
- [x] The documentation has been modified accordingly, like docstring or example tutorials.
  - _This change fixes an existing bug and doesn't require modifying any documentation/docstring._
The documentation of the use_sigmoid argument in CrossEntropyLoss
currently suggests the sigmoid would be applied in addition to the
softmax function. This change fixes this typo.
Add twitter discord medium youtube link
## Motivation

We are from NVIDIA and we have developed a simplified and
inference-efficient transformer for dense prediction tasks. The method
is based on SegFormer with hardware-friendly design choices, resulting
in better accuracy and over 2x reduction in inference speed as compared
to the baseline. We believe this model would be of particular interests
to those who want to deploy an efficient vision transformer for
production, and it is easily adaptable to other tasks. Therefore, we
would like to contribute our method to mmsegmentation in order to
benefit a larger audience.

The paper was accepted to [Transformer for Vision
workshop](https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fsites.google.com%2Fview%2Ft4v-cvpr22%2Fpapers%3Fauthuser%3D0&data=05%7C01%7Cboyinz%40nvidia.com%7Cbf078d69821449d1f4c908dab5e8c7da%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638022308636438546%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=XtSgPQrbVgHxt5L9XkXF%2BGWvc95haB3kKPcHnsVIF3M%3D&reserved=0)
at CVPR 2022, here below are some resource links:
Paper
[https://arxiv.org/pdf/2204.13791.pdf](https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Farxiv.org%2Fpdf%2F2204.13791.pdf&data=05%7C01%7Cboyinz%40nvidia.com%7Cbf078d69821449d1f4c908dab5e8c7da%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638022308636438546%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=X%2FCVoa6PFA09EHfClES36QOa5NvbZu%2F6IDfBVwiYywU%3D&reserved=0)
(Table 3 shows the semseg results)
Code
[https://github.com/NVIDIA/DL4AGX/tree/master/DEST](https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2FNVIDIA%2FDL4AGX%2Ftree%2Fmaster%2FDEST&data=05%7C01%7Cboyinz%40nvidia.com%7Cbf078d69821449d1f4c908dab5e8c7da%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638022308636438546%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=9DLQZpEq1cN75%2FDf%2FniUOOUFS1ABX8FEUH02O6isGVQ%3D&reserved=0)
A webinar on its application
[https://www.nvidia.com/en-us/on-demand/session/other2022-drivetraining/](https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.nvidia.com%2Fen-us%2Fon-demand%2Fsession%2Fother2022-drivetraining%2F&data=05%7C01%7Cboyinz%40nvidia.com%7Cbf078d69821449d1f4c908dab5e8c7da%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638022308636438546%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=8jrBC%2Bp3jGxiaW4vtSfhh6GozC3tRqGNjNoALM%2FOYxs%3D&reserved=0)

## Modification

Add backbone(smit.py) and head(dest_head.py) of DEST

## BC-breaking (Optional)

N/A

## Use cases (Optional)

N/A

---------

Co-authored-by: MeowZheng <meowzheng@outlook.com>
Just fixes a small typo in the example.
## Motivation

Support SegNeXt.

Due to many commits & changed files caused by WIP too long (perhaps it
could be resolved by `git merge` or `git rebase`).

This PR is created only for backup of old PR
#2247

Co-authored-by: MeowZheng <meowzheng@outlook.com>
Co-authored-by: Miao Zheng <76149310+MeowZheng@users.noreply.github.com>
## Motivation

Transfer keys of each `mscan_x.pth` pretrained models of SegNeXt, and
upload them in the website.

The reason of transferring keys is we modify original repo
[`.dwconv.dwconv.xxx`](https://github.com/Visual-Attention-Network/SegNeXt/blob/main/mmseg/models/backbones/mscan.py#L21)
to
[`.dwconv.xxx`](https://github.com/open-mmlab/mmsegmentation/blob/master/mmseg/models/backbones/mscan.py#L43).
Use the word "Library" instead of using the word "toolbox".
## Motivation

Added ascending device support in mmseg.

## Modification

The main modification points are as follows:
We added an NPU device in the DDP scenario and DP scenario when using
the NPU.

## BC-breaking (Optional)

None

## Use cases (Optional)

We tested
[fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes.py](https://github.com/open-mmlab/mmsegmentation/blob/master/configs/unet/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes.py)
.
…#2730)

Note that this PR is a modified version of the withdrawn PR
#1748

## Motivation

In the last years, panoptic segmentation has become more into the focus
in reseach. Weber et al.
[[Link]](http://www.cvlibs.net/publications/Weber2021NEURIPSDATA.pdf)
have published a quite nice dataset, which is in the same style like
Cityscapes, but for KITTI sequences. Since Cityscapes and KITTI-STEP
share the same classes and also a comparable domain (dashcam view),
interesting investigations, e.g. about relations in the domain e.t.c.
can be done.

Note that KITTI-STEP provices panoptic segmentation annotations which
are out of scope for mmsegmentation.

## Modification

Mostly, I added the new dataset and dataset preparation file. To
simplify the first usage of the new dataset, I also added configs for
the dataset, segformer and deeplabv3plus.

## BC-breaking (Optional)

No BC-breaking

## Use cases (Optional)

Researchers want to test their new methods, e.g. for interpretable AI in
the context of semantic segmentation. They want to show, that their
method is reproducible on comparable datasets. Thus, they can compare
Cityscapes and KITTI-STEP.

---------

Co-authored-by: CSH <40987381+csatsurnh@users.noreply.github.com>
Co-authored-by: csatsurnh <cshan1995@126.com>
Co-authored-by: 谢昕辰 <xiexinch@outlook.com>
Thanks for your contribution and we appreciate it a lot. The following
instructions would make your pull request more healthy and more easily
get feedback. If you do not understand some items, don't worry, just
make the pull request and seek help from maintainers.

## Motivation

The focal Tversky loss was proposed in https://arxiv.org/abs/1810.07842.
It has nearly 600 citations and has been shown to be extremely useful
for highly imbalanced (medical) datasets. To add support for the focal
Tversky loss, only few lines of changes are needed for the Tversky loss.

## Modification

Add `gamma` as (optional) argument in the constructor of `TverskyLoss`.
This parameter is then passed to `tversky_loss` to compute the focal
Tversky loss.

## BC-breaking (Optional)

Does the modification introduce changes that break the
backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the
downstream projects should modify their code to keep compatibility with
this PR.

## Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases
here, and update the documentation.

## Checklist

1. Pre-commit or other linting tools are used to fix the potential lint
issues.
2. The modification is covered by complete unit tests. If not, please
add more unit test to ensure the correctness.
3. If the modification has potential influence on downstream projects,
this PR should be tested with downstream projects, like MMDet or
MMDet3D.
4. The documentation has been modified accordingly, like docstring or
example tutorials.

Reopening of previous
[PR](#2783).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.