-
Notifications
You must be signed in to change notification settings - Fork 378
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
a version of the semantic segmentation task for regression #849
Comments
@isaaccorley this may be useful for our GHG project. |
I actually have a draft of this regression trainer which uses MSE loss. I'll make a PR soon. |
@Geethen Can you list a few architectures you would like to see available in the segmentation trainer? From experience, vanilla UNet with some training tricks is generally hard to beat which is why we chose segmentation models pytorch as the backend. |
@isaaccorley I have heard this before as well. Nevertheless, it will be cool to have some of these recent models available segnext, stego there are lots of stuff in this repo: https://github.com/robmarkcole/satellite-image-deep-learning#Segmentation. two models with a proven record (tend to always be some adaptation of UNet as you mentioned): Corn and coral for ordinal regression: https://github.com/Raschka-research-group/coral-pytorch |
@Geethen, btw the google dynamic world paper uses a simple full convolutional net I think the global tree cover layer also uses a simple fcn (but takes in a time series and has recurrent layers) |
Summary
semantic segmentation models does support regression if 'classes' are changed to 1 and the 'activation' is changed to 'identity'. it will be useful to have a task created for this in torchgeo.
Also, it would be useful if segmentation models supported more recent architectures (not a torchgeo problem)- out of my depth for me to implement.
Rationale
No response
Implementation
No response
Alternatives
No response
Additional information
No response
The text was updated successfully, but these errors were encountered: