The official code of "Improving Deep Regression with Ordinal Entropy" in ICLR 2023. [PDF].
We currently provide a detailed code for experiments on the synthetic dataset, with a new visualization experiments for easy reproduction.
- run main.py
We add a new visualization experiment with the synthetic dataset for easy reproduction, as the visualization experiments in our paper is on depth estimation task, which may take some effort to reproduce.
- run vis_tsne.py to obtain the features
- run vis_sphere.py to visualize the obtained features on a sphere
For the Linear task:
- train.npy : the traning set
- test.npy: the test set, please download it here.
For the non-linear task:
- train_sde.npy : the traning set
- test_sde.npy: the test set
The dataset above is generated with this code: DeepONet.
The code for the Depth Baseline can be found here:
The code for the Crowd Counting Baseline can be found here:
The ordinal entropy code for the two tasks can be found here:
- ./DepthEstimation&CrowdCounting/OrdinalEntropy.py
The ordinal entropy can be added into the New-CRFs and CSRNet baselines by:
- change the output of models from
returen x
to
if self.training:
return x, encoding
else:
return x
- add the ordinal entropy into the loss: change
outputs = model(inputs, targets, epoch)
to
outputs, features = model(inputs, targets, epoch)
oe_loss = ordinalentropy(features, targets)
loss = loss + oe_loss
The visualization results can be obtained by:
- run vis_sphere.py to visualize the obtained features on a sphere
The code for the Baseline can be found here:
The ordinal entropy code for Age Estimation can be found here:
- ./AgeEstimation/OrdinalEntropy.py
The ordinal entropy can be added into the Age Estimation baselines in a similar way shown above.
S. Zhang, L. Yang, M. Bi Mi, X. Zheng, A. Yao, "Improving Deep Regression with Ordinal Entropy," in ICLR, 2023. [PDF].