Tensorflow Keras implementation of ordinal regression (aka ordinal classification) using
-
CORAL: consistent rank logits (CORAL) by Cao, Mirjalili, & Raschka (2019)
-
CORN: conditional ordinal regression for neural networks (CORN) by Shi X., Cao W., & Raschka S. (2021).
This package includes:
- Ordinal output layers:
CoralOrdinal()
&CornOrdinal()
- Ordinal loss function:s
OrdinalCrossEntropy()
&CornOrdinalCrossEntropy()
- Ordinal error metric:
MeanAbsoluteErrorLabels()
- Ordinal activation functions:
ordinal_softmax()
&corn_ordinal_softmax()
- Ordinal label prediction functions:
cumprobs_to_label()
This is a work in progress, so please post any issues to the issue queue. The package was developed as part of the Berkeley D-Lab's hate speech measurement project and paper (Kennedy et al. 2020).
Acknowledgments: Many thanks to Sebastian Raschka for the help in porting from the PyTorch source repository.
Key pending items:
- Function docstrings
- Docs
- Tests
Install the stable version via pip:
pip install coral-ordinal
Install the most recent code on GitHub via pip:
pip install git+https://github.com/ck37/coral-ordinal/
This package relies on Python 3.6+, Tensorflow 2.2+, and numpy.
This is a quick example to show a basic model implementation. With actual data one would also want to specify the input shape.
import coral_ordinal as coral
NUM_CLASSES = 5
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(32, activation = "relu"))
model.add(coral.CoralOrdinal(num_classes = NUM_CLASSES)) # Ordinal variable has 5 labels, 0 through 4.
model.compile(loss = coral.OrdinalCrossEntropy(),
metrics = [coral.MeanAbsoluteErrorLabels()])
See this colab notebook for extended examples of ordinal regression with MNIST (multilayer perceptron) and Amazon reviews (universal sentence encoder).
Note that the minimum value of the ordinal variable needs to be 0. If your labeled data ranges from 1 to 5, you will need to subtract 1 so that it is scaled to be 0 to 4.
Cao, W., Mirjalili, V., & Raschka, S. (2019). Rank-consistent ordinal regression for neural networks. arXiv preprint arXiv:1901.07884, 6.
Shi X., Cao W., & Raschka S. (2021). Deep Neural Networks for Rank-Consistent Ordinal Regression Based On Conditional Probabilities. arXiv preprint arXiv:211108851
Kennedy, C. J., Bacon, G., Sahn, A., & von Vacano, C. (2020). Constructing interval variables via faceted Rasch measurement and multitask deep learning: a hate speech application. arXiv preprint arXiv:2009.10277.