- Release EfficientNet model implementation (#475)
- Add support to convert any
PyTorch
model to aClassyModel
with the ability to attach heads to it ([#461](facebookresearch#461)) - Squeeze and Excitation support for
ResNe(X)t
andDenseNet
models (#426, #427) - Made
ClassyHook
s registrable (#401) and configurable (#402) - Migrated to
TorchElastic v0.2.0
(#464) - Add
SyncBatchNorm
support (#423) - Implement
mixup
train augmentation (#469) - Support
LARC
for SGD optimizer (#408) - Added convenience wrappers for
Iterable
datasets (#455) Tensorboard
improvements- Invalid (
NaN
/Inf
) loss detection - Revamped logging (#478)
- Add
bn_weight_decay
configuration option forResNe(X)t
models - Support specifying
update_interval
to Parameter Schedulers (#418)
ClassificationTask
API improvement andtrain_step
,eval_step
simplification- Rename
lr
tovalue
in parameter schedulers (#417)
- checkpoint_folder renamed to checkpoint_load_path (#379)
- head support on DenseNet (#383)
- Cleaner abstraction in ClassyTask/ClassyTrainer: eval_step, on_start, on_end, …
- Speed metrics in TB (#385)
- test_phase_period in ClassificationTask (#395)
- support for losses with trainable parameters (#394)
- Added presets for some typical resNe(X)t configurations: #405)
- Adam optimizer (#301)
- R(2+1)d units (#322)
- Mixed precision training (#338)
- One-hot targets in meters (#349)
This release has been tested on the latest PyTorch (1.4) and torchvision (0.5) releases. It also includes bug fixes and other smaller features.