ICLR 2020 Interesting papers
There is nerver nothing we can do.
-
[promising results!] U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation [Tensorflow]: Style transfer
-
[Discriminator] Real or not real, that is the question [PyTorch]: Change the output of discriminator to a distribution with a vector size of 8 (followed by KL loss), which is similar to the extended soft label.
-
[Understanding generative model] Do deep generative models know what they don't know
-
⭐⭐⭐⭐⭐ [Robust GAN] Optimal strategies against generative attacks [PyTorch]: GAN in the middle networks. Find the optimal solution to against attacks, with analysis of how to attack with leaked samples.
-
[Loss function] Improving Adversarial Robustness Requires Revisiting Misclassified Examples : Mark
-
[Comparison discriminator] Self-Adversarial Learning with Comparative Discrimination for Text Generation
-
Controlling generative models with continuous factors of variations
-
[Learning para as loss weight] You Only Train Once: Loss-Conditional Training of Deep Networks
-
[Augmentation] Adversarial AutoAugment : RL
- Identity Crisis: Memorization and Generalization Under Extreme Overparameterization: FC layers. Low level features show identity ability, while high-level features output constant information.
-
Target-Embedding Autoencoders for Supervised Representation Learning : VAE usually reconstracts X, but in this paper, it reconstracts Y(Y is high-dimensional). Application: multivariate sequence forecasting
-
⭐⭐⭐⭐⭐ [Unsupervised clustering] Self-labelling via simultaneous clustering and representation learning [PyTorch]: random generated labels & optimization & update labels (representation learning) + linear program solving
-
[Interpretation] Rotation-invariant clustering of neuronal responses in primary visual cortex
-
[Multi-view information] Learning Robust Representations via Multi-View Information Bottleneck
-
Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification : re-identification, mean teacher, 4 models, triplet loss
- Latent Normalizing Flows for Many-to-Many Cross-Domain Mappings [PyTorch]: image caption
- From Inference to Generation: End-to-end Fully Self-supervised Generation of Human Face from Speech : using voice to generate face, interesting.
- Deep Graph Matching Consensus [PyTorch]: Idea (neighboors should contain same information in two similar images) is good, which might be used in somewhere else.
- DropEdge: Towards Deep Graph Convolutional Networks on Node Classification [PyTorch]
- ⭐⭐⭐⭐ A crical analysis of self-supervision, or what we can learn from a single image : Self supervision with one image could learn low-level features with high quality
-
[Application] Automatically Discovering and Learning New Visual Categories with Ranking Statistics [PyTorch]: 应用流也能中ICLR了.. self-supervision + supervised-learning + pesudo label + incremental learning
-
[Remove noise labelled data to semi-supervised learning] SELF: Learning to Filter Noisy Labels with Self-Ensembling
-
[Mixmatch upgrade] ReMixMatch: Semi-Supervised Learning with Distribution Matching and Augmentation Anchoring
-
[Semi-supervised] DivideMix: Learning with Noisy Labels as Semi-supervised Learning: filter noise label, then trest them as unlabelled data, apply pseudo label for mixmatch.
- Weakly Supervised Clustering by Exploiting Unique Class Count: predict the class No. within one images.
- [certainty and diversity] Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds [Oral]
-
[Remove noise labelled data to semi-supervised learning] SELF: Learning to Filter Noisy Labels with Self-Ensembling
-
[Regularization] Simple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee [Code]
-
[Semi-supervised] DivideMix: Learning with Noisy Labels as Semi-supervised Learning: filter noise label, then trest them as unlabelled data, apply pseudo label for mixmatch.
- [Semantic Segmentation] FasterSeg:Searching for Faster Real-time Semantic Segmentation [PyTorch]: network search + teacher/student knowledge distilation
-
[Activation] Enhancing adversarial defence by k-winners-take-all [PyTorch]: For improving the robustness, similar to ReLU but with a ratio to keep the values instead of using threshold 0.
-
[Curriculum Loss] Curriculum Loss: Robust Learning and Generalization against Label Corruption
-
[Normalization] Mixup Inference: Better Exploiting Mixup to Defend Adversarial Attacks [PyTorch]: Adding permutations to the samples to improve the robustness.
-
[Optimization] Don't Use Large Mini-batches, Use Local SGD multi-GPU, before communication, more inference...
-
[Optimization] Large Batch Optimization for Deep Learning: Training BERT in 76 minutes [TensorFlow]
-
Learning to Balance: Bayesian Meta-Learning for Imbalanced and Out-of-distribution Tasks [Code][Oral]
-
A Meta-Transfer Objective for Learning to Disentangle Causal Mechanisms
- Unpaired Point Cloud Completion on Real Scans using Adversarial Training [TensorFlow]: GAN + super-resolution
-
[Model Compression] Once for All: Train One Network and Specialize it for Efficient Deployment [PyTorch]