Skip to content

v0.9.0

Compare
Choose a tag to compare
@sleepinyourhat sleepinyourhat released this 06 May 12:24
· 510 commits to master since this release

The initial work-in-progress release coinciding with the launch of SuperGLUE.

Highlights:

We currently support two-phase training (pretraining and target task training) using various shared encoders, including:

  • BERT
  • OpenAI GPT
  • Plain Transformer
  • Ordered Neurons (ON-LSTM) Grammar Induction Model
  • PRPN Grammar Induction Model

We also have support for SuperGLUE baselines, sentence encoder probing experiments, and STILTS-style training.

Examples

They can be found in https://github.com/nyu-mll/jiant/tree/master/config/examples