After pre-training, convert the model file to HuggingFace format with
PYTHONPATH=absolute_path_of(./transformers/) python ./transformers/transformers/convert_segatron_to_huggingface.py --segatron_path <your model file path> --bert_config_file <bert config file path> --huggingface_path <target path>
./transformers/examples/eval_finetune/train_glue.sh 0,1,2,3,4,5,6,7 16000
./transformers/examples/eval_finetune/grid_search_glue.sh
./transformers/examples/eval_finetune/train_squad.sh 0,1,2,3,4,5,6,7 16000
./transformers/examples/eval_finetune/train_squad2.sh 0,1,2,3,4,5,6,7 16000
./transformers/examples/eval_finetune/run_race.sh 0,1,2,3,4,5,6,7 16000