Skip to content
/ mtsa Public

Experimental Codes for Multi-mask Tensorized Self-Attention (MTSA)

Notifications You must be signed in to change notification settings

taoshen58/mtsa

Repository files navigation

Multi-Mask Tensorized Self-Attention (MTSA) Mechanism

  • The repository is the experimental codes and APIs for MTSA
  • All codes is implemented with Python + Tensorflow

Repo Contents

  1. MTSA_API: API for Universal and unified Multi-Mask Tensorized Self-Attention (MTSA) Tensorflow implementation and its stacking version.
  2. Proj_SNLI_mtsa: Implementation of MTSA on Stanford Natural Language Inference (SNLI) dataset.
  3. Proj_NLI_Trans_mtsa: Implementation of MTSA on SNLI and MultiNLI for transfer learning.
  4. Proj_SRL_mtsa: Implementation of MTSA on Semantic Role Labeling Task (CONLL-05 Dataset).
  5. Proj_TREC_mtsa: Implementation of MTSA on TREC Question-Type Classification dataset.
  6. Proj_SST_mtsa: Implementation of MTSA on 5-class Stanford Sentiment Treebank.
  7. Proj_SentCls_mtsa: Implementation of MTSA on Sentence Classification (e.g., CR, MPQA and SUBJ) datasets.
  8. Proj_NMT: Implementation of MTSA on neural machine translation.

Requirements

For API

  • Python2 and Python3
  • tensorflow>=1.2

For Experimental Projects

  • Python2 for SRL_mtsa and Python3 for the remaining
  • tensorflow>=1.2

Python Package

  • tqdm
  • nltk (with Models/punkt)

Experimental Project Framework

This framework is except SRL_mtsa

PROJECT FILE TREE:

ROOT
--dataset[d]
----glove[d]
----$task_dataset_name$[d]
--src[d]
----model[d]
------template.py[f]
------$model_name$.py[f]
----nn_utils[d]
----utils[d]
------file.py[f]
------nlp.py[f]
------record_log.py[f]
------time_counter.py[f]
----dataset.py[f]
----evaluator.py[f]
----graph_handler.py[f]
----perform_recorder.py[f]
--result[d]
----processed_data[d]
----model[d]
------$model_specific_dir$[d]
--------ckpt[d]
--------log_files[d]
--------summary[d]
--------answer[d]
--configs.py[f]
--$task$_main.py[f]
--$task$_log_analysis.py[f]

The result dir will appear after any running. Every files[f] and directory[d] will be detailed as follows:

./configs.py: perform the parameters parsing and definitions and declarations of global variables, e.g., parameter definition/default value, name(of train/dev/test_data, model, processed_data, ckpt etc.) definitions, directories(of data, result, $model_specific_dir$ etc.) definitions and corresponding paths generation.

./$task$_main.py: this is the main entry python script to run the project;

./$task$_log_analysis.py: this provides a function to analyze the log file of training process.

./dataset/: this is the directory including datasets for current project.

  • ./dataset/glove: including pre-trained glove file

  • ./dataset/$task_dataset_name$/: This is the dataset dir for current task, we will concretely introduce this in each project dir.

  • ./src/dataset.py: a class to process raw data from dataset, including data tokenization, token dictionary generation, data digitization, neural network data generation. In addition, there are also some method: generate_batch_sample_iter for random mini-batch iteration, get_statistic for sentence length statistics and a interface for deleting samples with long sentence in training data.

  • ./src/evaluator.py: a class for model evaluation.

  • ./src/graph_handler.py: a class for handling graph: session initialization, summary saving, model restore etc.

  • ./src/perform_recoder.py: a class to save top-n dev accuracy model checkpoint for future loading.

  • ./src/model/: the dir including tensorflow model file

  • ./src/model/template.py: a abstract python class, including network placeholders, global tensorflow tensor variables, TF loss function, TF accuracy function, EMA for learnable variables and summary, training operation, feed dict generation and training step function.

  • ./src/model/$model_name$.py: Main TF neural network model, implement the abstract interface build_network, extending from template.py.

  • ./src/nn_utils/: a package include various tensorflow layers implemented by this repo author.

  • ./src/utils/file.py: file I/O functions.

  • ./src/utils/nlp.py: natural language processing functions.

  • ./src/utils/record_log.py: a log recorder class, and a corresponding instance for all use in current project.

  • ./src/utils/time_counter.py: time counter class to collect the training time, note this time exclude the process to prepare data but only the time spending in training step.

./result/: a dir to place the results.

  • ./result/processed_data/: a dir to place Dataset instance with pickle format. The file name is generated by get_params_str in ./config.py according to the related parameters.
  • ./result/model/$model_specific_dir$/: the name of this dir is generated by get_params_str in ./config.py according to the related parameters, to save the result for a combination of the parameters. In other words, we associate a dir for a combination of the parameters.
  • ./result/model/$model_specific_dir$/ckpt/: a dir to save top-n model checkpoints.
  • ./result/model/$model_specific_dir$/log_files/: a dir to save log file.
  • ./result/model/$model_specific_dir$/summary/: a dir to save tensorboard summary and tensorflow graph meta files.
  • ./result/model/$model_specific_dir$/answer/: a dir to save extra prediction result for a part of these projects.

The details of parsing hyper-params are elaborated in config.py

Paper Info

The full paper can be viewed at here

And please cite this paper if the paper or its code is useful for your work.


@inproceedings{shen2019tensorized,
	Author = {Shen, Tao and Zhou, Tianyi and Long, Guodong and Jiang, Jing and Zhang, Chengqi},
	Booktitle = {(NAACL) Annual Conference of the North American Chapter of the Association for Computational Linguisticss},
	Pages = {1256--1266},
	Title = {Tensorized Self-Attention: Efficiently Modeling Pairwise and Global Dependencies Together},
	Year = {2019}
}

Contact Information

Please feel free to open an issue if any problem and bug is encountered.

About

Experimental Codes for Multi-mask Tensorized Self-Attention (MTSA)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages