Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

word vectors, visualizing attention #3

Open
andreasvc opened this issue Sep 6, 2017 · 2 comments
Open

word vectors, visualizing attention #3

andreasvc opened this issue Sep 6, 2017 · 2 comments

Comments

@andreasvc
Copy link

How do I obtain the glove vectors in the right format? I downloaded the pretrained vectors from https://nlp.stanford.edu/projects/glove/ but it's not clear how to convert them to the expected format.

I tried running the code without word vectors (is this supposed to work?) but I get an exception:

/home/andreas/src/Structured-Self-Attentive-Sentence-Embedding/train.py in train(epoch_number)
     79         total_pure_loss += loss.data
     80
---> 81         if attention:  # add penalization term
     82             attentionT = torch.transpose(attention, 1, 2).contiguous()
     83             extra_loss = Frobenius(torch.bmm(attention, attentionT) - I[:attention.size(0)])

/home/andreas/.local/lib/python2.7/site-packages/torch/autograd/variable.pyc in __bool__(self)
    121             return False
    122         raise RuntimeError("bool value of Variable objects containing non-empty " +
--> 123                            torch.typename(self.data) + " is ambiguous")
    124
    125     __nonzero__ = __bool__

RuntimeError: bool value of Variable objects containing non-empty torch.FloatTensor is ambiguous

(If I replace this condition with False the code works).

Lastly, how could I obtain the kind of visualizations of attention as in the paper?

@andreasvc
Copy link
Author

I figured out that the word vector file is generated by running train.py in the snli example linked.

As for the exception, replacing if attention: with the following seems to work:

if (attention.data != 0).sum() != 0:

@kaushalshetty
Copy link

@andreasvc I had done attention visualization as in the paper. Do check out:
https://github.com/kaushalshetty/Structured-Self-Attention

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants