Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Live demo site produces different cover text compared to source code #5

Open
arxivcrawler opened this issue Sep 29, 2019 · 3 comments

Comments

@arxivcrawler
Copy link

Hello, this is my message.
Is the secret message that I am using. When I put this into the live demo site the cover text produced is:

" His resignation was held on March 1, 1788, so that he may return to the presidency of the United States in January 1820.

The year 1788 marks the year of Washington's retirement from the presidency. In 1788"

When I run the secret message through the source code, I get this produced text.

" His resignation was declared the official declaration of war in the Union and he called it quits.

He left Arlington Cemetery to pursue his education at the University of Virginia before entering the Navy"

The source code is properly decoding it back into the original message when the model used is 'gpt2' which I presume is the small model. I am also presuming the live website uses gpt2 as the parameter as the PPL is measured by gpt2.

The results are don't match when I use 'gpt2-medium' and it seems like 'gpt2-large' does not work (it's saying that it's an invalid parameter) and when i upgrade pytorch_transformers to the latest version, it returns the BPE error.

@flarn2006
Copy link

I noticed this as well; wasn't sure what's going on.

@vijeyanidhi
Copy link

@arxivcrawler
Did the environment you used for gpt2 is
pytorch_transformers==1.1.0
torch==1.0.1
bitarray==1.0.1

or the version for pytorch_transformers is 1.4.0

i am not able to decode the message back to the original message when i try it with gpt2 not the medium or large version just gpt2.

i also tried the workaround mentioned in #1 but nothing seems to work for me

@zhangwei1992-s
Copy link

pytorch_transformers==1.1.0
torch==1.0.1
bitarray==1.0.1

I just ran the model used the environment described in your reply and it woked. When i update the pytorch-transformers to 1.2.0, i am not able to decode the message back.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants