You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, this is my message.
Is the secret message that I am using. When I put this into the live demo site the cover text produced is:
" His resignation was held on March 1, 1788, so that he may return to the presidency of the United States in January 1820.
The year 1788 marks the year of Washington's retirement from the presidency. In 1788"
When I run the secret message through the source code, I get this produced text.
" His resignation was declared the official declaration of war in the Union and he called it quits.
He left Arlington Cemetery to pursue his education at the University of Virginia before entering the Navy"
The source code is properly decoding it back into the original message when the model used is 'gpt2' which I presume is the small model. I am also presuming the live website uses gpt2 as the parameter as the PPL is measured by gpt2.
The results are don't match when I use 'gpt2-medium' and it seems like 'gpt2-large' does not work (it's saying that it's an invalid parameter) and when i upgrade pytorch_transformers to the latest version, it returns the BPE error.
The text was updated successfully, but these errors were encountered:
I just ran the model used the environment described in your reply and it woked. When i update the pytorch-transformers to 1.2.0, i am not able to decode the message back.
Hello, this is my message.
Is the secret message that I am using. When I put this into the live demo site the cover text produced is:
" His resignation was held on March 1, 1788, so that he may return to the presidency of the United States in January 1820.
The year 1788 marks the year of Washington's retirement from the presidency. In 1788"
When I run the secret message through the source code, I get this produced text.
" His resignation was declared the official declaration of war in the Union and he called it quits.
He left Arlington Cemetery to pursue his education at the University of Virginia before entering the Navy"
The source code is properly decoding it back into the original message when the model used is 'gpt2' which I presume is the small model. I am also presuming the live website uses gpt2 as the parameter as the PPL is measured by gpt2.
The results are don't match when I use 'gpt2-medium' and it seems like 'gpt2-large' does not work (it's saying that it's an invalid parameter) and when i upgrade pytorch_transformers to the latest version, it returns the BPE error.
The text was updated successfully, but these errors were encountered: