Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not getting randomized output for the same prompt despite seed changing #723

Closed
dogjamboree opened this issue Apr 2, 2023 · 3 comments
Closed

Comments

@dogjamboree
Copy link

Recently I noticed that I'm getting the exact same answers to my prompt every time where this wasn't the case before. I have no idea what happened to trigger this.

I have a version of chat-with-bob.txt where I ask it to tell my a children's story based on preferences I specified in the text in the prompt and up until now it was obviously using a different seed every time and thus generating a new story every time but out of nowhere I'm getting an identical story every time (despite the fact that the output IS showing a different seed every time upon execution).

As a test, I changed a couple of characters in the prompt in chat-with-bob.txt to see if the output would change and it did indeed generate an entirely different story but again, it exhibited the same behavior and repeatedly gave me the same story when I ran the identical prompt. FYI, I changed "fantasy story" to "fantastical story in chat-with-bob.txt.

I just did a new fresh pull of the code and compiled from scratch a few minutes ago so everything is up to date. Using the Alpaca/33B-LORA-merged/ model BTW.

See chat-with-bob.txt and output.txt for examples.

chat-with-bob.txt
output.txt

@msftceo
Copy link

msftceo commented Apr 3, 2023

Try increasing top_k to be larger than 1. That's the size of the pool of words it is picking from.

@dogjamboree
Copy link
Author

Wow, you're right, I hadn't realized I changed that setting but I did. That fixed it. Thanks -- closing this issue!

@shubham8550
Copy link

Wow, you're right, I hadn't realized I changed that setting but I did. That fixed it. Thanks -- closing this issue!

Hii I am facing similar issue rn i am using top_k = 40

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants