You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The problem is that unlike ChatGPT it refuses to perform tasks, or to play role-playing games.
With AUTOGPT.PY he sometimes works very well, almost scary, other times he refuses to perform tasks (such as writing articles or other tasks that he deems not suitable for him)
Instead on BabyAGI and Camel , it refuses to perform the tasks. Because they are based on role-playing PromptTemplate . And bingchat is set to not execute these role-play task.
Any suggestions to buy pass the problem?
The text was updated successfully, but these errors were encountered:
Hmm, I've heard that Bing reacts negatively to messages with accusative/imperative tone (Using "you must" or "do this").
I wonder if we could add some sort of randomized jailbreak prompt to the default Bing prompts.
Be careful with that though because Microsoft is banning people for using the same jailbreak prompt many times, so if you include it in your project, it might get many people banned at the same time!
We have recently implemented the BING CHAT API.
The problem is that unlike ChatGPT it refuses to perform tasks, or to play role-playing games.
With AUTOGPT.PY he sometimes works very well, almost scary, other times he refuses to perform tasks (such as writing articles or other tasks that he deems not suitable for him)
Instead on BabyAGI and Camel , it refuses to perform the tasks. Because they are based on role-playing PromptTemplate . And bingchat is set to not execute these role-play task.
Any suggestions to buy pass the problem?
The text was updated successfully, but these errors were encountered: