-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Suggestion: Chat UI - Add an optional “memory” section for important events. #124
Comments
One way to create a memory of old chat messages is to summarize the chat history using a summarization algorithm like BART. @SillyLossy did this in his fork of the web UI. It is now possible to easily create extensions that define custom prompt generators: import gradio as gr
import modules.shared as shared
from modules.chat import clean_chat_message
from modules.extensions import apply_extensions
from modules.text_generation import encode, get_max_prompt_length
def custom_generate_chat_prompt(user_input, max_new_tokens, name1, name2, context, chat_prompt_size, impersonate=False):
user_input = clean_chat_message(user_input)
rows = [f"{context.strip()}\n"]
if shared.soft_prompt:
chat_prompt_size -= shared.soft_prompt_tensor.shape[1]
max_length = min(get_max_prompt_length(max_new_tokens), chat_prompt_size)
i = len(shared.history['internal'])-1
while i >= 0 and len(encode(''.join(rows), max_new_tokens)[0]) < max_length:
rows.insert(1, f"{name2}: {shared.history['internal'][i][1].strip()}\n")
if not (shared.history['internal'][i][0] == '<|BEGIN-VISIBLE-CHAT|>'):
rows.insert(1, f"{name1}: {shared.history['internal'][i][0].strip()}\n")
i -= 1
if not impersonate:
rows.append(f"{name1}: {user_input}\n")
rows.append(apply_extensions(f"{name2}:", "bot_prefix"))
limit = 3
else:
rows.append(f"{name1}:")
limit = 2
while len(rows) > limit and len(encode(''.join(rows), max_new_tokens)[0]) >= max_length:
rows.pop(1)
prompt = ''.join(rows)
return prompt
def ui():
pass This custom function has access to the complete chat log (stored in |
I see, that would be a more elegant solution compare to manually adding the memory. I’ll wait until @SillyLossy ’s work is completed then. Thank for answering :D. |
Custom field between character's definition and chat history is still useful, if you want to set a topic of the conversation for example. Akin: Mb call it softer bias? |
@Xabab that can also be added using a custom prompt extension. I will try to create a prototype with both of these ideas for us to experiment with and see what works. |
I really love this idea, how do you put the memories in the extension memory? |
This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below. |
No prototype? That seems like an awfully needed feature, with most available language models not being able to process more than 2k tokens at once |
@hpnyaggerman #1548 gives some hope |
Currently, the prompt is built using the character json + example dialog + past dialogs.
Past dialogs will be cut off when the total length
Hence, when the dialogs get to long, character might forget important event (You marry her, turn her into xxx, take her to a date …) in past dialog.
This can be remedied by either:
Therefore, I suggest we have a “memory” section (inspired by KoboldAI UI), where user can manually enter important events. The memory will be added to the prompt so the bot won’t forget about it. 🤔
The text was updated successfully, but these errors were encountered: