-
I've read the paper "Beyond Goldfish Memory: Long-Term Open-Domain Conversation" but I still cannot understand where does the LTM resides, on debug mode, I can see that BB2 can show Also in a scenario where a single instance of BB2 is used by N number of people, how will it be able to establish long term knowledge for the right person? Does this mean that if N number of people uses a single instance of BB2, it will basically write everything that N people have said into its memory, therefore having a shared/consolidated knowledge in the memory? If so, does that mean that if you would want BB2 to have a memory that is dedicated to each of the N users, you should essentially have N number of BB2 instances as well? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
The The memory is written to work appropriately with batched inputs, such that the module stores a different set of memories for each batch element (memory is not shared across batch items) |
Beta Was this translation helpful? Give feedback.
The
LongTermMemory
module stores the memories; we store both the encodings of the memories and the memories themselves. A read from the memory involves a retrieval step using DPR such that memories are returned according to a similarity score with the given context.The memory is written to work appropriately with batched inputs, such that the module stores a different set of memories for each batch element (memory is not shared across batch items)