-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem about update_memory #25
Comments
according to paper,because they want to memory component get grads,so they need to update memory before calculate embedding,but they can't use n-th batch to update memory due to memory leak problem, instead they use (n-1)-th batch. Only set memory_update_at_start=True, can memory component get grads and params get updated
the proper order is
|
Hi @shadow150519 , after read your reply, I got some problems to discuss. Thanks for your reply : ) |
Hi @emalgorithm, I got some problems when reading your codes.
When
memory_update_at_start=True
, the msg_agg and msg_func will calc twice, before thecompute_embedding
and aftercompute_embedding
. Before thecompute_embedding
, theget_updated_memory
function will calc all nodes' memory. After thecompute_embedding
,update_memory
function will calc positive nodes.The code annotation here was "Persist the updates to the memory only for sources and destinations (since now we have new messages for them)", but actually the message in this batch was update after the memory update,
update_memory
function was updating memory from the message in last batch. So here comes a problem thatupdate_memory(positives, self.memory.messages)
was updating positive nodes in this batch, and updated messages was from last batch. I don't understand why the code is doing this, maybe it's a bug?I think here needs to update all nodes' memory (or record last batch's positive nodes), or update memory in
get_updated_memory
function directly (replace it toupdate_memory
).The text was updated successfully, but these errors were encountered: