You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To calculate the value of weight_regularizer & dropout_regularizer the dataset_size N is used. What if we don't know the dataset size in advance like in case of Lifelong Reinforcement learning in which case re-training happens on a periodic basis on new data as it comes.
I am not able to figure out in such cases how we will be able to use this concrete dropout implementation.
Can you please suggest on how to handle such situations using Concrete dropout. Will really appreciate it.
Thanks in Advance
Nikhil
The text was updated successfully, but these errors were encountered:
Hi,
To calculate the value of weight_regularizer & dropout_regularizer the dataset_size N is used. What if we don't know the dataset size in advance like in case of Lifelong Reinforcement learning in which case re-training happens on a periodic basis on new data as it comes.
I am not able to figure out in such cases how we will be able to use this concrete dropout implementation.
Can you please suggest on how to handle such situations using Concrete dropout. Will really appreciate it.
Thanks in Advance
Nikhil
The text was updated successfully, but these errors were encountered: