Skip to content

Commit

Permalink
Fix: a temp fix to cap the value in exp term in torch_sahp.py
Browse files Browse the repository at this point in the history
  • Loading branch information
iLampard authored Sep 3, 2024
1 parent efe3cdd commit 940b5a6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion easy_tpp/model/torch_model/torch_sahp.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def state_decay(self, encode_state, mu, eta, gamma, duration_t):
# [batch_size, hidden_dim]
states = torch.matmul(encode_state, mu) + (
torch.matmul(encode_state, eta) - torch.matmul(encode_state, mu)) * torch.exp(
-torch.matmul(encode_state, gamma) * duration_t)
-torch.matmul(encode_state, gamma) * torch.clip(duration_t, max=10)) # a temp fix to avoid exploding the exp term
return states

def forward(self, time_seqs, time_delta_seqs, event_seqs, attention_mask):
Expand Down

0 comments on commit 940b5a6

Please sign in to comment.