Skip to content

Commit

Permalink
Set drop_prob = 0 for causal models (#125)
Browse files Browse the repository at this point in the history
*Description of changes:* This PR sets `drop_prob = 0` when training
causal models. Missing values are problematic for causal model training.


By submitting this pull request, I confirm that you can use, modify,
copy, and redistribute this contribution, under the terms of your
choice.
  • Loading branch information
abdulfatir authored Jun 16, 2024
1 parent 2f92a12 commit d2e0c9d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion scripts/training/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -320,7 +320,7 @@ def __init__(
self.tokenizer = tokenizer
self.context_length = context_length
self.prediction_length = prediction_length
self.drop_prob = drop_prob
self.drop_prob = drop_prob if model_type == "seq2seq" else 0.0
self.min_past = min_past or prediction_length
self.model_type = model_type
self.imputation_method = imputation_method or LeavesMissingValues()
Expand Down

0 comments on commit d2e0c9d

Please sign in to comment.