You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The behavior of how safe_div uses default_value is inconsistent depending on if denorm is float or tensor
Described in the the comments, seems the default_value should be used directly as the devision result when denorm is 0. This is indeed implemented accordingly when denorm is float
Also better rename the parameter to default_denorm to avoid further confusion.
The question left is if there is any need or use cases for the 1st bahvior.
The text was updated successfully, but these errors were encountered:
@aobo-y, that's a good point! I agree, let's change it to return numerator / (denom if denom != 0.0 else default_value). default_denorm is a good name.
Thank you!
https://github.com/pytorch/captum/blob/master/captum/_utils/common.py#L26-L37
The behavior of how
safe_div
usesdefault_value
is inconsistent depending on ifdenorm
isfloat
ortensor
Described in the the comments, seems the
default_value
should be used directly as the devision result whendenorm
is0
. This is indeed implemented accordingly whendenorm
isfloat
However, when
denorm
istensor
, thedefault_value
is actually used as thedefault denorm
insteadI think the 2nd behavior is more reasonable, i.e., when
denorm
isfloat
, update toAlso better rename the parameter to
default_denorm
to avoid further confusion.The question left is if there is any need or use cases for the 1st bahvior.
The text was updated successfully, but these errors were encountered: