Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge key_padding_mask into attn_mask_rel_pos in WavLM (pytorch#3265)
Summary: When `key_padding_mask` is not `None`, it needs to be combined with `attn_mask_rel_pos` as one mask for `scaled_dot_product_attention` function. Pull Request resolved: pytorch#3265 Reviewed By: hwangjeff Differential Revision: D44901093 Pulled By: nateanl fbshipit-source-id: 73ca7af48faf7f4eb36b35b603187a11e5582c70
- Loading branch information