Skip to content

Commit

Permalink
Remove torch.jit.fuser("fuser2") in test (#7069)
Browse files Browse the repository at this point in the history
* [WIP] Remove torch.jit.fuser("fuser2") in test

Internally we're considering removing support for fuser2, so we need to remove this special case from the test.

* completely remove special-casing
  • Loading branch information
davidberard98 committed Jan 10, 2023
1 parent 35f68a0 commit 2b16299
Showing 1 changed file with 1 addition and 7 deletions.
8 changes: 1 addition & 7 deletions test/test_ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -1555,13 +1555,7 @@ def test_jit(self, alpha, gamma, reduction, device, dtype, seed):
torch.random.manual_seed(seed)
inputs, targets = self._generate_diverse_input_target_pair(dtype=dtype, device=device)
focal_loss = ops.sigmoid_focal_loss(inputs, targets, gamma=gamma, alpha=alpha, reduction=reduction)
if device == "cpu":
scripted_focal_loss = script_fn(inputs, targets, gamma=gamma, alpha=alpha, reduction=reduction)
else:
with torch.jit.fuser("fuser2"):
# Use fuser2 to prevent a bug on fuser: https://github.com/pytorch/pytorch/issues/75476
# We may remove this condition once the bug is resolved
scripted_focal_loss = script_fn(inputs, targets, gamma=gamma, alpha=alpha, reduction=reduction)
scripted_focal_loss = script_fn(inputs, targets, gamma=gamma, alpha=alpha, reduction=reduction)

tol = 1e-3 if dtype is torch.half else 1e-5
torch.testing.assert_close(focal_loss, scripted_focal_loss, rtol=tol, atol=tol)
Expand Down

0 comments on commit 2b16299

Please sign in to comment.