Skip to content

Commit

Permalink
[PT FE] Fix sporadic issue in quantized tests (#23520)
Browse files Browse the repository at this point in the history
### Details:
 - *Relax quantized tests condition to remove sporadicity.*

### Tickets:
 - *CVS-129734*
  • Loading branch information
mvafin authored Mar 19, 2024
1 parent 9c80612 commit 7d5e4af
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion tests/layer_tests/pytorch_tests/pytorch_layer_test_class.py
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,8 @@ def numpy_to_torch_recursively(x):
if not quantized_ops and n_is_not_close > 0:
is_ok = False
print("Max diff is {}".format(max_diff))
elif quantized_ops and (n_is_not_close > int(np.log10(cur_fw_res.size)) or max_diff > np.array(quant_size + fw_eps).max()):
elif quantized_ops and max_diff > np.array(quant_size + fw_eps).max():
# To remove sporadic issues, allow any number of error of 1 quant
is_ok = False
print("Errors outside threshold range: {} with max diff {}, expected at most {} with max diff {}".format(
n_is_not_close, max_diff, int(np.log10(cur_fw_res.size)), quant_size + fw_eps))
Expand Down

0 comments on commit 7d5e4af

Please sign in to comment.