We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torchbench_bfloat16_training xpu train phlippe_resnet E0626 09:53:20.652000 139764145854272 torch/_dynamo/utils.py:1478] RMSE (res-fp64): 0.00734, (ref-fp64): 0.00047 and shape=torch.Size([]). res.dtype: torch.bfloat16, multiplier: 3.000000, tol: 0.001000 fail_accuracy
loading model: 0it [00:00, ?it/s] loading model: 0it [00:00, ?it/s]
torch-xpu-ops: 31c4001 pytorch: 0f81473d7b4a1bf09246410712df22541be7caf3 + PRs: 127277,129120 device: PVC 1100, 803.61, 0.5.1
The text was updated successfully, but these errors were encountered:
Not very large absolute error, and this model could pass if increasing tol to 5*1e-3
Sorry, something went wrong.
Public PR to raise tolerance: pytorch/pytorch#134192
this datatype is not included in Meta dashboard, not target to PT 2.6
Update weekly accuracy reference (#1223)
ababdb4
Last reference updated is 20240709 Related issues: - [x] #1216 - [x] #1217 - [x] #1219 - [x] #1220 - [ ] #1221 - [x] #1222 - [ ] #1256 - [ ] #1260 - [ ] #1261 - [ ] #1262 - [ ] #1263 - [ ] #1264 - [ ] #1273 - [ ] #1274 - [ ] #1275 - [ ] #1276 - [ ] #1277 - [ ] #1278 - [ ] #508 - [ ] #509 - [ ] #510
retonym
No branches or pull requests
🐛 Describe the bug
torchbench_bfloat16_training
xpu train phlippe_resnet
E0626 09:53:20.652000 139764145854272 torch/_dynamo/utils.py:1478] RMSE (res-fp64): 0.00734, (ref-fp64): 0.00047 and shape=torch.Size([]). res.dtype: torch.bfloat16, multiplier: 3.000000, tol: 0.001000
fail_accuracy
loading model: 0it [00:00, ?it/s]
loading model: 0it [00:00, ?it/s]
Versions
torch-xpu-ops: 31c4001
pytorch: 0f81473d7b4a1bf09246410712df22541be7caf3 + PRs: 127277,129120
device: PVC 1100, 803.61, 0.5.1
The text was updated successfully, but these errors were encountered: