Skip to content

TPU test flake: test_diagonal_write_transposed_r3 (__main__.TestAtenXlaTensor) #8985

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
tengyifei opened this issue Apr 16, 2025 · 0 comments
Labels
bug Something isn't working testing Testing and coverage related issues. xla:tpu TPU specific issues and PRs

Comments

@tengyifei
Copy link
Collaborator

Seen in https://github.com/pytorch/xla/actions/runs/14486615445/job/40635200589

======================================================================
FAIL: test_diagonal_write_transposed_r3 (__main__.TestAtenXlaTensor)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/runner/_work/xla/xla/pytorch/xla/test/test_operations.py", line 1399, in test_diagonal_write_transposed_r3
    self.runAtenTest([torch.randn(5, 8, 7)], test_fn)
  File "/home/runner/_work/xla/xla/pytorch/xla/test/test_utils.py", line 393, in runAtenTest
    self.compareResults(results, xla_results, rel_err=rel_err, abs_err=abs_err)
  File "/home/runner/_work/xla/xla/pytorch/xla/test/test_utils.py", line 378, in compareResults
    self.assertEqualRel(
  File "/home/runner/_work/xla/xla/pytorch/xla/test/test_utils.py", line 332, in assertEqualRel
    self.fail('Relative error higher than the maximum tolerance')
AssertionError: Relative error higher than the maximum tolerance

----------------------------------------------------------------------
Ran 211 tests in 62.095s

FAILED (failures=1, skipped=27)
a=1.64124596118927	b=2.545245885848999	diff=0.903999924659729	index=(1, 1, 0)
a=1.0509189367294312	b=1.9549188613891602	diff=0.903999924659729	index=(2, 1, 1)
a=0.24387085437774658	b=1.1478707790374756	diff=0.903999924659729	index=(3, 1, 2)
a=-0.4914337992668152	b=0.4125661849975586	diff=0.9039999842643738	index=(4, 1, 3)

max_diff=0.9039999842643738	max_rel=1.8395153284072876	index=(4, 1, 3)

@ysiraichi ysiraichi added bug Something isn't working testing Testing and coverage related issues. xla:tpu TPU specific issues and PRs labels Apr 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working testing Testing and coverage related issues. xla:tpu TPU specific issues and PRs
Projects
None yet
Development

No branches or pull requests

2 participants