Enable skipIfXpu to support class-level skipping#151420
Enable skipIfXpu to support class-level skipping#151420EikanWang wants to merge 8 commits intogh/EikanWang/81/basefrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/151420
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (4 Unrelated Failures)As of commit c5787b8 with merge base cd7bc60 ( FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
| ) | ||
| expected_ref = torch.xpu.is_available() | ||
| expected_res = "test_case_1" not in rc | ||
| self.assertEqual(expected_ref, expected_res) |
There was a problem hiding this comment.
Since the test class is decorated by skipIfXpu, all the test cases are skipped only for XPU.
| expected_ref = torch.xpu.is_available() | ||
| expected_res = "test_case_2" not in rc | ||
| self.assertEqual(expected_ref, expected_res) |
There was a problem hiding this comment.
The test_case_2 is decoreated by skipIfXpu while the other test cases are not. Therefore, the literal "test_case_2" should not appear in the output when the XPU is available.
|
@pytorchbot rebase |
|
@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here |
|
Rebase failed due to Command Raised by https://github.com/pytorch/pytorch/actions/runs/14999417333 |
|
Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as |
Stack from ghstack (oldest at bottom):