Add test that import torch doesn't modify global logging state#87629
Add test that import torch doesn't modify global logging state#87629zou3519 wants to merge 2 commits intogh/zou3519/560/basefrom
import torch doesn't modify global logging state#87629Conversation
Fixes #87626 Also adds the same test for `import functorch`. Users have complained at us when we do modify the global logging state, which has happened in the past. Test Plan: - tested locally; I added `logging.basicConfig` to `torch/__init__.py` and checked that the test got triggered [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/87629
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 FailuresAs of commit ff1a590: The following jobs have failed:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
Fixes #87626 Also adds the same test for `import functorch`. Users have complained at us when we do modify the global logging state, which has happened in the past. Test Plan: - tested locally; I added `logging.basicConfig` to `torch/__init__.py` and checked that the test got triggered ghstack-source-id: 4a7a7a2 Pull Request resolved: #87629
albanD
left a comment
There was a problem hiding this comment.
Sounds good. minor comment about windows
| cwd=os.path.dirname(os.path.realpath(__file__)),).decode("utf-8") | ||
| self.assertEquals(out, "") | ||
|
|
||
| @unittest.skipIf(IS_WINDOWS, "importing torch+CUDA on CPU results in warning") |
There was a problem hiding this comment.
No need to skip since you don't check for warnings here?
There was a problem hiding this comment.
The warning is going to change the stdout/stderr output, right? Then that affects the string we do a comparison with
test/test_testing.py
Outdated
| out = subprocess.check_output( | ||
| [sys.executable, "-W", "all", "-c", "; ".join(commands)], | ||
| stderr=subprocess.STDOUT, | ||
| # On Windows, opening the subprocess with the default CWD makes `import torch` |
There was a problem hiding this comment.
not needed if windows is skipped?
… state" Fixes #87626 Also adds the same test for `import functorch`. Users have complained at us when we do modify the global logging state, which has happened in the past. Test Plan: - tested locally; I added `logging.basicConfig` to `torch/__init__.py` and checked that the test got triggered [ghstack-poisoned]
Fixes #87626 Also adds the same test for `import functorch`. Users have complained at us when we do modify the global logging state, which has happened in the past. Test Plan: - tested locally; I added `logging.basicConfig` to `torch/__init__.py` and checked that the test got triggered ghstack-source-id: bff2f26 Pull Request resolved: #87629
|
Test failure is unrelated (tbh I can't find the failing test in the log, but I verified |
|
@pytorchbot merge -f "test failures are unrelated" |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…orch#87629) Fixes pytorch#87626 Also adds the same test for `import functorch`. Users have complained at us when we do modify the global logging state, which has happened in the past. Test Plan: - tested locally; I added `logging.basicConfig` to `torch/__init__.py` and checked that the test got triggered Pull Request resolved: pytorch#87629 Approved by: https://github.com/albanD
…orch#87629) Fixes pytorch#87626 Also adds the same test for `import functorch`. Users have complained at us when we do modify the global logging state, which has happened in the past. Test Plan: - tested locally; I added `logging.basicConfig` to `torch/__init__.py` and checked that the test got triggered Pull Request resolved: pytorch#87629 Approved by: https://github.com/albanD
Stack from ghstack:
import torchdoesn't modify global logging state #87629Fixes #87626
Also adds the same test for
import functorch. Users have complained atus when we do modify the global logging state, which has happened in the
past.
Test Plan:
logging.basicConfigtotorch/__init__.pyand checked that the test got triggered