[dynamo] Fix dunder attr access on WrapperUserFunctionVariable (lru_cache, wraps)#176934
[dynamo] Fix dunder attr access on WrapperUserFunctionVariable (lru_cache, wraps)#176934guilhermeleobas wants to merge 7 commits intogh/guilhermeleobas/310/basefrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/176934
Note: Links to docs will display an error until the docs builds have been completed. ⏳ No Failures, 83 PendingAs of commit 9bc8c84 with merge base 3fab23b ( UNSTABLE - The following jobs are marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
…ache, wraps) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object graph-break. Co-authored Claude Sonnet 4.6 ghstack-source-id: 9b69323 Pull-Request: #176934
This PR needs a
|
…ache, wraps) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 ghstack-source-id: 9b69323 Pull-Request: #176934
| def var_getattr(self, tx: "InstructionTranslator", name: str): | ||
| fn_dict = self.get_dict_vt(tx) | ||
|
|
||
| # missing: __globals__, __closure__, __kwdefautls__, __defaults__ |
There was a problem hiding this comment.
Is this a TODO we plan on adding later? If so can we make this clear?
Or if these are not supported here inentionally we should call that out
There was a problem hiding this comment.
It's intentional.
torch/_dynamo/variables/functions.py
Outdated
| def var_getattr(self, tx: "InstructionTranslator", name: str) -> VariableTracker: | ||
| if name == "__dict__": | ||
| return self.get_dict_vt(tx) | ||
| if name in ("__dict__",): |
There was a problem hiding this comment.
Why a single check for in?
There was a problem hiding this comment.
Forgot to undo this change before committing. I'll undo
…ache, wraps) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 ghstack-source-id: 88b4ba0 Pull-Request: #176934
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 1 jobs have failed, first few of them are: inductor / inductor-cpu-build / build Details for Dev Infra teamRaised by workflow job |
|
@pytorchbot merge -i |
Merge startedYour change will be merged while ignoring the following 3 checks: inductor / inductor-cpu-build / build, inductor / unit-test / inductor-test / test (inductor_cpp_wrapper, 1, 2, linux.g5.4xlarge.nvidia.gpu), inductor / unit-test / inductor-test / test (inductor_cpp_wrapper, 2, 2, linux.g5.4xlarge.nvidia.gpu) Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
@pytorchbot revert -m "This is being reverted due to an internal build dependency issue. |
|
❌ 🤖 pytorchbot command failed: |
|
@pytorchbot revert -m "This is being reverted due to an internal build dependency issue.The code changes themselves are correct and the tests pass. However, when importing this change internally, it triggers a dependency blocklist violation, please find a meta folks to help fix this internally,see D96281773" -c ghfirst |
|
@pytorchbot successfully started a revert job. Check the current status here. |
|
@guilhermeleobas your PR has been successfully reverted. |
…e (lru_cache, wraps) (#176934)" This reverts commit 609e0ed. Reverted #176934 on behalf of https://github.com/yangw-dev due to This is being reverted due to an internal build dependency issue.The code changes themselves are correct and the tests pass. However, when importing this change internally, it triggers a dependency blocklist violation, please find a meta folks to help fix this internally,see D96281773 ([comment](#176934 (comment)))
|
i apologize, this should not be reverted. the diff train is a bit messed now today. remerge this, nothing need author to do |
|
@pytorchbot merge -i |
Merge startedYour change will be merged while ignoring the following 1 checks: inductor / inductor-cpu-build / build Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
@pytorchbot merge -f 'accidentally revert this pr, due to misinfo from internal diff train' |
|
The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
|
@yangw-dev Do I need to do anything? |
…ache, wraps) (pytorch#176934) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 Pull Request resolved: pytorch#176934 Approved by: https://github.com/Lucaskabela, https://github.com/williamwen42
…ache, wraps) (pytorch#176934) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 Pull Request resolved: pytorch#176934 Approved by: https://github.com/Lucaskabela, https://github.com/williamwen42 ghstack dependencies: pytorch#176623
…e (lru_cache, wraps) (pytorch#176934)" This reverts commit b09edda. Reverted pytorch#176934 on behalf of https://github.com/malfet due to Broke dynamo_wrapped tests, see https://hud.pytorch.org/hud/pytorch/pytorch/e274afff409c9c595b25206160e7c1dd4c6bbf6a/1?per_page=50&name_filter=dynamo_wrapped&mergeEphemeralLF=true ([comment](pytorch#176934 (comment)))
…ache, wraps) (pytorch#176934) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 Pull Request resolved: pytorch#176934 Approved by: https://github.com/Lucaskabela, https://github.com/williamwen42
…e (lru_cache, wraps) (pytorch#176934)" This reverts commit 609e0ed. Reverted pytorch#176934 on behalf of https://github.com/yangw-dev due to This is being reverted due to an internal build dependency issue.The code changes themselves are correct and the tests pass. However, when importing this change internally, it triggers a dependency blocklist violation, please find a meta folks to help fix this internally,see D96281773 ([comment](pytorch#176934 (comment)))
…ache, wraps) (pytorch#176934) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 Pull Request resolved: pytorch#176934 Approved by: https://github.com/Lucaskabela, https://github.com/williamwen42
…e (lru_cache, wraps) (pytorch#176934)" This reverts commit 609e0ed. Reverted pytorch#176934 on behalf of https://github.com/yangw-dev due to This is being reverted due to an internal build dependency issue.The code changes themselves are correct and the tests pass. However, when importing this change internally, it triggers a dependency blocklist violation, please find a meta folks to help fix this internally,see D96281773 ([comment](pytorch#176934 (comment)))
…ache, wraps) (pytorch#176934) `WrapperUserFunctionVariable` now inherits from `BaseUserFunctionVariable` instead of `VariableTracker`, gaining the shared `var_getattr` implementation that handles `__name__`, `__qualname__`, `__doc__`, `__module__`, `__code__`, `__dict__`, `__annotations__`, and `__type_params__`. This fixes `functools.wraps` applied to `lru_cache`-wrapped functions at trace time — previously, accessing `__name__`, `__dict__`, etc. on the wrapper object would graph-break. Co-authored Claude Sonnet 4.6 Pull Request resolved: pytorch#176934 Approved by: https://github.com/Lucaskabela, https://github.com/williamwen42
Stack from ghstack (oldest at bottom):
WrapperUserFunctionVariablenow inherits fromBaseUserFunctionVariableinstead of
VariableTracker, gaining the sharedvar_getattrimplementationthat handles
__name__,__qualname__,__doc__,__module__,__code__,__dict__,__annotations__, and__type_params__.This fixes
functools.wrapsapplied tolru_cache-wrapped functions at tracetime — previously, accessing
__name__,__dict__, etc. on the wrapper objectwould graph-break.
Co-authored Claude Sonnet 4.6
cc @voznesenskym @penguinwu @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @kadeng @chauhang @amjames @Lucaskabela @jataylo