Skip to content

[fx] Bypass custom __setattr__ in Node.__init__#135079

Closed
jansel wants to merge 9 commits intogh/jansel/382/basefrom
gh/jansel/382/head
Closed

[fx] Bypass custom __setattr__ in Node.__init__#135079
jansel wants to merge 9 commits intogh/jansel/382/basefrom
gh/jansel/382/head

Conversation

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Sep 4, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/135079

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 5613da3 with merge base 58f2477 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
@jansel jansel marked this pull request as draft September 4, 2024 21:45
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
@jansel jansel requested a review from oulgen September 6, 2024 03:22
@jansel jansel marked this pull request as ready for review September 6, 2024 03:22
@pytorchmergebot
Copy link
Collaborator

Reverting PR 135079 failed

Reason: Command git -C /home/runner/work/pytorch/pytorch revert --no-edit 7ffb3b201c86f6a84e069a4cacf939183b16ebed returned non-zero exit code 1

Auto-merging torch/_inductor/ir.py
CONFLICT (content): Merge conflict in torch/_inductor/ir.py
error: could not revert 7ffb3b201c... [inductor] Remove LoopBody.reads,writes,other (#135256)
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git revert --continue".
hint: You can instead skip this commit with "git revert --skip".
hint: To abort and get back to the state before "git revert",
hint: run "git revert --abort".
hint: Disable this message with "git config advice.mergeConflict false"
Details for Dev Infra team Raised by workflow job

izaitsevfb added a commit that referenced this pull request Sep 10, 2024
pytorchmergebot pushed a commit that referenced this pull request Sep 10, 2024
…135562)

This reverts commit 66da3b3.

#135079 breaks internal tests and needs to be reverted. Revert with mergebot doesn't work as this PR is technically part of the stack, but, according to @jansel, it should be possible to revert it individually.
Pull Request resolved: #135562
Approved by: https://github.com/jansel, https://github.com/seemethere
jansel added a commit to jansel/pytorch that referenced this pull request Sep 11, 2024
Relands pytorch#135079 whcih was reverted by pytorch#135562

I broke this up into three parts to test internally.
pytorchmergebot pushed a commit that referenced this pull request Sep 12, 2024
Relands #135079 whcih was reverted by #135562

I broke this up into three parts to test internally.
Pull Request resolved: #135733
Approved by: https://github.com/oulgen
pytorchmergebot pushed a commit that referenced this pull request Sep 12, 2024
Relands #135079 whcih was reverted by #135562

I broke this up into three parts to test internally.
Pull Request resolved: #135735
Approved by: https://github.com/oulgen
jansel added a commit to jansel/pytorch that referenced this pull request Sep 12, 2024
Relands pytorch#135079 whcih was reverted by pytorch#135562

I broke this up into three parts to test internally.
atalman pushed a commit that referenced this pull request Sep 19, 2024
#135625)

Revert "[fx] Bypass custom __setattr__ in Node.__init__ (#135079)" (#135562)

This reverts commit 66da3b3.

#135079 breaks internal tests and needs to be reverted. Revert with mergebot doesn't work as this PR is technically part of the stack, but, according to @jansel, it should be possible to revert it individually.
Pull Request resolved: #135562
Approved by: https://github.com/jansel, https://github.com/seemethere

Co-authored-by: Ivan Zaitsev <ivanzaitsev@fb.com>
Chao1Han pushed a commit to Chao1Han/pytorch that referenced this pull request Sep 20, 2024
This is roughly a 7% speedup in inductor compile time for hf_Bert_large.  The time spent in `LoopBody.__init__` improves from 15% to 8% of `fx_codegen_and_compile`.

Before
![image](https://github.com/user-attachments/assets/7de0f28e-35bd-472f-b4be-b52733d2a85c)

After
![image](https://github.com/user-attachments/assets/5f0cf11a-43c5-43ae-b13c-f32383a75a7f)

Overall
![image](https://github.com/user-attachments/assets/6a369d8c-fb5e-4ad2-9504-0fc745ad6568)

Pull Request resolved: pytorch#135235
Approved by: https://github.com/oulgen
ghstack dependencies: pytorch#135070, pytorch#135076, pytorch#135082, pytorch#135084, pytorch#135079
Chao1Han pushed a commit to Chao1Han/pytorch that referenced this pull request Sep 20, 2024
Chao1Han pushed a commit to Chao1Han/pytorch that referenced this pull request Sep 20, 2024
…)" (pytorch#135562)

This reverts commit 66da3b3.

pytorch#135079 breaks internal tests and needs to be reverted. Revert with mergebot doesn't work as this PR is technically part of the stack, but, according to @jansel, it should be possible to revert it individually.
Pull Request resolved: pytorch#135562
Approved by: https://github.com/jansel, https://github.com/seemethere
Chao1Han pushed a commit to Chao1Han/pytorch that referenced this pull request Sep 20, 2024
…135733)

Relands pytorch#135079 whcih was reverted by pytorch#135562

I broke this up into three parts to test internally.
Pull Request resolved: pytorch#135733
Approved by: https://github.com/oulgen
Chao1Han pushed a commit to Chao1Han/pytorch that referenced this pull request Sep 20, 2024
…135735)

Relands pytorch#135079 whcih was reverted by pytorch#135562

I broke this up into three parts to test internally.
Pull Request resolved: pytorch#135735
Approved by: https://github.com/oulgen
@github-actions github-actions bot deleted the gh/jansel/382/head branch October 12, 2024 02:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged release notes: fx release notes category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants