Try finally block for with context on graph break instruction#213
Try finally block for with context on graph break instruction#213anijain2305 merged 6 commits intopytorch:mainfrom
Conversation
|
@jansel this is ready for review. |
torchdynamo/variables/misc.py
Outdated
| return [ | ||
| codegen.create_load_global("torch"), | ||
| codegen.create_load_attr("_C"), | ||
| create_instruction("LOAD_METHOD", 2, "_set_grad_enabled"), |
There was a problem hiding this comment.
I don't think we will know that name index "2" will be "_set_grad_enabled".
LOAD_METHOD looks up code.co_names[2] so we need to add _set_grad_enabled to co_names if it is not there already.
We have some helpers for this already in codegen.
torchdynamo/variables/misc.py
Outdated
| return ([], []) | ||
|
|
||
| def set_grad_insts(mode): | ||
| codegen.load_import_from("torch", "_C") |
There was a problem hiding this comment.
This has the side effect creating a bunch of instruction in codegen.output_instructions. These won't be split into the prologue/epilogue. Perhaps we should refactor this to return the instructions you want.
torchdynamo/variables/misc.py
Outdated
| codegen.load_import_from("torch", "_C") | ||
| codegen.load_import_from("torch._C", "_set_grad_enabled") | ||
| return [ | ||
| codegen.create_load_global("torch"), |
There was a problem hiding this comment.
The user might have overloaded the global variable torch (load_import_from checks for this)
|
Thanks for the review. I have incorporated your comments. PTAL. |
Fixes #207 and also
hf_Reformerwith AOTAutograd training.Relying on CI to test for different python versions.