Optional ScalarType support for native functions & JIT#15154
Optional ScalarType support for native functions & JIT#15154tugrulates wants to merge 2 commits intopytorch:masterfrom tugrulates:optional-scalartype-base
Conversation
| else if(n->kind().is_prim()){ | ||
| switch(n->kind()){ | ||
| case prim::Constant: | ||
| case prim::None: |
There was a problem hiding this comment.
This just tells to copy None values in JIT graph to batch JIT graph as is.
| {"Storage", ParameterType::STORAGE}, | ||
| {"PyObject*", ParameterType::PYOBJECT}, | ||
| {"ScalarType", ParameterType::SCALARTYPE}, | ||
| {"optional<ScalarType>", ParameterType::SCALARTYPE}, |
There was a problem hiding this comment.
This wasn't used and not needed.
| if arg.get('default') is not None: | ||
| default = arg['default'] | ||
| if default == 'nullptr' or default == 'nullopt' or default == '{}': | ||
| if default == 'nullptr' or default == 'nullopt' or default == 'c10::nullopt' or default == '{}': |
There was a problem hiding this comment.
why do you need c10::nullopt? where does that come from?
There was a problem hiding this comment.
I remember JIT was generating that (register_aten_ops) last week. It doesn't seem to be needed any more, so removed this.
There was a problem hiding this comment.
Ah, this was still needed. It's added in
pytorch/aten/src/ATen/native_parse.py
Line 26 in b6edd7b
I will either revert this or find a different fix in the next PR.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@tugrulates has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
How hard is it to just use |
wanchaol
left a comment
There was a problem hiding this comment.
Thank for adding this! the PR LGTM. Can you add a test in JIT to test a op that takes Optional[ScalarType], pass a None into it and see if it works end to end?
|
Could we add test for this by creating a dummy aten ops on |
|
I'm about to convert |
|
Added another commit which converts all optional types except Why leave Tensor out? Why not wait for #12582 added optional support by following In any case, I can still drop the last commit and use |
|
Existing JIT tests that use There is currently no op using I'm holding off on sending them before this PR is resolved. I plan to add a JIT test for |
I don't see the real value of migrating from Also |
|
Reverted the last commit that changes the schema syntax to |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@tugrulates has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
For #6593 and #9515
This completes the support for optional in native, JIT and autograd.
Note: Mostly following the existing implementation for optional that was added in #12582.
This PR introduces a way to make functions accept an optional dtype and it will unblock #9515 by allowing the
dtypeparam for type promotion interface:An alternative approach could have been using
ScalarType::Undefinedfor the same purpose but without optional, though it would have been a bit hacky.Here's an example use of this in action: 971f69e
There are already a bunch of native functions that were getting optional
dtypethrough function overloading. #15133 is the attempt to migrate all of those. I will send those changes separately after this since some functions (e.g. sum) need quite a bit of change in the codebase. See the commits over there.