Skip to content

[jit] Fix handling of IntList[k] parameters#6965

Merged
ezyang merged 5 commits intopytorch:masterfrom
elanmart:jit-intlist
Apr 30, 2018
Merged

[jit] Fix handling of IntList[k] parameters#6965
ezyang merged 5 commits intopytorch:masterfrom
elanmart:jit-intlist

Conversation

@elanmart
Copy link
Contributor

Fixes #6734

If there's anything wrong with this PR, it's my fault. If there's anything good about it, credit goes to @t-vi.

@ezyang ezyang added the oncall: jit Add this issue/PR to JIT oncall triage queue label Apr 26, 2018
@ezyang
Copy link
Contributor

ezyang commented Apr 26, 2018

CC @apaszke @zdevito @jamesr66a

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot @elanmart and @t-vi! I think we can do this without going through the powerset, so it would be nice to change it.

Actually I think it would be even better if we had a different auto-generated JIT pass, that normalizes out all of those cases that are normally handled by the Python arg parser. Right now it's hard to handle those nodes in all optimization passes, because they can appear in a lot of possible configurations. What we should do instead is do this normalization before even attempting to run anything else on the graph. In this case, it would check if a particular node has IntList[x] args declared, and would change any int attributes under this name to correctly expanded IntList attrs. However that's a slightly bigger change, so I'm ok with merging this to unblock the sum PR, and I'll fix it later myself.

test/test_jit.py Outdated

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@t-vi
Copy link
Collaborator

t-vi commented Apr 26, 2018

So I must admit that the faults in the implementation strategy are probably mine here.

So this is what I had in mind when I looked into making the JIT actually use python_arg_parser code for determining the function to use.
So when I think about this, I would look into changing this essentially to imitating (or using) python arg parser.h/cpp in that we lookup the name and then go through the iterated positions as the parameters processed by the JIT are not automatically converted into the right descriptor string.
However, that would be a largish change. In my preliminary patch, I implemented a FunctionSignature.parse_jit (or match_jit) function and a FunctionParameter.check_jit function to do this. Then the jit dispatch would have the full signature - it might also be useful to handle the "exotic" parameters such as device)
My understanding is that that is not the way you want, but what would be the way you would want the jit to work here?

@elanmart
Copy link
Contributor Author

elanmart commented Apr 26, 2018

@apaszke

Actually I think it would be even better if we had a different auto-generated JIT pass, that normalizes out all of those cases that are normally handled by the Python arg parser. (...) However that's a slightly bigger change, so I'm ok with merging this to unblock the sum PR, and I'll fix it later myself.

I agree this PR doesn't seem like the nicest possible change. I'd be happy to give a try at a more elaborate one, but I'm afraid wouldn't know where to start with this.

So perhaps we can unblock #6152 , and I'll try to work with @t-vi to come up with a proper solution?

@apaszke
Copy link
Contributor

apaszke commented Apr 27, 2018

@elanmart can you only please tell me how many lines are there in the generated aten_dispatch.cpp after this patch?

@elanmart
Copy link
Contributor Author

$ wc -l torch/csrc/jit/generated/aten_dispatch.cpp 
21619 torch/csrc/jit/generated/aten_dispatch.cpp

@apaszke
Copy link
Contributor

apaszke commented Apr 27, 2018

Yeah, so the powerset makes it 2x larger 😕

wc -l torch/csrc/jit/generated/aten_dispatch.cpp 
10897 torch/csrc/jit/generated/aten_dispatch.cpp

Eh, it's not great, but I guess we'll need to stick with it for now.

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's clean up indices_to_replace and merge this.

This comment was marked as off-topic.

This comment was marked as off-topic.

* emit additional declarations and handle positional arg. case
* apply minor tweaks
* py-2 fix
* Address Tom's comments
* move logic to gen_jit_dispatch, start adding tests
* add test
@ezyang ezyang merged commit bc62645 into pytorch:master Apr 30, 2018
Jorghi12 pushed a commit to wsttiger/pytorch that referenced this pull request May 10, 2018
*  squash commits

* emit additional declarations and handle positional arg. case
* apply minor tweaks
* py-2 fix
* Address Tom's comments
* move logic to gen_jit_dispatch, start adding tests
* add test

* address review comments

* address review comment

* fix build issue. change argument indices to argument names. Get rid of deepcopy

* py-2 flake8 fix
weiyangfb pushed a commit to weiyangfb/pytorch that referenced this pull request Jun 11, 2018
*  squash commits

* emit additional declarations and handle positional arg. case
* apply minor tweaks
* py-2 fix
* Address Tom's comments
* move logic to gen_jit_dispatch, start adding tests
* add test

* address review comments

* address review comment

* fix build issue. change argument indices to argument names. Get rid of deepcopy

* py-2 flake8 fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

oncall: jit Add this issue/PR to JIT oncall triage queue open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[JIT] jit does not properly handle IntList[x] params

4 participants