Introduce __array_priority__ on torch.Tensor#9651
Introduce __array_priority__ on torch.Tensor#9651t-vi wants to merge 3 commits intopytorch:masterfrom
Conversation
This causes numpy to yield to the torch functions, e.g. instead of numpy array/scalar __mul__ converting the tensor to an array, it will now arrange for the Tensor __rmul__ to be called. Fixes case 2 of pytorch#9468 I also makes case 3 and 4 equivalent but does not fix them.
|
Ha, so some of the tests depended on the old behaviour of silently making array of things. I'll fix the tests... |
|
@pytorchbot retest this please |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ssnl has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
A much more elaborate way might be to make the array priority of new tensors configurable (easy) and possibly even give calculation results the max of the inputs (quite invasive). |
ezyang
left a comment
There was a problem hiding this comment.
I don't really know what __array_priority__ etiquette is, but this totally seems reasonable.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
This is BC breaking though.
…On Mon, Jul 23, 2018 at 11:09 Edward Z. Yang ***@***.***> wrote:
***@***.**** approved this pull request.
I don't really know what __array_priority__ etiquette is, but this
totally seems reasonable.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#9651 (review)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AFaWZU4Q86dd8BFnsNl0J-ypXDhjsJZ-ks5uJecygaJpZM4VY_ay>
.
|
ezyang
left a comment
There was a problem hiding this comment.
need to work out BC considerations
|
So we passed a tensor to scipy.stat.binom. Now if you look at https://github.com/scipy/scipy/blob/master/scipy/stats/_distn_infrastructure.py, you see that most functions use So in the the spectrum of "mixing non-tensor and tensor" you could have these options
That means that functions who didn't use asarray and relied on some binary op doing an implicit conversion and actually need an array now fail. So in summary, I'm not terribly worried about the backwards compatibility - it was more than fragile before - if scipy decided to do |
Summary: This causes numpy to yield to the torch functions, e.g. instead of numpy array/scalar __mul__ converting the tensor to an array, it will now arrange for the Tensor __rmul__ to be called. Fixes case 2 of pytorch#9468 I also makes case 3 and 4 equivalent but does not fix them. Pull Request resolved: pytorch#9651 Differential Revision: D8948079 Pulled By: ezyang fbshipit-source-id: bd42c04e96783da0bd340f37f4ac3559e9bbf8db
Summary: This causes numpy to yield to the torch functions, e.g. instead of numpy array/scalar __mul__ converting the tensor to an array, it will now arrange for the Tensor __rmul__ to be called. Fixes case 2 of pytorch#9468 I also makes case 3 and 4 equivalent but does not fix them. Pull Request resolved: pytorch#9651 Differential Revision: D8948079 Pulled By: ezyang fbshipit-source-id: bd42c04e96783da0bd340f37f4ac3559e9bbf8db
This causes numpy to yield to the torch functions,
e.g. instead of numpy array/scalar mul converting the tensor to
an array, it will now arrange for the Tensor rmul to be called.
Fixes case 2 of #9468
I also makes case 3 and 4 equivalent but does not fix them.