Conversation
torch/csrc/jit/fusion_compiler.cpp
Outdated
| {aten::sub, "(${0} - ${2}*${1})"}, | ||
| {aten::rand_like, "uniform(rnd())"}, | ||
| //min, max | ||
| {aten::clamp, "fmaxf(fminf(${0}, ${2}), ${1})"}, |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Unfortunately this will conflict with #10981 😕 |
torch/csrc/jit/autodiff.cpp
Outdated
| // boundary and the factor is 1 when the boundary is NaN | ||
| // the ! is expressed as "1-" for lack of a "not" function and | ||
| // the the fuser insisting on float | ||
| return {(inputs.at(0).isnan() ? inputs.at(0) : grads.at(0)) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
apaszke
left a comment
There was a problem hiding this comment.
LGTM, but I'd like to simplify the derivative formula
torch/csrc/jit/autodiff.cpp
Outdated
| return {grads.at(0) * (outputs.at(0) > at::Scalar(0)).type_as(outputs.at(0))}; | ||
|
|
||
| } else if (node->matches("aten::clamp(Tensor self, Scalar min, Scalar max) -> Tensor")) { | ||
| // we do two type_as as it's free (hopefully) and the "*" only works with float |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/csrc/jit/autodiff.cpp
Outdated
| // but that is hard to reliably code here, so we have 0 as gradient | ||
| // when the input is NaN (unless grads is NaN or infinite) | ||
| return {grads.at(0) | ||
| * (1-(inputs.at(0).isnan()).type_as(inputs.at(0))) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
Thank you, @apaszke, for your feedback! Also put back support for DifferentiableGraph in assertAllFused.
|
@ezyang I think it is good, but I don't know what to make out of the CI failure - it doesn't look related at first sight but from clicking on "previous build" a couple of times, it's apparently not shared by other PRs... |
facebook-github-bot
left a comment
There was a problem hiding this comment.
apaszke has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
This patch adds fused forward and backward for clamp to the jit.
This is one item of #11118 . If it's OK, I'd be happy to also add some more of #11118 .
The patch depends on #11150 , which I merged into master as a base. I'll rebase it when that or #10981 is merged.
This is first serious jit patch, thank you, @ngimel and the others for their guidance. All errors are my own.