Skip to content

Remove TORCH_ASSERT#9575

Closed
goldsborough wants to merge 2 commits intopytorch:masterfrom
goldsborough:variable-error
Closed

Remove TORCH_ASSERT#9575
goldsborough wants to merge 2 commits intopytorch:masterfrom
goldsborough:variable-error

Conversation

@goldsborough
Copy link
Contributor

@goldsborough goldsborough commented Jul 18, 2018

I got some tensor->variable conversion exceptions from torch/csrc/autograd/variable.h, which used the TORCH_ASSERTM macros instead of AT_CHECK, so they didn't have backtraces. This was such a substantial loss for debugability that I decided to update the whole codebase to use the backtrace-enabled ATen macros instead of TORCH_ASSERT and JIT_ASSERT, the latter having been an alias of the former.

@ezyang @apaszke @zdevito

@goldsborough
Copy link
Contributor Author

@pytorchbot retest this please

1 similar comment
@goldsborough
Copy link
Contributor Author

@pytorchbot retest this please

@apaszke
Copy link
Contributor

apaszke commented Jul 19, 2018

Can we please wait a bit with this patch? This will conflict with so many JIT changes... Is it really worth the cleanup? We could just #define those macros to AT_* implementations

@goldsborough
Copy link
Contributor Author

I thought it would just be unnecessary to have JIT_* be the same as AT_*. Is it a huge pain to do the codemod locally in your PRs (if you globally replace JIT_ASSERT WITH AT_CHECK before rebasing, you'll have no conflicts)? The JIT is always heavily worked on, so I fear there would never be a "perfect" time for this kind of codemod

@goldsborough
Copy link
Contributor Author

Ok I'll leave JIT_* as defines

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@goldsborough has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@goldsborough goldsborough changed the title Remove TORCH_ASSERT and JIT_ASSERT Remove TORCH_ASSERT Jul 19, 2018
@goldsborough
Copy link
Contributor Author

@pytorchbot retest this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@goldsborough is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Missed stuff in tools/

Replace barf with AT_ERROR

Removed %s and fixed interpreter.cpp calls

Remove noexcept from methods that throw

Dont use AT_ERROR in destructor

Undo changes in jit
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@goldsborough has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@goldsborough
Copy link
Contributor Author

@pytorchbot retest this please

zdevito pushed a commit to zdevito/ATen that referenced this pull request Jul 25, 2018
Summary:
I got some tensor->variable conversion exceptions from `torch/csrc/autograd/variable.h`, which used the `TORCH_ASSERTM` macros instead of `AT_CHECK`, so they didn't have backtraces. This was such a substantial loss for debugability that I decided to update the whole codebase to use the backtrace-enabled ATen macros instead of `TORCH_ASSERT` and `JIT_ASSERT`, the latter having been an alias of the former.

ezyang apaszke zdevito
Pull Request resolved: pytorch/pytorch#9575

Differential Revision: D8924566

Pulled By: goldsborough

fbshipit-source-id: 7a4013b13eec9dbf024cef94cf49fca72f61d441
@goldsborough goldsborough deleted the variable-error branch July 25, 2018 04:01
jramseyer pushed a commit to jramseyer/pytorch that referenced this pull request Jul 30, 2018
Summary:
I got some tensor->variable conversion exceptions from `torch/csrc/autograd/variable.h`, which used the `TORCH_ASSERTM` macros instead of `AT_CHECK`, so they didn't have backtraces. This was such a substantial loss for debugability that I decided to update the whole codebase to use the backtrace-enabled ATen macros instead of `TORCH_ASSERT` and `JIT_ASSERT`, the latter having been an alias of the former.

ezyang apaszke zdevito
Pull Request resolved: pytorch#9575

Differential Revision: D8924566

Pulled By: goldsborough

fbshipit-source-id: 7a4013b13eec9dbf024cef94cf49fca72f61d441
goodlux pushed a commit to goodlux/pytorch that referenced this pull request Aug 15, 2018
Summary:
I got some tensor->variable conversion exceptions from `torch/csrc/autograd/variable.h`, which used the `TORCH_ASSERTM` macros instead of `AT_CHECK`, so they didn't have backtraces. This was such a substantial loss for debugability that I decided to update the whole codebase to use the backtrace-enabled ATen macros instead of `TORCH_ASSERT` and `JIT_ASSERT`, the latter having been an alias of the former.

ezyang apaszke zdevito
Pull Request resolved: pytorch#9575

Differential Revision: D8924566

Pulled By: goldsborough

fbshipit-source-id: 7a4013b13eec9dbf024cef94cf49fca72f61d441
@ezyang ezyang added the merged label Jun 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants