Update autograd docs#5907
Conversation
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
541c339 to
0cf136c
Compare
Advise to replace with 'grad_tensors'.
0cf136c to
deeee80
Compare
ssnl
left a comment
There was a problem hiding this comment.
Generally looks great! Have some very minor nits and suggestions. :)
| declare then with ``requires_grad=False`` or detach them from the computation graph with | ||
| :func:`torch.Tensor.detach`. | ||
| - Methods such as ``var.backward(), var.detach(), var.register_hook()`` now work on tensors | ||
| with the same name. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| outputs (sequence of Tensor): outputs of the differentiated function. | ||
| inputs (sequence of Tensor): Inputs w.r.t. which the gradient will be | ||
| returned (and not accumulated into ``.grad``). | ||
| grad_outputs (sequence of Tensor): Gradients w.r.t. each output. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| None values can be specified for scalar Tensors or ones that don't require | ||
| grad. If a None value would be acceptable for all grad_tensors, then this | ||
| argument is optional. | ||
| retain_graph (bool, optional): If ``False``, the graph used to compute the grad |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| a :class:`torch.Tensor`. Below please find a quick guide on what has changed: | ||
|
|
||
| - ``Variable(tensor, requires_grad=True)`` is now ``torch.tensor(tensor, requires_grad=True)`` | ||
| - ``var.data`` is now just ``tensor``. If you don't want to record operations on tensors, |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| r"""Context-manager that sets gradient calculation to on or off. | ||
|
|
||
| `set_grad_enabled` will enable or disable grads based on its argument `mode`. | ||
| ``set_grad_enabled`` will enable or disable grads based on its argument ``mode``. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
Will do another pass tomorrow after CI finishes |
| ``requires_grad`` set to ``True``. In addition, ``Variable(tensor)`` now returns | ||
| a :class:`torch.Tensor`. Below please find a quick guide on what has changed: | ||
|
|
||
| - ``Variable(tensor, requires_grad=True)`` is now ``torch.tensor(tensor, requires_grad=True)`` |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| ``requires_grad`` set to ``True``. In addition, ``Variable(tensor)`` now returns | ||
| a :class:`torch.Tensor`. Below please find a quick guide on what has changed: | ||
|
|
||
| - ``Variable(tensor, requires_grad=True)`` is now ``torch.tensor(tensor, requires_grad=True)``. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| if grad_tensors is None: | ||
| grad_tensors = grad_variables | ||
| else: | ||
| warnings.warn("'grad_tensors' and 'grad_variables' (deprecated) arguments both " |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| warnings.warn("'grad_tensors' and 'grad_variables' (deprecated) arguments both " | ||
| "passed to backward(). Using 'grad_tensors'.") | ||
|
|
||
| tensors = (tensors,) if isinstance(tensors, Variable) else tuple(tensors) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
module 'torch' has no attribute 'no_grad' |
|
@lianzhibin if you are using 0.3.1 or earlier, then you are looking at the wrong doc. This only updates the doc for master. |
* Update autograd docs * Deprecate 'grad_variables' in backward(). Advise to replace with 'grad_tensors'. * Resolve saved_variables/saved_tensors * Tensor section * Address comments * Address comments * Address comments
There are some FIXMEs right now that I'll make another pass through. Opening this PR for discussion.