Skip to content

add checks in autograd c++ ops for double backward without retain_variables#1293

Closed
soumith wants to merge 2 commits intomasterfrom
autogradopschecks
Closed

add checks in autograd c++ ops for double backward without retain_variables#1293
soumith wants to merge 2 commits intomasterfrom
autogradopschecks

Conversation

@soumith
Copy link
Collaborator

@soumith soumith commented Apr 19, 2017

Fixes #1288

auto BatchNormBackward::apply(const variable_list& grad_outputs) -> variable_list {
auto& input = this->input.unpack();

if (!input) throw std::runtime_error("Trying to backward through the "

This comment was marked as off-topic.

This comment was marked as off-topic.

Copy link
Contributor

@apaszke apaszke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you rebase that commit on top of autograd?

#ifndef PYTORCH_AUTOGRAD_ERRORS_H
#define PYTORCH_AUTOGRAD_ERRORS_H

#define PT_ERR_BACKWARD_TWICE "Trying to backward through the " \

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@@ -0,0 +1,9 @@
#ifndef PYTORCH_AUTOGRAD_ERRORS_H
#define PYTORCH_AUTOGRAD_ERRORS_H

This comment was marked as off-topic.

This comment was marked as off-topic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants